iojs-v1.0.2-darwin-x64/000755 000766 000024 00000000000 12456115120 014652 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/bin/000755 000766 000024 00000000000 12456115120 015422 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/CHANGELOG.md000644 000766 000024 00000567031 12456115120 016477 0ustar00iojsstaff000000 000000 # io.js ChangeLog ## 2015-01-16, Version 1.0.2, @rvagg ### Notable changes * Windows installer fixes * Bundled node-gyp fixes for Windows * http_parser v2.4.1 upgrade * libuv v1.2.1 upgrade ### Commits * 265cb76 - build: add new installer config for OS X (Rod Vagg) * 8cf6079 - doc: update AUTHORS list (Rod Vagg) * c80a944 - doc: Add http keepalive behavior to CHANGELOG.md (Isaac Z. Schlueter) * 9b81c3e - doc: fix author attribution (Tom Hughes) * fd30eb2 - src: fix jslint errors (Yosuke Furukawa) * 946eabd - tools: update closure linter to 2.3.17 (Yosuke Furukawa) * 9e62ae4 - _debug_agent: use `readableObjectMode` option (Vladimir Kurchatkin) * eec4c81 - doc: fix formatting in LICENSE for RTF generation (Rod Vagg) * e789103 - doc: fix 404s for syntax highlighting js (Phil Hughes) * ca039b4 - src: define AI_V4MAPPED for OpenBSD (Aaron Bieber) * 753fcaa - doc: extend example of http.request by end event (Michal Tehnik) * 8440cac - src: fix documentation url in help message (Shigeki Ohtsu) * 24def66 - win,msi: warn that older io.js needs manual uninstall (Bert Belder) * 59d9361 - win,msi: change UpgradeCode (Bert Belder) * 5de334c - deps: make node-gyp work again on windows (Bert Belder) * 07bd05b - deps: update libuv to 1.2.1 (Saúl Ibarra Corretgé) * e177377 - doc: mention io.js alongside Node in Punycode docs (Mathias Bynens) * 598efcb - deps: update http_parser to 2.4.1 (Fedor Indutny) * 3dd7ebb - doc: update cluster entry in CHANGELOG (Ben Noordhuis) * 0c5de1f - doc: fix double smalloc example (Mathias Buus) ## 2015-01-14, Version 1.0.1, @rvagg Rebuild due to stale build slave git reflogs for 1.0.0 release * doc: improve write style consistency (Rui Marinho) * win,msi: correct doc website link (Bert Belder) -------------------------------------- Below is a summary of the user-facing changes to be found in the io.js v1.0.0 release as compared to the current _stable_ Node.js release, v0.10.35. At the time of the v1.0.0 release, the latest _unstable_ Node.js release is v0.11.14 with much progress made towards a v0.11.15 release. The io.js codebase inherits the majority of the changes found in the v0.11 branch of the [joyent/node](https://github.com/joyent/node) repository and therefore can be seen as an extension to v0.11. ## Summary of changes from Node.js v0.10.35 to io.js v1.0.0 ### General - The V8 JavaScript engine bundled with io.js was upgraded dramatically, from version 3.14.5.9 in Node.js v0.10.35 and 3.26.33 in Node.js v0.11.14 to 3.31.74.1 for io.js v1.0.0. This brings along many fixes and performance improvements, as well as additional support for new ES6 language features! For more information on this, check out [the io.js ES6 page](https://iojs.org/es6.html). - Other bundled technologies were upgraded: - c-ares: 1.9.0-DEV to 1.10.0-DEV - http_parser: 1.0 to 2.3 - libuv: 0.10.30 to 1.2.0 - npm: 1.4.28 to 2.1.18 - openssl: 1.0.1j to 1.0.1k - punycode: 1.2.0 to 1.3.2. - Performance and stability improvements on all platforms. ### buffer https://iojs.org/api/buffer.html - Added `buf.writeUIntLE`, `buf.writeUIntBE`, `buf.writeIntLE`, `buf.writeIntBE`, `buf.readUIntLE`, `buf.readUIntBE`, `buf.readIntLE` and `buf.readIntBE` methods that read and write value up to 6 bytes. - Added `Buffer.compare()` which does a `memcmp()` on two Buffer instances. Instances themselves also have a `compare()`. - Added `buffer.equals()` that checks equality of Buffers by their contents. - Added `new Buffer(otherBuffer)` constructor. - Tweaked `SlowBuffer`'s semantics. - Updated the output of `buffer.toJSON()` to not be the same as an array. Instead it is an object specifically tagged as a buffer, which can be recovered by passing it to (a new overload of) the `Buffer` constructor. ### child_process https://iojs.org/api/child_process.html - Added a `shell` option to `child_process.exec`. - Added synchronous counterparts for the child process functions: `child_process.spawnSync`, `child_process.execSync`, and `child_process.execFileSync`. - Added the path to any `ENOENT` errors, for easier debugging. ### console https://iojs.org/api/console.html - Added an `options` parameter to `console.dir`. ### cluster https://iojs.org/api/cluster.html - Updated `cluster` to use round-robin load balancing by default on non-Windows platforms. The scheduling policy is configurable however. - `--debug` has been made cluster-aware. - Many bug fixes. ### crypto https://iojs.org/api/crypto.html - Added support for custom generator values to `DiffieHellman` (defaulting to 2 for backwards compatibility). - Added support for custom pbkdf2 digest methods. - Added support for elliptic curve-based Diffie-Hellman. - Added support for loading and setting the engine for some/all OpenSSL functions. - Added support for passing in a passphrase for decrypting the signing key to `Sign.sign()`. - Added support for private key passphrase in every method that accepts it. - Added support for RSA public/private encryption/decryption functionality. - Added support for setting and getting of authentication tags and setting additional authentication data when using ciphers such as AES-GCM. ### dgram https://iojs.org/api/dgram.html - Added support for receiving empty UDP packets. ### dns https://iojs.org/api/dns.html - Added `dns.resolveSoa`, `dns.getServers`, and `dns.setServers` methods. - Added `hostname` on error messages when available. - Improved error handling consistency. ### events https://iojs.org/api/events.html - Added chaining support to `EventEmitter.setMaxListeners`. - Updated `require('events')` to return the `EventEmitter` constructor, allowing the module to be used like `var EventEmitter = require('events')` instead of `var EventEmitter = require('events').EventEmitter`. ### fs https://iojs.org/api/fs.html - Added `fs.access`, and deprecated `fs.exists`. Please read the documentation carefully. - Added more informative errors and method call site details when the `NODE_DEBUG` environment is set to ease debugging. - Added option to `fs.watch` for recursive sub-directory support (OS X only). - Fixed missing callbacks errors just being printed instead of thrown. ### http https://iojs.org/api/http.html - Added support for `response.write` and `response.end` to receive a callback to know when the operation completes. - Added support for 308 status code (see RFC 7238). - Added `http.METHODS` array, listing the HTTP methods supported by the parser. - Added `request.flush` method. - Added `response.getHeader('header')` method that may be used before headers are flushed. - Added `response.statusMessage` property. - Added Client Keep-Alive behavior. Set `keepAlive:true` in request options to reuse connections indefinitely. - Added `rawHeaders` and `rawTrailers` members on incoming message. - Removed default chunked encoding on `DELETE` and `OPTIONS`. ### os https://iojs.org/api/os.html - Added MAC addresses, netmasks and scope IDs for IPv6 addresses to `os.networkInterfaces` method output. - Updated `os.tmpdir` on Windows to use the `%SystemRoot%` or `%WINDIR%` environment variables instead of the hard-coded value of `c:\windows` when determining the temporary directory location. ### path https://iojs.org/api/path.html - Added `path.isAbsolute` and `path.parse` methods. - Added `path.win32` and `path.posix` objects that contain platform-specific versions of the various `path` functions. - Improved `path.join` performance. ### process https://iojs.org/api/process.html - Added `beforeExit` event. - Added `process.mainModule` and `process.exitCode`. ### querystring https://iojs.org/api/querystring.html - Added the ability to pass custom versions of `encodeURIComponent` and `decodeURIComponent` when stringifying or parsing a querystring. - Fixed several issues with the formatting of query strings in edge cases. ### smalloc https://iojs.org/api/smalloc.html `smalloc` is a new core module for doing (external) raw memory allocation/deallocation/copying in JavaScript. ### streams https://iojs.org/api/stream.html The changes to streams are not as drastic as the transition from streams1 to streams2: they are a refinement of existing ideas, and should make the API slightly less surprising for humans and faster for computers. As a whole the changes are referred to as "streams3", but the changes should largely go unnoticed by the majority of stream consumers and implementers. #### Readable streams The distinction between "flowing" and "non-flowing" modes has been refined. Entering "flowing" mode is no longer an irreversible operation—it is possible to return to "non-flowing" mode from "flowing" mode. Additionally, the two modes now flow through the same machinery instead of replacing methods. Any time data is returned as a result of a `.read` call that data will *also* be emitted on the `"data"` event. As before, adding a listener for the `"readable"` or `"data"` event will start flowing the stream; as will piping to another stream. #### Writable streams The ability to "bulk write" to underlying resources has been added to `Writable` streams. For stream implementers, one can signal that a stream is bulk-writable by specifying a [_writev](https://iojs.org/api/stream.html#stream_writable_writev_chunks_callback) method. Bulk writes will occur in two situations: 1. When a bulk-writable stream is clearing its backlog of buffered write requests, 2. or if an end user has made use of the new `.cork()` and `.uncork()` API methods. `.cork` and `.uncork` allow the end user to control the buffering behavior of writable streams separate from exerting backpressure. `.cork` indicates that the stream should accept new writes (up to `highWaterMark`), while `.uncork` resets that behavior and attempts to bulk-write all buffered writes to the underlying resource. The only core stream API that **currently** implements `_writev` is `net.Socket`. In addition to the bulk-write changes, the performance of repeated small writes to non-bulk-writable streams (such as `fs.WriteStream`) has been drastically improved. Users piping high volume log streams to disk should see an improvement. For a detailed overview of how streams3 interact, [see this diagram](https://cloud.githubusercontent.com/assets/37303/5728694/f9a3e300-9b20-11e4-9e14-a6938b3327f0.png). ### timers https://iojs.org/api/timers.html - Removed `process.maxTickDepth`, allowing `process.nextTick` to be used recursively without limit. - Updated `setImmediate` to process the full queue each turn of the event loop, instead of one per queue. ### tls https://iojs.org/api/tls.html - Added `detailed` boolean flag to `getPeerCertificate` to return detailed certificate information (with raw DER bytes). - Added `renegotiate(options, callback)` method for session renegotiation. - Added `setMaxSendFragment` method for varying TLS fragment size. - Added a `dhparam` option for DH ciphers. - Added a `ticketKeys` option for TLS ticket AES encryption keys setup. - Added async OCSP-stapling callback. - Added async session storage events. - Added async SNI callback. - Added multi-key server support (for example, ECDSA+RSA server). - Added optional callback to `checkServerIdentity` for manual certificate validation in user-land. - Added support for ECDSA/ECDHE cipher. - Implemented TLS streams in C++, boosting their performance. - Moved `createCredentials` to `tls` and renamed it to `createSecureContext`. - Removed SSLv2 and SSLv3 support. ### url https://iojs.org/api/url.html - Added support for `path` option in `url.format`, which encompasses `pathname`, `query`, and `search`. - Improved escaping of certain characters. - Improved parsing speed. ### util https://iojs.org/api/util.html - Added `util.debuglog`. - Added a plethora of new type-testing methods. See [the docs](https://iojs.org/api/util.html). - Updated `util.format` to receive several changes: - `-0` is now displayed as such, instead of as `0`. - Anything that is `instanceof Error` is now formatted as an error. - Circular references in JavaScript objects are now handled for the `%j` specifier. - Custom `inspect` functions are now allowed to return an object. - Custom `inspect` functions now receive any arguments passed to `util.inspect`. ## v8 https://iojs.org/api/v8.html `v8` is a new core module for interfacing directly with the V8 engine. ### vm https://iojs.org/api/vm.html The `vm` module has been rewritten to work better, based on the excellent [Contextify](https://github.com/brianmcd/contextify) native module. All of the functionality of Contextify is now in core, with improvements! - Added `vm.isContext(object)` method to determine whether `object` has been contextified. - Added `vm.runInDebugContext(code)` method to compile and execute `code` inside the V8 debug context. - Updated `vm.createContext(sandbox)` to "contextify" the sandbox, making it suitable for use as a global for `vm` scripts, and then return it. It no longer creates a separate context object. - Updated most `vm` and `vm.Script` methods to accept an `options` object, allowing you to configure a timeout for the script, the error display behavior, and sometimes the filename (for stack traces). - Updated the supplied sandbox object to be used directly as the global, remove error-prone copying of properties back and forth between the supplied sandbox object and the global that appears inside the scripts run by the `vm` module. For more information, see the `vm` documentation linked above. ### zlib https://iojs.org/api/zlib.html - Added support for `zlib.flush` to specify a particular flush method (defaulting to `Z_FULL_FLUSH`). - Added support for `zlib.params` to dynamically update the compression level and strategy when deflating. - Added synchronous versions of the zlib methods. ### C++ API Changes https://iojs.org/api/addons.html In general it is recommended that you use [NAN](https://github.com/rvagg/nan) as a compatibility layer for your addons. This will also help with future changes in the V8 and Node/io.js C++ API. Most of the following changes are already handled by NAN-specific wrappers. #### V8 highlights - Exposed method signature has changed from `Handle Method(const Arguments& args)` to `void Method(const v8::FunctionCallbackInfo& args)` with the newly introduced `FunctionCallbackInfo` also taking the return value via `args.GetReturnValue().Set(value)` instead of `scope.Close(value)`, `Arguments` has been removed. - Exposed setter signature has changed from `void Setter(Local property, Local value, const v8::AccessorInfo& args)` `void Setter(Local property, Local value, const v8::PropertyCallbackInfo& args)`. - Exposed getter signature has changed from `void Getter(Local property, Local value, const v8::AccessorInfo& args)` `void Setter(Local property, Local value, const v8::PropertyCallbackInfo& args)`. - Exposed property setter signature has changed from `Handle Setter(Local property, Local value, const v8::AccessorInfo& args)` `void Setter(Local property, Local value, const v8::PropertyCallbackInfo& args)`. - Exposed property getter signature has changed from `Handle Getter(Local property, Local value, const v8::AccessorInfo& args)` `void Setter(Local property, Local value, const v8::PropertyCallbackInfo& args)`. - Similar changes have been made to property enumerators, property deleters, property query, index getter, index setter, index enumerator, index deleter, index query. - V8 objects instantiated in C++ now require an `Isolate*` argument as the first argument. In most cases it is OK to simply pass `v8::Isolate::GetCurrent()`, e.g. `Date::New(Isolate::GetCurrent(), time)`, or `String::NewFromUtf8(Isolate::GetCurrent(), "foobar")`. - `HandleScope scope` now requires an `Isolate*` argument, i.e. `HandleScope scope(isolate)`, in most cases `v8::Isolate::GetCurrent()` is OK. - Similar changes have been made to `Locker` and `Unlocker`. - V8 objects that need to "escape" a scope should be enclosed in a `EscapableHandleScope` rather than a `HandleScope` and should be returned with `scope.Escape(value)`. - Exceptions are now thrown from isolates with `isolate->ThrowException(ExceptionObject)`. - `Context::GetCurrent()` must now be done on an isolate, e.g. `Isolate::GetCurrent()->GetCurrentContext()`. - `String::NewSymbol()` has been removed, use plain strings instead. - `String::New()` has been removed, use `String::NewFromUtf8()` instead. - `Persistent` objects no longer inherit from `Handle` and cannot be instantiated with another object. Instead, the `Persistent` should simply be declared, e.g. `Persistent handle` and then have a `Local` assigned to it with `handle.Reset(isolate, value)`. To get a `Local` from a `Persistent` you must instantiate it as the argument, i.e. `Local::New(Isolate*, Persistent)`. #### Node / io.js - Updated `node::Buffer::New()` to return a `Handle` directly so you no longer need to fetch the `handle_` property. - Updated `node::MakeCallback()` to require an `Isolate*` as the first argument. Generally `Isolate::GetCurrent()` will be OK for this. -------------------------------------- **The changelog below was inherited from joyent/node prior to the io.js fork.** ## 2014.09.24, Version 0.11.14 (Unstable) * uv: Upgrade to v1.0.0-rc1 * http_parser: Upgrade to v2.3.0 * npm: Upgrade to v2.0.0 * openssl: Upgrade to v1.0.1i * v8: Upgrade to 3.26.33 * Add fast path for simple URL parsing (Gabriel Wicke) * Added support for options parameter in console.dir() (Xavi Magrinyà) * Cluster: fix shared handles on Windows (Alexis Campailla) * buffer: Fix incorrect Buffer.compare behavior (Feross Aboukhadijeh) * buffer: construct new buffer from buffer toJSON() output (cjihrig) * buffer: improve Buffer constructor (Kang-Hao Kenny) * build: linking CoreFoundation framework for OSX (Thorsten Lorenz) * child_process: accept uid/gid everywhere (Fedor Indutny) * child_process: add path to spawn ENOENT Error (Ryan Cole) * child_process: copy spawnSync() cwd option to proper buffer (cjihrig) * child_process: do not access stderr when stdio set to 'ignore' (cjihrig) * child_process: don't throw on EAGAIN (Charles) * child_process: don't throw on EMFILE/ENFILE (Ben Noordhuis) * child_process: use full path for cmd.exe on Win32 (Ed Morley) * cluster: allow multiple calls to setupMaster() (Ryan Graham) * cluster: centralize removal from workers list. (Julien Gilli) * cluster: enable error/message events using .worker (cjihrig) * cluster: include settings object in 'setup' event (Ryan Graham) * cluster: restore v0.10.x setupMaster() behaviour (Ryan Graham) * cluster: support options in Worker constructor (cjihrig) * cluster: test events emit on cluster.worker (Sam Roberts) * console: console.dir() accepts options object (Xavi Magrinyà) * crypto: add `honorCipherOrder` argument (Fedor Indutny) * crypto: allow padding in RSA methods (Fedor Indutny) * crypto: clarify RandomBytes() error msg (Mickael van der Beek) * crypto: never store pointer to conn in SSL_CTX (Fedor Indutny) * crypto: unsigned value can't be negative (Brian White) * dgram: remove new keyword from errnoException (Jackson Tian) * dns: always set variable family in lookup() (cjihrig) * dns: include host name in error message if available (Maciej Małecki) * dns: introduce lookupService function (Saúl Ibarra Corretgé) * dns: send lookup c-ares errors to callback (Chris Dickinson) * dns: throw if hostname is not string or falsey (cjihrig) * events: Output the event that is leaking (Arnout Kazemier) * fs: close file if fstat() fails in readFile() (cjihrig) * fs: fs.readFile should not throw uncaughtException (Jackson Tian) * http: add 308 status_code, see RFC7238 (Yazhong Liu) * http: don't default OPTIONS to chunked encoding (Nick Muerdter) * http: fix bailout for writeHead (Alex Kocharin) * http: remove unused code block (Fedor Indutny) * http: write() after end() emits an error. (Julien Gilli) * lib, src: add vm.runInDebugContext() (Ben Noordhuis) * lib: noisy deprecation of child_process customFds (Ryan Graham) * module: don't require fs several times (Robert Kowalski) * net,dgram: workers can listen on exclusive ports (cjihrig) * net,stream: add isPaused, don't read() when paused (Chris Dickinson) * net: Ensure consistent binding to IPV6 if address is absent (Raymond Feng) * net: add remoteFamily for socket (Jackson Tian) * net: don't emit listening if handle is closed (Eli Skeggs) * net: don't prefer IPv4 addresses during resolution (cjihrig) * net: don't throw on net.Server.close() (cjihrig) * net: reset `errorEmitted` on reconnect (Ed Umansky) * node: set names for prototype methods (Trevor Norris) * node: support v8 microtask queue (Vladimir Kurchatkin) * path: fix slice OOB in trim (Lucio M. Tato) * path: isAbsolute() should always return boolean (Herman Lee) * process: throw TypeError if kill pid not a number (Sam Roberts) * querystring: custom encode and decode (fengmk2) * querystring: do not add sep for empty array (cjihrig) * querystring: remove prepended ? from query field (Ezequiel Rabinovich) * readline: fix close event of readline.Interface() (Yazhong Liu) * readline: fixes scoping bug (Dan Kaplun) * readline: implements keypress buffering (Dan Kaplun) * repl: fix multi-line input (Fedor Indutny) * repl: fix overwrite for this._prompt (Yazhong Liu) * repl: proper `setPrompt()` and `multiline` support (Fedor Indutny) * stream: don't try to finish if buffer is not empty (Vladimir Kurchatkin) * stream: only end reading on null, not undefined (Jonathan Reem) * streams: set default hwm properly for Duplex (Andrew Oppenlander) * string_bytes: ucs2 support big endian (Andrew Low) * tls, crypto: add DHE support (Shigeki Ohtsu) * tls: `checkServerIdentity` option (Trevor Livingston) * tls: add DHE-RSA-AES128-SHA256 to the def ciphers (Shigeki Ohtsu) * tls: better error reporting at cert validation (Fedor Indutny) * tls: support multiple keys/certs (Fedor Indutny) * tls: throw an error, not string (Jackson Tian) * udp: make it possible to receive empty udp packets (Andrius Bentkus) * url: treat \ the same as / (isaacs) ## 2014.05.01, Version 0.11.13 (Unstable) https://github.com/iojs/io.js/commit/99c9930ad626e2796af23def7cac19b65c608d18 * v8: upgrade to 3.24.35.22 * buffer: add compare and equals methods (Sean McArthur) * buffer: improve {read,write}{U}Int* methods (Nick Apperson) * buffer: return uint if MSB is 1 in readUInt32 (goussardg) * buffer: truncate buffer after string decode (Fedor Indutny) * child_process: fix assertion error in spawnSync (Shigeki Ohtsu) * crypto: fix memory leak in CipherBase::Final (Fedor Indutny) * crypto: improve error messages (Ingmar Runge) * crypto: move `createCredentials` to tls (Fedor Indutny) * crypto: work around OpenSSL oddness (Fedor Indutny) * dgram: introduce `reuseAddr` option (Fedor Indutny) * domain: don't crash on "throw null" (Alex Kocharin) * events: check if _events is an own property (Vladimir Kurchatkin) * fs: improve performance of all stat functions (James Pickard) * fs: return blksize on stats object (Trevor Norris) * http: add request.flush() method (Ben Noordhuis) * http: better client "protocol not supported" error (Nathan Rajlich) * http: use defaultAgent.protocol in protocol check (Nathan Rajlich) * main: Handle SIGINT properly. (Geir Hauge) * net: bind to `::` TCP address by default (Fedor Indutny) * readline: consider newlines for cursor position (Yazhong Liu) * stream: split `objectMode` for Duplex (Vladimir Kurchatkin) * tls: `getPeerCertificate(detailed)` (Fedor Indutny) * tls: do not call SNICallback unless present (Fedor Indutny) * tls: force readable/writable to `true` (Fedor Indutny) * tls: support OCSP on client and server (Fedor Indutny) * util: made util.isArray a direct alias for Array.isArray (Evan Carroll) ## 2014.03.11, Version 0.11.12 (Unstable) https://github.com/iojs/io.js/commit/7d6b8db40f32e817ff145b7cfe6b3aec3179fba7 * uv: Upgrade to v0.11.22 (Timothy J Fontaine) * buffer: allow toString to accept Infinity for end (Brian White) * child_process: add spawnSync/execSync (Bert Belder, Timothy J Fontaine) * cluster: handle bind errors on Windows (Alexis Campailla) * contextify: handle infinite recursion errors (Fedor Indutny) * crypto: allow custom generator for DiffieHellman (Brian White) * crypto: allow setting add'l authenticated data (Brian White) * crypto: fix CipherFinal return value check (Brian White) * crypto: make NewSessionDoneCb public (Fedor Indutny) * dgram: pass the bytes sent to the send callback (Timothy J Fontaine) * dns: validate arguments in resolver (Kenan Sulayman) * dns: verify argument is valid function in resolve (Kenan Sulayman) * http: avoid duplicate keys in writeHead (David Björklund) * net: add localPort to connect options (Timothy J Fontaine) * node: do not print SyntaxError hints to stderr (Fedor Indutny) * node: invoke `beforeExit` again if loop was active (Fedor Indutny) * node: make AsyncListenerInst field more explicit (Trevor Norris) * os: networkInterfaces include scopeid for ipv6 (Xidorn Quan) * process: allow changing `exitCode` in `on('exit')` (Fedor Indutny) * readline: fix `line` event, if input emit 'end' (Yazhong Liu) * src: add tracing.v8.on('gc') statistics hooks (Ben Noordhuis) * src: add v8.getHeapStatistics() function (Ben Noordhuis) * src: emit 'beforeExit' event on process object (Ben Noordhuis) * src: move AsyncListener from process to tracing (Trevor Norris) * tls: fix crash in SNICallback (Fedor Indutny) * tls: introduce asynchronous `newSession` (Fedor Indutny) * util: show meaningful values for boxed primitives (Nathan Rajlich) * vm: don't copy Proxy object from parent context (Ben Noordhuis) * windows: make stdout/sterr pipes blocking (Alexis Campailla) * zlib: add sync versions for convenience methods (Nikolai Vavilov) ## 2014.01.29, Version 0.11.11 (Unstable) https://github.com/iojs/io.js/commit/b46e77421581ea358e221a8a843d057c747f7e90 * v8: Upgrade to 3.22.24.19 * http_parser: Upgrade to 2.2.1 * openssl: Upgrade to 1.0.1f * uv: Upgrade to 0.11.18 * async-listener: revamp of subsystem (Trevor Norris) * node: do not ever close stdio (Fedor Indutny) * http: use writev on chunked encoding (Trevor Norris) * async_wrap/timers: remove Add/RemoveAsyncListener (Trevor Norris) * child_process: better error reporting for exec (Fedor Indutny) * crypto: add newline to cert and key if not present (Fedor Indutny) * crypto: clear error in GetPeerCertificate (Fedor Indutny) * crypto: honor default ciphers in client mode (Jacob Hoffman-Andrews) * crypto: introduce .setEngine(engine, [flags]) (Fedor Indutny) * crypto: support custom pbkdf2 digest methods (Ben Noordhuis) * domain: fix off-by-one in Domain.exit() (Ryan Graham) * http: concatenate duplicate headers by default (Alex Kocharin) * http: do not emit EOF non-readable socket (Fedor Indutny) * node: fix argument parsing with -p arg (Alexis Campailla) * path: improve POSIX path.join() performance (Jo Liss) * tls: emit `clientError` on early socket close (Fedor Indutny) * tls: introduce `.setMaxSendFragment(size)` (Fedor Indutny) * tls: make cert/pfx optional in tls.createServer() (Ben Noordhuis) * tls: process accumulated input (Fedor Indutny) * tls: show human-readable error messages (Ben Noordhuis) * util: handle escaped forward slashes correctly (Tom Gallacher) ## 2013.12.31, Version 0.11.10 (Unstable) https://github.com/iojs/io.js/commit/66931791f06207d1cdfea5ec1529edf3c94026d3 * http_parser: update to 2.2 * uv: Upgrade to v0.11.17 * v8: Upgrade to 3.22.24.10 * buffer: optimize writeInt* methods (Paul Loyd) * child_process: better error handling (Alexis Campailla) * cluster: do not synchronously emit 'setup' event (Sam Roberts) * cluster: restore backwards compatibility and various fixes (Sam Roberts) * crypto: remove unnecessary OpenSSL_add_all_digests (Yorkie) * crypto: support GCM authenticated encryption mode. (Ingmar Runge) * dns: add resolveSoa and 'SOA' rrtype (Tuğrul Topuz) * events: move EE c'tor guts to EventEmitter.init (Bert Belder) * http: DELETE shouldn't default to chunked encoding (Lalit Kapoor) * http: parse the status message in a http response. (Cam Swords) * node: fix removing AsyncListener in callback (Vladimir Kurchatkin) * node: follow specification, zero-fill ArrayBuffers (Trevor Norris) * openssl: use ASM optimized routines (Fedor Indutny) * process: allow nextTick infinite recursion (Trevor Norris) * querystring: remove `name` from `stringify()` (Yorkie) * timers: setImmediate v8 optimization fix (pflannery) * tls: add serialNumber to getPeerCertificate() (Ben Noordhuis) * tls: reintroduce socket.encrypted (Fedor Indutny) * tls: fix handling of asterisk in SNI context (Fedor Indutny) * util: Format negative zero as '-0' (David Chan) * vm: fix race condition in timeout (Alexis Campailla) * windows: fix dns lookup of localhost with ipv6 (Alexis Campailla) ## 2013.11.20, Version 0.11.9 (Unstable) https://github.com/iojs/io.js/commit/dcfd032bdd69dfb38c120e18438d6316ae522edc * uv: upgrade to v0.11.15 (Timothy J Fontaine) * v8: upgrade to 3.22.24.5 (Timothy J Fontaine) * buffer: remove warning when no encoding is passed (Trevor Norris) * build: make v8 use random seed for hash tables (Ben Noordhuis) * crypto: build with shared openssl without NPN (Ben Noordhuis) * crypto: update root certificates (Ben Noordhuis) * debugger: pass on v8 debug switches (Ben Noordhuis) * domain: use AsyncListener API (Trevor Norris) * fs: add recursive subdirectory support to fs.watch (Nick Simmons) * fs: make fs.watch() non-recursive by default (Ben Noordhuis) * http: cleanup freeSockets when socket destroyed (fengmk2) * http: force socket encoding to be null (isaacs) * http: make DELETE requests set `req.method` (Nathan Rajlich) * node: add AsyncListener support (Trevor Norris) * src: remove global HandleScope that hid memory leaks (Ben Noordhuis) * tls: add ECDH ciphers support (Erik Dubbelboer) * tls: do not default to 'localhost' servername (Fedor Indutny) * tls: more accurate wrapping of connecting socket (Fedor Indutny) ## 2013.10.30, Version 0.11.8 (Unstable) https://github.com/iojs/io.js/commit/f8d86e24f3463c36f7f3f4c3b3ec779e5b6201e1 * uv: Upgrade to v0.11.14 * v8: upgrade 3.21.18.3 * assert: indicate if exception message is generated (Glen Mailer) * buffer: add buf.toArrayBuffer() API (Trevor Norris) * cluster: fix premature 'disconnect' event (Ben Noordhuis) * crypto: add SPKAC support (Jason Gerfen) * debugger: count space for line numbers correctly (Alex Kocharin) * debugger: make busy loops SIGUSR1-interruptible (Ben Noordhuis) * debugger: repeat last command (Alex Kocharin) * debugger: show current line, fix for [#6150](https://github.com/joyent/node/issues/6150) (Alex Kocharin) * dgram: send() can accept strings (Trevor Norris) * dns: rename domain to hostname (Ben Noordhuis) * dns: set hostname property on error object (Ben Noordhuis) * dtrace, mdb_v8: support more string, frame types (Dave Pacheco) * http: add statusMessage (Patrik Stutz) * http: expose supported methods (Ben Noordhuis) * http: provide backpressure for pipeline flood (isaacs) * process: Add exitCode property (isaacs) * tls: socket.renegotiate(options, callback) (Fedor Indutny) * util: format as Error if instanceof Error (Rod Vagg) ## 2013.08.21, Version 0.11.7 (Unstable) https://github.com/iojs/io.js/commit/be52549bfa5311208b5fcdb3ba09210460fa9ceb * uv: upgrade to v0.11.13 * v8: upgrade to 3.20.17 * buffer: adhere to INSPECT_MAX_BYTES (Timothy J Fontaine) * buffer: fix regression for large buffer creation (Trevor Norris) * buffer: don't throw if slice length too long (Trevor Norris) * buffer: Buffer(buf) constructor copies into the proper buffer (Ben Noordhuis) * cli: remove --max-stack-size (Ben Noordhuis) * cli: unknown command line options are errors (Ben Noordhuis) * child_process: exec accept buffer as an encoding (Seth Fitzsimmons) * crypto: make randomBytes/pbkdf2 callbacks domain aware (Ben Noordhuis) * domain: deprecate domain.dispose(). (Forrest L Norvell) * fs: Expose birthtime on stat objects (isaacs) * http: Only send connection:keep-alive if necessary (isaacs) * repl: Catch syntax errors better (isaacs, Nathan Rajlich) * stream: change default highWaterMark for objectMode to 16 (Mathias Buus) * stream: make setEncoding/pause/resume chainable (Julian Gruber, isaacs) * util: pass opts to custom inspect functions (Timothy J Fontaine) * vm: rewritten to behave like Contextify (Domenic Denicola) ## 2013.08.21, Version 0.11.6 (Unstable) https://github.com/iojs/io.js/commit/04018d4b3938fd30ba14822e79195e4af2be36f6 * uv: Upgrade to v0.11.8 * v8: upgrade v8 to 3.20.14.1 * build: disable SSLv2 by default (Ben Noordhuis) * build: don't auto-destroy existing configuration (Ben Noordhuis) * crypto: add TLS 1.1 and 1.2 to secureProtocol list (Matthias Bartelmeß) * crypto: fix memory leak in randomBytes() error path (Ben Noordhuis) * dgram: don't call into js when send cb is omitted (Ben Noordhuis) * dgram: fix regression in string argument handling (Ben Noordhuis) * domains: performance improvements (Trevor Norris) * events: EventEmitter = require('events') (Jake Verbaten) * http: Add write()/end() callbacks (isaacs) * http: Consistent 'finish' event semantics (isaacs) * http: Prefer 'binary' over 'ascii' (isaacs) * http: Support legacy agent.addRequest API (isaacs) * http: Write hex/base64 chunks properly (isaacs) * http: add agent.maxFreeSockets option (isaacs) * http: provide access to raw headers/trailers (isaacs) * http: removed headers stay removed (James Halliday) * http,timers: improve callback performance (Ben Noordhuis) * net: family option in net.connect (Vsevolod Strukchinsky) * readline: pause stdin before turning off terminal raw mode (Daniel Chatfield) * smalloc: allow different external array types (Trevor Norris) * smalloc: expose ExternalArraySize (Trevor Norris) * stream: Short-circuit buffer pushes when flowing (isaacs) * tls: handle errors on socket before releasing it (Fedor Indutny) * util: fix isPrimitive check (Trevor Norris) * util: isObject should always return boolean (Trevor Norris) ## 2013.08.06, Version 0.11.5 (Unstable) https://github.com/iojs/io.js/commit/6f92da2dd106b0c63fde563284f83e08e2a521b5 * v8: upgrade to 3.20.11 * uv: upgrade to v0.11.7 * buffer: return offset for end of last write (Trevor Norris) * build: embed the mdb_v8.so into the binary (Timothy J Fontaine) * build: fix --without-ssl build (Ben Noordhuis) * child_process: add 'shell' option to .exec() (Ben Noordhuis) * dgram: report send errors to cb, don't pass bytes (Ben Noordhuis) * fs: write strings directly to disk (Trevor Norris) * https: fix default port (Koichi Kobayashi) * openssl: use asm for sha, md5, rmd (Fedor Indutny) * os: add mac address to networkInterfaces() output (Brian White) * smalloc: introduce smalloc module (Trevor Norris) * stream: Simplify flowing, passive data listening (streams3) (isaacs) * tls: asynchronous SNICallback (Fedor Indutny) * tls: share tls tickets key between cluster workers (Fedor Indutny) * util: don't throw on circular %j input to format() (Ben Noordhuis) ## 2013.07.12, Version 0.11.4 (Unstable) https://github.com/iojs/io.js/commit/b5b84197ed037918fd1a26e5cb87cce7c812ca55 * npm: Upgrade to 1.3.4 * v8: Upgrade to v3.20.2 * c-ares: Upgrade to piscisaureus/cares@805d153 * timers: setImmediate process full queue each turn (Ben Noordhuis) * http: Add agent.get/request methods (isaacs) * http: Proper KeepAlive behavior (isaacs) * configure: fix the --without-ssl option (Nathan Rajlich) * buffer: propagate originating parent (Trevor Norris) * tls_wrap: return Error not throw for missing cert (Timothy J Fontaine) * src: enable native v8 typed arrays (Ben Noordhuis) * stream: objectMode transform should allow falsey values (Jeff Barczewski) * slab_allocator: remove SlabAllocator (Trevor Norris) * crypto: fix memory leak in LoadPKCS12 (Fedor Indutny) * tls: export TLSSocket (Fedor Indutny) * zlib: allow changing of level and strategy (Brian White) * zlib: allow custom flush type for flush() (Brian White) ## 2013.06.26, Version 0.11.3 (Unstable) https://github.com/iojs/io.js/commit/38c0c47bbe280ddc42054418091571e532d82a1e * uv: Upgrade to v0.11.5 * c-ares: upgrade to 1.10.0 * v8: upgrade to v3.19.13 * punycode: update to v1.2.3 (Mathias Bynens) * debugger: break on uncaught exception (Miroslav Bajtos) * child_process: emit 'disconnect' asynchronously (Ben Noordhuis) * dtrace: enable uv's probes if enabled (Timothy J Fontaine) * dtrace: unify dtrace and systemtap interfaces (Timothy J Fontaine) * buffer: New API for backing data store (Trevor Norris) * buffer: return `this` in fill() for chainability (Brian White) * build: fix include order for building on windows (Timothy J Fontaine) * build: add android support (Linus Mårtensson) * readline: strip ctrl chars for prompt width calc (Krzysztof Chrapka) * tls: introduce TLSSocket based on tls_wrap binding (Fedor Indutny) * tls: add localAddress and localPort properties (Ben Noordhuis) * crypto: free excessive memory in NodeBIO (Fedor Indutny) * process: remove maxTickDepth (Trevor Norris) * timers: use uv_now instead of Date.now (Timothy J Fontaine) * util: Add debuglog, deprecate console lookalikes (isaacs) * module: use path.sep instead of a custom solution (Robert Kowalski) * http: don't escape request path, reject bad chars (Ben Noordhuis) * net: emit dns 'lookup' event before connect (Ben Noordhuis) * dns: add getServers and setServers (Timothy J Fontaine) ## 2013.05.13, Version 0.11.2 (Unstable) https://github.com/iojs/io.js/commit/5d3dc0e4c3369dfb00b7b13e08936c2e652fa696 * uv: Upgrade to 0.11.2 * V8: Upgrade to 3.19.0 * npm: Upgrade to 1.2.21 * build: Makefile should respect configure --prefix (Timothy J Fontaine) * cluster: use round-robin load balancing (Ben Noordhuis) * debugger, cluster: each worker has new debug port (Miroslav Bajtoš) * debugger: `restart` with custom debug port (Miroslav Bajtoš) * debugger: breakpoints in scripts not loaded yet (Miroslav Bajtoš) * event: EventEmitter#setMaxListeners() returns this (Sam Roberts) * events: add EventEmitter.defaultMaxListeners (Ben Noordhuis) * install: Support $(PREFIX) install target directory prefix (Olof Johansson) * os: Include netmask in os.networkInterfaces() (Ben Kelly) * path: add path.isAbsolute(path) (Ryan Doenges) * stream: Guarantee ordering of 'finish' event (isaacs) * streams: introduce .cork/.uncork/._writev (Fedor Indutny) * vm: add support for timeout argument (Andrew Paprocki) ## 2013.04.19, Version 0.11.1 (Unstable) https://github.com/iojs/io.js/commit/4babd2b46ebf9fbea2c9946af5cfae25a33b2b22 * V8: upgrade to 3.18.0 * uv: Upgrade to v0.11.1 * http: split into multiple separate modules (Timothy J Fontaine) * http: escape unsafe characters in request path (Ben Noordhuis) * url: Escape all unwise characters (isaacs) * build: depend on v8 postmortem-metadata if enabled (Paddy Byers) * etw: update prototypes to match dtrace provider (Timothy J Fontaine) * buffer: change output of Buffer.prototype.toJSON() (David Braun) * dtrace: actually use the _handle.fd value (Timothy J Fontaine) * dtrace: pass more arguments to probes (Dave Pacheco) * build: allow building with dtrace on osx (Dave Pacheco) * zlib: allow passing options to convenience methods (Kyle Robinson Young) ## 2013.03.28, Version 0.11.0 (Unstable) https://github.com/iojs/io.js/commit/bce38b3d74e64fcb7d04a2dd551151da6168cdc5 * V8: update to 3.17.13 * os: use %SystemRoot% or %windir% in os.tmpdir() (Suwon Chae) * util: fix util.inspect() line width calculation (Marcin Kostrzewa) * buffer: remove _charsWritten (Trevor Norris) * fs: uv_[fl]stat now reports subsecond resolution (Timothy J Fontaine) * fs: Throw if error raised and missing callback (bnoordhuis) * tls: expose SSL_CTX_set_timeout via tls.createServer (Manav Rathi) * tls: remove harmful unnecessary bounds checking (Marcel Laverdet) * buffer: write ascii strings using WriteOneByte (Trevor Norris) * dtrace: fix generation of v8 constants on freebsd (Fedor Indutny) * dtrace: x64 ustack helper (Fedor Indutny) * readline: handle wide characters properly (Nao Iizuka) * repl: Use a domain to catch async errors safely (isaacs) * repl: emit 'reset' event when context is reset (Sami Samhuri) * util: custom `inspect()` method may return an Object (Nathan Rajlich) * console: `console.dir()` bypasses inspect() methods (Nathan Rajlich) ## 2014.12.22, Version 0.10.35 (Stable) * tls: re-add 1024-bit SSL certs removed by f9456a2 (Chris Dickinson) * timers: don't close interval timers when unrefd (Julien Gilli) * timers: don't mutate unref list while iterating it (Julien Gilli) ## 2014.12.17, Version 0.10.34 (Stable) https://github.com/iojs/io.js/commit/52795f8fcc2de77cf997e671ea58614e5e425dfe * uv: update to v0.10.30 * zlib: upgrade to v1.2.8 * child_process: check execFile args is an array (Sam Roberts) * child_process: check fork args is an array (Sam Roberts) * crypto: update root certificates (Ben Noordhuis) * domains: fix issues with abort on uncaught (Julien Gilli) * timers: Avoid linear scan in _unrefActive. (Julien Gilli) * timers: fix unref() memory leak (Trevor Norris) * v8: add api for aborting on uncaught exception (Julien Gilli) * debugger: fix when using "use strict" (Julien Gilli) ## 2014.10.20, Version 0.10.33 (Stable) https://github.com/iojs/io.js/commit/8d045a30e95602b443eb259a5021d33feb4df079 * openssl: Update to 1.0.1j (Addressing multiple CVEs) * uv: Update to v0.10.29 * child_process: properly support optional args (cjihrig) * crypto: Disable autonegotiation for SSLv2/3 by default (Fedor Indutny, Timothy J Fontaine, Alexis Campailla) This is a behavior change, by default we will not allow the negotiation to SSLv2 or SSLv3. If you want this behavior, run Node.js with either `--enable-ssl2` or `--enable-ssl3` respectively. This does not change the behavior for users specifically requesting `SSLv2_method` or `SSLv3_method`. While this behavior is not advised, it is assumed you know what you're doing since you're specifically asking to use these methods. ## 2014.09.16, Version 0.10.32 (Stable) https://github.com/iojs/io.js/commit/0fe0d121551593c23a565db8397f85f17bb0f00e * npm: Update to 1.4.28 * v8: fix a crash introduced by previous release (Fedor Indutny) * configure: add --openssl-no-asm flag (Fedor Indutny) * crypto: use domains for any callback-taking method (Chris Dickinson) * http: do not send `0\r\n\r\n` in TE HEAD responses (Fedor Indutny) * querystring: fix unescape override (Tristan Berger) * url: Add support for RFC 3490 separators (Mathias Bynens) ## 2014.08.19, Version 0.10.31 (Stable) https://github.com/iojs/io.js/commit/7fabdc23d843cb705d2d0739e7bbdaaf50aa3292 * v8: backport CVE-2013-6668 * openssl: Update to v1.0.1i * npm: Update to v1.4.23 * cluster: disconnect should not be synchronous (Sam Roberts) * fs: fix fs.readFileSync fd leak when get RangeError (Jackson Tian) * stream: fix Readable.wrap objectMode falsy values (James Halliday) * timers: fix timers with non-integer delay hanging. (Julien Gilli) ## 2014.07.31, Version 0.10.30 (Stable) https://github.com/iojs/io.js/commit/bc0ff830aff1e016163d855e86ded5c98b0899e8 * uv: Upgrade to v0.10.28 * npm: Upgrade to v1.4.21 * v8: Interrupts must not mask stack overflow. * Revert "stream: start old-mode read in a next tick" (Fedor Indutny) * buffer: fix sign overflow in `readUIn32BE` (Fedor Indutny) * buffer: improve {read,write}{U}Int* methods (Nick Apperson) * child_process: handle writeUtf8String error (Fedor Indutny) * deps: backport 4ed5fde4f from v8 upstream (Fedor Indutny) * deps: cherry-pick eca441b2 from OpenSSL (Fedor Indutny) * lib: remove and restructure calls to isNaN() (cjihrig) * module: eliminate double `getenv()` (Maciej Małecki) * stream2: flush extant data on read of ended stream (Chris Dickinson) * streams: remove unused require('assert') (Rod Vagg) * timers: backport f8193ab (Julien Gilli) * util.h: interface compatibility (Oguz Bastemur) * zlib: do not crash on write after close (Fedor Indutny) ## 2014.06.05, Version 0.10.29 (Stable) https://github.com/iojs/io.js/commit/ce82d6b8474bde7ac7df6d425fb88fb1bcba35bc * openssl: to 1.0.1h (CVE-2014-0224) * npm: upgrade to 1.4.14 * utf8: Prevent Node from sending invalid UTF-8 (Felix Geisendörfer) - *NOTE* this introduces a breaking change, previously you could construct invalid UTF-8 and invoke an error in a client that was expecting valid UTF-8, now unmatched surrogate pairs are replaced with the unknown UTF-8 character. To restore the old functionality simply have NODE_INVALID_UTF8 environment variable set. * child_process: do not set args before throwing (Greg Sabia Tucker) * child_process: spawn() does not throw TypeError (Greg Sabia Tucker) * constants: export O_NONBLOCK (Fedor Indutny) * crypto: improve memory usage (Alexis Campailla) * fs: close file if fstat() fails in readFile() (cjihrig) * lib: name EventEmitter prototype methods (Ben Noordhuis) * tls: fix performance issue (Alexis Campailla) ## 2014.05.01, Version 0.10.28 (Stable) https://github.com/iojs/io.js/commit/b148cbe09d4657766fdb61575ba985734c2ff0a8 * npm: upgrade to v1.4.9 ## 2014.05.01, Version 0.10.27 (Stable) https://github.com/iojs/io.js/commit/cb7911f78ae96ef7a540df992cc1359ba9636e86 * npm: upgrade to v1.4.8 * openssl: upgrade to 1.0.1g * uv: update to v0.10.27 * dns: fix certain txt entries (Fedor Indutny) * assert: Ensure reflexivity of deepEqual (Mike Pennisi) * child_process: fix deadlock when sending handles (Fedor Indutny) * child_process: fix sending handle twice (Fedor Indutny) * crypto: do not lowercase cipher/hash names (Fedor Indutny) * dtrace: workaround linker bug on FreeBSD (Fedor Indutny) * http: do not emit EOF non-readable socket (Fedor Indutny) * http: invoke createConnection when no agent (Nathan Rajlich) * stream: remove useless check (Brian White) * timer: don't reschedule timer bucket in a domain (Greg Brail) * url: treat \ the same as / (isaacs) * util: format as Error if instanceof Error (Rod Vagg) ## 2014.02.18, Version 0.10.26 (Stable) https://github.com/iojs/io.js/commit/cc56c62ed879ad4f93b1fdab3235c43e60f48b7e * uv: Upgrade to v0.10.25 (Timothy J Fontaine) * npm: upgrade to 1.4.3 (isaacs) * v8: support compiling with VS2013 (Fedor Indutny) * cares: backport TXT parsing fix (Fedor Indutny) * crypto: throw on SignFinal failure (Fedor Indutny) * crypto: update root certificates (Ben Noordhuis) * debugger: Fix breakpoint not showing after restart (Farid Neshat) * fs: make unwatchFile() insensitive to path (iamdoron) * net: do not re-emit stream errors (Fedor Indutny) * net: make Socket destroy() re-entrance safe (Jun Ma) * net: reset `endEmitted` on reconnect (Fedor Indutny) * node: do not close stdio implicitly (Fedor Indutny) * zlib: avoid assertion in close (Fedor Indutny) ## 2014.01.23, Version 0.10.25 (Stable) https://github.com/iojs/io.js/commit/b0e5f195dfce3e2b99f5091373d49f6616682596 * uv: Upgrade to v0.10.23 * npm: Upgrade to v1.3.24 * v8: Fix enumeration for objects with lots of properties * child_process: fix spawn() optional arguments (Sam Roberts) * cluster: report more errors to workers (Fedor Indutny) * domains: exit() only affects active domains (Ryan Graham) * src: OnFatalError handler must abort() (Timothy J Fontaine) * stream: writes may return false but forget to emit drain (Yang Tianyang) ## 2013.12.18, Version 0.10.24 (Stable) https://github.com/iojs/io.js/commit/b7fd6bc899ccb629d790c47aee06aba87e535c41 * uv: Upgrade to v0.10.21 * npm: upgrade to 1.3.21 * v8: backport fix for CVE-2013-{6639|6640} * build: unix install node and dep library headers (Timothy J Fontaine) * cluster, v8: fix --logfile=%p.log (Ben Noordhuis) * module: only cache package main (Wyatt Preul) ## 2013.12.12, Version 0.10.23 (Stable) https://github.com/iojs/io.js/commit/0462bc23564e7e950a70ae4577a840b04db6c7c6 * uv: Upgrade to v0.10.20 (Timothy J Fontaine) * npm: Upgrade to 1.3.17 (isaacs) * gyp: update to 78b26f7 (Timothy J Fontaine) * build: include postmortem symbols on linux (Timothy J Fontaine) * crypto: Make Decipher._flush() emit errors. (Kai Groner) * dgram: fix abort when getting `fd` of closed dgram (Fedor Indutny) * events: do not accept NaN in setMaxListeners (Fedor Indutny) * events: avoid calling `once` functions twice (Tim Wood) * events: fix TypeError in removeAllListeners (Jeremy Martin) * fs: report correct path when EEXIST (Fedor Indutny) * process: enforce allowed signals for kill (Sam Roberts) * tls: emit 'end' on .receivedShutdown (Fedor Indutny) * tls: fix potential data corruption (Fedor Indutny) * tls: handle `ssl.start()` errors appropriately (Fedor Indutny) * tls: reset NPN callbacks after SNI (Fedor Indutny) ## 2013.11.12, Version 0.10.22 (Stable) https://github.com/iojs/io.js/commit/cbff8f091c22fb1df6b238c7a1b9145db950fa65 * npm: Upgrade to 1.3.14 * uv: Upgrade to v0.10.19 * child_process: don't assert on stale file descriptor events (Fedor Indutny) * darwin: Fix "Not Responding" in Mavericks activity monitor (Fedor Indutny) * debugger: Fix bug in sb() with unnamed script (Maxim Bogushevich) * repl: do not insert duplicates into completions (Maciej Małecki) * src: Fix memory leak on closed handles (Timothy J Fontaine) * tls: prevent stalls by using read(0) (Fedor Indutny) * v8: use correct timezone information on Solaris (Maciej Małecki) ## 2013.10.18, Version 0.10.21 (Stable) https://github.com/iojs/io.js/commit/e2da042844a830fafb8031f6c477eb4f96195210 * uv: Upgrade to v0.10.18 * crypto: clear errors from verify failure (Timothy J Fontaine) * dtrace: interpret two byte strings (Dave Pacheco) * fs: fix fs.truncate() file content zeroing bug (Ben Noordhuis) * http: provide backpressure for pipeline flood (isaacs) * tls: fix premature connection termination (Ben Noordhuis) ## 2013.09.30, Version 0.10.20 (Stable) https://github.com/iojs/io.js/commit/d7234c8d50a1af73f60d2d3c0cc7eed17429a481 * tls: fix sporadic hang and partial reads (Fedor Indutny) - fixes "npm ERR! cb() never called!" ## 2013.09.24, Version 0.10.19 (Stable) https://github.com/iojs/io.js/commit/6b5e6a5a3ec8d994c9aab3b800b9edbf1b287904 * uv: Upgrade to v0.10.17 * npm: upgrade to 1.3.11 * readline: handle input starting with control chars (Eric Schrock) * configure: add mips-float-abi (soft, hard) option (Andrei Sedoi) * stream: objectMode transforms allow falsey values (isaacs) * tls: prevent duplicate values returned from read (Nathan Rajlich) * tls: NPN protocols are now local to connections (Fedor Indutny) ## 2013.09.04, Version 0.10.18 (Stable) https://github.com/iojs/io.js/commit/67a1f0c52e0708e2596f3f2134b8386d6112561e * uv: Upgrade to v0.10.15 * stream: Don't crash on unset _events property (isaacs) * stream: Pass 'buffer' encoding with decoded writable chunks (isaacs) ## 2013.08.21, Version 0.10.17 (Stable) https://github.com/iojs/io.js/commit/469a4a5091a677df62be319675056b869c31b35c * uv: Upgrade v0.10.14 * http_parser: Do not accept PUN/GEM methods as PUT/GET (Chris Dickinson) * tls: fix assertion when ssl is destroyed at read (Fedor Indutny) * stream: Throw on 'error' if listeners removed (isaacs) * dgram: fix assertion on bad send() arguments (Ben Noordhuis) * readline: pause stdin before turning off terminal raw mode (Daniel Chatfield) ## 2013.08.16, Version 0.10.16 (Stable) https://github.com/iojs/io.js/commit/50b4c905a4425430ae54db4906f88982309e128d * v8: back-port fix for CVE-2013-2882 * npm: Upgrade to 1.3.8 * crypto: fix assert() on malformed hex input (Ben Noordhuis) * crypto: fix memory leak in randomBytes() error path (Ben Noordhuis) * events: fix memory leak, don't leak event names (Ben Noordhuis) * http: Handle hex/base64 encodings properly (isaacs) * http: improve chunked res.write(buf) performance (Ben Noordhuis) * stream: Fix double pipe error emit (Eran Hammer) ## 2013.07.25, Version 0.10.15 (Stable) https://github.com/iojs/io.js/commit/2426d65af860bda7be9f0832a99601cc43c6cf63 * src: fix process.getuid() return value (Ben Noordhuis) ## 2013.07.25, Version 0.10.14 (Stable) https://github.com/iojs/io.js/commit/fdf57f811f9683a4ec49a74dc7226517e32e6c9d * uv: Upgrade to v0.10.13 * npm: Upgrade to v1.3.5 * os: Don't report negative times in cpu info (Ben Noordhuis) * fs: Handle large UID and GID (Ben Noordhuis) * url: Fix edge-case when protocol is non-lowercase (Shuan Wang) * doc: Streams API Doc Rewrite (isaacs) * node: call MakeDomainCallback in all domain cases (Trevor Norris) * crypto: fix memory leak in LoadPKCS12 (Fedor Indutny) ## 2013.07.09, Version 0.10.13 (Stable) https://github.com/iojs/io.js/commit/e32660a984427d46af6a144983cf7b8045b7299c * uv: Upgrade to v0.10.12 * npm: Upgrade to 1.3.2 * windows: get proper errno (Ben Noordhuis) * tls: only wait for finish if we haven't seen it (Timothy J Fontaine) * http: Dump response when request is aborted (isaacs) * http: use an unref'd timer to fix delay in exit (Peter Rust) * zlib: level can be negative (Brian White) * zlib: allow zero values for level and strategy (Brian White) * buffer: add comment explaining buffer alignment (Ben Noordhuis) * string_bytes: properly detect 64bit (Timothy J Fontaine) * src: fix memory leak in UsingDomains() (Ben Noordhuis) ## 2013.06.18, Version 0.10.12 (Stable) https://github.com/iojs/io.js/commit/a088cf4f930d3928c97d239adf950ab43e7794aa * npm: Upgrade to 1.2.32 * readline: make `ctrl + L` clear the screen (Yuan Chuan) * v8: add setVariableValue debugger command (Ben Noordhuis) * net: Do not destroy socket mid-write (isaacs) * v8: fix build for mips32r2 architecture (Andrei Sedoi) * configure: fix cross-compilation host_arch_cc() (Andrei Sedoi) ## 2013.06.13, Version 0.10.11 (Stable) https://github.com/iojs/io.js/commit/d9d5bc465450ae5d60da32e9ffcf71c2767f1fad * uv: upgrade to 0.10.11 * npm: Upgrade to 1.2.30 * openssl: add missing configuration pieces for MIPS (Andrei Sedoi) * Revert "http: remove bodyHead from 'upgrade' events" (isaacs) * v8: fix pointer arithmetic undefined behavior (Trevor Norris) * crypto: fix utf8/utf-8 encoding check (Ben Noordhuis) * net: Fix busy loop on POLLERR|POLLHUP on older linux kernels (Ben Noordhuis, isaacs) ## 2013.06.04, Version 0.10.10 (Stable) https://github.com/iojs/io.js/commit/25e51c396aa23018603baae2b1d9390f5d9db496 * uv: Upgrade to 0.10.10 * npm: Upgrade to 1.2.25 * url: Properly parse certain oddly formed urls (isaacs) * stream: unshift('') is a noop (isaacs) ## 2013.05.30, Version 0.10.9 (Stable) https://github.com/iojs/io.js/commit/878ffdbe6a8eac918ef3a7f13925681c3778060b * npm: Upgrade to 1.2.24 * uv: Upgrade to v0.10.9 * repl: fix JSON.parse error check (Brian White) * tls: proper .destroySoon (Fedor Indutny) * tls: invoke write cb only after opposite read end (Fedor Indutny) * tls: ignore .shutdown() syscall error (Fedor Indutny) ## 2013.05.24, Version 0.10.8 (Stable) https://github.com/iojs/io.js/commit/30d9e9fdd9d4c33d3d95a129d021cd8b5b91eddb * v8: update to 3.14.5.9 * uv: upgrade to 0.10.8 * npm: Upgrade to 1.2.23 * http: remove bodyHead from 'upgrade' events (Nathan Zadoks) * http: Return true on empty writes, not false (isaacs) * http: save roundtrips, convert buffers to strings (Ben Noordhuis) * configure: respect the --dest-os flag consistently (Nathan Rajlich) * buffer: throw when writing beyond buffer (Trevor Norris) * crypto: Clear error after DiffieHellman key errors (isaacs) * string_bytes: strip padding from base64 strings (Trevor Norris) ## 2013.05.17, Version 0.10.7 (Stable) https://github.com/iojs/io.js/commit/d2fdae197ac542f686ee06835d1153dd43b862e5 * uv: upgrade to v0.10.7 * npm: Upgrade to 1.2.21 * crypto: Don't ignore verify encoding argument (isaacs) * buffer, crypto: fix default encoding regression (Ben Noordhuis) * timers: fix setInterval() assert (Ben Noordhuis) ## 2013.05.14, Version 0.10.6 (Stable) https://github.com/iojs/io.js/commit/5deb1672f2b5794f8be19498a425ea4dc0b0711f * module: Deprecate require.extensions (isaacs) * stream: make Readable.wrap support objectMode, empty streams (Daniel Moore) * child_process: fix handle delivery (Ben Noordhuis) * crypto: Fix performance regression (isaacs) * src: DRY string encoding/decoding (isaacs) ## 2013.04.23, Version 0.10.5 (Stable) https://github.com/iojs/io.js/commit/deeaf8fab978e3cadb364e46fb32dafdebe5f095 * uv: Upgrade to 0.10.5 (isaacs) * build: added support for Visual Studio 2012 (Miroslav Bajtoš) * http: Don't try to destroy nonexistent sockets (isaacs) * crypto: LazyTransform on properties, not methods (isaacs) * assert: put info in err.message, not err.name (Ryan Doenges) * dgram: fix no address bind() (Ben Noordhuis) * handle_wrap: fix NULL pointer dereference (Ben Noordhuis) * os: fix unlikely buffer overflow in os.type() (Ben Noordhuis) * stream: Fix unshift() race conditions (isaacs) ## 2013.04.11, Version 0.10.4 (Stable) https://github.com/iojs/io.js/commit/9712aa9f76073c30850b20a188b1ed12ffb74d17 * uv: Upgrade to 0.10.4 * npm: Upgrade to 1.2.18 * v8: Avoid excessive memory growth in JSON.parse (Fedor Indutny) * child_process, cluster: fix O(n*m) scan of cmd string (Ben Noordhuis) * net: fix socket.bytesWritten Buffers support (Fedor Indutny) * buffer: fix offset checks (Łukasz Walukiewicz) * stream: call write cb before finish event (isaacs) * http: Support write(data, 'hex') (isaacs) * crypto: dh secret should be left-padded (Fedor Indutny) * process: expose NODE_MODULE_VERSION in process.versions (Rod Vagg) * crypto: fix constructor call in crypto streams (Andreas Madsen) * net: account for encoding in .byteLength (Fedor Indutny) * net: fix buffer iteration in bytesWritten (Fedor Indutny) * crypto: zero is not an error if writing 0 bytes (Fedor Indutny) * tls: Re-enable check of CN-ID in cert verification (Tobias Müllerleile) ## 2013.04.03, Version 0.10.3 (Stable) https://github.com/iojs/io.js/commit/d4982f6f5e4a9a703127489a553b8d782997ea43 * npm: Upgrade to 1.2.17 * child_process: acknowledge sent handles (Fedor Indutny) * etw: update prototypes to match dtrace provider (Timothy J Fontaine) * dtrace: pass more arguments to probes (Dave Pacheco) * build: allow building with dtrace on osx (Dave Pacheco) * http: Remove legacy ECONNRESET workaround code (isaacs) * http: Ensure socket cleanup on client response end (isaacs) * tls: Destroy socket when encrypted side closes (isaacs) * repl: isSyntaxError() catches "strict mode" errors (Nathan Rajlich) * crypto: Pass options to ctor calls (isaacs) * src: tie process.versions.uv to uv_version_string() (Ben Noordhuis) ## 2013.03.28, Version 0.10.2 (Stable) https://github.com/iojs/io.js/commit/1e0de9c426e07a260bbec2d2196c2d2db8eb8886 * npm: Upgrade to 1.2.15 * uv: Upgrade to 0.10.3 * tls: handle SSL_ERROR_ZERO_RETURN (Fedor Indutny) * tls: handle errors before calling C++ methods (Fedor Indutny) * tls: remove harmful unnecessary bounds checking (Marcel Laverdet) * crypto: make getCiphers() return non-SSL ciphers (Ben Noordhuis) * crypto: check randomBytes() size argument (Ben Noordhuis) * timers: do not calculate Timeout._when property (Alexey Kupershtokh) * timers: fix off-by-one ms error (Alexey Kupershtokh) * timers: handle signed int32 overflow in enroll() (Fedor Indutny) * stream: Fix stall in Transform under very specific conditions (Gil Pedersen) * stream: Handle late 'readable' event listeners (isaacs) * stream: Fix early end in Writables on zero-length writes (isaacs) * domain: fix domain callback from MakeCallback (Trevor Norris) * child_process: don't emit same handle twice (Ben Noordhuis) * child_process: fix sending utf-8 to child process (Ben Noordhuis) ## 2013.03.21, Version 0.10.1 (Stable) https://github.com/iojs/io.js/commit/c274d1643589bf104122674a8c3fd147527a667d * npm: upgrade to 1.2.15 * crypto: Improve performance of non-stream APIs (Fedor Indutny) * tls: always reset this.ssl.error after handling (Fedor Indutny) * tls: Prevent mid-stream hangs (Fedor Indutny, isaacs) * net: improve arbitrary tcp socket support (Ben Noordhuis) * net: handle 'finish' event only after 'connect' (Fedor Indutny) * http: Don't hot-path end() for large buffers (isaacs) * fs: Missing cb errors are deprecated, not a throw (isaacs) * fs: make write/appendFileSync correctly set file mode (Raymond Feng) * stream: Return self from readable.wrap (isaacs) * stream: Never call decoder.end() multiple times (Gil Pedersen) * windows: enable watching signals with process.on('SIGXYZ') (Bert Belder) * node: revert removal of MakeCallback (Trevor Norris) * node: Unwrap without aborting in handle fd getter (isaacs) ## 2013.03.11, Version 0.10.0 (Stable) https://github.com/iojs/io.js/commit/163ca274230fce536afe76c64676c332693ad7c1 * npm: Upgrade to 1.2.14 * core: Append filename properly in dlopen on windows (isaacs) * zlib: Manage flush flags appropriately (isaacs) * domains: Handle errors thrown in nested error handlers (isaacs) * buffer: Strip high bits when converting to ascii (Ben Noordhuis) * win/msi: Enable modify and repair (Bert Belder) * win/msi: Add feature selection for various node parts (Bert Belder) * win/msi: use consistent registry key paths (Bert Belder) * child_process: support sending dgram socket (Andreas Madsen) * fs: Raise EISDIR on Windows when calling fs.read/write on a dir (isaacs) * unix: fix strict aliasing warnings, macro-ify functions (Ben Noordhuis) * unix: honor UV_THREADPOOL_SIZE environment var (Ben Noordhuis) * win/tty: fix typo in color attributes enumeration (Bert Belder) * win/tty: don't touch insert mode or quick edit mode (Bert Belder) ## 2013.03.06, Version 0.9.12 (Unstable) https://github.com/iojs/io.js/commit/0debf5a82934da805592b6496756cdf27c993abc * stream: Allow strings in Readable.push/unshift (isaacs) * stream: Remove bufferSize option (isaacs) * stream: Increase highWaterMark on large reads (isaacs) * stream: _write: takes an encoding argument (isaacs) * stream: _transform: remove output() method, provide encoding (isaacs) * stream: Don't require read(0) to emit 'readable' event (isaacs) * node: Add --throw-deprecation (isaacs) * http: fix multiple timeout events (Eugene Girshov) * http: More useful setTimeout API on server (isaacs) * net: use close callback, not process.nextTick (Ben Noordhuis) * net: Provide better error when writing after FIN (isaacs) * dns: Support NAPTR queries (Pavel Lang) * dns: fix ReferenceError in resolve() error path (Xidorn Quan) * child_process: handle ENOENT correctly on Windows (Scott Blomquist) * cluster: Rename destroy() to kill(signal=SIGTERM) (isaacs) * build: define nightly tag external to build system (Timothy J Fontaine) * build: make msi build work when spaces are present in the path (Bert Belder) * build: fix msi build issue with WiX 3.7/3.8 (Raymond Feng) * repl: make compatible with domains (Dave Olszewski) * events: Code cleanup and performance improvements (Trevor Norris) ## 2013.03.01, Version 0.9.11 (Unstable) https://github.com/iojs/io.js/commit/83392403b7a9b7782b37c17688938c75010f81ba * V8: downgrade to 3.14.5 * openssl: update to 1.0.1e * darwin: Make process.title work properly (Ben Noordhuis) * fs: Support mode/flag options to read/append/writeFile (isaacs) * stream: _read() no longer takes a callback (isaacs) * stream: Add stream.unshift(chunk) (isaacs) * stream: remove lowWaterMark feature (isaacs) * net: omit superfluous 'connect' event (Ben Noordhuis) * build, windows: disable SEH (Ben Noordhuis) * core: remove errno global (Ben Noordhuis) * core: Remove the nextTick for running the main file (isaacs) * core: Mark exit() calls with status codes (isaacs) * core: Fix debug signal handler race condition lock (isaacs) * crypto: clear error stack (Ben Noordhuis) * test: optionally set common.PORT via env variable (Timothy J Fontaine) * path: Throw TypeError on non-string args to path.resolve/join (isaacs, Arianit Uka) * crypto: fix uninitialized memory access in openssl (Ben Noordhuis) ## 2013.02.19, Version 0.9.10 (Unstable) * V8: Upgrade to 3.15.11.15 * npm: Upgrade to 1.2.12 * fs: Change default WriteStream config, increase perf (isaacs) * process: streamlining tick callback logic (Trevor Norris) * stream_wrap, udp_wrap: add read-only fd property (Ben Noordhuis) * buffer: accept negative indices in Buffer#slice() (Ben Noordhuis) * tls: Cycle data when underlying socket drains (isaacs) * stream: read(0) should not always trigger _read(n,cb) (isaacs) * stream: Empty strings/buffers do not signal EOF any longer (isaacs) * crypto: improve cipher/decipher error messages (Ben Noordhuis) * net: Respect the 'readable' flag on sockets (isaacs) * net: don't suppress ECONNRESET (Ben Noordhuis) * typed arrays: copy Buffer in typed array constructor (Ben Noordhuis) * typed arrays: make DataView throw on non-ArrayBuffer (Ben Noordhuis) * windows: MSI installer enhancements (Scott Blomquist, Jim Schubert) ## 2013.02.07, Version 0.9.9 (Unstable) https://github.com/iojs/io.js/commit/4b9f0d190cd6b22853caeb0e07145a98ce1d1d7f * tls: port CryptoStream to streams2 (Fedor Indutny) * typed arrays: only share ArrayBuffer backing store (Ben Noordhuis) * stream: make Writable#end() accept a callback function (Nathan Rajlich) * buffer: optimize 'hex' handling (Ben Noordhuis) * dns, cares: don't filter NOTIMP, REFUSED, SERVFAIL (Ben Noordhuis) * readline: treat bare \r as a line ending (isaacs) * readline: make \r\n emit one 'line' event (Ben Noordhuis) * cluster: support datagram sockets (Bert Belder) * stream: Correct Transform class backpressure (isaacs) * addon: Pass module object to NODE_MODULE init function (isaacs, Rod Vagg) * buffer: slow buffer copy compatibility fix (Trevor Norris) * Add bytesWritten to tls.CryptoStream (Andy Burke) ## 2013.01.24, Version 0.9.8 (Unstable) https://github.com/iojs/io.js/commit/5f2f8400f665dc32c3e10e7d31d53d756ded9156 * npm: Upgrade to v1.2.3 * V8: Upgrade to 3.15.11.10 * streams: Support objects other than Buffers (Jake Verbaten) * buffer: remove float write range checks (Trevor Norris) * http: close connection on 304/204 responses with chunked encoding (Ben Noordhuis) * build: fix build with dtrace support on FreeBSD (Fedor Indutny) * console: Support formatting options in trace() (isaacs) * domain: empty stack on all exceptions (Dave Olszewski) * unix, windows: make uv_*_bind() error codes consistent (Andrius Bentkus) * linux: add futimes() fallback (Ben Noordhuis) ## 2013.01.18, Version 0.9.7 (Unstable) https://github.com/iojs/io.js/commit/9e7bebeb8305edd55735a95955a98fdbe47572e5 * V8: Upgrade to 3.15.11.7 * npm: Upgrade to 1.2.2 * punycode: Upgrade to 1.2.0 (Mathias Bynens) * repl: make built-in modules available by default (Felix Böhm) * windows: add support for '_Total' perf counters (Scott Blomquist) * cluster: make --prof work for workers (Ben Noordhuis) * child_process: do not keep list of sent sockets (Fedor Indutny) * tls: Follow RFC6125 more strictly (Fedor Indutny) * buffer: floating point read/write improvements (Trevor Norris) * TypedArrays: Improve dataview perf without endian param (Dean McNamee) * module: assert require() called with a non-empty string (Felix Böhm, James Campos) * stdio: Set readable/writable flags properly (isaacs) * stream: Properly handle large reads from push-streams (isaacs) ## 2013.01.11, Version 0.9.6 (Unstable) https://github.com/iojs/io.js/commit/9313fdc71ca8335d5e3a391c103230ee6219b3e2 * V8: update to 3.15.11.5 * node: remove ev-emul.h (Ben Noordhuis) * path: make basename and extname ignore trailing slashes (Bert Belder) * typed arrays: fix sunos signed/unsigned char issue (Ben Noordhuis) * child_process: Fix {stdio:'inherit'} regression (Ben Noordhuis) * child_process: Fix pipe() from child stdio streams (Maciej Małecki) * child_process: make fork() execPath configurable (Bradley Meck) * stream: Add readable.push(chunk) method (isaacs) * dtrace: x64 ustack helper (Fedor Indutny) * repl: fix floating point number parsing (Nirk Niggler) * repl: allow overriding builtins (Ben Noordhuis) * net: add localAddress and localPort to Socket (James Hight) * fs: make pool size coincide with ReadStream bufferSize (Shigeki Ohtsu) * typed arrays: implement load and store swizzling (Dean McNamee) * windows: fix perfctr crash on XP and 2003 (Scott Blomquist) * dgram: fix double implicit bind error (Ben Noordhuis) ## 2012.12.30, Version 0.9.5 (Unstable) https://github.com/iojs/io.js/commit/01994e8119c24f2284bac0779b32acb49c95bee7 * assert: improve support for new execution contexts (lukebayes) * domain: use camelCase instead of snake_case (isaacs) * domain: Do not use uncaughtException handler (isaacs) * fs: make 'end' work with ReadStream without 'start' (Ben Noordhuis) * https: optimize createConnection() (Ryunosuke SATO) * buffer: speed up base64 encoding by 20% (Ben Noordhuis) * doc: Colorize API stabilitity index headers in docs (Luke Arduini) * net: socket.readyState corrections (bentaber) * http: Performance enhancements for http under streams2 (isaacs) * stream: fix to emit end event on http.ClientResponse (Shigeki Ohtsu) * stream: fix event handler leak in readstream pipe and unpipe (Andreas Madsen) * build: Support ./configure --tag switch (Maciej Małecki) * repl: don't touch `require.cache` (Nathan Rajlich) * node: Emit 'exit' event when exiting for an uncaught exception (isaacs) ## 2012.12.21, Version 0.9.4 (Unstable) https://github.com/iojs/io.js/commit/d86d83c75f6343b5368bb7bd328b4466a035e1d4 * streams: Update all streaming interfaces to use new classes (isaacs) * node: remove idle gc (Ben Noordhuis) * http: protect against response splitting attacks (Bert Belder) * fs: Raise error when null bytes detected in paths (isaacs) * fs: fix 'object is not a function' callback errors (Ben Noordhuis) * fs: add autoClose=true option to fs.createReadStream (Farid Neshat) * process: add getgroups(), setgroups(), initgroups() (Ben Noordhuis) * openssl: optimized asm code on x86 and x64 (Bert Belder) * crypto: fix leak in GetPeerCertificate (Fedor Indutny) * add systemtap support (Jan Wynholds) * windows: add ETW and PerfCounters support (Scott Blomquist) * windows: fix normalization of UNC paths (Bert Belder) * crypto: fix ssl error handling (Sergey Kholodilov) * node: remove eio-emul.h (Ben Noordhuis) * os: add os.endianness() function (Nathan Rajlich) * readline: don't emit "line" events with a trailing '\n' char (Nathan Rajlich) * build: add configure option to generate xcode build files (Timothy J Fontaine) * build: allow linking against system libuv, cares, http_parser (Stephen Gallagher) * typed arrays: add slice() support to ArrayBuffer (Anthony Pesch) * debugger: exit and kill child on SIGTERM or SIGHUP (Fedor Indutny) * url: url.format escapes delimiters in path and query (J. Lee Coltrane) ## 2012.10.24, Version 0.9.3 (Unstable) https://github.com/iojs/io.js/commit/1ed4c6776e4f52956918b70565502e0f8869829d * V8: Upgrade to 3.13.7.4 * crypto: Default to buffers instead of binary strings (isaacs, Fedor Indutny) * crypto: add getHashes() and getCiphers() (Ben Noordhuis) * unix: add custom thread pool, remove libeio (Ben Noordhuis) * util: make `inspect()` accept an "options" argument (Nathan Rajlich) * https: fix renegotation attack protection (Ben Noordhuis) * cluster: make 'listening' handler see actual port (Aaditya Bhatia) * windows: use USERPROFILE to get the user's home dir (Bert Belder) * path: add platform specific path delimiter (Paul Serby) * http: add response.headersSent property (Pavel Lang) * child_process: make .fork()'d child auto-exit (Ben Noordhuis) * events: add 'removeListener' event (Ben Noordhuis) * string_decoder: Add 'end' method, do base64 properly (isaacs) * buffer: include encoding value in exception when invalid (Ricky Ng-Adam) * http: make http.ServerResponse no longer emit 'end' (isaacs) * streams: fix pipe is destructed by 'end' from destination (koichik) ## 2012.09.17, Version 0.9.2 (Unstable) https://github.com/iojs/io.js/commit/6e2055889091a424fbb5c500bc3ab9c05d1c28b4 * http_parser: upgrade to ad3b631 * openssl: upgrade 1.0.1c * darwin: use FSEvents to watch directory changes (Fedor Indutny) * unix: support missing API on NetBSD (Shigeki Ohtsu) * unix: fix EMFILE busy loop (Ben Noordhuis) * windows: un-break writable tty handles (Bert Belder) * windows: map WSAESHUTDOWN to UV_EPIPE (Bert Belder) * windows: make spawn with custom environment work again (Bert Belder) * windows: map ERROR_DIRECTORY to UV_ENOENT (Bert Belder) * tls, https: validate server certificate by default (Ben Noordhuis) * tls, https: throw exception on missing key/cert (Ben Noordhuis) * tls: async session storage (Fedor Indutny) * installer: don't install header files (Ben Noordhuis) * buffer: implement Buffer.prototype.toJSON() (Nathan Rajlich) * buffer: added support for writing NaN and Infinity (koichik) * http: make http.ServerResponse emit 'end' (Ben Noordhuis) * build: ./configure --ninja (Ben Noordhuis, Timothy J Fontaine) * installer: fix --without-npm (Ben Noordhuis) * cli: make -p equivalent to -pe (Ben Noordhuis) * url: Go much faster by using Url class (isaacs) ## 2012.08.28, Version 0.9.1 (Unstable) https://github.com/iojs/io.js/commit/e6ce259d2caf338fec991c2dd447de763ce99ab7 * buffer: Add Buffer.isEncoding(enc) to test for valid encoding values (isaacs) * Raise UV_ECANCELED on premature close. (Ben Noordhuis) * Remove c-ares from libuv, move to a top-level node dependency (Bert Belder) * ref/unref for all HandleWraps, timers, servers, and sockets (Timothy J Fontaine) * addon: remove node-waf, superseded by node-gyp (Ben Noordhuis) * child_process: emit error on exec failure (Ben Noordhuis) * cluster: do not use internal server API (Andreas Madsen) * constants: add O_DIRECT (Ian Babrou) * crypto: add sync interface to crypto.pbkdf2() (Ben Noordhuis) * darwin: emulate fdatasync() (Fedor Indutny) * dgram: make .bind() always asynchronous (Ben Noordhuis) * events: Make emitter.listeners() side-effect free (isaacs, Joe Andaverde) * fs: Throw early on invalid encoding args (isaacs) * fs: fix naming of truncate/ftruncate functions (isaacs) * http: bubble up parser errors to ClientRequest (Brian White) * linux: improve cpuinfo parser on ARM and MIPS (Ben Noordhuis) * net: add support for IPv6 addresses ending in :: (Josh Erickson) * net: support Server.listen(Pipe) (Andreas Madsen) * node: don't scan add-on for "init" symbol (Ben Noordhuis) * remove process.uvCounters() (Ben Noordhuis) * repl: console writes to repl rather than process stdio (Nathan Rajlich) * timers: implement setImmediate (Timothy J Fontaine) * tls: fix segfault in pummel/test-tls-ci-reneg-attack (Ben Noordhuis) * tools: Move gyp addon tools to node-gyp (Nathan Rajlich) * unix: preliminary signal handler support (Ben Noordhuis) * unix: remove dependency on ev_child (Ben Noordhuis) * unix: work around darwin bug, don't poll() on pipe (Fedor Indutny) * util: Formally deprecate util.pump() (Ben Noordhuis) * windows: make active and closing handle state independent (Bert Belder) * windows: report spawn errors to the exit callback (Bert Belder) * windows: signal handling support with uv_signal_t (Bert Belder) ## 2012.07.20, Version 0.9.0 (Unstable) https://github.com/iojs/io.js/commit/f9b237f478c372fd55e4590d7399dcd8f25f3603 * punycode: update to v1.1.1 (Mathias Bynens) * c-ares: upgrade to 1.9.0 (Saúl Ibarra Corretgé) * dns: ignore rogue DNS servers reported by windows (Saúl Ibarra Corretgé) * unix: speed up uv_async_send() (Ben Noordhuis) * darwin: get cpu model correctly on mac (Xidorn Quan) * nextTick: Handle tick callbacks before any other I/O (isaacs) * Enable color customization of `util.inspect` (Pavel Lang) * tls: Speed and memory improvements (Fedor Indutny) * readline: Use one history item for reentered line (Vladimir Beloborodov) * Fix [#3521](https://github.com/joyent/node/issues/3521) Make process.env more like a regular Object (isaacs) ## 2013.06.13, Version 0.8.25 (maintenance) https://github.com/iojs/io.js/commit/0b9bdb2bc7e1c872f0ea4713517fda22a4b0b202 * npm: Upgrade to 1.2.30 * child_process: fix handle delivery (Ben Noordhuis) ## 2013.06.04, Version 0.8.24 (maintenance) https://github.com/iojs/io.js/commit/c1a1ab067721ea17ef7b05ec5c68b01321017f05 * npm: Upgrade to v1.2.24 * url: Properly parse certain oddly formed urls (isaacs) * http: Don't try to destroy nonexistent sockets (isaacs) * handle_wrap: fix NULL pointer dereference (Ben Noordhuis) ## 2013.04.09, Version 0.8.23 (maintenance) https://github.com/iojs/io.js/commit/c67f8d0500fe15637a623eb759d2ad7eb9fb3b0b * npm: Upgrade to v1.2.18 * http: Avoid EE warning on ECONNREFUSED handling (isaacs) * tls: Re-enable check of CN-ID in cert verification (Tobias Müllerleile) * child_process: fix sending utf-8 to child process (Ben Noordhuis) * crypto: check key type in GetPeerCertificate() (Ben Noordhuis) * win/openssl: mark assembled object files as seh safe (Bert Belder) * windows/msi: fix msi build issue with WiX 3.7/3.8 (Raymond Feng) ## 2013.03.07, Version 0.8.22 (Stable) https://github.com/iojs/io.js/commit/67a4cb4fe8c2346e30ffb83f7178e205cc2dab33 * npm: Update to 1.2.14 * cluster: propagate bind errors (Ben Noordhuis) * crypto: don't assert when calling Cipher#final() twice (Ben Noordhuis) * build, windows: disable SEH (Ben Noordhuis) ## 2013.02.25, Version 0.8.21 (Stable) https://github.com/iojs/io.js/commit/530d8c05d4c546146f18e5ba811d7eb3b7b7c0c5 * http: Do not free the wrong parser on socket close (isaacs) * http: Handle hangup writes more gently (isaacs) * zlib: fix assert on bad input (Ben Noordhuis) * test: add TAP output to the test runner (Timothy J Fontaine) * unix: Handle EINPROGRESS from domain sockets (Ben Noordhuis) ## 2013.02.15, Version 0.8.20 (Stable) https://github.com/iojs/io.js/commit/e10c75579b536581ddd7ae4e2c3bf8a9d550d343 * npm: Upgrade to v1.2.11 * http: Do not let Agent hand out destroyed sockets (isaacs) * http: Raise hangup error on destroyed socket write (isaacs) * http: protect against response splitting attacks (Bert Belder) ## 2013.02.06, Version 0.8.19 (Stable) https://github.com/iojs/io.js/commit/53978bdf420622ff0121c63c0338c9e7c2e60869 * npm: Upgrade to v1.2.10 * zlib: pass object size hint to V8 (Ben Noordhuis) * zlib: reduce memory consumption, release early (Ben Noordhuis) * buffer: slow buffer copy compatibility fix (Trevor Norris) * zlib: don't assert on malformed dictionary (Ben Noordhuis) * zlib: don't assert on missing dictionary (Ben Noordhuis) * windows: better ipv6 support (Bert Belder) * windows: add error mappings related to unsupported protocols (Bert Belder) * windows: map ERROR_DIRECTORY to UV_ENOENT (Bert Belder) ## 2013.01.18, Version 0.8.18 (Stable) https://github.com/iojs/io.js/commit/2c4eef0d972838c51999d32c0d251857a713dc18 * npm: Upgrade to v1.2.2 * dns: make error message match errno (Dan Milon) * tls: follow RFC6125 more stricly (Fedor Indutny) * buffer: reject negative SlowBuffer offsets (Ben Noordhuis) * install: add simplejson fallback (Chris Dent) * http: fix "Cannot call method 'emit' of null" (Ben Noordhuis) ## 2013.01.09, Version 0.8.17 (Stable) https://github.com/iojs/io.js/commit/c50c33e9397d7a0a8717e8ce7530572907c054ad * npm: Upgrade to v1.2.0 - peerDependencies (Domenic Denicola) - node-gyp v0.8.2 (Nathan Rajlich) - Faster installs from github user/project shorthands (Nathan Zadoks) * typed arrays: fix 32 bit size/index overflow (Ben Noordhuis) * http: Improve performance of single-packet responses (Ben Noordhuis) * install: fix openbsd man page location (Ben Noordhuis) * http: bubble up parser errors to ClientRequest (Brian White) ## 2012.12.13, Version 0.8.16 (Stable) https://github.com/iojs/io.js/commit/1c9c6277d5cfcaaac8569c0c8f7daa64292048a9 * npm: Upgrade to 1.1.69 * fs: fix WriteStream/ReadStream fd leaks (Ben Noordhuis) * crypto: fix leak in GetPeerCertificate (Fedor Indutny) * buffer: Don't double-negate numeric buffer arg (Trevor Norris) * net: More accurate IP address validation and IPv6 dotted notation. (Joshua Erickson) ## 2012.11.26, Version 0.8.15 (Stable) https://github.com/iojs/io.js/commit/fdf91afb494a7a2fff2913d817f589c191a2c88f * npm: Upgrade to 1.1.66 (isaacs) * linux: use /proc/cpuinfo for CPU frequency (Ben Noordhuis) * windows: map WSAESHUTDOWN to UV_EPIPE (Ben Noordhuis) * windows: map ERROR_GEN_FAILURE to UV_EIO (Bert Belder) * unix: do not set environ unless one is provided (Charlie McConnell) * domains: don't crash if domain is set to null (Bert Belder) * windows: fix the x64 debug build (Bert Belder) * net, tls: fix connect() resource leak (Ben Noordhuis) ## 2012.10.25, Version 0.8.14 (Stable) https://github.com/iojs/io.js/commit/b00527fcf05c3d9f https://github.com/iojs/io.js/commit/b5d5d790f9472906a59fe218 * events: Don't clobber pre-existing _events obj in EE ctor (isaacs) ## 2012.10.25, Version 0.8.13 (Stable) https://github.com/iojs/io.js/commit/ff4c974873f9a7cc6a5b042eb9b6389bb8dde6d6 * V8: Upgrade to 3.11.10.25 * npm: Upgrade to 1.1.65 * url: parse hostnames that start with - or _ (Ben Noordhuis) * repl: Fix Windows 8 terminal issue (Bert Belder) * typed arrays: use signed char for signed int8s (Aaron Jacobs) * crypto: fix bugs in DiffieHellman (Ben Noordhuis) * configure: turn on VFPv3 on ARMv7 (Ben Noordhuis) * Re-enable OpenSSL UI for entering passphrases via tty (Ben Noordhuis) * repl: ensure each REPL instance gets its own "context" (Nathan Rajlich) ## 2012.10.12, Version 0.8.12 (Stable) https://github.com/iojs/io.js/commit/38c72d4e29574dec5205bcf23c2a85efe65331a4 * npm: Upgrade to 1.1.63 * crypto: Reduce stability index to 2-Unstable (isaacs) * windows: fix handle leak in uv_fs_utime (Bert Belder) * windows: fix application crashed popup in debug version (Bert Belder) * buffer: report proper retained size in profiler (Ben Noordhuis) * buffer: fix byteLength with UTF-16LE (koichik) * repl: make "end of input" JSON.parse() errors throw in the REPL (Nathan Rajlich) * repl: make invalid RegExp modifiers throw in the REPL (Nathan Rajlich) * http: handle multiple Proxy-Authenticate values (Willi Eggeling) ## 2012.09.27, Version 0.8.11 (Stable) https://github.com/iojs/io.js/commit/e1f39468fa580c1e4cb15fac621f87944ee625dc * fs: Fix stat() size reporting for large files (Ben Noordhuis) ## 2012.09.25, Version 0.8.10 (Stable) https://github.com/iojs/io.js/commit/0bc273da4fcaa79b209ed755ad249a3e7be626a6 * npm: Upgrade to 1.1.62 * repl: make invalid RegExps throw in the REPL (Nathan Rajlich) * v8: loosen artificial mmap constraint (Bryan Cantrill) * process: fix setuid() and setgid() error reporting (Ben Noordhuis) * domain: Properly exit() on domain disposal (isaacs) * fs: fix watchFile() missing deletion events (Ben Noordhuis) * fs: fix assert in fs.watch() (Ben Noordhuis) * fs: don't segfault on deeply recursive stat() (Ben Noordhuis) * http: Remove timeout handler when data arrives (Frédéric Germain) * http: make the client "res" object gets the same domain as "req" (Nathan Rajlich) * windows: don't blow up when an invalid FD is used (Bert Belder) * unix: map EDQUOT to UV_ENOSPC (Charlie McConnell) * linux: improve /proc/cpuinfo parser (Ben Noordhuis) * win/tty: reset background brightness when color is set to default (Bert Belder) * unix: put child process stdio fds in blocking mode (Ben Noordhuis) * unix: fix EMFILE busy loop (Ben Noordhuis) * sunos: don't set TCP_KEEPALIVE (Ben Noordhuis) * tls: Use slab allocator for memory management (Fedor Indutny) * openssl: Use optimized assembly code for x86 and x64 (Bert Belder) ## 2012.09.11, Version 0.8.9 (Stable) https://github.com/iojs/io.js/commit/b88c3902b241cf934e75443b934f2033ad3915b1 * v8: upgrade to 3.11.10.22 * GYP: upgrade to r1477 * npm: Upgrade to 1.1.61 * npm: Don't create world-writable files (isaacs) * windows: fix single-accept mode for shared server sockets (Bert Belder) * windows: fix uninitialized memory access in uv_update_time() (Bert Belder) * windows: don't throw when a signal handler is attached (Bert Belder) * unix: fix memory leak in udp (Ben Noordhuis) * unix: map errno ESPIPE (Ben Noordhuis) * unix, windows: fix memory corruption in fs-poll.c (Ben Noordhuis) * sunos: fix os.cpus() on x86_64 (Ben Noordhuis) * child process: fix processes with IPC channel don't emit 'close' (Bert Belder) * build: add a "--dest-os" option to force a gyp "flavor" (Nathan Rajlich) * build: set `process.platform` to "sunos" on SunOS (Nathan Rajlich) * build: fix `make -j` fails after `make clean` (Bearice Ren) * build: fix openssl configuration for "arm" builds (Nathan Rajlich) * tls: support unix domain socket/named pipe in tls.connect (Shigeki Ohtsu) * https: make https.get() accept a URL (koichik) * http: respect HTTP/1.0 TE header (Ben Noordhuis) * crypto, tls: Domainify setSNICallback, pbkdf2, randomBytes (Ben Noordhuis) * stream.pipe: Don't call destroy() unless it's a function (isaacs) ## 2012.08.22, Version 0.8.8 (Stable) https://github.com/iojs/io.js/commit/a299c97bbc701f4d460e91214d7bfe7a9589d361 * V8: upgrade to 3.11.10.19 * npm: upgrade to 1.1.59 * windows: fix uninitialized memory access in uv_update_time() (Bert Belder) * unix, windows: fix memory corruption in fs-poll.c (Ben Noordhuis) * unix: fix integer overflow in uv_hrtime (Tim Holy) * sunos: fix uv_cpu_info() on x86_64 (Ben Noordhuis) * tls: update default cipher list (Ben Noordhuis) * unix: Fix llvm and older gcc duplicate symbol warnings (Bert Belder) * fs: fix use after free in stat watcher (Ben Noordhuis) * build: Fix using manually compiled gcc on OS X (Nathan Rajlich) * windows: make junctions work again (Bert Belder) ## 2012.08.15, Version 0.8.7 (Stable) https://github.com/iojs/io.js/commit/f640c5d35cba96634cd8176a525a1d876e361a61 * npm: Upgrade to 1.1.49 * website: download page (Golo Roden) * crypto: fix uninitialized memory access in openssl (Ben Noordhuis) * buffer, crypto: fix buffer decoding (Ben Noordhuis) * build: compile with -fno-tree-vrp when gcc >= 4.0 (Ben Noordhuis) * tls: handle multiple CN fields when verifying cert (Ben Noordhuis) * doc: remove unused util from child_process (Kyle Robinson Young) * build: rework -fvisibility=hidden detection (Ben Noordhuis) * windows: don't duplicate invalid stdio handles (Bert Belder) * windows: fix typos in process-stdio.c (Bert Belder) ## 2012.08.07, Version 0.8.6 (Stable) https://github.com/iojs/io.js/commit/0544a586ca6b6b900a42e164033dbf350765700a * npm: Upgrade to v1.1.48 * Add 'make binary' to build binary tarballs for all Unixes (Nathan Rajlich) * zlib: Emit 'close' on destroy(). (Dominic Tarr) * child_process: Fix stdout=null when stdio=['pipe'] (Tyler Neylon) * installer: prevent ETXTBSY errors (Ben Noordhuis) * installer: honor --without-npm, default install path (Ben Noordhuis) * net: make pause work with connecting sockets (Bert Belder) * installer: fix cross-compile installs (Ben Noordhuis) * net: fix .listen({fd:0}) (Ben Noordhuis) * windows: map WSANO_DATA to UV_ENOENT (Bert Belder) ## 2012.08.02, Version 0.8.5 (Stable) https://github.com/iojs/io.js/commit/9b86a4453f0c76f2707a75c0b2343aba33ec63bc * node: tag Encode and friends NODE_EXTERN (Ben Noordhuis) * fs: fix ReadStream / WriteStream missing callback (Gil Pedersen) * fs: fix readFileSync("/proc/cpuinfo") regression (Ben Noordhuis) * installer: don't assume bash is installed (Ben Noordhuis) * Report errors properly from --eval and stdin (isaacs) * assert: fix throws() throws an error without message property (koichik) * cluster: fix libuv assert in net.listen() (Ben Noordhuis) * build: always link sunos builds with libumem (Trent Mick) * build: improve armv7 / hard-float detection (Adam Malcontenti-Wilson) * https: Use host header as effective servername (isaacs) * sunos: work around OS bug to prevent fs.watch() from spinning (Bryan Cantrill) * linux: fix 'two watchers, one path' segfault (Ben Noordhuis) * windows: fix memory leaks in many fs functions (Bert Belder) * windows: don't allow directories to be opened for writing/appending (Bert Belder) * windows: make fork() work even when not all stdio handles are valid (Bert Belder) * windows: make unlink() not remove mount points, and improve performance (Bert Belder) * build: Sign pkg installer for OS X (isaacs) ## 2012.07.25, Version 0.8.4 (Stable) https://github.com/iojs/io.js/commit/f98562fcd7d1cab573ca4dc1612157d6999befd4 * V8: Upgrade to 3.11.10.17 * npm: Upgrade to 1.1.45 * net: fix Socket({ fd: 42 }) api (Ben Noordhuis) * readline: Remove event listeners on close (isaacs) * windows: correctly prep long path for fs.exists(Sync) (Bert Belder) * debugger: wake up the event loop when a debugger command is dispatched (Peter Rybin) * tls: verify server's identity (Fedor Indutny) * net: ignore socket.setTimeout(Infinity or NaN) (Fedor Indutny) ## 2012.07.19, Version 0.8.3 (Stable) https://github.com/iojs/io.js/commit/60bf2d6cb33e4ce55604f73889ab840a9de8bdab * V8: upgrade to 3.11.10.15 * npm: Upgrade to 1.1.43 * net: fix net.Server.listen({fd:x}) error reporting (Ben Noordhuis) * net: fix bogus errno reporting (Ben Noordhuis) * build: Move npm shebang logic into an npm script (isaacs) * build: fix add-on loading on freebsd (Ben Noordhuis) * build: disable unsafe optimizations (Ben Noordhuis) * build: fix spurious mksnapshot crashes for good (Ben Noordhuis) * build: speed up genv8constants (Dave Pacheco) * fs: make unwatchFile() remove a specific listener (Ben Noordhuis) * domain: Remove first arg from intercepted fn (Toshihiro Nakamura) * domain: Fix memory leak on error (isaacs) * events: Fix memory leak from removeAllListeners (Nathan Rajlich) * zlib: Fix memory leak in Unzip class. (isaacs) * crypto: Fix memory leak in DecipherUpdate() (Ben Noordhuis) ## 2012.07.09, Version 0.8.2 (Stable) https://github.com/iojs/io.js/commit/cc6084b9ac5cf1d4fe5e7165b71e8fc05d11be1f * npm: Upgrade to 1.1.36 * readline: don't use Function#call() (Nathan Rajlich) * Code cleanup to pass 'use strict' (Jonas Westerlund) * module: add filename to require() json errors (TJ Holowaychuk) * readline: fix for unicode prompts (Tim Macfarlane) * timers: fix handling of large timeouts (Ben Noordhuis) * repl: fix passing an empty line inserting "undefined" into the buffer (Nathan Rajlich) * repl: fix crashes when buffering command (Maciej Małecki) * build: rename strict_aliasing to node_no_strict_aliasing (Ben Noordhuis) * build: disable -fstrict-aliasing for any gcc < 4.6.0 (Ben Noordhuis) * build: detect cc version with -dumpversion (Ben Noordhuis) * build: handle output of localized gcc or clang (Ben Noordhuis) * unix: fix memory corruption in freebsd.c (Ben Noordhuis) * unix: fix 'zero handles, one request' busy loop (Ben Noordhuis) * unix: fix busy loop on unexpected tcp message (Ben Noordhuis) * unix: fix EINPROGRESS busy loop (Ben Noordhuis) ## 2012.06.29, Version 0.8.1 (stable) https://github.com/iojs/io.js/commit/2134aa3d5c622fc3c3b02ccb713fcde0e0df479a * V8: upgrade to v3.11.10.12 * npm: upgrade to v1.1.33 - Support for parallel use of the cache folder - Retry on registry timeouts or network failures (Trent Mick) - Reduce 'engines' failures to a warning - Use new zsh completion if aviailable (Jeremy Cantrell) * Fix [#3577](https://github.com/joyent/node/issues/3577) Un-break require('sys') * util: speed up formatting of large arrays/objects (Ben Noordhuis) * windows: make fs.realpath(Sync) work with UNC paths (Bert Belder) * build: fix --shared-v8 option (Ben Noordhuis) * doc: `detached` is a boolean (Andreas Madsen) * build: use proper python interpreter (Ben Noordhuis) * build: expand ~ in `./configure --prefix=~/a/b/c` (Ben Noordhuis) * build: handle CC env var with spaces (Gabriel de Perthuis) * build: fix V8 build when compiling with gcc 4.5 (Ben Noordhuis) * build: fix --shared-v8 option (Ben Noordhuis) * windows msi: Fix icon issue which caused huge file size (Bert Belder) * unix: assume that dlopen() may clobber dlerror() (Ben Noordhuis) * sunos: fix memory corruption bugs (Ben Noordhuis) * windows: better (f)utimes and (f)stat (Bert Belder) ## 2012.06.25, Version 0.8.0 (stable) https://github.com/iojs/io.js/commit/8b8a7a7f9b41e74e1e810d0330738ad06fc302ec * V8: upgrade to v3.11.10.10 * npm: Upgrade to 1.1.32 * Deprecate iowatcher (Ben Noordhuis) * windows: update icon (Bert Belder) * http: Hush 'MUST NOT have a body' warnings to debug() (isaacs) * Move blog.nodejs.org content into repository (isaacs) * Fix [#3503](https://github.com/joyent/node/issues/3503): stdin: resume() on pipe(dest) (isaacs) * crypto: fix error reporting in SetKey() (Fedor Indutny) * Add --no-deprecation and --trace-deprecation command-line flags (isaacs) * fs: fix fs.watchFile() (Ben Noordhuis) * fs: Fix fs.readfile() on pipes (isaacs) * Rename GYP variable node_use_system_openssl to be consistent (Ryan Dahl) ## 2012.06.19, Version 0.7.12 (unstable) https://github.com/iojs/io.js/commit/a72120190a8ffdbcd3d6ad2a2e6ceecd2087111e * npm: Upgrade to 1.1.30 - Improved 'npm init' - Fix the 'cb never called' error from 'oudated' and 'update' - Add --save-bundle|-B config - Fix isaacs/npm[#2465](https://github.com/joyent/node/issues/2465): Make npm script and windows shims cygwin-aware - Fix isaacs/npm[#2452](https://github.com/joyent/node/issues/2452) Use --save(-dev|-optional) in npm rm - `logstream` option to replace removed `logfd` (Rod Vagg) - Read default descriptions from README.md files * Shims to support deprecated ev_* and eio_* methods (Ben Noordhuis) * [#3118](https://github.com/joyent/node/issues/3118) net.Socket: Delay pause/resume until after connect (isaacs) * [#3465](https://github.com/joyent/node/issues/3465) Add ./configure --no-ifaddrs flag (isaacs) * child_process: add .stdin stream to forks (Fedor Indutny) * build: fix `make install DESTDIR=/path` (Ben Noordhuis) * tls: fix off-by-one error in renegotiation check (Ben Noordhuis) * crypto: Fix diffie-hellman key generation UTF-8 errors (Fedor Indutny) * node: change the constructor name of process from EventEmitter to process (Andreas Madsen) * net: Prevent property access throws during close (Reid Burke) * querystring: improved speed and code cleanup (Felix Böhm) * sunos: fix assertion errors breaking fs.watch() (Fedor Indutny) * unix: stat: detect sub-second changes (Ben Noordhuis) * add stat() based file watcher (Ben Noordhuis) ## 2012.06.15, Version 0.7.11 (unstable) https://github.com/iojs/io.js/commit/5cfe0b86d5be266ef51bbba369c39e412ee51944 * V8: Upgrade to v3.11.10 * npm: Upgrade to 1.1.26 * doc: Improve cross-linking in API docs markdown (Ben Kelly) * Fix [#3425](https://github.com/joyent/node/issues/3425): removeAllListeners should delete array (Reid Burke) * cluster: don't silently drop messages when the write queue gets big (Bert Belder) * Add Buffer.concat method (isaacs) * windows: make symlinks tolerant to forward slashes (Bert Belder) * build: Add node.d and node.1 to installer (isaacs) * cluster: rename worker.unqiueID to worker.id (Andreas Madsen) * Windows: Enable ETW events on Windows for existing DTrace probes. (Igor Zinkovsky) * test: bundle node-weak in test/gc so that it doesn't need to be downloaded (Nathan Rajlich) * Make many tests pass on Windows (Bert Belder) * Fix [#3388](https://github.com/joyent/node/issues/3388) Support listening on file descriptors (isaacs) * Fix [#3407](https://github.com/joyent/node/issues/3407) Add os.tmpDir() (isaacs) * Unbreak the snapshotted build on Windows (Bert Belder) * Clean up child_process.kill throws (Bert Belder) * crypto: make cipher/decipher accept buffer args (Ben Noordhuis) ## 2012.06.11, Version 0.7.10 (unstable) https://github.com/iojs/io.js/commit/12a32a48a30182621b3f8e9b9695d1946b53c131 * Roll V8 back to 3.9.24.31 * build: x64 target should always pass -m64 (Robert Mustacchi) * add NODE_EXTERN to node::Start (Joel Brandt) * repl: Warn about running npm commands (isaacs) * slab_allocator: fix crash in dtor if V8 is dead (Ben Noordhuis) * slab_allocator: fix leak of Persistent handles (Shigeki Ohtsu) * windows/msi: add node.js prompt to startmenu (Jeroen Janssen) * windows/msi: fix adding node to PATH (Jeroen Janssen) * windows/msi: add start menu links when installing (Jeroen Janssen) * windows: don't install x64 version into the 'program files (x86)' folder (Matt Gollob) * domain: Fix [#3379](https://github.com/joyent/node/issues/3379) domain.intercept no longer passes error arg to cb (Marc Harter) * fs: make callbacks run in global context (Ben Noordhuis) * fs: enable fs.realpath on windows (isaacs) * child_process: expose UV_PROCESS_DETACHED as options.detached (Charlie McConnell) * child_process: new stdio API for .spawn() method (Fedor Indutny) * child_process: spawn().ref() and spawn().unref() (Fedor Indutny) * Upgrade npm to 1.1.25 - Enable npm link on windows - Properly remove sh-shim on Windows - Abstract out registry client and logger ## 2012.05.28, Version 0.7.9 (unstable) https://github.com/iojs/io.js/commit/782277f11a753ded831439ed826448c06fc0f356 * Upgrade V8 to 3.11.1 * Upgrade npm to 1.1.23 * uv: rework reference counting scheme (Ben Noordhuis) * uv: add interface for joining external event loops (Bert Belder) * repl, readline: Handle Ctrl+Z and SIGCONT better (Nathan Rajlich) * fs: 64bit offsets for fs calls (Igor Zinkovsky) * fs: add sync open flags 'rs' and 'rs+' (Kevin Bowman) * windows: enable creating directory junctions with fs.symlink (Igor Zinkovsky, Bert Belder) * windows: fix fs.lstat to properly detect symlinks. (Igor Zinkovsky) * Fix [#3270](https://github.com/joyent/node/issues/3270) Escape url.parse delims (isaacs) * http: make http.get() accept a URL (Adam Malcontenti-Wilson) * Cleanup vm module memory leakage (Marcel Laverdet) * Optimize writing strings with Socket.write (Bert Belder) * add support for CESU-8 and UTF-16LE encodings (koichik) * path: add path.sep to get the path separator. (Yi, EungJun) * net, http: add backlog parameter to .listen() (Erik Dubbelboer) * debugger: support mirroring Date objects (Fedor Indutny) * addon: add AtExit() function (Ben Noordhuis) * net: signal localAddress bind failure in connect (Brian Schroeder) * util: handle non-string return value in .inspect() (Alex Kocharin) ## 2012.04.18, Version 0.7.8 (unstable) https://github.com/iojs/io.js/commit/c2b47097c0b483552efc1947c6766fa1128600b6 * Upgrade V8 to 3.9.24.9 * Upgrade OpenSSL to 1.0.0f * Upgrade npm to 1.1.18 * Show licenses in Binary installers * Domains (isaacs) * readline: rename "end" to "close" (Nathan Rajlich) * tcp: make getsockname() return address family as string (Shigeki Ohtsu) * http, https: fix .setTimeout() (ssuda) * os: add cross platform EOL character (Mustansir Golawala) * typed arrays: unexport SizeOfArrayElementForType() (Aaron Jacobs) * net: honor 'enable' flag in .setNoDelay() (Ben Noordhuis) * child_process: emit error when .kill fails (Andreas Madsen) * gyp: fix 'argument list too long' build error (Ben Noordhuis) * fs.WriteStream: Handle modifications to fs.open (isaacs) * repl, readline: Handle newlines better (Nathan Rajlich, Nathan Friedly) * build: target OSX 10.5 when building on darwin (Nathan Rajlich) * Fix [#3052](https://github.com/joyent/node/issues/3052) Handle errors properly in zlib (isaacs) * build: add support for DTrace and postmortem (Dave Pacheco) * core: add reusable Slab allocator (Ben Noordhuis) ## 2012.03.30, Version 0.7.7 (unstable) https://github.com/iojs/io.js/commit/5cda2542fdb086f9fe5de889bea435a65e377dea * Upgrade V8 to 3.9.24.7 * Upgrade npm to 1.1.15 * Handle Emoji characters properly (Erik Corry, Bert Belder) * readline: migrate ansi/vt100 logic from tty to readline (Nathan Rajlich) * readline: Fix multiline handling (Alex Kocharin) * add a -i/--interactive flag to force the REPL (Nathan Rajlich) * debugger: add breakOnException command (Fedor Indutny) * cluster: kill workers when master dies (Andreas Madsen) * cluster: add graceful disconnect support (Andreas Madsen) * child_process: Separate 'close' event from 'exit' (Charlie McConnell) * typed arrays: add Uint8ClampedArray (Mikael Bourges-Sevenier) * buffer: Fix byte alignment issues (Ben Noordhuis, Erik Lundin) * tls: fix CryptoStream.setKeepAlive() (Shigeki Ohtsu) * Expose http parse error codes (Felix Geisendörfer) * events: don't delete the listeners array (Ben Noordhuis, Nathan Rajlich) * process: add process.config to view node's ./configure settings (Nathan Rajlich) * process: process.execArgv to see node's arguments (Micheil Smith) * process: fix process.title setter (Ben Noordhuis) * timers: handle negative or non-numeric timeout values (Ben Noordhuis) ## 2012.03.13, Version 0.7.6 (unstable) https://github.com/iojs/io.js/commit/f06abda6f58e517349d1b63a2cbf5a8d04a03505 * Upgrade v8 to 3.9.17 * Upgrade npm to 1.1.8 - Add support for os/cpu fields in package.json (Adam Blackburn) - Automatically node-gyp packages containing a binding.gyp - Fix failures unpacking in UNC shares - Never create un-listable directories - Handle cases where an optionalDependency fails to build * events: newListener emit correct fn when using 'once' (Roly Fentanes) * url: Ignore empty port component (Łukasz Walukiewicz) * module: replace 'children' array (isaacs) * tls: parse multiple values of a key in ssl certificate (Sambasiva Suda) * cluster: support passing of named pipes (Ben Noordhuis) * Windows: include syscall in fs errors (Bert Belder) * http: [#2888](https://github.com/joyent/node/issues/2888) Emit end event only once (Igor Zinkovsky) * readline: add multiline support (Rlidwka) * process: add `process.hrtime()` (Nathan Rajlich) * net, http, https: add localAddress option (Dmitry Nizovtsev) * addon improvements (Nathan Rajlich) * build improvements (Ben Noordhuis, Sadique Ali, T.C. Hollingsworth, Nathan Rajlich) * add support for "SEARCH" request methods (Nathan Rajlich) * expose the zlib and http_parser version in process.versions (Nathan Rajlich) ## 2012.02.23, Version 0.7.5 (unstable) https://github.com/iojs/io.js/commit/d384b8b0d2ab7f05465f0a3e15fe20b4e25b5f86 * startup speed improvements (Maciej Małecki) * crypto: add function getDiffieHellman() (Tomasz Buchert) * buffer: support decoding of URL-safe base64 (Ben Noordhuis) * Make QueryString.parse() even faster (Brian White) * url: decode url entities in auth section (Ben Noordhuis) * http: support PURGE request method (Ben Noordhuis) * http: Generate Date headers on responses (Mark Nottingham) * Fix [#2762](https://github.com/joyent/node/issues/2762): Add callback to close function. (Mikeal Rogers) * dgram: fix out-of-bound memory read (Ben Noordhuis) * repl: add automatic loading of built-in libs (Brandon Benvie) * repl: remove double calls where possible (Fedor Indutny) * Readline improvements. Related: [#2737](https://github.com/joyent/node/issues/2737) [#2756](https://github.com/joyent/node/issues/2756) (Colton Baker) * build: disable -fomit-frame-pointer on solaris (Dave Pacheco) * build: arch detection improvements (Nathan Rajlich) * build: Make a fat binary for the OS X `make pkg`. (Nathan Rajlich) * jslint src/ and lib/ on 'make test' (isaacs) ## 2012.02.14, Version 0.7.4 (unstable) https://github.com/iojs/io.js/commit/de21de920cf93ec40736ada3792a7f85f3eadeda * Upgrade V8 to 3.9.5 * Upgrade npm to 1.1.1 * build: Detect host_arch better (Karl Skomski) * debugger: export `debug_port` to `process` (Fedor Indutny) * api docs: CSS bug fixes (isaacs) * build: use -fPIC for native addons on UNIX (Nathan Rajlich) * Re-add top-level v8::Locker (Marcel Laverdet) * Move images out of the dist tarballs (isaacs) * libuv: Remove uv_export and uv_import (Ben Noordhuis) * build: Support x64 build on Windows (Igor Zinkovsky) ## 2012.02.07, Version 0.7.3 (unstable) https://github.com/iojs/io.js/commit/99059aad8d654acda4abcfaa68df182b50f2ec90 * Upgrade V8 to 3.9.2 * Revert support for isolates. (Ben Noordhuis) * cluster: Cleanup docs, event handling, and process.disconnect (Andreas Madsen) * gyp_addon: link with node.lib on Windows (Nathan Rajlich) * http: fix case where http-parser is freed twice (koichik) * Windows: disable RTTI and exceptions (Bert Belder) ## 2012.02.01, Version 0.7.2 (unstable) https://github.com/iojs/io.js/commit/ec79acb3a6166e30f0bf271fbbfda1fb575b3321 * Update V8 to 3.8.9 * Support for sharing streams across Isolates (Igor Zinkovsky) * [#2636](https://github.com/joyent/node/issues/2636) - Fix case where http_parsers are freed too early (koichik) * url: Support for IPv6 addresses in URLs (Łukasz Walukiewicz) * child_process: Add disconnect() method to child processes (Andreas Madsen) * fs: add O_EXCL support, exclusive open file (Ben Noordhuis) * fs: more specific error messages (Tj Holowaychuk) * tty: emit 'unknown' key event if key sequence not found (Dan VerWeire, Nathan Rajlich) * build: compile release build too if BUILDTYPE=Debug (Ben Noordhuis) * module: fix --debug-brk on symlinked scripts (Fedor Indutny) * zlib: fix `Failed to set dictionary` issue (Fedor Indutny) * waf: predict target arch for OS X (Fedor Indutny) ## 2012.01.23, Version 0.7.1 (unstable) https://github.com/iojs/io.js/commit/a74354735ab5d5b0fa35a1e4ff7e653757d2069b * Update V8 to 3.8.8 * Install node-waf by default (Fedor Indutny) * crypto: Add ability to turn off PKCS padding (Ingmar Runge) * v8: implement VirtualMemory class on SunOS (Ben Noordhuis) * Add cluster.setupMaster (Andreas Madsen) * move `path.exists*` to `fs.exists*` (Maciej Małecki) * typed arrays: set class name (Ben Noordhuis) * libuv bug fixes (Igor Zinkovsky, Ben Noordhuis, Dan VerWeire) ## 2012.01.16, Version 0.7.0 (unstable) https://github.com/iojs/io.js/commit/9cc55dca6f67a6096c858b841c677b0593404321 * Upgrade V8 to 3.8.6 * Use GYP build system on unix (Ben Noordhuis) * Experimenetal isolates support (Ben Noordhuis) * Improvements to Cluster API (Andreas Madsen) * Use isolates for internal debugger (Fedor Indutny) * Bug fixes ## 2012.07.10 Version 0.6.20 (maintenance) https://github.com/iojs/io.js/commit/952e513379169ec1b40909d1db056e9bf4294899 * npm: Upgrade to 1.1.37 (isaacs) * benchmark: Backport improvements made in master (isaacs) * build: always link with -lz (Trent Mick) * core: use proper #include directives (Ben Noordhuis) * cluster: don't silently drop messages when the write queue gets big (Bert Belder) * windows: don't print error when GetConsoleTitleW returns an empty string (Bert Belder) ## 2012.06.06 Version 0.6.19 (stable) https://github.com/iojs/io.js/commit/debf552ed2d4a53957446e82ff3c52a8182d5ff4 * npm: upgrade to 1.1.24 * fs: no end emit after createReadStream.pause() (Andreas Madsen) * vm: cleanup module memory leakage (Marcel Laverdet) * unix: fix loop starvation under high network load (Ben Noordhuis) * unix: remove abort() in ev_unref() (Ben Noordhuis) * windows/tty: never report error after forcibly aborting line-buffered read (Bert Belder) * windows: skip GetFileAttributes call when opening a file (Bert Belder) ## 2012.05.15 Version 0.6.18 (stable) https://github.com/iojs/io.js/commit/4bc1d395de6abed2cf1e4d0b7b3a1480a21c368f * windows: skip GetFileAttributes call when opening a file (Bert Belder) * crypto: add PKCS12/PFX support (Sambasiva Suda) * [#3240](https://github.com/joyent/node/issues/3240): child_process: delete NODE_CHANNEL_FD from env in spawn (Ben Noordhuis) * windows: add test for path.normalize with UNC paths (Bert Belder) * windows: make path.normalize convert all slashes to backslashes (Bert Belder) * fs: Automatically close FSWatcher on error (Bert Belder) * [#3258](https://github.com/joyent/node/issues/3258): fs.ReadStream.pause() emits duplicate data event (koichik) * pipe_wrap: don't assert() on pipe accept errors (Ben Noordhuis) * Better exception output for module load and process.nextTick (Felix Geisendörfer) * zlib: fix error reporting (Ben Noordhuis) * http: Don't destroy on timeout (isaacs) * [#3231](https://github.com/joyent/node/issues/3231): http: Don't try to emit error on a null'ed req object (isaacs) * [#3236](https://github.com/joyent/node/issues/3236): http: Refactor ClientRequest.onSocket (isaacs) ## 2012.05.04 Version 0.6.17 (stable) https://github.com/iojs/io.js/commit/4ced23deaf36493f4303a18f6fdce768c58becc0 * Upgrade npm to 1.1.21 * uv: Add support for EROFS errors (Ben Noordhuis, Maciej Małecki) * uv: Add support for EIO and ENOSPC errors (Fedor Indutny) * windows: Add support for EXDEV errors (Bert Belder) * http: Fix client memory leaks (isaacs, Vincent Voyer) * fs: fix file descriptor leak in sync functions (Ben Noordhuis) * fs: fix ReadStream / WriteStream double close bug (Ben Noordhuis) ## 2012.04.30 Version 0.6.16 (stable) https://github.com/iojs/io.js/commit/a1d193963ddc80a27da5da01b59751e14e33d1d6 * Upgrade V8 to 3.6.6.25 * Upgrade npm to 1.1.19 * Windows: add mappings for UV_ENOENT (Bert Belder) * linux: add IN_MOVE_SELF to inotify event mask (Ben Noordhuis) * unix: call pipe handle connection cb on accept() error (Ben Noordhuis) * unix: handle EWOULDBLOCK (Ben Noordhuis) * map EWOULDBLOCK to UV_EAGAIN (Ben Noordhuis) * Map ENOMEM to UV_ENOMEM (isaacs) * Child process: support the `gid` and `uid` options (Bert Belder) * test: cluster: add worker death event test (Ben Noordhuis) * typo in node_http_parser (isaacs) * http_parser: Eat CRLF between requests, even on connection:close. (Ben Noordhuis) * don't check return value of unsetenv (Ben Noordhuis) ## 2012.04.09 Version 0.6.15 (stable) https://github.com/iojs/io.js/commit/f160a45b254e591eb33716311c92be533c6d86c4 * Update npm to 1.1.16 * Show licenses in binary installers. * unix: add uv_fs_read64, uv_fs_write64 and uv_fs_ftruncate64 (Ben Noordhuis) * add 64bit offset fs functions (Igor Zinkovsky) * windows: don't report ENOTSOCK when attempting to bind an udp handle twice (Bert Belder) * windows: backport pipe-connect-to-file fixes from master (Bert Belder) * windows: never call fs event callbacks after closing the watcher (Bert Belder) * fs.readFile: don't make the callback before the fd is closed (Bert Belder) * windows: use 64bit offsets for uv_fs apis (Igor Zinkovsky) * Fix [#2061](https://github.com/joyent/node/issues/2061): segmentation fault on OS X due to stat size mismatch (Ben Noordhuis) ## 2012.03.22 Version 0.6.14 (stable) https://github.com/iojs/io.js/commit/e513ffef7549a56a5af728e1f0c2c0c8f290518a * net: don't crash when queued write fails (Igor Zinkovsky) * sunos: fix EMFILE on process.memoryUsage() (Bryan Cantrill) * crypto: fix compile-time error with openssl 0.9.7e (Ben Noordhuis) * unix: ignore ECONNABORTED errors from accept() (Ben Noordhuis) * Add UV_ENOSPC and mappings to it (Bert Belder) * http-parser: Fix response body is not read (koichik) * Upgrade npm to 1.1.12 - upgrade node-gyp to 0.3.7 - work around AV-locked directories on Windows - Fix isaacs/npm[#2293](https://github.com/joyent/node/issues/2293) Don't try to 'uninstall' / - Exclude symbolic links from packages. - Fix isaacs/npm[#2275](https://github.com/joyent/node/issues/2275) Spurious 'unresolvable cycle' error. - Exclude/include dot files as if they were normal files ## 2012.03.15 Version 0.6.13 (stable) https://github.com/iojs/io.js/commit/9f7f86b534f8556290eb8cad915984ff4ca54996 * Windows: Many libuv test fixes (Bert Belder) * Windows: avoid uv_guess_handle crash in when fd < 0 (Bert Belder) * Map EBUSY and ENOTEMPTY errors (Bert Belder) * Windows: include syscall in fs errors (Bert Belder) * Fix fs.watch ENOSYS on Linux kernel version mismatch (Ben Noordhuis) * Update npm to 1.1.9 - upgrade node-gyp to 0.3.5 (Nathan Rajlich) - Fix isaacs/npm[#2249](https://github.com/joyent/node/issues/2249) Add cache-max and cache-min configs - Properly redirect across https/http registry requests - log config usage if undefined key in set function (Kris Windham) - Add support for os/cpu fields in package.json (Adam Blackburn) - Automatically node-gyp packages containing a binding.gyp - Fix failures unpacking in UNC shares - Never create un-listable directories - Handle cases where an optionalDependency fails to build ## 2012.03.02 Version 0.6.12 (stable) https://github.com/iojs/io.js/commit/48a2d34cfe6b7e1c9d15202a4ef5e3c82d1fba35 * Upgrade V8 to 3.6.6.24 * dtrace ustack helper improvements (Dave Pacheco) * API Documentation refactor (isaacs) * [#2827](https://github.com/joyent/node/issues/2827) net: fix race write() before and after connect() (koichik) * [#2554](https://github.com/joyent/node/issues/2554) [#2567](https://github.com/joyent/node/issues/2567) throw if fs args for 'start' or 'end' are strings (AJ ONeal) * punycode: Update to v1.0.0 (Mathias Bynens) * Make a fat binary for the OS X pkg (isaacs) * Fix hang on accessing process.stdin (isaacs) * repl: make tab completion work on non-objects (Nathan Rajlich) * Fix fs.watch on OS X (Ben Noordhuis) * Fix [#2515](https://github.com/joyent/node/issues/2515) nested setTimeouts cause premature process exit (Ben Noordhuis) * windows: fix time conversion in stat (Igor Zinkovsky) * windows: fs: handle EOF in read (Brandon Philips) * windows: avoid IOCP short-circuit on non-ifs lsps (Igor Zinkovsky) * Upgrade npm to 1.1.4 (isaacs) - windows fixes - Bundle nested bundleDependencies properly - install: support --save with url install targets - shrinkwrap: behave properly with url-installed modules - support installing uncompressed tars or single file modules from urls etc. - don't run make clean on rebuild - support HTTPS-over-HTTP proxy tunneling ## 2012.02.17 Version 0.6.11 (stable) https://github.com/iojs/io.js/commit/1eb1fe32250fc88cb5b0a97cddf3e02be02e3f4a * http: allow multiple WebSocket RFC6455 headers (Einar Otto Stangvik) * http: allow multiple WWW-Authenticate headers (Ben Noordhuis) * windows: support unicode argv and environment variables (Bert Belder) * tls: mitigate session renegotiation attacks (Ben Noordhuis) * tcp, pipe: don't assert on uv_accept() errors (Ben Noordhuis) * tls: Allow establishing secure connection on the existing socket (koichik) * dgram: handle close of dgram socket before DNS lookup completes (Seth Fitzsimmons) * windows: Support half-duplex pipes (Igor Zinkovsky) * build: disable omit-frame-pointer on solaris systems (Dave Pacheco) * debugger: fix --debug-brk (Ben Noordhuis) * net: fix large file downloads failing (koichik) * fs: fix ReadStream failure to read from existing fd (Christopher Jeffrey) * net: destroy socket on DNS error (Stefan Rusu) * dtrace: add missing translator (Dave Pacheco) * unix: don't flush tty on switch to raw mode (Ben Noordhuis) * windows: reset brightness when reverting to default text color (Bert Belder) * npm: update to 1.1.1 - Update which, fstream, mkdirp, request, and rimraf - Fix [#2123](https://github.com/joyent/node/issues/2123) Set path properly for lifecycle scripts on windows - Mark the root as seen, so we don't recurse into it. Fixes [#1838](https://github.com/joyent/node/issues/1838). (Martin Cooper) ## 2012.02.02, Version 0.6.10 (stable) https://github.com/iojs/io.js/commit/051908e023f87894fa68f5b64d0b99a19a7db01e * Update V8 to 3.6.6.20 * Add npm msysgit bash shim to msi installer (isaacs) * buffers: fix intermittent out of bounds error (Ben Noordhuis) * buffers: honor length argument in base64 decoder (Ben Noordhuis) * windows: Fix path.exists regression (Bert Belder) * Make QueryString.parse run faster (Philip Tellis) * http: avoid freeing http-parser objects too early (koichik) * timers: add v0.4 compatibility hack (Ben Noordhuis) * Proper EPERM error code support (Igor Zinkovsky, Brandon Philips) * dgram: Implement udp multicast methods on windows (Bert Belder) ## 2012.01.27, Version 0.6.9 (stable) https://github.com/iojs/io.js/commit/f19e20d33f57c4d2853aaea7d2724d44f3b0012f * dgram: Bring back missing functionality for Unix (Dan VerWeire, Roman Shtylman, Ben Noordhuis) - Note: Windows UDP support not yet complete. * http: Fix parser memory leak (koichik) * zlib: Fix [#2365](https://github.com/joyent/node/issues/2365) crashes on invalid input (Nicolas LaCasse) * module: fix --debug-brk on symlinked scripts (Fedor Indutny) * Documentation Restyling (Matthew Fitzsimmons) * Update npm to 1.1.0-3 (isaacs) * Windows: fix regression in stat() calls to C:\ (Bert Belder) ## 2012.01.19, Version 0.6.8 (stable) https://github.com/iojs/io.js/commit/d18cebaf8a7ac701dabd71a3aa4eb0571db6a645 * Update V8 to 3.6.6.19 * Numeric key hash collision fix for V8 (Erik Corry, Fedor Indutny) * Add missing TTY key translations for F1-F5 on Windows (Brandon Benvie) * path.extname bugfix with . and .. paths (Bert Belder) * cluster: don't always kill the master on uncaughtException (Ben Noordhuis) * Update npm to 1.1.0-2 (isaacs) * typed arrays: set class name (Ben Noordhuis) * zlib binding cleanup (isaacs, Bert Belder) * dgram: use slab memory allocator (Michael Bernstein) * fix segfault [#2473](https://github.com/joyent/node/issues/2473) * [#2521](https://github.com/joyent/node/issues/2521) 60% improvement in fs.stat on Windows (Igor Zinkovsky) ## 2012.01.06, Version 0.6.7 (stable) https://github.com/iojs/io.js/commit/d5a189acef14a851287ee555f7a39431fe276e1c * V8 hash collision fix (Breaks MIPS) (Bert Belder, Erik Corry) * Upgrade V8 to 3.6.6.15 * Upgrade npm to 1.1.0-beta-10 (isaacs) * many doc updates (Ben Noordhuis, Jeremy Martin, koichik, Dave Irvine, Seong-Rak Choi, Shannen, Adam Malcontenti-Wilson, koichik) * Fix segfault in node_http_parser.cc * dgram, timers: fix memory leaks (Ben Noordhuis, Yoshihiro Kikuchi) * repl: fix repl.start not passing the `ignoreUndefined` arg (Damon Oehlman) * [#1980](https://github.com/joyent/node/issues/1980): Socket.pause null reference when called on a closed Stream (koichik) * [#2263](https://github.com/joyent/node/issues/2263): XMLHttpRequest piped in a writable file stream hang (koichik) * [#2069](https://github.com/joyent/node/issues/2069): http resource leak (koichik) * buffer.readInt global pollution fix (Phil Sung) * timers: fix performance regression (Ben Noordhuis) * [#2308](https://github.com/joyent/node/issues/2308), [#2246](https://github.com/joyent/node/issues/2246): node swallows openssl error on request (koichik) * [#2114](https://github.com/joyent/node/issues/2114): timers: remove _idleTimeout from item in .unenroll() (James Hartig) * [#2379](https://github.com/joyent/node/issues/2379): debugger: Request backtrace w/o refs (Fedor Indutny) * simple DTrace ustack helper (Dave Pacheco) * crypto: rewrite HexDecode without snprintf (Roman Shtylman) * crypto: don't ignore DH init errors (Ben Noordhuis) ## 2011.12.14, Version 0.6.6 (stable) https://github.com/iojs/io.js/commit/9a059ea69e1f6ebd8899246682d8ca257610b8ab * npm update to 1.1.0-beta-4 (Isaac Z. Schlueter) * cli: fix output of --help (Ben Noordhuis) * new website * pause/resume semantics for stdin (Isaac Z. Schlueter) * Travis CI integration (Maciej Małecki) * child_process: Fix bug regarding closed stdin (Ben Noordhuis) * Enable upgrades in MSI. (Igor Zinkovsky) * net: Fixes memory leak (Ben Noordhuis) * fs: handle fractional or NaN ReadStream buffer size (Ben Noordhuis) * crypto: fix memory leaks in PBKDF2 error path (Ben Noordhuis) ## 2011.12.04, Version 0.6.5 (stable) https://github.com/iojs/io.js/commit/6cc94db653a2739ab28e33b2d6a63c51bd986a9f * npm workaround Windows antivirus software (isaacs) * Upgrade V8 to 3.6.6.11 ## 2011.12.02, Version 0.6.4 (stable) https://github.com/iojs/io.js/commit/9170077f13e5e5475b23d1d3c2e7f69bfe139727 * doc improvements (Kyle Young, Tim Oxley, Roman Shtylman, Mathias Bynens) * upgrade bundled npm (Isaac Schlueter) * polish Windows installer (Igor Zinkovsky, Isaac Schlueter) * punycode: upgrade to v0.2.1 (Mathias Bynens) * build: add –without-npm flag to configure script * sys: deprecate module some more, print stack trace if NODE_DEBUG=sys * cli: add -p switch, prints result of –eval * [#1997](https://github.com/joyent/node/issues/1997): fix Blowfish ECB encryption and decryption (Ingmar Runge) * [#2223](https://github.com/joyent/node/issues/2223): fix socket ‘close’ event being emitted twice * [#2224](https://github.com/joyent/node/issues/2224): fix RSS memory usage > 4 GB reporting (Russ Bradberry) * [#2225](https://github.com/joyent/node/issues/2225): fix util.inspect() object stringification bug (Nathan Rajlich) ## 2011.11.25, Version 0.6.3 (stable) https://github.com/iojs/io.js/commit/b159c6d62e5756d3f8847419d29c6959ea288b56 * [#2083](https://github.com/joyent/node/issues/2083) Land NPM in Node. It is included in packages/installers and installed on `make install`. * [#2076](https://github.com/joyent/node/issues/2076) Add logos to windows installer. * [#1711](https://github.com/joyent/node/issues/1711) Correctly handle http requests without headers. (Ben Noordhuis, Felix Geisendörfer) * TLS: expose more openssl SSL context options and constants. (Ben Noordhuis) * [#2177](https://github.com/joyent/node/issues/2177) Windows: don't kill UDP socket when a packet fails to reach its destination. (Bert Belder) * Windows: support paths longer than 260 characters. (Igor Zinkovsky) * Windows: correctly resolve drive-relative paths. (Bert Belder) * [#2166](https://github.com/joyent/node/issues/2166) Don't leave file descriptor open after lchmod. (Isaac Schlueter) * [#2084](https://github.com/joyent/node/issues/2084) Add OS X .pkg build script to make file. * [#2160](https://github.com/joyent/node/issues/2160) Documentation improvements. (Ben Noordhuis) ## 2011.11.18, Version 0.6.2 (stable) https://github.com/iojs/io.js/commit/a4402f0b2e410b19375a1d5c5fb7fe7f66f3c7f8 * doc improvements (Artur Adib, Trevor Burnham, Ryan Emery, Trent Mick) * timers: remember extra setTimeout() arguments when timeout==0 * punycode: use Mathias Bynens's punycode library, it's more compliant * repl: improved tab completion (Ryan Emery) * buffer: fix range checks in .writeInt() functions (Lukasz Walukiewicz) * tls: make cipher list configurable * addons: make Buffer and ObjectWrap visible to Windows add-ons (Bert Belder) * crypto: add PKCS[#1](https://github.com/joyent/node/issues/1) a.k.a RSA public key verification support * windows: fix stdout writes when redirected to nul * sunos: fix build on Solaris and Illumos * Upgrade V8 to 3.6.6.8 ## 2011.11.11, Version 0.6.1 (stable) https://github.com/iojs/io.js/commit/170f2addb2dd0c625bc4a6d461e89a31ad68b79b * doc improvements (Eric Lovett, Ben Noordhuis, Scott Anderson, Yoji SHIDARA) * crypto: make thread-safe (Ben Noordhuis) * fix process.kill error object * debugger: correctly handle source with multi-byte characters (Shigeki Ohtsu) * make stdout and stderr non-destroyable (Igor Zinkovsky) * fs: don't close uninitialized fs.watch handle (Ben Noordhuis) * [#2026](https://github.com/joyent/node/issues/2026) fix man page install on BSDs (Ben Noordhuis) * [#2040](https://github.com/joyent/node/issues/2040) fix unrecognized errno assert in uv_err_name * [#2043](https://github.com/joyent/node/issues/2043) fs: mkdir() should call callback if mode is omitted * [#2045](https://github.com/joyent/node/issues/2045) fs: fix fs.realpath on windows to return on error (Benjamin Pasero) * [#2047](https://github.com/joyent/node/issues/2047) minor cluster improvements * [#2052](https://github.com/joyent/node/issues/2052) readline get window columns correctly * Upgrade V8 to 3.6.6.7 ## 2011.11.04, Version 0.6.0 (stable) https://github.com/iojs/io.js/commit/865b077819a9271a29f982faaef99dc635b57fbc * print undefined on undefined values in REPL (Nathan Rajlich) * doc improvements (koichik, seebees, bnoordhuis, Maciej Małecki, Jacob Kragh) * support native addon loading in windows (Bert Belder) * rename getNetworkInterfaces() to networkInterfaces() (bnoordhuis) * add pending accepts knob for windows (igorzi) * http.request(url.parse(x)) (seebees) * [#1929](https://github.com/joyent/node/issues/1929) zlib Respond to 'resume' events properly (isaacs) * stream.pipe: Remove resume and pause events * test fixes for windows (igorzi) * build system improvements (bnoordhuis) * [#1936](https://github.com/joyent/node/issues/1936) tls: does not emit 'end' from EncryptedStream (koichik) * [#758](https://github.com/joyent/node/issues/758) tls: add address(), remoteAddress/remotePort * [#1399](https://github.com/joyent/node/issues/1399) http: emit Error object after .abort() (bnoordhuis) * [#1999](https://github.com/joyent/node/issues/1999) fs: make mkdir() default to 0777 permissions (bnoordhuis) * [#2001](https://github.com/joyent/node/issues/2001) fix pipe error codes * [#2002](https://github.com/joyent/node/issues/2002) Socket.write should reset timeout timer * stdout and stderr are blocking when associated with file too. * remote debugger support on windows (Bert Belder) * convenience methods for zlib (Matt Robenolt) * process.kill support on windows (igorzi) * process.uptime() support on windows (igorzi) * Return IPv4 addresses before IPv6 addresses from getaddrinfo * util.inspect improvements (Nathan Rajlich) * cluster module api changes * Downgrade V8 to 3.6.6.6 ## 2011.10.21, Version 0.5.10 (unstable) https://github.com/iojs/io.js/commit/220e61c1f65bf4db09699fcf6399c0809c0bc446 * Remove cmake build system, support for Cygwin, legacy code base, process.ENV, process.ARGV, process.memoryUsage().vsize, os.openOSHandle * Documentation improvments (Igor Zinkovsky, Bert Belder, Ilya Dmitrichenko, koichik, Maciej Małecki, Guglielmo Ferri, isaacs) * Performance improvements (Daniel Ennis, Bert Belder, Ben Noordhuis) * Long process.title support (Ben Noordhuis) * net: register net.Server callback only once (Simen Brekken) * net: fix connect queue bugs (Ben Noordhuis) * debugger: fix backtrace err handling (Fedor Indutny) * Use getaddrinfo instead of c-ares for dns.lookup * Emit 'end' from crypto streams on close * [#1902](https://github.com/joyent/node/issues/1902) buffer: use NO_NULL_TERMINATION flag (koichik) * [#1907](https://github.com/joyent/node/issues/1907) http: Added support for HTTP PATCH verb (Thomas Parslow) * [#1644](https://github.com/joyent/node/issues/1644) add GetCPUInfo on windows (Karl Skomski) * [#1484](https://github.com/joyent/node/issues/1484), [#1834](https://github.com/joyent/node/issues/1834), [#1482](https://github.com/joyent/node/issues/1482), [#771](https://github.com/joyent/node/issues/771) Don't use a separate context for the repl. (isaacs) * [#1882](https://github.com/joyent/node/issues/1882) zlib Update 'availOutBefore' value, and test (isaacs) * [#1888](https://github.com/joyent/node/issues/1888) child_process.fork: don't modify args (koichik) * [#1516](https://github.com/joyent/node/issues/1516) tls: requestCert unusable with Firefox and Chrome (koichik) * [#1467](https://github.com/joyent/node/issues/1467) tls: The TLS API is inconsistent with the TCP API (koichik) * [#1894](https://github.com/joyent/node/issues/1894) net: fix error handling in listen() (koichik) * [#1860](https://github.com/joyent/node/issues/1860) console.error now goes through uv_tty_t * Upgrade V8 to 3.7.0 * Upgrade GYP to r1081 ## 2011.10.10, Version 0.5.9 (unstable) https://github.com/iojs/io.js/commit/3bd9b08fb125b606f97a4079b147accfdeebb07d * fs.watch interface backed by kqueue, inotify, and ReadDirectoryChangesW (Igor Zinkovsky, Ben Noordhuis) * add dns.resolveTxt (Christian Tellnes) * Remove legacy http library (Ben Noordhuis) * child_process.fork returns and works on Windows. Allows passing handles. (Igor Zinkovsky, Bert Belder) * [#1774](https://github.com/joyent/node/issues/1774) Lint and clean up for --harmony_block_scoping (Tyler Larson, Colton Baker) * [#1813](https://github.com/joyent/node/issues/1813) Fix ctrl+c on Windows (Bert Belder) * [#1844](https://github.com/joyent/node/issues/1844) unbreak --use-legacy (Ben Noordhuis) * process.stderr now goes through libuv. Both process.stdout and process.stderr are blocking when referencing a TTY. * net_uv performance improvements (Ben Noordhuis, Bert Belder) ## 2011.09.30, Version 0.5.8 (unstable) https://github.com/iojs/io.js/commit/7cc17a0cea1d25188c103745a7d0c24375e3a609 * zlib bindings (isaacs) * Windows supports TTY ANSI escape codes (Bert Belder) * Debugger improvements (Fedor Indutny) * crypto: look up SSL errors with ERR_print_errors() (Ben Noordhuis) * dns callbacks go through MakeCallback now * Raise an error when a malformed package.json file is found. (Ben Leslie) * buffers: handle bad length argument in constructor (Ben Noordhuis) * [#1726](https://github.com/joyent/node/issues/1726), unref process.stdout * Doc improvements (Ben Noordhuis, Fedor Indutny, koichik) * Upgrade libuv to fe18438 ## 2011.09.16, Version 0.5.7 (unstable) https://github.com/iojs/io.js/commit/558241166c4f3c516e5a448e676db0b57119212f * Upgrade V8 to 3.6.4 * Improve Windows compatibility * Documentation improvements * Debugger and REPL improvements (Fedor Indutny) * Add legacy API support: net.Stream(fd), process.stdout.writable, process.stdout.fd * Fix mkdir EEXIST handling (isaacs) * Use net_uv instead of net_legacy for stdio * Do not load readline from util.inspect * [#1673](https://github.com/joyent/node/issues/1673) Fix bug related to V8 context with accessors (Fedor Indutny) * [#1634](https://github.com/joyent/node/issues/1634) util: Fix inspection for Error (koichik) * [#1645](https://github.com/joyent/node/issues/1645) fs: Add positioned file writing feature to fs.WriteStream (Thomas Shinnick) * [#1637](https://github.com/joyent/node/issues/1637) fs: Unguarded fs.watchFile cache statWatchers checking fixed (Thomas Shinnick) * [#1695](https://github.com/joyent/node/issues/1695) Forward customFds to ChildProcess.spawn * [#1707](https://github.com/joyent/node/issues/1707) Fix hasOwnProperty security problem in querystring (isaacs) * [#1719](https://github.com/joyent/node/issues/1719) Drain OpenSSL error queue ## 2011.09.08, Version 0.5.6 (unstable) https://github.com/iojs/io.js/commit/b49bec55806574a47403771bce1ee379c2b09ca2 * [#345](https://github.com/joyent/node/issues/345), [#1635](https://github.com/joyent/node/issues/1635), [#1648](https://github.com/joyent/node/issues/1648) Documentation improvements (Thomas Shinnick, Abimanyu Raja, AJ ONeal, Koichi Kobayashi, Michael Jackson, Logan Smyth, Ben Noordhuis) * [#650](https://github.com/joyent/node/issues/650) Improve path parsing on windows (Bert Belder) * [#752](https://github.com/joyent/node/issues/752) Remove headers sent check in OutgoingMessage.getHeader() (Peter Lyons) * [#1236](https://github.com/joyent/node/issues/1236), [#1438](https://github.com/joyent/node/issues/1438), [#1506](https://github.com/joyent/node/issues/1506), [#1513](https://github.com/joyent/node/issues/1513), [#1621](https://github.com/joyent/node/issues/1621), [#1640](https://github.com/joyent/node/issues/1640), [#1647](https://github.com/joyent/node/issues/1647) Libuv-related bugs fixed (Jorge Chamorro Bieling, Peter Bright, Luis Lavena, Igor Zinkovsky) * [#1296](https://github.com/joyent/node/issues/1296), [#1612](https://github.com/joyent/node/issues/1612) crypto: Fix BIO's usage. (Koichi Kobayashi) * [#1345](https://github.com/joyent/node/issues/1345) Correctly set socket.remoteAddress with libuv backend (Bert Belder) * [#1429](https://github.com/joyent/node/issues/1429) Don't clobber quick edit mode on windows (Peter Bright) * [#1503](https://github.com/joyent/node/issues/1503) Make libuv backend default on unix, override with `node --use-legacy` * [#1565](https://github.com/joyent/node/issues/1565) Fix fs.stat for paths ending with \ on windows (Igor Zinkovsky) * [#1568](https://github.com/joyent/node/issues/1568) Fix x509 certificate subject parsing (Koichi Kobayashi) * [#1586](https://github.com/joyent/node/issues/1586) Make socket write encoding case-insensitive (Koichi Kobayashi) * [#1591](https://github.com/joyent/node/issues/1591), [#1656](https://github.com/joyent/node/issues/1656), [#1657](https://github.com/joyent/node/issues/1657) Implement fs in libuv, remove libeio and pthread-win32 dependency on windows (Igor Zinkovsky, Ben Noordhuis, Ryan Dahl, Isaac Schlueter) * [#1592](https://github.com/joyent/node/issues/1592) Don't load-time link against CreateSymbolicLink on windows (Peter Bright) * [#1601](https://github.com/joyent/node/issues/1601) Improve API consistency when dealing with the socket underlying a HTTP client request (Mikeal Rogers) * [#1610](https://github.com/joyent/node/issues/1610) Remove DigiNotar CA from trusted list (Isaac Schlueter) * [#1617](https://github.com/joyent/node/issues/1617) Added some win32 os functions (Karl Skomski) * [#1624](https://github.com/joyent/node/issues/1624) avoid buffer overrun with 'binary' encoding (Koichi Kobayashi) * [#1633](https://github.com/joyent/node/issues/1633) make Buffer.write() always set _charsWritten (Koichi Kobayashi) * [#1644](https://github.com/joyent/node/issues/1644) Windows: set executables to be console programs (Peter Bright) * [#1651](https://github.com/joyent/node/issues/1651) improve inspection for sparse array (Koichi Kobayashi) * [#1672](https://github.com/joyent/node/issues/1672) set .code='ECONNRESET' on socket hang up errors (Ben Noordhuis) * Add test case for foaf+ssl client certificate (Niclas Hoyer) * Added RPATH environment variable to override run-time library paths (Ashok Mudukutore) * Added TLS client-side session resumption support (Sean Cunningham) * Added additional properties to getPeerCertificate (Nathan Rixham, Niclas Hoyer) * Don't eval repl command twice when an error is thrown (Nathan Rajlich) * Improve util.isDate() (Nathan Rajlich) * Improvements in libuv backend and bindings, upgrade libuv to bd6066cb349a9b3a1b0d87b146ddaee06db31d10 * Show warning when using lib/sys.js (Maciej Malecki) * Support plus sign in url protocol (Maciej Malecki) * Upgrade V8 to 3.6.2 ## 2011.08.26, Version 0.5.5 (unstable) https://github.com/iojs/io.js/commit/d2d53d4bb262f517a227cc178a1648094ba54c20 * typed arrays, implementation from Plesk * fix IP multicast on SunOS * fix DNS lookup order: IPv4 first, IPv6 second (--use-uv only) * remove support for UNIX datagram sockets (--use-uv only) * UDP support for Windows (Bert Belder) * [#1572](https://github.com/joyent/node/issues/1572) improve tab completion for objects in the REPL (Nathan Rajlich) * [#1563](https://github.com/joyent/node/issues/1563) fix buffer overflow in child_process module (reported by Dean McNamee) * [#1546](https://github.com/joyent/node/issues/1546) fix performance regression in http module (reported by Brian Geffon) * [#1491](https://github.com/joyent/node/issues/1491) add PBKDF2 crypto support (Glen Low) * [#1447](https://github.com/joyent/node/issues/1447) remove deprecated http.cat() function (Mikeal Rogers) * [#1140](https://github.com/joyent/node/issues/1140) fix incorrect dispatch of vm.runInContext's filename argument (Antranig Basman) * [#1140](https://github.com/joyent/node/issues/1140) document vm.runInContext() and vm.createContext() (Antranig Basman) * [#1428](https://github.com/joyent/node/issues/1428) fix os.freemem() on 64 bits freebsd (Artem Zaytsev) * [#1164](https://github.com/joyent/node/issues/1164) make all DNS lookups async, fixes uncatchable exceptions (Koichi Kobayashi) * fix incorrect ssl shutdown check (Tom Hughes) * various cmake fixes (Tom Hughes) * improved documentation (Koichi Kobayashi, Logan Smyth, Fedor Indutny, Mikeal Rogers, Maciej Małecki, Antranig Basman, Mickaël Delahaye) * upgrade libuv to commit 835782a * upgrade V8 to 3.5.8 ## 2011.08.12, Version 0.5.4 (unstable) https://github.com/iojs/io.js/commit/cfba1f59224ff8602c3fe9145181cad4c6df89a9 * libuv/Windows compatibility improvements * Build on Microsoft Visual Studio via GYP. Use generate-projects.bat in the to build sln files. (Peter Bright, Igor Zinkovsky) * Make Mikeal's HTTP agent client the default. Use old HTTP client with --use-http1 * Fixes https host header default port handling. (Mikeal Rogers) * [#1440](https://github.com/joyent/node/issues/1440) strip byte order marker when loading *.js and *.json files (Ben Noordhuis) * [#1434](https://github.com/joyent/node/issues/1434) Improve util.format() compatibility with browser. (Koichi Kobayashi) * Provide unchecked uint entry points for integer Buffer.read/writeInt methods. (Robert Mustacchi) * CMake improvements (Tom Huges) * Upgrade V8 to 3.5.4. ## 2011.08.01, Version 0.5.3 (unstable) https://github.com/iojs/io.js/commit/4585330afef44ddfb6a4054bd9b0f190b352628b * Fix crypto encryption/decryption with Base64. (SAWADA Tadashi) * [#243](https://github.com/joyent/node/issues/243) Add an optional length argument to Buffer.write() (koichik) * [#657](https://github.com/joyent/node/issues/657) convert nonbuffer data to string in fs.writeFile/Sync (Daniel Pihlström) * Add process.features, remove process.useUV (Ben Noordhuis) * [#324](https://github.com/joyent/node/issues/324) Fix crypto hmac to accept binary keys + add test cases from rfc 2202 and 4231 (Stefan Bühler) * Add Socket::bytesRead, Socket::bytesWritten (Alexander Uvarov) * [#572](https://github.com/joyent/node/issues/572) Don't print result of --eval in CLI (Ben Noordhuis) * [#1223](https://github.com/joyent/node/issues/1223) Fix http.ClientRequest crashes if end() was called twice (koichik) * [#1383](https://github.com/joyent/node/issues/1383) Emit 'close' after all connections have closed (Felix Geisendörfer) * Add sprintf-like util.format() function (Ben Noordhuis) * Add support for TLS SNI (Fedor Indutny) * New http agent implementation. Off by default the command line flag --use-http2 will enable it. "make test-http2" will run the tests for the new implementation. (Mikeal Rogers) * Revert AMD compatibility. (isaacs) * Windows: improvements, child_process support. * Remove pkg-config file. * Fix startup time regressions. * doc improvements ## 2011.07.22, Version 0.5.2 (unstable) https://github.com/iojs/io.js/commit/08ffce1a00dde1199174b390a64a90b60768ddf5 * libuv improvements; named pipe support * [#1242](https://github.com/joyent/node/issues/1242) check for SSL_COMP_get_compression_methods() (Ben Noordhuis) * [#1348](https://github.com/joyent/node/issues/1348) remove require.paths (isaacs) * [#1349](https://github.com/joyent/node/issues/1349) Delimit NODE_PATH with ; on Windows (isaacs) * [#1335](https://github.com/joyent/node/issues/1335) Remove EventEmitter from C++ * [#1357](https://github.com/joyent/node/issues/1357) Load json files with require() (isaacs) * [#1374](https://github.com/joyent/node/issues/1374) fix setting ServerResponse.statusCode in writeHead (Trent Mick) * Fixed: GC was being run too often. * Upgrade V8 to 3.4.14 * doc improvements ## 2011.07.14, Version 0.5.1 (unstable) https://github.com/iojs/io.js/commit/f8bfa54d0fa509f9242637bef2869a1b1e842ec8 * [#1233](https://github.com/joyent/node/issues/1233) Fix os.totalmem on FreeBSD amd64 (Artem Zaytsev) * [#1149](https://github.com/joyent/node/issues/1149) IDNA and Punycode support in url.parse (Jeremy Selier, Ben Noordhuis, isaacs) * Export $CC and $CXX to uv and V8's build systems * Include pthread-win32 static libraries in build (Igor Zinkovsky) * [#1199](https://github.com/joyent/node/issues/1199), [#1094](https://github.com/joyent/node/issues/1094) Fix fs can't handle large file on 64bit platform (koichik) * [#1281](https://github.com/joyent/node/issues/1281) Make require a public member of module (isaacs) * [#1303](https://github.com/joyent/node/issues/1303) Stream.pipe returns the destination (Elijah Insua) * [#1229](https://github.com/joyent/node/issues/1229) Addons should not -DEV_MULTIPLICITY=0 (Brian White) * libuv backend improvements * Upgrade V8 to 3.4.10 ## 2011.07.05, Version 0.5.0 (unstable) https://github.com/iojs/io.js/commit/ae7ed8482ea7e53c59acbdf3cf0e0a0ae9d792cd * New non-default libuv backend to support IOCP on Windows. Use --use-uv to enable. * deprecate http.cat * docs improved. * add child_process.fork * add fs.utimes() and fs.futimes() support (Ben Noordhuis) * add process.uptime() (Tom Huges) * add path.relative (Tony Huang) * add os.getNetworkInterfaces() * add remoteAddress and remotePort for client TCP connections (Brian White) * add secureOptions flag, setting ciphers, SSL_OP_CRYPTOPRO_TLSEXT_BUG to TLS (Theo Schlossnagle) * add process.arch (Nathan Rajlich) * add reading/writing of floats and doubles from/to buffers (Brian White) * Allow script to be read from stdin * [#477](https://github.com/joyent/node/issues/477) add Buffer::fill method to do memset (Konstantin Käfer) * [#573](https://github.com/joyent/node/issues/573) Diffie-Hellman support to crypto module (Håvard Stranden) * [#695](https://github.com/joyent/node/issues/695) add 'hex' encoding to buffer (isaacs) * [#851](https://github.com/joyent/node/issues/851) Update how REPLServer uses contexts (Ben Weaver) * [#853](https://github.com/joyent/node/issues/853) add fs.lchow, fs.lchmod, fs.fchmod, fs.fchown (isaacs) * [#889](https://github.com/joyent/node/issues/889) Allow to remove all EventEmitter listeners at once (Felix Geisendörfer) * [#926](https://github.com/joyent/node/issues/926) OpenSSL NPN support (Fedor Indutny) * [#955](https://github.com/joyent/node/issues/955) Change ^C handling in REPL (isaacs) * [#979](https://github.com/joyent/node/issues/979) add support for Unix Domain Sockets to HTTP (Mark Cavage) * [#1173](https://github.com/joyent/node/issues/1173) [#1170](https://github.com/joyent/node/issues/1170) add AMD, asynchronous module definition (isaacs) * DTrace probes: support X-Forwarded-For (Dave Pacheco) ## 2011.09.15, Version 0.4.12 (stable) https://github.com/iojs/io.js/commit/771ba34ca7b839add2ef96879e1ffc684813cf7c * Improve docs * [#1563](https://github.com/joyent/node/issues/1563) overflow in ChildProcess custom_fd. * [#1569](https://github.com/joyent/node/issues/1569), parse error on multi-line HTTP headers. (Ben Noordhuis) * [#1586](https://github.com/joyent/node/issues/1586) net: Socket write encoding case sensitivity (koichik) * [#1610](https://github.com/joyent/node/issues/1610) Remove DigiNotar CA from trusted list (isaacs) * [#1624](https://github.com/joyent/node/issues/1624) buffer: Avoid overrun with 'binary' encoding. (koichik) * [#1633](https://github.com/joyent/node/issues/1633) buffer: write() should always set _charsWritten. (koichik) * [#1707](https://github.com/joyent/node/issues/1707) hasOwnProperty usage security hole in querystring (isaacs) * [#1719](https://github.com/joyent/node/issues/1719) Drain OpenSSL error queue * Fix error reporting in net.Server.listen ## 2011.08.17, Version 0.4.11 (stable) https://github.com/iojs/io.js/commit/a745d19ce7d1c0e3778371af4f0346be70cf2c8e * [#738](https://github.com/joyent/node/issues/738) Fix crypto encryption/decryption with Base64. (SAWADA Tadashi) * [#1202](https://github.com/joyent/node/issues/1202) net.createConnection defer DNS lookup error events to next tick (Ben Noordhuis) * [#1374](https://github.com/joyent/node/issues/1374) fix setting ServerResponse.statusCode in writeHead (Trent Mick) * [#1417](https://github.com/joyent/node/issues/1417) Fix http.ClientRequest crashes if end() was called twice * [#1497](https://github.com/joyent/node/issues/1497) querystring: Replace 'in' test with 'hasOwnProperty' (isaacs) * [#1546](https://github.com/joyent/node/issues/1546) http perf improvement * fix memleak in libeio (Tom Hughes) * cmake improvements (Tom Hughes) * node_net.cc: fix incorrect sizeof() (Tom Hughes) * Windows/cygwin: no more GetConsoleTitleW errors on XP (Bert Belder) * Doc improvments (koichik, Logan Smyth, Ben Noordhuis, Arnout Kazemier) ## 2011.07.19, Version 0.4.10 (stable) https://github.com/iojs/io.js/commit/1b8dd65d6e3b82b6863ef38835cc436c5d30c1d5 * [#394](https://github.com/joyent/node/issues/394) Fix Buffer drops last null character in UTF-8 * [#829](https://github.com/joyent/node/issues/829) Backport r8577 from V8 (Ben Noordhuis) * [#877](https://github.com/joyent/node/issues/877) Don't wait for HTTP Agent socket pool to establish connections. * [#915](https://github.com/joyent/node/issues/915) Find kqueue on FreeBSD correctly (Brett Kiefer) * [#1085](https://github.com/joyent/node/issues/1085) HTTP: Fix race in abort/dispatch code (Stefan Rusu) * [#1274](https://github.com/joyent/node/issues/1274) debugger improvement (Yoshihiro Kikuchi) * [#1291](https://github.com/joyent/node/issues/1291) Properly respond to HEAD during end(body) hot path (Reid Burke) * [#1304](https://github.com/joyent/node/issues/1304) TLS: Fix race in abort/connection code (Stefan Rusu) * [#1360](https://github.com/joyent/node/issues/1360) Allow _ in url hostnames. * Revert 37d529f8 - unbreaks debugger command parsing. * Bring back global execScript * Doc improvements ## 2011.06.29, Version 0.4.9 (stable) https://github.com/iojs/io.js/commit/de44eafd7854d06cd85006f509b7051e8540589b * Improve documentation * [#1095](https://github.com/joyent/node/issues/1095) error handling bug in stream.pipe() (Felix Geisendörfer) * [#1097](https://github.com/joyent/node/issues/1097) Fix a few leaks in node_crypto.cc (Ben Noordhuis) * [#562](https://github.com/joyent/node/issues/562) [#1078](https://github.com/joyent/node/issues/1078) Parse file:// urls properly (Ryan Petrello) * [#880](https://github.com/joyent/node/issues/880) Option to disable SSLv2 (Jérémy Lal) * [#1087](https://github.com/joyent/node/issues/1087) Disabling SSL compression disabled with early OpenSSLs. * [#1144](https://github.com/joyent/node/issues/1144) debugger: don't allow users to input non-valid commands (Siddharth Mahendraker) * Perf improvement for util.inherits * [#1166](https://github.com/joyent/node/issues/1166) Support for signature verification with RSA/DSA public keys (Mark Cavage) * [#1177](https://github.com/joyent/node/issues/1177) Remove node_modules lookup optimization to better support nested project structures (Mathias Buus) * [#1203](https://github.com/joyent/node/issues/1203) Add missing scope.Close to fs.sendfileSync * [#1187](https://github.com/joyent/node/issues/1187) Support multiple 'link' headers * [#1196](https://github.com/joyent/node/issues/1196) Fix -e/--eval can't load module from node_modules (Koichi Kobayashi) * Upgrade V8 to 3.1.8.25, upgrade http-parser. ## 2011.05.20, Version 0.4.8 (stable) https://github.com/iojs/io.js/commit/7dd22c26e4365698dc3efddf138c4d399cb912c8 * [#974](https://github.com/joyent/node/issues/974) Properly report traceless errors (isaacs) * [#983](https://github.com/joyent/node/issues/983) Better JSON.parse error detection in REPL (isaacs) * [#836](https://github.com/joyent/node/issues/836) Agent socket errors bubble up to req only if req exists * [#1041](https://github.com/joyent/node/issues/1041) Fix event listener leak check timing (koichik) * [#1038](https://github.com/joyent/node/issues/1038) Fix dns.resolve() with 'PTR' throws Error: Unknown type "PTR" (koichik) * [#1073](https://github.com/joyent/node/issues/1073) Share SSL context between server connections (Fedor Indutny) * Disable compression with OpenSSL. Improves memory perf. * Implement os.totalmem() and os.freemem() for SunOS (Alexandre Marangone) * Fix a special characters in URL regression (isaacs) * Fix idle timeouts in HTTPS (Felix Geisendörfer) * SlowBuffer.write() with 'ucs2' throws ReferenceError. (koichik) * http.ServerRequest 'close' sometimes gets an error argument (Felix Geisendörfer) * Doc improvements * cleartextstream.destroy() should close(2) the socket. Previously was being mapped to a shutdown(2) syscall. * No longer compile out asserts and debug statements in normal build. * Debugger improvements. * Upgrade V8 to 3.1.8.16. ## 2011.04.22, Version 0.4.7 (stable) https://github.com/iojs/io.js/commit/c85455a954411b38232e79752d4abb61bb75031b * Don't emit error on ECONNRESET from read() [#670](https://github.com/joyent/node/issues/670) * Fix: Multiple pipes to the same stream were broken [#929](https://github.com/joyent/node/issues/929) (Felix Geisendörfer) * URL parsing/formatting corrections [#954](https://github.com/joyent/node/issues/954) (isaacs) * make it possible to do repl.start('', stream) (Wade Simmons) * Add os.loadavg for SunOS (Robert Mustacchi) * Fix timeouts with floating point numbers [#897](https://github.com/joyent/node/issues/897) (Jorge Chamorro Bieling) * Improve docs. ## 2011.04.13, Version 0.4.6 (stable) https://github.com/iojs/io.js/commit/58002d56bc79410c5ff397fc0e1ffec0665db38a * Don't error on ENOTCONN from shutdown() [#670](https://github.com/joyent/node/issues/670) * Auto completion of built-in debugger suggests prefix match rather than partial match. (koichik) * circular reference in vm modules. [#822](https://github.com/joyent/node/issues/822) (Jakub Lekstan) * http response.readable should be false after 'end' [#867](https://github.com/joyent/node/issues/867) (Abe Fettig) * Implement os.cpus() and os.uptime() on Solaris (Scott McWhirter) * fs.ReadStream: Allow omission of end option for range reads [#801](https://github.com/joyent/node/issues/801) (Felix Geisendörfer) * Buffer.write() with UCS-2 should not be write partial char [#916](https://github.com/joyent/node/issues/916) (koichik) * Pass secureProtocol through on tls.Server creation (Theo Schlossnagle) * TLS use RC4-SHA by default * Don't strangely drop out of event loop on HTTPS client uploads [#892](https://github.com/joyent/node/issues/892) * Doc improvements * Upgrade v8 to 3.1.8.10 ## 2011.04.01, Version 0.4.5 (stable) https://github.com/iojs/io.js/commit/787a343b588de26784fef97f953420b53a6e1d73 * Fix listener leak in stream.pipe() (Mikeal Rogers) * Retain buffers in fs.read/write() GH-814 (Jorge Chamorro Bieling) * TLS performance improvements * SlowBuffer.prototype.slice bug GH-843 * process.stderr.write should return true * Immediate pause/resume race condition GH-535 (isaacs) * Set default host header properly GH-721 (isaacs) * Upgrade V8 to 3.1.8.8 ## 2011.03.26, Version 0.4.4 (stable) https://github.com/iojs/io.js/commit/25122b986a90ba0982697b7abcb0158c302a1019 * CryptoStream.end shouldn't throw if not writable GH-820 * Drop out if connection destroyed before connect() GH-819 * expose https.Agent * Correctly setsid in tty.open GH-815 * Bug fix for failed buffer construction * Added support for removing .once listeners (GH-806) * Upgrade V8 to 3.1.8.5 ## 2011.03.18, Version 0.4.3 (stable) https://github.com/iojs/io.js/commit/c095ce1a1b41ca015758a713283bf1f0bd41e4c4 * Don't decrease server connection counter again if destroy() is called more than once GH-431 (Andreas Reich, Anders Conbere) * Documentation improvements (koichik) * Fix bug with setMaxListeners GH-682 * Start up memory footprint improvement. (Tom Hughes) * Solaris improvements. * Buffer::Length(Buffer*) should not invoke itself recursively GH-759 (Ben Noordhuis) * TLS: Advertise support for client certs GH-774 (Theo Schlossnagle) * HTTP Agent bugs: GH-787, GH-784, GH-803. * Don't call GetMemoryUsage every 5 seconds. * Upgrade V8 to 3.1.8.3 ## 2011.03.02, Version 0.4.2 (stable) https://github.com/iojs/io.js/commit/39280e1b5731f3fcd8cc42ad41b86cdfdcb6d58b * Improve docs. * Fix process.on edge case with signal event (Alexis Sellier) * Pragma HTTP header comma separation * In addition to 'aborted' emit 'close' from incoming requests (Felix Geisendörfer) * Fix memleak in vm.runInNewContext * Do not cache modules that throw exceptions (Felix Geisendörfer) * Build system changes for libnode (Aria Stewart) * Read up the prototype of the 'env' object. (Nathan Rajlich) * Add 'close' and 'aborted' events to Agent responses * http: fix missing 'drain' events (Russell Haering) * Fix process.stdout.end() throws ENOTSOCK error. (Koichi Kobayashi) * REPL bug fixes (isaacs) * node_modules folders should be highest priority (isaacs) * URL parse more safely (isaacs) * Expose errno with a string for dns/cares (Felix Geisendörfer) * Fix tty.setWindowSize * spawn: setuid after chdir (isaacs) * SIGUSR1 should break the VM without delay * Upgrade V8 to 3.1.8. ## 2011.02.19, Version 0.4.1 (stable) https://github.com/iojs/io.js/commit/e8aef84191bc2c1ba2bcaa54f30aabde7f03769b * Fixed field merging with progressive fields on writeHead() (TJ Holowaychuk) * Make the repl respect node_modules folders (isaacs) * Fix for DNS fail in HTTP request (Richard Rodger) * Default to port 80 for http.request and http.get. * Improve V8 support for Cygwin (Bert Belder) * Fix fs.open param parsing. (Felix Geisendörfer) * Fixed null signal. * Fix various HTTP and HTTPS bugs * cmake improvements (Tom Hughes) * Fix: TLS sockets should not be writable after 'end' * Fix os.cpus() on cygwin (Brian White) * MinGW: OpenSSL support (Bert Belder) * Upgrade V8 to 3.1.5, libev to 4.4. ## 2011.02.10, Version 0.4.0 (stable) https://github.com/iojs/io.js/commit/eb155ea6f6a6aa341aa8c731dca8da545c6a4008 * require() improvements (isaacs) - understand package.json (isaacs) - look for 'node_modules' dir * cmake fixes (Daniel Gröber) * http: fix buffer writes to outgoing messages (Russell Haering) * Expose UCS-2 Encoding (Konstantin Käfer) * Support strings for octal modes (isaacs) * Support array-ish args to Buffer ctor (isaacs) * cygwin and mingw improvements (Bert Belder) * TLS improvements * Fewer syscalls during require (Bert Belder, isaacs) * More DTrace probes (Bryan Cantrill, Robert Mustacchi) * 'pipe' event on pipe() (Mikeal Rogers) * CRL support in TLS (Theo Schlossnagle) * HTTP header manipulation methods (Tim Caswell, Charlie Robbins) * Upgrade V8 to 3.1.2 ## 2011.02.04, Version 0.3.8 (unstable) https://github.com/iojs/io.js/commit/9493b7563bff31525b4080df5aeef09747782d5e * Add req.abort() for client side requests. * Add exception.code for easy testing: Example: if (err.code == 'EADDRINUSE'); * Add process.stderr. * require.main is the main module. (Isaac Schlueter) * dgram: setMulticastTTL, setMulticastLoopback and addMembership. (Joe Walnes) * Fix throttling in TLS connections * Add socket.bufferSize * MinGW improvements (Bert Belder) * Upgrade V8 to 3.1.1 ## 2011.01.27, Version 0.3.7 (unstable) https://github.com/iojs/io.js/commit/d8579c6afdbe868de6dffa8db78bbe4ba2d03e0e * Expose agent in http and https client. (Mikeal Rogers) * Fix bug in http request's end method. (Ali Farhadi) * MinGW: better net support (Bert Belder) * fs.open should set FD_CLOEXEC * DTrace probes (Bryan Cantrill) * REPL fixes and improvements (isaacs, Bert Belder) * Fix many bugs with legacy http.Client interface * Deprecate process.assert. Use require('assert').ok * Add callback parameter to socket.setTimeout(). (Ali Farhadi) * Fixing bug in http request default encoding (Ali Farhadi) * require: A module ID with a trailing slash must be a dir. (isaacs) * Add ext_key_usage to getPeerCertificate (Greg Hughes) * Error when child_process.exec hits maxBuffer. * Fix option parsing in tls.connect() * Upgrade to V8 3.0.10 ## 2011.01.21, Version 0.3.6 (unstable) https://github.com/iojs/io.js/commit/bb3e71466e5240626d9d21cf791fe43e87d90011 * REPL and other improvements on MinGW (Bert Belder) * listen/bind errors should close net.Server * New HTTP and HTTPS client APIs * Upgrade V8 to 3.0.9 ## 2011.01.16, Version 0.3.5 (unstable) https://github.com/iojs/io.js/commit/b622bc6305e3c675e0edfcdbaa387d849ad0bba0 * Built-in debugger improvements. * Add setsid, setuid, setgid options to child_process.spawn (Isaac Schlueter) * tty module improvements. * Upgrade libev to 4.3, libeio to latest, c-ares to 1.7.4 * Allow third party hooks before main module load. (See 496be457b6a2bc5b01ec13644b9c9783976159b2) * Don't stat() on cached modules. (Felix Geisendörfer) ## 2011.01.08, Version 0.3.4 (unstable) https://github.com/iojs/io.js/commit/73f53e12e4a5b9ef7dbb4792bd5f8ad403094441 * Primordial mingw build (Bert Belder) * HTTPS server * Built in debugger 'node debug script.js' * realpath files during module load (Mihai Călin Bazon) * Rename net.Stream to net.Socket (existing name will continue to be supported) * Fix process.platform ## 2011.01.02, Version 0.3.3 (unstable) https://github.com/iojs/io.js/commit/57544ba1c54c7d0da890317deeb73076350c5647 * TLS improvements. * url.parse(url, true) defaults query field to {} (Jeremy Martin) * Upgrade V8 to 3.0.4 * Handle ECONNABORT properly (Theo Schlossnagle) * Fix memory leaks (Tom Hughes) * Add os.cpus(), os.freemem(), os.totalmem(), os.loadavg() and other functions for OSX, Linux, and Cygwin. (Brian White) * Fix REPL syntax error bug (GH-543), improve how REPL commands are evaluated. * Use process.stdin instead of process.openStdin(). * Disable TLS tests when node doesn't have OpenSSL. ## 2010.12.16, Version 0.3.2 (unstable) https://github.com/iojs/io.js/commit/4bb914bde9f3c2d6de00853353b6b8fc9c66143a * Rip out the old (broken) TLS implementation introduce new tested implementation and API. See docs. HTTPS not supported in this release. * Introduce 'os' and 'tty' modules. * Callback parameters for socket.write() and socket.connect(). * Support CNAME lookups in DNS module. (Ben Noordhuis) * cmake support (Tom Hughes) * 'make lint' * oprofile support (./configure --oprofile) * Lots of bug fixes, including: - Memory leak in ChildProcess:Spawn(). (Tom Hughes) - buffer.slice(0, 0) - Global variable leaks - clearTimeouts calling multiple times (Michael W) - utils.inspect's detection of circular structures (Tim Cooijmans) - Apple's threaded write()s bug (Jorge Chamorro Bieling) - Make sure raw mode is disabled when exiting a terminal-based REPL. (Brian White) * Deprecate process.compile, process.ENV * Upgrade V8 to 3.0.3, upgrade http-parser. ## 2010.11.16, Version 0.3.1 (unstable) https://github.com/iojs/io.js/commit/ce9a54aa1fbf709dd30316af8a2f14d83150e947 * TLS improvements (Paul Querna) - Centralize error handling in SecureStream - Add SecurePair for handling of a ssl/tls stream. * New documentation organization (Micheil Smith) * allowHalfOpen TCP connections disabled by default. * Add C++ API for constructing fast buffer from string * Move idle timers into its own module * Gracefully handle EMFILE and server.maxConnections * make "node --eval" eval in the global scope. (Jorge Chamorro Bieling) * Let exit listeners know the exit code (isaacs) * Handle cyclic links smarter in fs.realpath (isaacs) * Remove node-repl (just use 'node' without args) * Rewrite libeio After callback to use req->result instead of req->errorno for error checking (Micheil Smith) * Remove warning about deprecating 'sys' - too aggressive * Make writes to process.env update the real environment. (Ben Noordhuis) * Set FD_CLOEXEC flag on stdio FDs before spawning. (Guillaume Tuton) * Move ev_loop out of javascript * Switch \n with \r\n for all strings printed out. * Added support for cross compilation (Rasmus Andersson) * Add --profile flag to configure script, enables gprof profiling. (Ben Noordhuis) * writeFileSync could exhibit pathological behavior when a buffer could not be written to the file in a single write() call. * new path.join behavior (isaacs) - Express desired path.join behavior in tests. - Update fs.realpath to reflect new path.join behavior - Update url.resolve() to use new path.join behavior. * API: Move process.binding('evals') to require('vm') * Fix V8 build on Cygwin (Bert Belder) * Add ref to buffer during fs.write and fs.read * Fix segfault on test-crypto * Upgrade http-parser to latest and V8 to 2.5.3 ## 2010.10.23, Version 0.3.0 (unstable) https://github.com/iojs/io.js/commit/1582cfebd6719b2d2373547994b3dca5c8c569c0 * Bugfix: Do not spin on accept() with EMFILE * Improvements to readline.js (Trent Mick, Johan Euphrosine, Brian White) * Safe constructors (missing 'new' doesn't segfault) * Fix process.nextTick so thrown errors don't confuse it. (Benjamin Thomas) * Allow Strings for ports on net.Server.listen (Bradley Meck) * fs bugfixes (Tj Holowaychuk, Tobie Langel, Marco Rogers, isaacs) * http bug fixes (Fedor Indutny, Mikeal Rogers) * Faster buffers; breaks C++ API (Tim-Smart, Stéphan Kochen) * crypto, tls improvements (Paul Querna) * Add lfs flags to node addon script * Simpler querystring parsing; breaks API (Peter Griess) * HTTP trailers (Mark Nottingham) * http 100-continue support (Mark Nottingham) * Module system simplifications (Herbert Vojčík, isaacs, Tim-Smart) - remove require.async - remove registerExtension, add .extensions - expose require.resolve - expose require.cache - require looks in node_modules folders * Add --eval command line option (TJ Holowaychuk) * Commas last in sys.inspect * Constants moved from process object to require('constants') * Fix parsing of linux memory (Vitali Lovich) * inspect shows function names (Jorge Chamorro Bieling) * uncaughtException corner cases (Felix Geisendörfer) * TCP clients now buffer writes before connection * Rename sys module to 'util' (Micheil Smith) * Properly set stdio handlers to blocking on SIGTERM and SIGINT (Tom Hughes) * Add destroy methods to HTTP messages * base64 improvements (isaacs, Jorge Chamorro Bieling) * API for defining REPL commands (Sami Samhuri) * child_process.exec timeout fix (Aaron Heckmann) * Upgrade V8 to 2.5.1, Libev to 4.00, libeio, http-parser ## 2010.08.20, Version 0.2.0 https://github.com/iojs/io.js/commit/9283e134e558900ba89d9a33c18a9bdedab07cb9 * process.title support for FreeBSD, Macintosh, Linux * Fix OpenSSL 100% CPU usage on error (Illarionov Oleg) * Implement net.Server.maxConnections. * Fix process.platform, add process.version. * Add --without-snapshot configure option. * Readline REPL improvements (Trent Mick) * Bug fixes. * Upgrade V8 to 2.3.8 ## 2010.08.13, Version 0.1.104 https://github.com/iojs/io.js/commit/b14dd49222687c12f3e8eac597cff4f2674f84e8 * Various bug fixes (console, querystring, require) * Set cwd for child processes (Bert Belder) * Tab completion for readline (Trent Mick) * process.title getter/setter for OSX, Linux, Cygwin. (Rasmus Andersson, Bert Belder) * Upgrade V8 to 2.3.6 ## 2010.08.04, Version 0.1.103 https://github.com/iojs/io.js/commit/0b925d075d359d03426f0b32bb58a5e05825b4ea * Implement keep-alive for http.Client (Mikeal Rogers) * base64 fixes. (Ben Noordhuis) * Fix --debug-brk (Danny Coates) * Don't let path.normalize get above the root. (Isaac Schlueter) * Allow signals to be used with process.on in addition to process.addListener. (Brian White) * Globalize the Buffer object * Use kqueue on recent macintosh builds * Fix addrlen for unix_dgram sockets (Benjamin Kramer) * Fix stats.isDirectory() and friends (Benjamin Kramer) * Upgrade http-parser, V8 to 2.3.5 ## 2010.07.25, Version 0.1.102 https://github.com/iojs/io.js/commit/2a4568c85f33869c75ff43ccd30f0ec188b43eab * base64 encoding for Buffers. * Buffer support for Cipher, Decipher, Hmac, Sign and Verify (Andrew Naylor) * Support for reading byte ranges from files using fs.createReadStream. (Chandra Sekar) * Fix Buffer.toString() on 0-length slices. (Peter Griess) * Cache modules based on filename rather than ID (Isaac Schlueter) * querystring improvments (Jan Kassens, Micheil Smith) * Support DEL in the REPL. (Jérémy Lal) * Upgrade http-parser, upgrade V8 to 2.3.2 ## 2010.07.16, Version 0.1.101 https://github.com/iojs/io.js/commit/0174ceb6b24caa0bdfc523934c56af9600fa9b58 * Added env to child_process.exec (Сергей Крыжановский) * Allow modules to optionally be loaded in separate contexts with env var NODE_MODULE_CONTEXTS=1. * setTTL and setBroadcast for dgram (Matt Ranney) * Use execPath for default NODE_PATH, not installPrefix (Isaac Schlueter) * Support of console.dir + console.assert (Jerome Etienne) * on() as alias to addListener() * Use javascript port of Ronn to build docs (Jérémy Lal) * Upgrade V8 to 2.3.0 ## 2010.07.03, Version 0.1.100 https://github.com/iojs/io.js/commit/a6b8586e947f9c3ced180fe68c233d0c252add8b * process.execPath (Marshall Culpepper) * sys.pump (Mikeal Rogers) * Remove ini and mjsunit libraries. * Introduce console.log() and friends. * Switch order of arguments for Buffer.write (Blake Mizerany) * On overlapping buffers use memmove (Matt Ranney) * Resolve .local domains with getaddrinfo() * Upgrade http-parser, V8 to 2.2.21 ## 2010.06.21, Version 0.1.99 https://github.com/iojs/io.js/commit/a620b7298f68f68a855306437a3b60b650d61d78 * Datagram sockets (Paul Querna) * fs.writeFile could not handle utf8 (Felix Geisendörfer) and now accepts Buffers (Aaron Heckmann) * Fix crypto memory leaks. * A replacement for decodeURIComponent that doesn't throw. (Isaac Schlueter) * Only concatenate some incoming HTTP headers. (Peter Griess) * Upgrade V8 to 2.2.18 ## 2010.06.11, Version 0.1.98 https://github.com/iojs/io.js/commit/10d8adb08933d1d4cea60192c2a31c56d896733d * Port to Windows/Cygwin (Raffaele Sena) * File descriptor passing on unix sockets. (Peter Griess) * Simple, builtin readline library. REPL is now entered by executing "node" without arguments. * Add a parameter to spawn() that sets the child's stdio file descriptors. (Orlando Vazquez) * Upgrade V8 to 2.2.16, http-parser fixes, upgrade c-ares to 1.7.3. ## 2010.05.29, Version 0.1.97 https://github.com/iojs/io.js/commit/0c1aa36835fa6a3557843dcbc6ed6714d353a783 * HTTP throttling: outgoing messages emit 'drain' and write() returns false when send buffer is full. * API: readFileSync without encoding argument now returns a Buffer * Improve Buffer C++ API; addons now compile with debugging symbols. * Improvements to path.extname() and REPL; add fs.chown(). * fs.ReadStream now emits buffers, fs.readFileSync returns buffers. * Bugfix: parsing HTTP responses to HEAD requests. * Port to OpenBSD. * Upgrade V8 to 2.2.12, libeio, http-parser. ## 2010.05.21, Version 0.1.96 https://github.com/iojs/io.js/commit/9514a4d5476225e8c8310ce5acae2857033bcaaa * Thrown errors in http and socket call back get bubbled up. * Add fs.fsync (Andrew Johnston) * Bugfix: signal unregistering (Jonas Pfenniger) * Added better error messages for async and sync fs calls with paths (TJ Holowaychuk) * Support arrays and strings in buffer constructor. (Felix Geisendörfer) * Fix errno reporting in DNS exceptions. * Support buffers in fs.WriteStream.write. * Bugfix: Safely decode a utf8 streams that are broken on a multbyte character (http and net). (Felix Geisendörfer) * Make Buffer's C++ constructor public. * Deprecate sys.p() * FIX path.dirname('/tmp') => '/'. (Jonathan Rentzsch) ## 2010.05.13, Version 0.1.95 https://github.com/iojs/io.js/commit/0914d33842976c2c870df06573b68f9192a1fb7a * Change GC idle notify so that it runs alongside setInterval * Install node_buffer.h on make install * fs.readFile returns Buffer by default (Tim Caswell) * Fix error reporting in child_process callbacks * Better logic for testing if an argument is a port * Improve error reporting (single line "node.js:176:9" errors) * Bugfix: Some http responses being truncated (appeared in 0.1.94) * Fix long standing net idle timeout bugs. Enable 2 minute timeout by default in HTTP servers. * Add fs.fstat (Ben Noordhuis) * Upgrade to V8 2.2.9 ## 2010.05.06, Version 0.1.94 https://github.com/iojs/io.js/commit/f711d5343b29d1e72e87107315708e40951a7826 * Look in /usr/local/lib/node for modules, so that there's a way to install modules globally (Issac Schlueter) * SSL improvements (Rhys Jones, Paulo Matias) * Added c-ares headers for linux-arm (Jonathan Knezek) * Add symbols to release build * HTTP upgrade improvements, docs (Micheil Smith) * HTTP server emits 'clientError' instead of printing message * Bugfix: Don't emit 'error' twice from http.Client * Bugfix: Ignore SIGPIPE * Bugfix: destroy() instead of end() http connection at end of pipeline * Bugfix: http.Client may be prematurely released back to the free pool. (Thomas Lee) * Upgrade V8 to 2.2.8 ## 2010.04.29, Version 0.1.93 https://github.com/iojs/io.js/commit/557ba6bd97bad3afe0f9bd3ac07efac0a39978c1 * Fixed no 'end' event on long chunked HTTP messages https://github.com/joyent/node/issues/77 * Remove legacy modules http_old and tcp_old * Support DNS MX queries (Jérémy Lal) * Fix large socket write (tlb@tlb.org) * Fix child process exit codes (Felix Geisendörfer) * Allow callers to disable PHP/Rails style parameter munging in querystring.stringify (Thomas Lee) * Upgrade V8 to 2.2.6 ## 2010.04.23, Version 0.1.92 https://github.com/iojs/io.js/commit/caa828a242f39b6158084ef4376355161c14fe34 * OpenSSL support. Still undocumented (see tests). (Rhys Jones) * API: Unhandled 'error' events throw. * Script class with eval-function-family in binding('evals') plus tests. (Herbert Vojcik) * stream.setKeepAlive (Julian Lamb) * Bugfix: Force no body on http 204 and 304 * Upgrade Waf to 1.5.16, V8 to 2.2.4.2 ## 2010.04.15, Version 0.1.91 https://github.com/iojs/io.js/commit/311d7dee19034ff1c6bc9098c36973b8d687eaba * Add incoming.httpVersion * Object.prototype problem with C-Ares binding * REPL can be run from multiple different streams. (Matt Ranney) * After V8 heap is compact, don't use a timer every 2 seconds. * Improve nextTick implementation. * Add primative support for Upgrading HTTP connections. (See commit log for docs 760bba5) * Add timeout and maxBuffer options to child_process.exec * Fix bugs. * Upgrade V8 to 2.2.3.1 ## 2010.04.09, Version 0.1.90 https://github.com/iojs/io.js/commit/07e64d45ffa1856e824c4fa6afd0442ba61d6fd8 * Merge writing of networking system (net2) - New Buffer object for binary data. - Support UNIX sockets, Pipes - Uniform stream API - Currently no SSL - Legacy modules can be accessed at 'http_old' and 'tcp_old' * Replace udns with c-ares. (Krishna Rajendran) * New documentation system using Markdown and Ronn (Tim Caswell, Micheil Smith) * Better idle-time GC * Countless small bug fixes. * Upgrade V8 to 2.2.X, WAF 1.5.15 ## 2010.03.19, Version 0.1.33 https://github.com/iojs/io.js/commit/618296ef571e873976f608d91a3d6b9e65fe8284 * Include lib/ directory in node executable. Compile on demand. * evalcx clean ups (Isaac Z. Schlueter, Tim-Smart) * Various fixes, clean ups * V8 upgraded to 2.1.5 ## 2010.03.12, Version 0.1.32 https://github.com/iojs/io.js/commit/61c801413544a50000faa7f58376e9b33ba6254f * Optimize event emitter for single listener * Add process.evalcx, require.registerExtension (Tim Smart) * Replace --cflags with --vars * Fix bugs in fs.create*Stream (Felix Geisendörfer) * Deprecate process.mixin, process.unloop * Remove the 'Error: (no message)' exceptions, print stack trace instead * INI parser bug fixes (Isaac Schlueter) * FreeBSD fixes (Vanilla Hsu) * Upgrade to V8 2.1.3, WAF 1.5.14a, libev ## 2010.03.05, Version 0.1.31 https://github.com/iojs/io.js/commit/39b63dfe1737d46a8c8818c92773ef181fd174b3 * API: - Move process.watchFile into fs module - Move process.inherits to sys * Improve Solaris port * tcp.Connection.prototype.write now returns boolean to indicate if argument was flushed to the kernel buffer. * Added fs.link, fs.symlink, fs.readlink, fs.realpath (Rasmus Andersson) * Add setgid,getgid (James Duncan) * Improve sys.inspect (Benjamin Thomas) * Allow passing env to child process (Isaac Schlueter) * fs.createWriteStream, fs.createReadStream (Felix Geisendörfer) * Add INI parser (Rob Ellis) * Bugfix: fs.readFile handling encoding (Jacek Becela) * Upgrade V8 to 2.1.2 ## 2010.02.22, Version 0.1.30 https://github.com/iojs/io.js/commit/bb0d1e65e1671aaeb21fac186b066701da0bc33b * Major API Changes - Promises removed. See http://groups.google.com/group/nodejs/msg/426f3071f3eec16b http://groups.google.com/group/nodejs/msg/df199d233ff17efa The API for fs was fs.readdir("/usr").addCallback(function (files) { puts("/usr files: " + files); }); It is now fs.readdir("/usr", function (err, files) { if (err) throw err; puts("/usr files: " + files); }); - Synchronous fs operations exposed, use with care. - tcp.Connection.prototype.readPause() and readResume() renamed to pause() and resume() - http.ServerResponse.prototype.sendHeader() renamed to writeHeader(). Now accepts reasonPhrase. * Compact garbage on idle. * Configurable debug ports, and --debug-brk (Zoran Tomicic) * Better command line option parsing (Jeremy Ashkenas) * Add fs.chmod (Micheil Smith), fs.lstat (Isaac Z. Schlueter) * Fixes to process.mixin (Rasmus Andersson, Benjamin Thomas) * Upgrade V8 to 2.1.1 ## 2010.02.17, Version 0.1.29 https://github.com/iojs/io.js/commit/87d5e5b316a4276bcf881f176971c1a237dcdc7a * Major API Changes - Remove 'file' module - require('posix') -----------------> require('fs') - fs.cat ---------------------------> fs.readFile - file.write -----------------------> fs.writeFile - TCP 'receive' event --------------> 'data' - TCP 'eof' event ------------------> 'end' - TCP send() -----------------------> write() - HTTP sendBody() ------------------> write() - HTTP finish() --------------------> close() - HTTP 'body' event ----------------> 'data' - HTTP 'complete' event ------------> 'end' - http.Client.prototype.close() (formerly finish()) no longer takes an argument. Add the 'response' listener manually. - Allow strings for the flag argument to fs.open ("r", "r+", "w", "w+", "a", "a+") * Added multiple arg support for sys.puts(), print(), etc. (tj@vision-media.ca) * sys.inspect(Date) now shows the date value (Mark Hansen) * Calculate page size with getpagesize for armel (Jérémy Lal) * Bugfix: stderr flushing. * Bugfix: Promise late chain (Yuichiro MASUI) * Bugfix: wait() on fired promises (Felix Geisendörfer, Jonas Pfenniger) * Bugfix: Use InstanceTemplate() instead of PrototypeTemplate() for accessor methods. Was causing a crash with Eclipse debugger. (Zoran Tomicic) * Bugfix: Throw from connection.connect if resolving. (Reported by James Golick) ## 2010.02.09, Version 0.1.28 https://github.com/iojs/io.js/commit/49de41ef463292988ddacfb01a20543b963d9669 * Use Google's jsmin.py which can be used for evil. * Add posix.truncate() * Throw errors from server.listen() * stdio bugfix (test by Mikeal Rogers) * Module system refactor (Felix Geisendörfer, Blaine Cook) * Add process.setuid(), getuid() (Michael Carter) * sys.inspect refactor (Tim Caswell) * Multipart library rewrite (isaacs) ## 2010.02.03, Version 0.1.27 https://github.com/iojs/io.js/commit/0cfa789cc530848725a8cb5595224e78ae7b9dd0 * Implemented __dirname (Felix Geisendörfer) * Downcase process.ARGV, process.ENV, GLOBAL (now process.argv, process.env, global) * Bug Fix: Late promise promise callbacks firing (Felix Geisendörfer, Jonas Pfenniger) * Make assert.AssertionError instance of Error * Removed inline require call for querystring (self@cloudhead.net) * Add support for MX, TXT, and SRV records in DNS module. (Blaine Cook) * Bugfix: HTTP client automatically reconnecting * Adding OS X .dmg build scripts. (Standa Opichal) * Bugfix: ObjectWrap memory leak * Bugfix: Multipart handle Content-Type headers with charset (Felix Geisendörfer) * Upgrade http-parser to fix header overflow attack. * Upgrade V8 to 2.1.0 * Various other bug fixes, performance improvements. ## 2010.01.20, Version 0.1.26 https://github.com/iojs/io.js/commit/da00413196e432247346d9e587f8c78ce5ceb087 * Bugfix, HTTP eof causing crash (Ben Williamson) * Better error message on SyntaxError * API: Move Promise and EventEmitter into 'events' module * API: Add process.nextTick() * Allow optional params to setTimeout, setInterval (Micheil Smith) * API: change some Promise behavior (Felix Geisendörfer) - Removed Promise.cancel() - Support late callback binding - Make unhandled Promise errors throw an exception * Upgrade V8 to 2.0.6.1 * Solaris port (Erich Ocean) ## 2010.01.09, Version 0.1.25 https://github.com/iojs/io.js/commit/39ca93549af91575ca9d4cbafd1e170fbcef3dfa * sys.inspect() improvements (Tim Caswell) * path module improvements (isaacs, Benjamin Thomas) * API: request.uri -> request.url It is no longer an object, but a string. The 'url' module was addded to parse that string. That is, node no longer parses the request URL automatically. require('url').parse(request.url) is roughly equivlent to the old request.uri object. (isaacs) * Bugfix: Several libeio related race conditions. * Better errors for multipart library (Felix Geisendörfer) * Bugfix: Update node-waf version to 1.5.10 * getmem for freebsd (Vanilla Hsu) ## 2009.12.31, Version 0.1.24 https://github.com/iojs/io.js/commit/642c2773a7eb2034f597af1cd404b9e086b59632 * Bugfix: don't chunk responses to HTTP/1.0 clients, even if they send Connection: Keep-Alive (e.g. wget) * Bugfix: libeio race condition * Bugfix: Don't segfault on unknown http method * Simplify exception reporting * Upgrade V8 to 2.0.5.4 ## 2009.12.22, Version 0.1.23 https://github.com/iojs/io.js/commit/f91e347eeeeac1a8bd6a7b462df0321b60f3affc * Bugfix: require("../blah") issues (isaacs) * Bugfix: posix.cat (Jonas Pfenniger) * Do not pause request for multipart parsing (Felix Geisendörfer) ## 2009.12.19, Version 0.1.22 https://github.com/iojs/io.js/commit/a2d809fe902f6c4102dba8f2e3e9551aad137c0f * Bugfix: child modules get wrong id with "index.js" (isaacs) * Bugfix: require("../foo") cycles (isaacs) * Bugfix: require() should throw error if module does. * New URI parser stolen from Narwhal (isaacs) * Bugfix: correctly check kqueue and epoll. (Rasmus Andersson) * Upgrade WAF to 1.5.10 * Bugfix: posix.statSync() was crashing * Statically define string symbols for performance improvement * Bugfix: ARGV[0] weirdness * Added superCtor to ctor.super_ instead superCtor.prototype. (Johan Dahlberg) * http-parser supports webdav methods * API: http.Client.prototype.request() (Christopher Lenz) ## 2009.12.06, Version 0.1.21 https://github.com/iojs/io.js/commit/c6affb64f96a403a14d20035e7fbd6d0ce089db5 * Feature: Add HTTP client TLS support (Rhys Jones) * Bugfix: use --jobs=1 with WAF * Bugfix: Don't use chunked encoding for 1.0 requests * Bugfix: Duplicated header weren't handled correctly * Improve sys.inspect (Xavier Shay) * Upgrade v8 to 2.0.3 * Use CommonJS assert API (Felix Geisendörfer, Karl Guertin) ## 2009.11.28, Version 0.1.20 https://github.com/iojs/io.js/commit/aa42c6790da8ed2cd2b72051c07f6251fe1724d8 * Add gnutls version to configure script * Add V8 heap info to process.memoryUsage() * process.watchFile callback has 2 arguments with the stat object (choonkeat@gmail.com) ## 2009.11.28, Version 0.1.19 https://github.com/iojs/io.js/commit/633d6be328708055897b72327b88ac88e158935f * Feature: Initial TLS support for TCP servers and clients. (Rhys Jones) * Add options to process.watchFile() * Add process.umask() (Friedemann Altrock) * Bugfix: only detach timers when active. * Bugfix: lib/file.js write(), shouldn't always emit errors or success (onne@onnlucky.com) * Bugfix: Memory leak in fs.write (Reported by onne@onnlucky.com) * Bugfix: Fix regular expressions detecting outgoing message headers. (Reported by Elliott Cable) * Improvements to Multipart parser (Felix Geisendörfer) * New HTTP parser * Upgrade v8 to 2.0.2 ## 2009.11.17, Version 0.1.18 https://github.com/iojs/io.js/commit/027829d2853a14490e6de9fc5f7094652d045ab8 * Feature: process.watchFile() process.unwatchFile() * Feature: "uncaughtException" event on process (Felix Geisendörfer) * Feature: 'drain' event to tcp.Connection * Bugfix: Promise.timeout() blocked the event loop (Felix Geisendörfer) * Bugfix: sendBody() and chunked utf8 strings (Felix Geisendörfer) * Supply the strerror as a second arg to the tcp.Connection close event (Johan Sørensen) * Add EventEmitter.removeListener (frodenius@gmail.com) * Format JSON for inspecting objects (Felix Geisendörfer) * Upgrade libev to latest CVS ## 2009.11.07, Version 0.1.17 https://github.com/iojs/io.js/commit/d1f69ef35dac810530df8249d523add168e09f03 * Feature: process.chdir() (Brandon Beacher) * Revert http parser upgrade. (b893859c34f05db5c45f416949ebc0eee665cca6) Broke keep-alive. * API: rename process.inherits to sys.inherits ## 2009.11.03, Version 0.1.16 https://github.com/iojs/io.js/commit/726865af7bbafe58435986f4a193ff11c84e4bfe * API: Use CommonJS-style module requiring - require("/sys.js") becomes require("sys") - require("circle.js") becomes require("./circle") - process.path.join() becomes require("path").join() - __module becomes module * API: Many namespacing changes - Move node.* into process.* - Move node.dns into module "dns" - Move node.fs into module "posix" - process is no longer the global object. GLOBAL is. For more information on the API changes see: http://thread.gmane.org/gmane.comp.lang.javascript.nodejs/6 http://thread.gmane.org/gmane.comp.lang.javascript.nodejs/14 * Feature: process.platform, process.memoryUsage() * Feature: promise.cancel() (Felix Geisendörfer) * Upgrade V8 to 1.3.18 ## 2009.10.28, Version 0.1.15 https://github.com/iojs/io.js/commit/eca2de73ed786b935507fd1c6faccd8df9938fd3 * Many build system fixes (esp. for OSX users) * Feature: promise.timeout() (Felix Geisendörfer) * Feature: Added external interface for signal handlers, process.pid, and process.kill() (Brandon Beacher) * API: Rename node.libraryPaths to require.paths * Bugfix: 'data' event for stdio should emit a string * Large file support * Upgrade http_parser * Upgrade v8 to 1.3.16 ## 2009.10.09, Version 0.1.14 https://github.com/iojs/io.js/commit/b12c809bb84d1265b6a4d970a5b54ee8a4890513 * Feature: Improved addon builds with node-waf * Feature: node.SignalHandler (Brandon Beacher) * Feature: Enable V8 debugging (but still need to make a debugger) * API: Rename library /utils.js to /sys.js * Clean up Node's build system * Don't use parseUri for HTTP server * Remove node.pc * Don't use /bin/sh to create child process except with exec() * API: Add __module to reference current module * API: Remove include() add node.mixin() * Normalize http headers; "Content-Length" becomes "content-length" * Upgrade V8 to 1.3.15 ## 2009.09.30, Version 0.1.13 https://github.com/iojs/io.js/commit/58493bb05b3da3dc8051fabc0bdea9e575c1a107 * Feature: Multipart stream parser (Felix Geisendörfer) * API: Move node.puts(), node.exec() and others to /utils.js * API: Move http, tcp libraries to /http.js and /tcp.js * API: Rename node.exit() to process.exit() * Bugfix: require() and include() should work in callbacks. * Pass the Host header in http.cat calls * Add warning when coroutine stack size grows too large. * Enhance repl library (Ray Morgan) * Bugfix: build script for GCC 4.4 (removed -Werror in V8), on Linux 2.4, and with Python 2.4.4. * Add read() and write() to /file.js to read and write whole files at once. ## 2009.09.24, Version 0.1.12 https://github.com/iojs/io.js/commit/2f56ccb45e87510de712f56705598b3b4e3548ec * Feature: System modules, node.libraryPaths * API: Remove "raw" encoding, rename "raws" to "binary". * API: Added connection.setNoDElay() to disable Nagle algo. * Decrease default TCP server backlog to 128 * Bugfix: memory leak involving node.fs.* methods. * Upgrade v8 to 1.3.13 ## 2009.09.18, Version 0.1.11 https://github.com/iojs/io.js/commit/5ddc4f5d0c002bac0ae3d62fc0dc58f0d2d83ec4 * API: default to utf8 encoding for node.fs.cat() * API: add node.exec() * API: node.fs.read() takes a normal encoding parameter. * API: Change arguments of emit(), emitSuccess(), emitError() * Bugfix: node.fs.write() was stack allocating buffer. * Bugfix: ReportException shouldn't forget the top frame. * Improve buffering for HTTP outgoing messages * Fix and reenable x64 macintosh build. * Upgrade v8 to 1.3.11 ## 2009.09.11, Version 0.1.10 https://github.com/iojs/io.js/commit/12bb0d46ce761e3d00a27170e63b40408c15b558 * Feature: raw string encoding "raws" * Feature: access to environ through "ENV" * Feature: add isDirectory, isFile, isSocket, ... methods to stats object. * Bugfix: Internally use full paths when loading modules this fixes a shebang loading problem. * Bugfix: Add '--' command line argument for seperating v8 args from program args. * Add man page. * Add node-repl * Upgrade v8 to 1.3.10 ## 2009.09.05, Version 0.1.9 https://github.com/iojs/io.js/commit/d029764bb32058389ecb31ed54a5d24d2915ad4c * Bugfix: Compile on Snow Leopard. * Bugfix: Malformed URIs raising exceptions. ## 2009.09.04, Version 0.1.8 https://github.com/iojs/io.js/commit/e6d712a937b61567e81b15085edba863be16ba96 * Feature: External modules * Feature: setTimeout() for node.tcp.Connection * Feature: add node.cwd(), node.fs.readdir(), node.fs.mkdir() * Bugfix: promise.wait() releasing out of order. * Bugfix: Asyncly do getaddrinfo() on Apple. * Disable useless evcom error messages. * Better stack traces. * Built natively on x64. * Upgrade v8 to 1.3.9 ## 2009.08.27, Version 0.1.7 https://github.com/iojs/io.js/commit/f7acef9acf8ba8433d697ad5ed99d2e857387e4b * Feature: global 'process' object. Emits "exit". * Feature: promise.wait() * Feature: node.stdio * Feature: EventEmitters emit "newListener" when listeners are added * API: Use flat object instead of array-of-arrays for HTTP headers. * API: Remove buffered file object (node.File) * API: require(), include() are synchronous. (Uses continuations.) * API: Deprecate onLoad and onExit. * API: Rename node.Process to node.ChildProcess * Refactor node.Process to take advantage of evcom_reader/writer. * Upgrade v8 to 1.3.7 ## 2009.08.22, Version 0.1.6 https://github.com/iojs/io.js/commit/9c97b1db3099d61cd292aa59ec2227a619f3a7ab * Bugfix: Ignore SIGPIPE. ## 2009.08.21, Version 0.1.5 https://github.com/iojs/io.js/commit/b0fd3e281cb5f7cd8d3a26bd2b89e1b59998e5ed * Bugfix: Buggy connections could crash node.js. Now check connection before sending data every time (Kevin van Zonneveld) * Bugfix: stdin fd (0) being ignored by node.File. (Abe Fettig) * API: Remove connnection.fullClose() * API: Return the EventEmitter from addListener for chaining. * API: tcp.Connection "disconnect" event renamed to "close" * Upgrade evcom Upgrade v8 to 1.3.6 ## 2009.08.13, Version 0.1.4 https://github.com/iojs/io.js/commit/0f888ed6de153f68c17005211d7e0f960a5e34f3 * Major refactor to evcom. * Enable test-tcp-many-clients. * Add -m32 gcc flag to udns. * Add connection.readPause() and connection.readResume() Add IncomingMessage.prototype.pause() and resume(). * Fix http benchmark. Wasn't correctly dispatching. * Bugfix: response.setBodyEncoding("ascii") not working. * Bugfix: Negative ints in HTTP's on_body and node.fs.read() * Upgrade v8 to 1.3.4 Upgrade libev to 3.8 Upgrade http_parser to v0.2 ## 2009.08.06, Version 0.1.3 https://github.com/iojs/io.js/commit/695f0296e35b30cf8322fd1bd934810403cca9f3 * Upgrade v8 to 1.3.2 * Bugfix: node.http.ServerRequest.setBodyEncoding('ascii') not working * Bugfix: node.encodeUtf8 was broken. (Connor Dunn) * Add ranlib to udns Makefile. * Upgrade evcom - fix accepting too many connections issue. * Initial support for shebang * Add simple command line switches * Add node.version API ## 2009.08.01, Version 0.1.2 https://github.com/iojs/io.js/commit/025a34244d1cea94d6d40ad7bf92671cb909a96c * Add DNS API * node.tcp.Server's backlog option is now an argument to listen() * Upgrade V8 to 1.3.1 * Bugfix: Default to chunked for client requests without Content-Length. * Bugfix: Line numbers in stack traces. * Bugfix: negative integers in raw encoding stream * Bugfix: node.fs.File was not passing args to promise callbacks. ## 2009.07.27, Version 0.1.1 https://github.com/iojs/io.js/commit/77d407df2826b20e9177c26c0d2bb4481e497937 * Simplify and clean up ObjectWrap. * Upgrade liboi (which is now called evcom) Upgrade libev to 3.7 Upgrade V8 to 1.2.14 * Array.prototype.encodeUtf8 renamed to node.encodeUtf8(array) * Move EventEmitter.prototype.emit() completely into C++. * Bugfix: Fix memory leak in event emitters. http://groups.google.com/group/nodejs/browse_thread/thread/a8d1dfc2fd57a6d1 * Bugfix: Had problems reading scripts with non-ascii characters. * Bugfix: Fix Detach() in node::Server * Bugfix: Sockets not properly reattached if reconnected during disconnect event. * Bugfix: Server-side clients not attached between creation and on_connect. * Add 'close' event to node.tcp.Server * Simplify and clean up http.js. (Takes more advantage of event infrastructure.) * Add benchmark scripts. Run with "make benchmark". ## 2009.06.30, Version 0.1.0 https://github.com/iojs/io.js/commit/0fe44d52fe75f151bceb59534394658aae6ac328 * Update documentation, use asciidoc. * EventEmitter and Promise interfaces. (Breaks previous API.) * Remove node.Process constructor in favor of node.createProcess * Add -m32 flags for compiling on x64 platforms. (Thanks to András Bártházi) * Upgrade v8 to 1.2.10 and libev to 3.6 * Bugfix: Timer::RepeatSetter wasn't working. * Bugfix: Spawning many processes in a loop (reported by Felix Geisendörfer) ## 2009.06.24, Version 0.0.6 https://github.com/iojs/io.js/commit/fbe0be19ebfb422d8fa20ea5204c1713e9214d5f * Load modules via HTTP URLs (Urban Hafner) * Bugfix: Add HTTPConnection->size() and HTTPServer->size() * New node.Process API * Clean up build tools, use v8's test runner. * Use ev_unref() instead of starting/stopping the eio thread pool watcher. ## 2009.06.18, Version 0.0.5 https://github.com/iojs/io.js/commit/3a2b41de74b6c343b8464a68eff04c4bfd9aebea * Support for IPv6 * Remove namespace node.constants * Upgrade v8 to 1.2.8.1 * Accept ports as strings in the TCP client and server. * Bugfix: HTTP Client race * Bugfix: freeaddrinfo() wasn't getting called after getaddrinfo() for TCP servers * Add "opening" to TCP client readyState * Add remoteAddress to TCP client * Add global print() function. ## 2009.06.13, Version 0.0.4 https://github.com/iojs/io.js/commit/916b9ca715b229b0703f0ed6c2fc065410fb189c * Add interrupt() method to server-side HTTP requests. * Bugfix: onBodyComplete was not getting called on server-side HTTP ## 2009.06.11, Version 0.0.3 https://github.com/iojs/io.js/commit/6e0dfe50006ae4f5dac987f055e0c9338662f40a * Many bug fixes including the problem with http.Client on macintosh * Upgrades v8 to 1.2.7 * Adds onExit hook * Guard against buffer overflow in http parser * require() and include() now need the ".js" extension * http.Client uses identity transfer encoding by default. iojs-v1.0.2-darwin-x64/include/000755 000766 000024 00000000000 12456115120 016275 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/000755 000766 000024 00000000000 12456115116 015425 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/LICENSE000644 000766 000024 00000120261 12456115120 015661 0ustar00iojsstaff000000 000000 io.js is licensed for use as follows: """ Copyright io.js contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ This license applies to parts of io.js originating from the https://github.com/joyent/node repository: """ Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ The io.js license applies to all parts of io.js that are not externally maintained libraries. The externally maintained libraries used by io.js are: - V8, located at deps/v8. V8's license follows: """ This license applies to all parts of V8 that are not externally maintained libraries. The externally maintained libraries used by V8 are: - PCRE test suite, located in test/mjsunit/third_party/regexp-pcre.js. This is based on the test suite from PCRE-7.3, which is copyrighted by the University of Cambridge and Google, Inc. The copyright notice and license are embedded in regexp-pcre.js. - Layout tests, located in test/mjsunit/third_party. These are based on layout tests from webkit.org which are copyrighted by Apple Computer, Inc. and released under a 3-clause BSD license. - Strongtalk assembler, the basis of the files assembler-arm-inl.h, assembler-arm.cc, assembler-arm.h, assembler-ia32-inl.h, assembler-ia32.cc, assembler-ia32.h, assembler-x64-inl.h, assembler-x64.cc, assembler-x64.h, assembler-mips-inl.h, assembler-mips.cc, assembler-mips.h, assembler.cc and assembler.h. This code is copyrighted by Sun Microsystems Inc. and released under a 3-clause BSD license. - Valgrind client API header, located at third_party/valgrind/valgrind.h This is release under the BSD license. These libraries have their own licenses; we recommend you read them, as their terms may differ from the terms below. Copyright 2006-2012, the V8 project authors. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Google Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """ - C-Ares, an asynchronous DNS client, located at deps/cares. C-Ares license follows: """ /* Copyright 1998 by the Massachusetts Institute of Technology. * * Permission to use, copy, modify, and distribute this * software and its documentation for any purpose and without * fee is hereby granted, provided that the above copyright * notice appear in all copies and that both that copyright * notice and this permission notice appear in supporting * documentation, and that the name of M.I.T. not be used in * advertising or publicity pertaining to distribution of the * software without specific, written prior permission. * M.I.T. makes no representations about the suitability of * this software for any purpose. It is provided "as is" * without express or implied warranty. """ - OpenSSL located at deps/openssl. OpenSSL is cryptographic software written by Eric Young (eay@cryptsoft.com) to provide SSL/TLS encryption. OpenSSL's license follows: """ /* ==================================================================== * Copyright (c) 1998-2011 The OpenSSL Project. All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * * 2. Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in * the documentation and/or other materials provided with the * distribution. * * 3. All advertising materials mentioning features or use of this * software must display the following acknowledgment: * "This product includes software developed by the OpenSSL Project * for use in the OpenSSL Toolkit. (http://www.openssl.org/)" * * 4. The names "OpenSSL Toolkit" and "OpenSSL Project" must not be used to * endorse or promote products derived from this software without * prior written permission. For written permission, please contact * openssl-core@openssl.org. * * 5. Products derived from this software may not be called "OpenSSL" * nor may "OpenSSL" appear in their names without prior written * permission of the OpenSSL Project. * * 6. Redistributions of any form whatsoever must retain the following * acknowledgment: * "This product includes software developed by the OpenSSL Project * for use in the OpenSSL Toolkit (http://www.openssl.org/)" * * THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY * EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR * ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT * NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, * STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED * OF THE POSSIBILITY OF SUCH DAMAGE. * ==================================================================== * * This product includes cryptographic software written by Eric Young * (eay@cryptsoft.com). This product includes software written by Tim * Hudson (tjh@cryptsoft.com). * */ """ - HTTP Parser, located at deps/http_parser. HTTP Parser's license follows: """ http_parser.c is based on src/http/ngx_http_parse.c from NGINX copyright Igor Sysoev. Additional changes are licensed under the same terms as NGINX and copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ - Closure Linter is located at tools/closure_linter. Closure's license follows: """ # Copyright (c) 2007, Google Inc. # All rights reserved. # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are # met: # # * Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following disclaimer # in the documentation and/or other materials provided with the # distribution. # * Neither the name of Google Inc. nor the names of its # contributors may be used to endorse or promote products derived from # this software without specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """ - tools/cpplint.py is a C++ linter. Its license follows: """ # Copyright (c) 2009 Google Inc. All rights reserved. # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are # met: # # * Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # * Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following disclaimer # in the documentation and/or other materials provided with the # distribution. # * Neither the name of Google Inc. nor the names of its # contributors may be used to endorse or promote products derived from # this software without specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR # A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT # OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """ - lib/punycode.js is copyright 2011 Mathias Bynens and released under the MIT license. """ * Punycode.js * Copyright 2011 Mathias Bynens * Available under MIT license """ - tools/gyp. GYP is a meta-build system. GYP's license follows: """ Copyright (c) 2009 Google Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Google Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """ - Zlib at deps/zlib. zlib's license follows: """ /* zlib.h -- interface of the 'zlib' general purpose compression library version 1.2.8, April 28th, 2013 Copyright (C) 1995-2013 Jean-loup Gailly and Mark Adler This software is provided 'as-is', without any express or implied warranty. In no event will the authors be held liable for any damages arising from the use of this software. Permission is granted to anyone to use this software for any purpose, including commercial applications, and to alter it and redistribute it freely, subject to the following restrictions: 1. The origin of this software must not be misrepresented; you must not claim that you wrote the original software. If you use this software in a product, an acknowledgment in the product documentation would be appreciated but is not required. 2. Altered source versions must be plainly marked as such, and must not be misrepresented as being the original software. 3. This notice may not be removed or altered from any source distribution. Jean-loup Gailly Mark Adler jloup@gzip.org madler@alumni.caltech.edu */ """ - npm is a package manager program located at deps/npm. npm's license follows: """ Copyright (c) Isaac Z. Schlueter All rights reserved. npm is released under the Artistic 2.0 License. The text of the License follows: -------- The Artistic License 2.0 Copyright (c) 2000-2006, The Perl Foundation. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble This license establishes the terms under which a given free software Package may be copied, modified, distributed, and/or redistributed. The intent is that the Copyright Holder maintains some artistic control over the development of that Package while still keeping the Package available as open source and free software. You are always permitted to make arrangements wholly outside of this license directly with the Copyright Holder of a given Package. If the terms of this license do not permit the full use that you propose to make of the Package, you should contact the Copyright Holder and seek a different licensing arrangement. Definitions "Copyright Holder" means the individual(s) or organization(s) named in the copyright notice for the entire Package. "Contributor" means any party that has contributed code or other material to the Package, in accordance with the Copyright Holder's procedures. "You" and "your" means any person who would like to copy, distribute, or modify the Package. "Package" means the collection of files distributed by the Copyright Holder, and derivatives of that collection and/or of those files. A given Package may consist of either the Standard Version, or a Modified Version. "Distribute" means providing a copy of the Package or making it accessible to anyone else, or in the case of a company or organization, to others outside of your company or organization. "Distributor Fee" means any fee that you charge for Distributing this Package or providing support for this Package to another party. It does not mean licensing fees. "Standard Version" refers to the Package if it has not been modified, or has been modified only in ways explicitly requested by the Copyright Holder. "Modified Version" means the Package, if it has been changed, and such changes were not explicitly requested by the Copyright Holder. "Original License" means this Artistic License as Distributed with the Standard Version of the Package, in its current version or as it may be modified by The Perl Foundation in the future. "Source" form means the source code, documentation source, and configuration files for the Package. "Compiled" form means the compiled bytecode, object code, binary, or any other form resulting from mechanical transformation or translation of the Source form. Permission for Use and Modification Without Distribution (1) You are permitted to use the Standard Version and create and use Modified Versions for any purpose without restriction, provided that you do not Distribute the Modified Version. Permissions for Redistribution of the Standard Version (2) You may Distribute verbatim copies of the Source form of the Standard Version of this Package in any medium without restriction, either gratis or for a Distributor Fee, provided that you duplicate all of the original copyright notices and associated disclaimers. At your discretion, such verbatim copies may or may not include a Compiled form of the Package. (3) You may apply any bug fixes, portability changes, and other modifications made available from the Copyright Holder. The resulting Package will still be considered the Standard Version, and as such will be subject to the Original License. Distribution of Modified Versions of the Package as Source (4) You may Distribute your Modified Version as Source (either gratis or for a Distributor Fee, and with or without a Compiled form of the Modified Version) provided that you clearly document how it differs from the Standard Version, including, but not limited to, documenting any non-standard features, executables, or modules, and provided that you do at least ONE of the following: (a) make the Modified Version available to the Copyright Holder of the Standard Version, under the Original License, so that the Copyright Holder may include your modifications in the Standard Version. (b) ensure that installation of your Modified Version does not prevent the user installing or running the Standard Version. In addition, the Modified Version must bear a name that is different from the name of the Standard Version. (c) allow anyone who receives a copy of the Modified Version to make the Source form of the Modified Version available to others under (i) the Original License or (ii) a license that permits the licensee to freely copy, modify and redistribute the Modified Version using the same licensing terms that apply to the copy that the licensee received, and requires that the Source form of the Modified Version, and of any works derived from it, be made freely available in that license fees are prohibited but Distributor Fees are allowed. Distribution of Compiled Forms of the Standard Version or Modified Versions without the Source (5) You may Distribute Compiled forms of the Standard Version without the Source, provided that you include complete instructions on how to get the Source of the Standard Version. Such instructions must be valid at the time of your distribution. If these instructions, at any time while you are carrying out such distribution, become invalid, you must provide new instructions on demand or cease further distribution. If you provide valid instructions or cease distribution within thirty days after you become aware that the instructions are invalid, then you do not forfeit any of your rights under this license. (6) You may Distribute a Modified Version in Compiled form without the Source, provided that you comply with Section 4 with respect to the Source of the Modified Version. Aggregating or Linking the Package (7) You may aggregate the Package (either the Standard Version or Modified Version) with other packages and Distribute the resulting aggregation provided that you do not charge a licensing fee for the Package. Distributor Fees are permitted, and licensing fees for other components in the aggregation are permitted. The terms of this license apply to the use and Distribution of the Standard or Modified Versions as included in the aggregation. (8) You are permitted to link Modified and Standard Versions with other works, to embed the Package in a larger work of your own, or to build stand-alone binary or bytecode versions of applications that include the Package, and Distribute the result without restriction, provided the result does not expose a direct interface to the Package. Items That are Not Considered Part of a Modified Version (9) Works (including, but not limited to, modules and scripts) that merely extend or make use of the Package, do not, by themselves, cause the Package to be a Modified Version. In addition, such works are not considered parts of the Package itself, and are not subject to the terms of this license. General Provisions (10) Any use, modification, and distribution of the Standard or Modified Versions is governed by this Artistic License. By using, modifying or distributing the Package, you accept this license. Do not use, modify, or distribute the Package, if you do not accept this license. (11) If your Modified Version has been derived from a Modified Version made by someone other than you, you are nevertheless required to ensure that your Modified Version complies with the requirements of this license. (12) This license does not grant you the right to use any trademark, service mark, tradename, or logo of the Copyright Holder. (13) This license includes the non-exclusive, worldwide, free-of-charge patent license to make, have made, use, offer to sell, sell, import and otherwise transfer the Package with respect to any patent claims licensable by the Copyright Holder that are necessarily infringed by the Package. If you institute patent litigation (including a cross-claim or counterclaim) against any party alleging that the Package constitutes direct or contributory patent infringement, then this Artistic License to you shall terminate on the date that such litigation is filed. (14) Disclaimer of Warranty: THE PACKAGE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS "AS IS' AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES. THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT ARE DISCLAIMED TO THE EXTENT PERMITTED BY YOUR LOCAL LAW. UNLESS REQUIRED BY LAW, NO COPYRIGHT HOLDER OR CONTRIBUTOR WILL BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING IN ANY WAY OUT OF THE USE OF THE PACKAGE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -------- "Node.js" and "node" trademark Joyent, Inc. npm is not officially part of the Node.js project, and is neither owned by nor officially affiliated with Joyent, Inc. Packages published in the npm registry (other than the Software and its included dependencies) are not part of npm itself, are the sole property of their respective maintainers, and are not covered by this license. "npm Logo" created by Mathias Pettersson and Brian Hammond, used with permission. "Gubblebum Blocky" font Copyright (c) by Tjarda Koster, http://jelloween.deviantart.com included for use in the npm website and documentation, used with permission. This program uses several io.js modules contained in the node_modules/ subdirectory, according to the terms of their respective licenses. """ - tools/doc/node_modules/marked. Marked is a Markdown parser. Marked's license follows: """ Copyright (c) 2011-2012, Christopher Jeffrey (https://github.com/chjj/) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. """ - test/gc/node_modules/weak. Node-weak is a node.js addon that provides garbage collector notifications. Node-weak's license follows: """ Copyright (c) 2011, Ben Noordhuis Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. """ - wrk is located at tools/wrk. wrk's license follows: """ Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS """ iojs-v1.0.2-darwin-x64/README.md000644 000766 000024 00000012614 12456115120 016135 0ustar00iojsstaff000000 000000 io.js === [![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/iojs/io.js?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) This repository began as a GitHub fork of [joyent/node](https://github.com/joyent/node). io.js contributions, releases, and contributorship are under an [open governance model](./GOVERNANCE.md). We intend to land, with increasing regularity, releases which are compatible with the npm ecosystem that has been built to date for Node.js. ## Is it io.js or IO.js or iojs or IOjs or iOjS? The official name is **io.js**, which should never be capitalized, especially not at the start of a sentence, unless it is being displayed in a location that is customarily all-caps (such as the title of man pages.) ## To build: ### Unix / Macintosh Prerequisites: * `gcc` and `g++` 4.8 or newer, or * `clang` and `clang++` 3.3 or newer * Python 2.6 or 2.7 * GNU Make 3.81 or newer * libexecinfo (FreeBSD and OpenBSD only) ```text $ ./configure $ make $ make install ``` If your Python binary is in a non-standard location or has a non-standard name, run the following instead: ```text $ export PYTHON=/path/to/python $ $PYTHON ./configure $ make $ make install ``` To run the tests: ```text $ make test ``` To build the documentation: ```text $ make doc ``` To read the documentation: ```text $ man doc/iojs.1 ``` ### Windows Prerequisites: * [Python 2.6 or 2.7](https://www.python.org/downloads/) * Visual Studio 2013 for Windows Desktop, or * Visual Studio Express 2013 for Windows Desktop * Basic Unix tools required for some tests, [Git for Windows](http://git-scm.com/download/win) includes Git Bash and tools which can be included in the global `PATH`. ```text > vcbuild nosign ``` To run the tests: ```text > vcbuild test ``` ### `Intl` (ECMA-402) support: [Intl](https://github.com/joyent/node/wiki/Intl) support is not enabled by default. #### "small" (English only) support This option will build with "small" (English only) support, but the full `Intl` (ECMA-402) APIs. With `--download=all` it will download the ICU library as needed. Unix / Macintosh: ```text $ ./configure --with-intl=small-icu --download=all ``` Windows: ```text > vcbuild small-icu download-all ``` The `small-icu` mode builds with English-only data. You can add full data at runtime. *Note:* more docs are on [the joyent/node wiki](https://github.com/joyent/node/wiki/Intl). #### Build with full ICU support (all locales supported by ICU): With the `--download=all`, this may download ICU if you don't have an ICU in `deps/icu`. Unix / Macintosh: ```text $ ./configure --with-intl=full-icu --download=all ``` Windows: ```text > vcbuild full-icu download-all ``` #### Build with no Intl support `:-(` The `Intl` object will not be available. This is the default at present, so this option is not normally needed. Unix / Macintosh: ```text $ ./configure --with-intl=none ``` Windows: ```text > vcbuild intl-none ``` #### Use existing installed ICU (Unix / Macintosh only): ```text $ pkg-config --modversion icu-i18n && ./configure --with-intl=system-icu ``` #### Build with a specific ICU: You can find other ICU releases at [the ICU homepage](http://icu-project.org/download). Download the file named something like `icu4c-**##.#**-src.tgz` (or `.zip`). Unix / Macintosh ```text # from an already-unpacked ICU: $ ./configure --with-intl=[small-icu,full-icu] --with-icu-source=/path/to/icu # from a local ICU tarball $ ./configure --with-intl=[small-icu,full-icu] --with-icu-source=/path/to/icu.tgz # from a tarball URL $ ./configure --with-intl=full-icu --with-icu-source=http://url/to/icu.tgz ``` Windows First unpack latest ICU to `deps/icu` [icu4c-**##.#**-src.tgz](http://icu-project.org/download) (or `.zip`) as `deps/icu` (You'll have: `deps/icu/source/...`) ```text > vcbuild full-icu ``` ## Resources for Newcomers * [CONTRIBUTING.md](./CONTRIBUTING.md) * [GOVERNANCE.md](./GOVERNANCE.md) * IRC: [#io.js on Freenode.net](http://webchat.freenode.net?channels=io.js&uio=d4) * [iojs/io.js on Gitter](https://gitter.im/iojs/io.js) ## Current Project Team Members The io.js project team comprises a group of core collaborators and a sub-group that forms the _Technical Committee_ (TC) which governs the project. For more information about the governance of the io.js project, see [GOVERNANCE.md](./GOVERNANCE.md). * **Isaac Z. Schlueter** ([@isaacs](https://github.com/isaacs)) <i@izs.me> (Technical Committee) * **Ben Noordhuis** ([@bnoordhuis](https://github.com/bnoordhuis)) <info@bnoordhuis.nl> (Technical Committee) * **Bert Belder** ([@piscisaureus](https://github.com/piscisaureus)) <bertbelder@gmail.com> (Technical Committee) * **Fedor Indutny** ([@indutny](https://github.com/indutny)) <fedor.indutny@gmail.com> (Technical Committee) * **Trevor Norris** ([@trevnorris](https://github.com/trevnorris)) <trev.norris@gmail.com> (Technical Committee) * **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com> (Technical Committee) * **Colin Ihrig** ([@cjihrig](https://github.com/cjihrig)) <cjihrig@gmail.com> (Technical Committee) * **Mikeal Rogers** ([@mikeal](https://github.com/mikeal)) <mikeal.rogers@gmail.com> * **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org> Collaborators follow the [COLLABORATOR_GUIDE.md](./COLLABORATOR_GUIDE.md) in maintaining the io.js project. iojs-v1.0.2-darwin-x64/share/000755 000766 000024 00000000000 12456115116 015761 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/share/man/000755 000766 000024 00000000000 12456115116 016534 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/share/systemtap/000755 000766 000024 00000000000 12456115116 020012 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/share/systemtap/tapset/000755 000766 000024 00000000000 12456115116 021312 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/share/systemtap/tapset/node.stp000644 000766 000024 00000005535 12455173734 023010 0ustar00iojsstaff000000 000000 probe node_net_server_connection = process("node").mark("net__server__connection") { remote = user_string($arg2); port = $arg3; fd = $arg4; probestr = sprintf("%s(remote=%s, port=%d, fd=%d)", $$name, remote, port, fd); } probe node_net_stream_end = process("node").mark("net__stream__end") { remote = user_string($arg2); port = $arg3; fd = $arg4; probestr = sprintf("%s(remote=%s, port=%d, fd=%d)", $$name, remote, port, fd); } probe node_net_socket_write = process("node").mark("net__socket__write") { bytes = $arg2; remote = user_string($arg3); port = $arg4; fd = $arg5; probestr = sprintf("%s(bytes=%d, remote=%s, port=%d, fd=%d)", $$name, bytes, remote, port, fd); } probe node_net_socket_read = process("node").mark("net__socket__read") { bytes = $arg2; remote = user_string($arg3); port = $arg4; fd = $arg5; probestr = sprintf("%s(bytes=%d, remote=%s, port=%d, fd=%d)", $$name, bytes, remote, port, fd); } probe node_http_server_request = process("node").mark("http__server__request") { remote = user_string($arg3); port = $arg4; method = user_string($arg5); url = user_string($arg6); fd = $arg7; probestr = sprintf("%s(remote=%s, port=%d, method=%s, url=%s, fd=%d)", $$name, remote, port, method, url, fd); } probe node_http_server_response = process("node").mark("http__server__response") { remote = user_string($arg2); port = $arg3; fd = $arg4; probestr = sprintf("%s(remote=%s, port=%d, fd=%d)", $$name, remote, port, fd); } probe node_http_client_request = process("node").mark("http__client__request") { remote = user_string($arg3); port = $arg4; method = user_string($arg5); url = user_string($arg6); fd = $arg7; probestr = sprintf("%s(remote=%s, port=%d, method=%s, url=%s, fd=%d)", $$name, remote, port, method, url, fd); } probe node_http_client_response = process("node").mark("http__client__response") { remote = user_string($arg2); port = $arg3; fd = $arg4; probestr = sprintf("%s(remote=%s, port=%d, fd=%d)", $$name, remote, port, fd); } probe node_gc_start = process("node").mark("gc__start") { scavenge = 1 << 0; compact = 1 << 1; if ($arg1 == scavenge) type = "kGCTypeScavenge"; else if ($arg1 == compact) type = "kGCTypeMarkSweepCompact"; else type = "kGCTypeAll"; flags = $arg2; probestr = sprintf("%s(type=%s,flags=%d)", $$name, type, flags); } probe node_gc_stop = process("node").mark("gc__stop") { scavenge = 1 << 0; compact = 1 << 1; if ($arg1 == scavenge) type = "kGCTypeScavenge"; else if ($arg1 == compact) type = "kGCTypeMarkSweepCompact"; else type = "kGCTypeAll"; flags = $arg2; probestr = sprintf("%s(type=%s,flags=%d)", $$name, type, flags); } iojs-v1.0.2-darwin-x64/share/man/man1/000755 000766 000024 00000000000 12456115116 017370 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/share/man/man1/iojs.1000644 000766 000024 00000102752 12455173734 020436 0ustar00iojsstaff000000 000000 .TH IO.JS "1" "2010" "" "" .SH "NAME" iojs \- Server-side JavaScript .SH SYNOPSIS .B iojs [ .B \-v ] [ .B \-\-debug | .B \-\-debug-brk ] [ .B \-\-v8-options ] .br [ .B \-e .I command | .I script.js ] [ .I arguments ] Execute without arguments to start the REPL. .SH DESCRIPTION io.js is a set of libraries for javascript which allows it to be used outside of the browser. It is primarily focused on creating simple, easy to build network clients and servers. .SH OPTIONS -v, --version print iojs's version -e, --eval script evaluate script -p, --print print result of --eval -i, --interactive always enter the REPL even if stdin does not appear to be a terminal --no-deprecation silence deprecation warnings --trace-deprecation show stack traces on deprecations --throw-deprecation throw errors on deprecations --v8-options print v8 command line options --max-stack-size=val set max v8 stack size (bytes) .SH ENVIRONMENT VARIABLES .IP NODE_PATH \':\'\-separated list of directories prefixed to the module search path. .IP NODE_MODULE_CONTEXTS If set to 1 then modules will load in their own global contexts. .IP NODE_DISABLE_COLORS If set to 1 then colors will not be used in the REPL. .SH V8 OPTIONS --use_strict (enforce strict mode) type: bool default: false --es_staging (enable all completed harmony features) type: bool default: false --harmony (enable all completed harmony features) type: bool default: false --harmony_shipping (enable all shipped harmony fetaures) type: bool default: true --harmony_modules (enable "harmony modules (implies block scoping)" (in progress)) type: bool default: false --harmony_arrays (enable "harmony array methods" (in progress)) type: bool default: false --harmony_array_includes (enable "harmony Array.prototype.includes" (in progress)) type: bool default: false --harmony_regexps (enable "harmony regular expression extensions" (in progress)) type: bool default: false --harmony_arrow_functions (enable "harmony arrow functions" (in progress)) type: bool default: false --harmony_proxies (enable "harmony proxies" (in progress)) type: bool default: false --harmony_sloppy (enable "harmony features in sloppy mode" (in progress)) type: bool default: false --harmony_unicode (enable "harmony unicode escapes" (in progress)) type: bool default: false --harmony_tostring (enable "harmony toString") type: bool default: false --harmony_numeric_literals (enable "harmony numeric literals") type: bool default: true --harmony_strings (enable "harmony string methods") type: bool default: true --harmony_scoping (enable "harmony block scoping") type: bool default: true --harmony_classes (enable "harmony classes (implies block scoping & object literal extension)") type: bool default: true --harmony_object_literals (enable "harmony object literal extensions") type: bool default: true --harmony_templates (enable "harmony template literals") type: bool default: true --compiled_keyed_generic_loads (use optimizing compiler to generate keyed generic load stubs) type: bool default: false --pretenuring_call_new (pretenure call new) type: bool default: false --allocation_site_pretenuring (pretenure with allocation sites) type: bool default: true --trace_pretenuring (trace pretenuring decisions of HAllocate instructions) type: bool default: false --trace_pretenuring_statistics (trace allocation site pretenuring statistics) type: bool default: false --track_fields (track fields with only smi values) type: bool default: true --track_double_fields (track fields with double values) type: bool default: true --track_heap_object_fields (track fields with heap values) type: bool default: true --track_computed_fields (track computed boilerplate fields) type: bool default: true --track_field_types (track field types) type: bool default: true --smi_binop (support smi representation in binary operations) type: bool default: true --vector_ics (support vector-based ics) type: bool default: false --optimize_for_size (Enables optimizations which favor memory size over execution speed.) type: bool default: false --unbox_double_arrays (automatically unbox arrays of doubles) type: bool default: true --string_slices (use string slices) type: bool default: true --crankshaft (use crankshaft) type: bool default: true --hydrogen_filter (optimization filter) type: string default: * --use_gvn (use hydrogen global value numbering) type: bool default: true --gvn_iterations (maximum number of GVN fix-point iterations) type: int default: 3 --use_canonicalizing (use hydrogen instruction canonicalizing) type: bool default: true --use_inlining (use function inlining) type: bool default: true --use_escape_analysis (use hydrogen escape analysis) type: bool default: true --use_allocation_folding (use allocation folding) type: bool default: true --use_local_allocation_folding (only fold in basic blocks) type: bool default: false --use_write_barrier_elimination (eliminate write barriers targeting allocations in optimized code) type: bool default: true --max_inlining_levels (maximum number of inlining levels) type: int default: 5 --max_inlined_source_size (maximum source size in bytes considered for a single inlining) type: int default: 600 --max_inlined_nodes (maximum number of AST nodes considered for a single inlining) type: int default: 196 --max_inlined_nodes_cumulative (maximum cumulative number of AST nodes considered for inlining) type: int default: 400 --loop_invariant_code_motion (loop invariant code motion) type: bool default: true --fast_math (faster (but maybe less accurate) math functions) type: bool default: true --collect_megamorphic_maps_from_stub_cache (crankshaft harvests type feedback from stub cache) type: bool default: true --hydrogen_stats (print statistics for hydrogen) type: bool default: false --trace_check_elimination (trace check elimination phase) type: bool default: false --trace_hydrogen (trace generated hydrogen to file) type: bool default: false --trace_hydrogen_filter (hydrogen tracing filter) type: string default: * --trace_hydrogen_stubs (trace generated hydrogen for stubs) type: bool default: false --trace_hydrogen_file (trace hydrogen to given file name) type: string default: NULL --trace_phase (trace generated IR for specified phases) type: string default: HLZ --trace_inlining (trace inlining decisions) type: bool default: false --trace_load_elimination (trace load elimination) type: bool default: false --trace_store_elimination (trace store elimination) type: bool default: false --trace_alloc (trace register allocator) type: bool default: false --trace_all_uses (trace all use positions) type: bool default: false --trace_range (trace range analysis) type: bool default: false --trace_gvn (trace global value numbering) type: bool default: false --trace_representation (trace representation types) type: bool default: false --trace_removable_simulates (trace removable simulates) type: bool default: false --trace_escape_analysis (trace hydrogen escape analysis) type: bool default: false --trace_allocation_folding (trace allocation folding) type: bool default: false --trace_track_allocation_sites (trace the tracking of allocation sites) type: bool default: false --trace_migration (trace object migration) type: bool default: false --trace_generalization (trace map generalization) type: bool default: false --stress_pointer_maps (pointer map for every instruction) type: bool default: false --stress_environments (environment for every instruction) type: bool default: false --deopt_every_n_times (deoptimize every n times a deopt point is passed) type: int default: 0 --deopt_every_n_garbage_collections (deoptimize every n garbage collections) type: int default: 0 --print_deopt_stress (print number of possible deopt points) type: bool default: false --trap_on_deopt (put a break point before deoptimizing) type: bool default: false --trap_on_stub_deopt (put a break point before deoptimizing a stub) type: bool default: false --deoptimize_uncommon_cases (deoptimize uncommon cases) type: bool default: true --polymorphic_inlining (polymorphic inlining) type: bool default: true --use_osr (use on-stack replacement) type: bool default: true --array_bounds_checks_elimination (perform array bounds checks elimination) type: bool default: true --trace_bce (trace array bounds check elimination) type: bool default: false --array_bounds_checks_hoisting (perform array bounds checks hoisting) type: bool default: false --array_index_dehoisting (perform array index dehoisting) type: bool default: true --analyze_environment_liveness (analyze liveness of environment slots and zap dead values) type: bool default: true --load_elimination (use load elimination) type: bool default: true --check_elimination (use check elimination) type: bool default: true --store_elimination (use store elimination) type: bool default: false --dead_code_elimination (use dead code elimination) type: bool default: true --fold_constants (use constant folding) type: bool default: true --trace_dead_code_elimination (trace dead code elimination) type: bool default: false --unreachable_code_elimination (eliminate unreachable code) type: bool default: true --trace_osr (trace on-stack replacement) type: bool default: false --stress_runs (number of stress runs) type: int default: 0 --lookup_sample_by_shared (when picking a function to optimize, watch for shared function info, not JSFunction itself) type: bool default: true --cache_optimized_code (cache optimized code for closures) type: bool default: true --flush_optimized_code_cache (flushes the cache of optimized code for closures on every GC) type: bool default: true --inline_construct (inline constructor calls) type: bool default: true --inline_arguments (inline functions with arguments object) type: bool default: true --inline_accessors (inline JavaScript accessors) type: bool default: true --escape_analysis_iterations (maximum number of escape analysis fix-point iterations) type: int default: 2 --optimize_for_in (optimize functions containing for-in loops) type: bool default: true --concurrent_recompilation (optimizing hot functions asynchronously on a separate thread) type: bool default: true --job_based_recompilation (post tasks to v8::Platform instead of using a thread for concurrent recompilation) type: bool default: false --trace_concurrent_recompilation (track concurrent recompilation) type: bool default: false --concurrent_recompilation_queue_length (the length of the concurrent compilation queue) type: int default: 8 --concurrent_recompilation_delay (artificial compilation delay in ms) type: int default: 0 --block_concurrent_recompilation (block queued jobs until released) type: bool default: false --concurrent_osr (concurrent on-stack replacement) type: bool default: true --omit_map_checks_for_leaf_maps (do not emit check maps for constant values that have a leaf map, deoptimize the optimized code if the layout of the maps changes.) type: bool default: true --turbo_filter (optimization filter for TurboFan compiler) type: string default: ~ --trace_turbo (trace generated TurboFan IR) type: bool default: false --trace_turbo_graph (trace generated TurboFan graphs) type: bool default: false --trace_turbo_cfg_file (trace turbo cfg graph (for C1 visualizer) to a given file name) type: string default: NULL --trace_turbo_types (trace TurboFan's types) type: bool default: true --trace_turbo_scheduler (trace TurboFan's scheduler) type: bool default: false --trace_turbo_reduction (trace TurboFan's various reducers) type: bool default: false --trace_turbo_jt (trace TurboFan's jump threading) type: bool default: false --turbo_asm (enable TurboFan for asm.js code) type: bool default: true --turbo_verify (verify TurboFan graphs at each phase) type: bool default: false --turbo_stats (print TurboFan statistics) type: bool default: false --turbo_types (use typed lowering in TurboFan) type: bool default: true --turbo_source_positions (track source code positions when building TurboFan IR) type: bool default: false --context_specialization (enable context specialization in TurboFan) type: bool default: false --turbo_deoptimization (enable deoptimization in TurboFan) type: bool default: false --turbo_inlining (enable inlining in TurboFan) type: bool default: false --turbo_inlining_intrinsics (enable inlining of intrinsics in TurboFan) type: bool default: false --trace_turbo_inlining (trace TurboFan inlining) type: bool default: false --loop_assignment_analysis (perform loop assignment analysis) type: bool default: true --turbo_profiling (enable profiling in TurboFan) type: bool default: false --turbo_reuse_spill_slots (reuse spill slots in TurboFan) type: bool default: true --turbo_delay_ssa_decon (delay ssa deconstruction in TurboFan register allocator) type: bool default: false --turbo_move_optimization (optimize gap moves in TurboFan) type: bool default: true --turbo_jt (enable jump threading) type: bool default: true --typed_array_max_size_in_heap (threshold for in-heap typed array) type: int default: 64 --frame_count (number of stack frames inspected by the profiler) type: int default: 1 --interrupt_budget (execution budget before interrupt is triggered) type: int default: 6144 --type_info_threshold (percentage of ICs that must have type info to allow optimization) type: int default: 25 --generic_ic_threshold (max percentage of megamorphic/generic ICs to allow optimization) type: int default: 30 --self_opt_count (call count before self-optimization) type: int default: 130 --trace_opt_verbose (extra verbose compilation tracing) type: bool default: false --debug_code (generate extra code (assertions) for debugging) type: bool default: false --code_comments (emit comments in code disassembly) type: bool default: false --enable_sse3 (enable use of SSE3 instructions if available) type: bool default: true --enable_sse4_1 (enable use of SSE4.1 instructions if available) type: bool default: true --enable_sahf (enable use of SAHF instruction if available (X64 only)) type: bool default: true --enable_avx (enable use of AVX instructions if available) type: bool default: true --enable_fma3 (enable use of FMA3 instructions if available) type: bool default: true --enable_vfp3 (enable use of VFP3 instructions if available) type: bool default: true --enable_armv7 (enable use of ARMv7 instructions if available (ARM only)) type: bool default: true --enable_armv8 (enable use of ARMv8 instructions if available (ARM 32-bit only)) type: bool default: true --enable_neon (enable use of NEON instructions if available (ARM only)) type: bool default: true --enable_sudiv (enable use of SDIV and UDIV instructions if available (ARM only)) type: bool default: true --enable_mls (enable use of MLS instructions if available (ARM only)) type: bool default: true --enable_movw_movt (enable loading 32-bit constant by means of movw/movt instruction pairs (ARM only)) type: bool default: false --enable_unaligned_accesses (enable unaligned accesses for ARMv7 (ARM only)) type: bool default: true --enable_32dregs (enable use of d16-d31 registers on ARM - this requires VFP3) type: bool default: true --enable_vldr_imm (enable use of constant pools for double immediate (ARM only)) type: bool default: false --force_long_branches (force all emitted branches to be in long mode (MIPS only)) type: bool default: false --expose_natives_as (expose natives in global object) type: string default: NULL --expose_debug_as (expose debug in global object) type: string default: NULL --expose_free_buffer (expose freeBuffer extension) type: bool default: false --expose_gc (expose gc extension) type: bool default: false --expose_gc_as (expose gc extension under the specified name) type: string default: NULL --expose_externalize_string (expose externalize string extension) type: bool default: false --expose_trigger_failure (expose trigger-failure extension) type: bool default: false --stack_trace_limit (number of stack frames to capture) type: int default: 10 --builtins_in_stack_traces (show built-in functions in stack traces) type: bool default: false --disable_native_files (disable builtin natives files) type: bool default: false --inline_new (use fast inline allocation) type: bool default: true --trace_codegen (print name of functions for which code is generated) type: bool default: false --trace (trace function calls) type: bool default: false --mask_constants_with_cookie (use random jit cookie to mask large constants) type: bool default: true --lazy (use lazy compilation) type: bool default: true --trace_opt (trace lazy optimization) type: bool default: false --trace_opt_stats (trace lazy optimization statistics) type: bool default: false --opt (use adaptive optimizations) type: bool default: true --always_opt (always try to optimize functions) type: bool default: false --always_osr (always try to OSR functions) type: bool default: false --prepare_always_opt (prepare for turning on always opt) type: bool default: false --trace_deopt (trace optimize function deoptimization) type: bool default: false --trace_stub_failures (trace deoptimization of generated code stubs) type: bool default: false --serialize_toplevel (enable caching of toplevel scripts) type: bool default: true --serialize_inner (enable caching of inner functions) type: bool default: false --trace_serializer (print code serializer trace) type: bool default: false --min_preparse_length (minimum length for automatic enable preparsing) type: int default: 1024 --max_opt_count (maximum number of optimization attempts before giving up.) type: int default: 10 --compilation_cache (enable compilation cache) type: bool default: true --cache_prototype_transitions (cache prototype transitions) type: bool default: true --cpu_profiler_sampling_interval (CPU profiler sampling interval in microseconds) type: int default: 1000 --trace_debug_json (trace debugging JSON request/response) type: bool default: false --trace_js_array_abuse (trace out-of-bounds accesses to JS arrays) type: bool default: false --trace_external_array_abuse (trace out-of-bounds-accesses to external arrays) type: bool default: false --trace_array_abuse (trace out-of-bounds accesses to all arrays) type: bool default: false --enable_liveedit (enable liveedit experimental feature) type: bool default: true --hard_abort (abort by crashing) type: bool default: true --stack_size (default size of stack region v8 is allowed to use (in kBytes)) type: int default: 984 --max_stack_trace_source_length (maximum length of function source code printed in a stack trace.) type: int default: 300 --always_inline_smi_code (always inline smi code in non-opt code) type: bool default: false --min_semi_space_size (min size of a semi-space (in MBytes), the new space consists of twosemi-spaces) type: int default: 0 --target_semi_space_size (target size of a semi-space (in MBytes) before triggering a GC) type: int default: 0 --max_semi_space_size (max size of a semi-space (in MBytes), the new space consists of twosemi-spaces) type: int default: 0 --semi_space_growth_factor (factor by which to grow the new space) type: int default: 2 --experimental_new_space_growth_heuristic (Grow the new space based on the percentage of survivors instead of their absolute value.) type: bool default: false --max_old_space_size (max size of the old space (in Mbytes)) type: int default: 0 --initial_old_space_size (initial old space size (in Mbytes)) type: int default: 0 --max_executable_size (max size of executable memory (in Mbytes)) type: int default: 0 --gc_global (always perform global GCs) type: bool default: false --gc_interval (garbage collect after allocations) type: int default: -1 --trace_gc (print one trace line following each garbage collection) type: bool default: false --trace_gc_nvp (print one detailed trace line in name=value format after each garbage collection) type: bool default: false --trace_gc_ignore_scavenger (do not print trace line after scavenger collection) type: bool default: false --trace_idle_notification (print one trace line following each idle notification) type: bool default: false --trace_idle_notification_verbose (prints the heap state used by the idle notification) type: bool default: false --print_cumulative_gc_stat (print cumulative GC statistics in name=value format on exit) type: bool default: false --print_max_heap_committed (print statistics of the maximum memory committed for the heap in name=value format on exit) type: bool default: false --trace_gc_verbose (print more details following each garbage collection) type: bool default: false --trace_fragmentation (report fragmentation for old pointer and data pages) type: bool default: false --collect_maps (garbage collect maps from which no objects can be reached) type: bool default: true --weak_embedded_maps_in_optimized_code (make maps embedded in optimized code weak) type: bool default: true --weak_embedded_objects_in_optimized_code (make objects embedded in optimized code weak) type: bool default: true --flush_code (flush code that we expect not to use again (during full gc)) type: bool default: true --flush_code_incrementally (flush code that we expect not to use again (incrementally)) type: bool default: true --trace_code_flushing (trace code flushing progress) type: bool default: false --age_code (track un-executed functions to age code and flush only old code (required for code flushing)) type: bool default: true --incremental_marking (use incremental marking) type: bool default: true --incremental_marking_steps (do incremental marking steps) type: bool default: true --concurrent_sweeping (use concurrent sweeping) type: bool default: true --trace_incremental_marking (trace progress of the incremental marking) type: bool default: false --track_gc_object_stats (track object counts and memory usage) type: bool default: false --heap_profiler_trace_objects (Dump heap object allocations/movements/size_updates) type: bool default: false --use_idle_notification (Use idle notification to reduce memory footprint.) type: bool default: true --use_ic (use inline caching) type: bool default: true --trace_ic (trace inline cache state transitions) type: bool default: false --native_code_counters (generate extra code for manipulating stats counters) type: bool default: false --always_compact (Perform compaction on every full GC) type: bool default: false --never_compact (Never perform compaction on full GC - testing only) type: bool default: false --compact_code_space (Compact code space on full non-incremental collections) type: bool default: true --incremental_code_compaction (Compact code space on full incremental collections) type: bool default: true --cleanup_code_caches_at_gc (Flush inline caches prior to mark compact collection and flush code caches in maps during mark compact cycle.) type: bool default: true --use_marking_progress_bar (Use a progress bar to scan large objects in increments when incremental marking is active.) type: bool default: true --zap_code_space (Zap free memory in code space with 0xCC while sweeping.) type: bool default: true --random_seed (Default seed for initializing random generator (0, the default, means to use system random).) type: int default: 0 --trace_weak_arrays (trace WeakFixedArray usage) type: bool default: false --track_prototype_users (keep track of which maps refer to a given prototype object) type: bool default: false --use_verbose_printer (allows verbose printing) type: bool default: true --allow_natives_syntax (allow natives syntax) type: bool default: false --trace_parse (trace parsing and preparsing) type: bool default: false --trace_sim (Trace simulator execution) type: bool default: false --debug_sim (Enable debugging the simulator) type: bool default: false --check_icache (Check icache flushes in ARM and MIPS simulator) type: bool default: false --stop_sim_at (Simulator stop after x number of instructions) type: int default: 0 --sim_stack_alignment (Stack alingment in bytes in simulator (4 or 8, 8 is default)) type: int default: 8 --sim_stack_size (Stack size of the ARM64 and MIPS64 simulator in kBytes (default is 2 MB)) type: int default: 2048 --log_regs_modified (When logging register values, only print modified registers.) type: bool default: true --log_colour (When logging, try to use coloured output.) type: bool default: true --ignore_asm_unimplemented_break (Don't break for ASM_UNIMPLEMENTED_BREAK macros.) type: bool default: false --trace_sim_messages (Trace simulator debug messages. Implied by --trace-sim.) type: bool default: false --stack_trace_on_illegal (print stack trace when an illegal exception is thrown) type: bool default: false --abort_on_uncaught_exception (abort program (dump core) when an uncaught exception is thrown) type: bool default: false --randomize_hashes (randomize hashes to avoid predictable hash collisions (with snapshots this option cannot override the baked-in seed)) type: bool default: true --hash_seed (Fixed seed to use to hash property keys (0 means random)(with snapshots this option cannot override the baked-in seed)) type: int default: 0 --profile_deserialization (Print the time it takes to deserialize the snapshot.) type: bool default: false --regexp_optimization (generate optimized regexp code) type: bool default: true --testing_bool_flag (testing_bool_flag) type: bool default: true --testing_maybe_bool_flag (testing_maybe_bool_flag) type: maybe_bool default: unset --testing_int_flag (testing_int_flag) type: int default: 13 --testing_float_flag (float-flag) type: float default: 2.5 --testing_string_flag (string-flag) type: string default: Hello, world! --testing_prng_seed (Seed used for threading test randomness) type: int default: 42 --testing_serialization_file (file in which to serialize heap) type: string default: /tmp/serdes --startup_blob (Write V8 startup blob file. (mksnapshot only)) type: string default: NULL --profile_hydrogen_code_stub_compilation (Print the time it takes to lazily compile hydrogen code stubs.) type: bool default: false --predictable (enable predictable mode) type: bool default: false --help (Print usage message, including flags, on console) type: bool default: true --dump_counters (Dump counters on exit) type: bool default: false --debugger (Enable JavaScript debugger) type: bool default: false --map_counters (Map counters to a file) type: string default: --js_arguments (Pass all remaining arguments to the script. Alias for "--".) type: arguments default: --gdbjit (enable GDBJIT interface (disables compacting GC)) type: bool default: false --gdbjit_full (enable GDBJIT interface for all code objects) type: bool default: false --gdbjit_dump (dump elf objects with debug info to disk) type: bool default: false --gdbjit_dump_filter (dump only objects containing this substring) type: string default: --force_marking_deque_overflows (force overflows of marking deque by reducing it's size to 64 words) type: bool default: false --stress_compaction (stress the GC compactor to flush out bugs (implies --force_marking_deque_overflows)) type: bool default: false --log (Minimal logging (no API, code, GC, suspect, or handles samples).) type: bool default: false --log_all (Log all events to the log file.) type: bool default: false --log_api (Log API events to the log file.) type: bool default: false --log_code (Log code events to the log file without profiling.) type: bool default: false --log_gc (Log heap samples on garbage collection for the hp2ps tool.) type: bool default: false --log_handles (Log global handle events.) type: bool default: false --log_snapshot_positions (log positions of (de)serialized objects in the snapshot.) type: bool default: false --log_suspect (Log suspect operations.) type: bool default: false --prof (Log statistical profiling information (implies --log-code).) type: bool default: false --prof_browser_mode (Used with --prof, turns on browser-compatible mode for profiling.) type: bool default: true --log_regexp (Log regular expression execution.) type: bool default: false --logfile (Specify the name of the log file.) type: string default: v8.log --logfile_per_isolate (Separate log files for each isolate.) type: bool default: true --ll_prof (Enable low-level linux profiler.) type: bool default: false --perf_basic_prof (Enable perf linux profiler (basic support).) type: bool default: false --perf_jit_prof (Enable perf linux profiler (experimental annotate support).) type: bool default: false --gc_fake_mmap (Specify the name of the file for fake gc mmap used in ll_prof) type: string default: /tmp/__v8_gc__ --log_internal_timer_events (Time internal events.) type: bool default: false --log_timer_events (Time events including external callbacks.) type: bool default: false --log_instruction_stats (Log AArch64 instruction statistics.) type: bool default: false --log_instruction_file (AArch64 instruction statistics log file.) type: string default: arm64_inst.csv --log_instruction_period (AArch64 instruction statistics logging period.) type: int default: 4194304 --redirect_code_traces (output deopt information and disassembly into file code--.asm) type: bool default: false --redirect_code_traces_to (output deopt information and disassembly into the given file) type: string default: NULL --hydrogen_track_positions (track source code positions when building IR) type: bool default: false --trace_elements_transitions (trace elements transitions) type: bool default: false --trace_creation_allocation_sites (trace the creation of allocation sites) type: bool default: false --print_code_stubs (print code stubs) type: bool default: false --test_secondary_stub_cache (test secondary stub cache by disabling the primary one) type: bool default: false --test_primary_stub_cache (test primary stub cache by disabling the secondary one) type: bool default: false --print_code (print generated code) type: bool default: false --print_opt_code (print optimized code) type: bool default: false --print_unopt_code (print unoptimized code before printing optimized code based on it) type: bool default: false --print_code_verbose (print more information for code) type: bool default: false --print_builtin_code (print generated code for builtins) type: bool default: false --sodium (print generated code output suitable for use with the Sodium code viewer) type: bool default: false --print_all_code (enable all flags related to printing code) type: bool default: false .SH RESOURCES AND DOCUMENTATION See the website for documentation http://iojs.org/ Mailing list: http://groups.google.com/group/nodejs IRC: irc.freenode.net #io.js iojs-v1.0.2-darwin-x64/lib/dtrace/000755 000766 000024 00000000000 12456115116 016667 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/000755 000766 000024 00000000000 12456115116 020102 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/000755 000766 000024 00000000000 12456115120 020667 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/.eslintrc000644 000766 000024 00000000637 12455173731 022534 0ustar00iojsstaff000000 000000 { "env" : { "node" : true }, "rules" : { "semi": [2, "never"], "strict": 0, "quotes": [1, "double", "avoid-escape"], "no-use-before-define": 0, "curly": 0, "no-underscore-dangle": 0, "no-lonely-if": 1, "no-unused-vars": [2, {"vars" : "all", "args" : "after-used"}], "no-mixed-requires": 0, "space-infix-ops": 0, "key-spacing": 0, "no-multi-spaces": 0 } } iojs-v1.0.2-darwin-x64/lib/node_modules/npm/.npmignore000644 000766 000024 00000000752 12455173731 022705 0ustar00iojsstaff000000 000000 *.swp .*.swp npm-debug.log /test/bin /test/output.log /test/packages/*/node_modules /test/packages/npm-test-depends-on-spark/which-spark.log /test/packages/test-package/random-data.txt /test/root node_modules/marked node_modules/ronn node_modules/tap node_modules/.bin node_modules/npm-registry-mock /npmrc /release/ # don't need these in the npm package. html/*.png # don't ignore .npmignore files # these are used in some tests. !.npmignore /npm-*.tgz *.pyc /test/tap/builtin-config iojs-v1.0.2-darwin-x64/lib/node_modules/npm/.npmrc000644 000766 000024 00000000054 12455173731 022021 0ustar00iojsstaff000000 000000 save-prefix = ~ proprietary-attribs = false iojs-v1.0.2-darwin-x64/lib/node_modules/npm/.travis.yml000644 000766 000024 00000000344 12455173731 023014 0ustar00iojsstaff000000 000000 language: node_js node_js: - "0.11" - "0.10" env: - DEPLOY_VERSION=testing before_install: - "npm config set spin false" - "npm install -g npm@^2" - "sudo mkdir -p /var/run/couchdb" script: "npm run-script test-all" iojs-v1.0.2-darwin-x64/lib/node_modules/npm/AUTHORS000644 000766 000024 00000013177 12455173731 021763 0ustar00iojsstaff000000 000000 # Authors sorted by whether or not they're me Isaac Z. Schlueter Steve Steiner Mikeal Rogers Aaron Blohowiak Martyn Smith Mathias Pettersson Brian Hammond Charlie Robbins Francisco Treacy Cliffano Subagio Christian Eager Dav Glass Alex K. Wolfe James Sanders Reid Burke Arlo Breault Timo Derstappen Bradley Meck Bart Teeuwisse Ben Noordhuis Tor Valamo Whyme.Lyu <5longluna@gmail.com> Olivier Melcher Tomaž Muraus Evan Meagher Orlando Vazquez George Miroshnykov Geoff Flarity Pete Kruckenberg Laurie Harper Chris Wong Max Goodman Scott Bronson Federico Romero Visnu Pitiyanuvath Irakli Gozalishvili Mark Cahill Zearin Iain Sproat Trent Mick Felix Geisendörfer Conny Brunnkvist Will Elwood Oleg Efimov Martin Cooper Jameson Little cspotcode Maciej Małecki Stephen Sugden Gautham Pai David Trejo Paul Vorbach George Ornbo Tim Oxley Tyler Green atomizer Rod Vagg Christian Howe Andrew Lunny Henrik Hodne Adam Blackburn Kris Windham Jens Grunert Joost-Wim Boekesteijn Dalmais Maxence Marcus Ekwall Aaron Stacy Phillip Howell Domenic Denicola James Halliday Jeremy Cantrell Ribettes Einar Otto Stangvik Don Park Kei Son Nicolas Morel Mark Dube Nathan Rajlich Maxim Bogushevich Justin Beckwith Meaglin Ben Evans Nathan Zadoks Brian White Jed Schmidt Ian Livingstone Patrick Pfeiffer Paul Miller seebees Carl Lange Jan Lehnardt Alexey Kreschuk Di Wu Florian Margaine Forbes Lindesay Ian Babrou Jaakko Manninen Johan Nordberg Johan Sköld Larz Conwell Luke Arduini Marcel Klehr Mathias Bynens Matt Lunn Matt McClure Nirk Niggler Paolo Fragomeni Jake Verbaten (Raynos) Robert Kowalski Schabse Laks Stuart Knightley Stuart P. Bentley Vaz Allen elisee Evan You Wil Moore III Dylan Greene zeke Andrew Horton Denis Gladkikh Daniel Santiago Alex Kocharin Evan Lucas Steve Mason Quinn Slack Sébastien Santoro CamilleM Tom Huang Sergey Belov Younghoon Park Yazhong Liu Mikola Lysenko Rafael de Oleza Yeonghoon Park Franck Cuny Alan Shaw Alex Rodionov Alexej Yaroshevich Elan Shanker François Frisch Gabriel Falkenberg Jason Diamond Jess Martin Jon Spencer Matt Colyer Matt McClure Maximilian Antoni Nicholas Kinsey Paulo Cesar Quim Calpe Robert Gieseke Spain Train TJ Holowaychuk Thom Blake Trevor Burnham bitspill Neil Gentleman iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/000755 000766 000024 00000000000 12456115117 021445 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/CHANGELOG.md000644 000766 000024 00000265641 12455173731 022531 0ustar00iojsstaff000000 000000 ### v2.1.18 (2015-01-01): * [`bf8640b`](https://github.com/npm/npm/commit/bf8640b0395b5dff71260a0cede7efc699a7bcf5) [#7044](https://github.com/npm/npm/issues/7044) Document `.npmignore` syntax. ([@zeke](https://github.com/zeke)) ### v2.1.17 (2014-12-25): merry npm xmas Working with [@phated](https://github.com/phated), I discovered that npm still had some lingering race conditions around how it handles Git dependencies. The following changes were intended to remedy to these issues. Thanks to [@phated](https://github.com/phated) for all his help getting to the bottom of these. * [`bdf1c84`](https://github.com/npm/npm/commit/bdf1c8483f5c4ad79b712db12d73276e15883923) [#7006](https://github.com/npm/npm/issues/7006) Only `chown` template and top-level Git cache directories. ([@othiym23](https://github.com/othiym23)) * [`581a72d`](https://github.com/npm/npm/commit/581a72da18f35ec87edef6255adf4ef4714a478c) [#7006](https://github.com/npm/npm/issues/7006) Map Git remote inflighting to clone paths rather than Git URLs. ([@othiym23](https://github.com/othiym23)) * [`1c48d08`](https://github.com/npm/npm/commit/1c48d08dea31a11ac11a285cac598a482481cade) [#7009](https://github.com/npm/npm/issues/7009) `normalize-git-url@1.0.0`: Normalize Git URLs while caching. ([@othiym23](https://github.com/othiym23)) * [`5423cf0`](https://github.com/npm/npm/commit/5423cf0be8ff2b76bfff7c8e780e5f261235a86a) [#7009](https://github.com/npm/npm/issues/7009) Pack tarballs to their final locations atomically. ([@othiym23](https://github.com/othiym23)) * [`7f6557f`](https://github.com/npm/npm/commit/7f6557ff317469ee4a87c542ff9a991e74ce9f38) [#7009](https://github.com/npm/npm/issues/7009) Inflight local directory packing, just to be safe. ([@othiym23](https://github.com/othiym23)) Other changes: * [`1c491e6`](https://github.com/npm/npm/commit/1c491e65d70af013e8d5ac008d6d9762d6d91793) [#6991](https://github.com/npm/npm/issues/6991) `npm version`: fix regression in dirty-checking behavior ([@rlidwka](https://github.com/rlidwka)) * [`55ceb2b`](https://github.com/npm/npm/commit/55ceb2b08ff8a0f56b94cc972ca15d7862e8733c) [#1991](https://github.com/npm/npm/issues/1991) modify docs to reflect actual `npm restart` behavior ([@smikes](https://github.com/smikes)) * [`fb8e31b`](https://github.com/npm/npm/commit/fb8e31b95476a50bda35a665a99eec8a5d25a4db) [#6982](https://github.com/npm/npm/issues/6982) when doing registry operations, ensure registry URL always ends with `/` ([@othiym23](https://github.com/othiym23)) * [`5bcba65`](https://github.com/npm/npm/commit/5bcba65bed2678ffe80fb596f72abe9871d131c8) pull whitelisted Git environment variables out into a named constant ([@othiym23](https://github.com/othiym23)) * [`be04bbd`](https://github.com/npm/npm/commit/be04bbdc52ebfc820cd939df2f7d79fe87067747) [#7000](https://github.com/npm/npm/issues/7000) No longer install badly-named manpage files, and log an error when trying to uninstall them. ([@othiym23](https://github.com/othiym23)) * [`6b7c5ec`](https://github.com/npm/npm/commit/6b7c5eca6b65e1247d0e51f6400cf2637ac880ce) [#7011](https://github.com/npm/npm/issues/7011) Send auth for tarball fetches for packages in `npm-shrinkwrap.json` from private registries. ([@othiym23](https://github.com/othiym23)) * [`9b9de06`](https://github.com/npm/npm/commit/9b9de06a99893b40aa23f0335726dec6df7979db) `glob@4.3.2`: Better handling of trailing slashes. ([@isaacs](https://github.com/isaacs)) * [`030f3c7`](https://github.com/npm/npm/commit/030f3c7450b8ce124a19073bfbae0948a0a1a02c) `semver@4.2.0`: Diffing between version strings. ([@isaacs](https://github.com/isaacs)) ### v2.1.16 (2014-12-22): * [`a4e4e33`](https://github.com/npm/npm/commit/a4e4e33edb35c68813f04bf42bdf933a6f727bcd) [#6987](https://github.com/npm/npm/issues/6987) `read-installed@3.1.5`: fixed a regression where a new / empty package would cause read-installed to throw. ([@othiym23](https://github.com/othiym23) / [@pgilad](https://github.com/pgilad)) ### v2.1.15 (2014-12-18): * [`e5a2dee`](https://github.com/npm/npm/commit/e5a2dee47c74f26c56fee5998545b97497e830c8) [#6951](https://github.com/npm/npm/issues/6951) `fs-vacuum@1.2.5`: Use `path-is-inside` for better Windows normalization. ([@othiym23](https://github.com/othiym23)) * [`ac6167c`](https://github.com/npm/npm/commit/ac6167c2b9432939c57296f7ddd11ad5f8f918b2) [#6955](https://github.com/npm/npm/issues/6955) Call `path.normalize` in `lib/utils/gently-rm.js` for better Windows normalization. ([@ben-page](https://github.com/ben-page)) * [`c625d71`](https://github.com/npm/npm/commit/c625d714795e3b5badd847945e2401adfad5a196) [#6964](https://github.com/npm/npm/issues/6964) Clarify CA configuration docs. ([@jeffjo](https://github.com/jeffjo)) * [`58b8cb5`](https://github.com/npm/npm/commit/58b8cb5cdf26a854358b7c2ab636572dba9bac16) [#6950](https://github.com/npm/npm/issues/6950) Fix documentation typos. ([@martinvd](https://github.com/martinvd)) * [`7c1299d`](https://github.com/npm/npm/commit/7c1299d00538ea998684a1903a4091eafc63b7f1) [#6909](https://github.com/npm/npm/issues/6909) Remove confusing mention of rubygems `~>` semver operator. ([@mjtko](https://github.com/mjtko)) * [`7dfdcc6`](https://github.com/npm/npm/commit/7dfdcc6debd8ef1fc52a2b508997d15887aad824) [#6909](https://github.com/npm/npm/issues/6909) `semver@4.1.1`: Synchronize documentation with PR [#6909](https://github.com/npm/npm/issues/6909) ([@othiym23](https://github.com/othiym23)) * [`adfddf3`](https://github.com/npm/npm/commit/adfddf3b682e0ae08e4b59d87c1b380dd651c572) [#6925](https://github.com/npm/npm/issues/6925) Correct typo in `doc/api/npm-ls.md` ([@oddurs](https://github.com/oddurs)) * [`f5c534b`](https://github.com/npm/npm/commit/f5c534b711ab173129baf366c4f08d68f6117333) [#6920](https://github.com/npm/npm/issues/6920) Remove recommendation to run as root from `README.md`. ([@robertkowalski](https://github.com/robertkowalski)) * [`3ef4459`](https://github.com/npm/npm/commit/3ef445922cd39f25b992d91bd22c4d367882ea22) [#6920](https://github.com/npm/npm/issues/6920) `npm-@googlegroups.com` has gone the way of all things. That means it's gone. ([@robertkowalski](https://github.com/robertkowalski)) ### v2.1.14 (2014-12-13): * [`cf7aeae`](https://github.com/npm/npm/commit/cf7aeae3c3a24e48d3de4006fa082f0c6040922a) [#6923](https://github.com/npm/npm/issues/6923) Overaggressive link update for new website broke node-gyp. ([@othiym23](https://github.com/othiym23)) ### v2.1.13 (2014-12-11): * [`cbb890e`](https://github.com/npm/npm/commit/cbb890eeacc0501ba1b8c6955f1c829c8af9f486) [#6897](https://github.com/npm/npm/issues/6897) npm is a nice package manager that runs server-side JavaScript. ([@othiym23](https://github.com/othiym23)) * [`d9043c3`](https://github.com/npm/npm/commit/d9043c3b8d7450c3cb9ca795028c0e1c05377820) [#6893](https://github.com/npm/npm/issues/6893) Remove erroneous docs about preupdate / update / postupdate lifecycle scripts, which have never existed. ([@devTristan](https://github.com/devTristan)) * [`c5df4d0`](https://github.com/npm/npm/commit/c5df4d0d683cd3506808d1cd1acebff02a8b82db) [#6884](https://github.com/npm/npm/issues/6884) Update npmjs.org to npmjs.com in docs. ([@linclark](https://github.com/linclark)) * [`cb6ff8d`](https://github.com/npm/npm/commit/cb6ff8dace1b439851701d4784d2d719c22ca7a7) [#6879](https://github.com/npm/npm/issues/6879) npm version: Update shrinkwrap post-check. ([@othiym23](https://github.com/othiym23)) * [`2a340bd`](https://github.com/npm/npm/commit/2a340bdd548c6449468281e1444a032812bff677) [#6868](https://github.com/npm/npm/issues/6868) Use magic numbers instead of regexps to distinguish tarballs from other things. ([@daxxog](https://github.com/daxxog)) * [`f1c8bdb`](https://github.com/npm/npm/commit/f1c8bdb3f6b753d0600597e12346bdc3a34cb9c1) [#6861](https://github.com/npm/npm/issues/6861) `npm-registry-client@4.0.5`: Distinguish between error properties that are part of the response and error strings that should be returned to the user. ([@disrvptor](https://github.com/disrvptor)) * [`d3a1b63`](https://github.com/npm/npm/commit/d3a1b6397fddef04b5198ca89d36d720aeb05eb6) [#6762](https://github.com/npm/npm/issues/6762) Make `npm outdated` ignore private packages. ([@KenanY](https://github.com/KenanY)) * [`16d8542`](https://github.com/npm/npm/commit/16d854283ca5bcdb0cb2812fc5745d841652b952) install.sh: Drop support for node < 0.8, remove engines bits. ([@isaacs](https://github.com/isaacs)) * [`b9c6046`](https://github.com/npm/npm/commit/b9c60466d5b713b1dc2947da14a5dfe42352e029) `init-package-json@1.1.3`: ([@terinstock](https://github.com/terinstock)) noticed that `init.license` configuration doesn't stick. Make sure that dashed defaults don't trump dotted parameters. ([@othiym23](https://github.com/othiym23)) * [`b6d6acf`](https://github.com/npm/npm/commit/b6d6acfc02c8887f78067931babab8f7c5180fed) `which@1.0.8`: No longer use graceful-fs for some reason. ([@isaacs](https://github.com/isaacs)) * [`d39f673`](https://github.com/npm/npm/commit/d39f673caf08a90fb2bb001d79c98062d2cd05f4) `request@2.51.0`: Incorporate bug fixes. ([@nylen](https://github.com/nylen)) * [`c7ad727`](https://github.com/npm/npm/commit/c7ad7279cc879930ec58ccc62fa642e621ecb65c) `columnify@1.3.2`: Incorporate bug fixes. ([@timoxley](https://github.com/timoxley)) ### v2.1.12 (2014-12-04): * [`e5b1e44`](https://github.com/npm/npm/commit/e5b1e448bb4a9d6eae4ba0f67b1d3c2cea8ed383) add alias verison=version ([@isaacs](https://github.com/isaacs)) * [`5eed7bd`](https://github.com/npm/npm/commit/5eed7bddbd7bb92a44c4193c93e8529500c558e6) `request@2.49.0` ([@nylen](https://github.com/nylen)) * [`e72f81d`](https://github.com/npm/npm/commit/e72f81d8412540ae7d1e0edcc37c11bcb8169051) `glob@4.3.1` / `minimatch@2.0.1` ([@isaacs](https://github.com/isaacs)) * [`b8dcc36`](https://github.com/npm/npm/commit/b8dcc3637b5b71933b97162b7aff1b1a622c13e2) `graceful-fs@3.0.5` ([@isaacs](https://github.com/isaacs)) ### v2.1.11 (2014-11-27): * [`4861d28`](https://github.com/npm/npm/commit/4861d28ad0ebd959fe6bc15b9c9a50fcabe57f55) `which@1.0.7`: License update. ([@isaacs](https://github.com/isaacs)) * [`30a2ea8`](https://github.com/npm/npm/commit/30a2ea80c891d384b31a1cf28665bba4271915bd) `ini@1.3.2`: License update. ([@isaacs](https://github.com/isaacs)) * [`6a4ea05`](https://github.com/npm/npm/commit/6a4ea054f6ddf52fc58842ba2046564b04c5c0e2) `fstream@1.0.3`: Propagate error events to downstream streams. ([@gfxmonk](https://github.com/gfxmonk)) * [`a558695`](https://github.com/npm/npm/commit/a5586954f1c18df7c96137e0a79f41a69e7a884e) `tar@1.0.3`: Don't extract broken files, propagate `drain` event. ([@gfxmonk](https://github.com/gfxmonk)) * [`989624e`](https://github.com/npm/npm/commit/989624e8321f87734c1b1272fc2f646e7af1f81c) [#6767](https://github.com/npm/npm/issues/6767) Actually pass parameters when adding git repo to cach under Windows. ([@othiym23](https://github.com/othiym23)) * [`657af73`](https://github.com/npm/npm/commit/657af7308f7d6cd2f81389fcf0d762252acaf1ce) [#6774](https://github.com/npm/npm/issues/6774) When verifying paths on unbuild, resolve both source and target as symlinks. ([@hokaccha](https://github.com/hokaccha)) * [`fd19c40`](https://github.com/npm/npm/commit/fd19c4046414494f9647a6991c00f8406a939929) [#6713](https://github.com/npm/npm/issues/6713) `realize-package-specifier@1.3.0`: Make it so that `npm install foo@1` work when a file named `1` exists. ([@iarna](https://github.com/iarna)) * [`c8ac37a`](https://github.com/npm/npm/commit/c8ac37a470491b2ed28514536e2e198494638c79) `npm-registry-client@4.0.4`: Fix regression in failed fetch retries. ([@othiym23](https://github.com/othiym23)) ### v2.1.10 (2014-11-20): * [`756f3d4`](https://github.com/npm/npm/commit/756f3d40fe18bc02bc93afe17016dfcc266c4b6b) [#6735](https://github.com/npm/npm/issues/6735) Log "already built" messages at info, not error. ([@smikes](https://github.com/smikes)) * [`1b7330d`](https://github.com/npm/npm/commit/1b7330dafba3bbba171f74f1e58b261cb1b9301e) [#6729](https://github.com/npm/npm/issues/6729) `npm-registry-client@4.0.3`: GitHub won't redirect you through an HTML page to a compressed tarball if you don't tell it you accept JSON responses. ([@KenanY](https://github.com/KenanY)) * [`d9c7857`](https://github.com/npm/npm/commit/d9c7857be02dacd274e55bf6d430d90d91509d53) [#6506](https://github.com/npm/npm/issues/6506) `readdir-scoped-modules@1.0.1`: Use `graceful-fs` so the whole dependency tree gets read, even in case of `EMFILE`. ([@sakana](https://github.com/sakana)) * [`3a085be`](https://github.com/npm/npm/commit/3a085be158ace8f1e4395e69f8c102d3dea00c5f) Grammar fix in docs. ([@icylace](https://github.com/icylace)) * [`3f8e2ff`](https://github.com/npm/npm/commit/3f8e2ff8342d327d6f1375437ecf4bd945dc360f) Did you know that npm has a Code of Conduct? Add a link to it to CONTRIBUTING.md. ([@isaacs](https://github.com/isaacs)) * [`319ccf6`](https://github.com/npm/npm/commit/319ccf633289e06e57a80d74c39706899348674c) `glob@4.2.1`: Performance tuning. ([@isaacs](https://github.com/isaacs)) * [`835f046`](https://github.com/npm/npm/commit/835f046e7568c33e81a0b48c84cff965024d8b8a) `readable-stream@1.0.33`: Bug fixes. ([@rvagg](https://github.com/rvagg)) * [`a34c38d`](https://github.com/npm/npm/commit/a34c38d0732fb246d11f2a776d2ad0d8db654338) `request@2.48.0`: Bug fixes. ([@nylen](https://github.com/nylen)) ### v2.1.9 (2014-11-13): * [`eed9f61`](https://github.com/npm/npm/commit/eed9f6101963364acffc59d7194fc1655180e80c) [#6542](https://github.com/npm/npm/issues/6542) `npm owner add / remove` now works properly with scoped packages ([@othiym23](https://github.com/othiym23)) * [`cd25973`](https://github.com/npm/npm/commit/cd25973825aa5315b7ebf26227bd32bd6be5533f) [#6548](https://github.com/npm/npm/issues/6548) using sudo won't leave the cache's git directories with bad permissions ([@othiym23](https://github.com/othiym23)) * [`56930ab`](https://github.com/npm/npm/commit/56930abcae6a6ea41f1b75e23765c61259cef2dd) fixed irregular `npm cache ls` output (yes, that's a thing) ([@othiym23](https://github.com/othiym23)) * [`740f483`](https://github.com/npm/npm/commit/740f483db6ec872b453065842da080a646c3600a) legacy tests no longer poison user's own cache ([@othiym23](https://github.com/othiym23)) * [`ce37f14`](https://github.com/npm/npm/commit/ce37f142a487023747a9086335618638ebca4372) [#6169](https://github.com/npm/npm/issues/6169) add terse output similar to `npm publish / unpublish` for `npm owner add / remove` ([@KenanY](https://github.com/KenanY)) * [`bf2b8a6`](https://github.com/npm/npm/commit/bf2b8a66d7188900bf1e957c052b893948b67e0e) [#6680](https://github.com/npm/npm/issues/6680) pass auth credentials to registry when downloading search index ([@terinjokes](https://github.com/terinjokes)) * [`00ecb61`](https://github.com/npm/npm/commit/00ecb6101422984696929f602e14da186f9f669c) [#6400](https://github.com/npm/npm/issues/6400) `.npmignore` is respected for git repos on cache / pack / publish ([@othiym23](https://github.com/othiym23)) * [`d1b3a9e`](https://github.com/npm/npm/commit/d1b3a9ec5e2b6d52765ba5da5afb08dba41c49c1) [#6311](https://github.com/npm/npm/issues/6311) `npm ls -l --depth=0` no longer prints phantom duplicate children ([@othiym23](https://github.com/othiym23)) * [`07c5f34`](https://github.com/npm/npm/commit/07c5f34e45c9b18c348ed53b5763b1c5d4325740) [#6690](https://github.com/npm/npm/issues/6690) `uid-number@0.0.6`: clarify confusing names in error-handling code ([@isaacs](https://github.com/isaacs)) * [`1ac9be9`](https://github.com/npm/npm/commit/1ac9be9f3bab816211d72d13cb05b5587878a586) [#6684](https://github.com/npm/npm/issues/6684) `npm init`: don't report write if canceled ([@smikes](https://github.com/smikes)) * [`7bb207d`](https://github.com/npm/npm/commit/7bb207d1d6592a9cffc986871e4b671575363c2f) [#5754](https://github.com/npm/npm/issues/5754) never remove app directories on failed install ([@othiym23](https://github.com/othiym23)) * [`705ce60`](https://github.com/npm/npm/commit/705ce601e7b9c5428353e02ebb30cb76c1991fdd) [#5754](https://github.com/npm/npm/issues/5754) `fs-vacuum@1.2.2`: don't throw when another fs task writes to a directory being vacuumed ([@othiym23](https://github.com/othiym23)) * [`1b650f4`](https://github.com/npm/npm/commit/1b650f4f217c413a2ffb96e1701beb5aa67a0de2) [#6255](https://github.com/npm/npm/issues/6255) ensure that order credentials are used from `.npmrc` doesn't regress ([@othiym23](https://github.com/othiym23)) * [`9bb2c34`](https://github.com/npm/npm/commit/9bb2c3435cedef40b45d3e9bd7a8edfb8cbe7209) [#6644](https://github.com/npm/npm/issues/6644) `warn` rather than `info` on fetch failure ([@othiym23](https://github.com/othiym23)) * [`e34a7b6`](https://github.com/npm/npm/commit/e34a7b6b7371b1893a062f627ae8e168546d7264) [#6524](https://github.com/npm/npm/issues/6524) `npm-registry-client@4.0.2`: proxy via `request` more transparently ([@othiym23](https://github.com/othiym23)) * [`40afd6a`](https://github.com/npm/npm/commit/40afd6aaf34c11a10e80ec87b115fb2bb907e3bd) [#6524](https://github.com/npm/npm/issues/6524) push proxy settings into `request` ([@tauren](https://github.com/tauren)) ### v2.1.8 (2014-11-06): * [`063d843`](https://github.com/npm/npm/commit/063d843965f9f0bfa5732d7c2d6f5aa37a8260a2) npm version now updates version in npm-shrinkwrap.json ([@faiq](https://github.com/faiq)) * [`3f53cd7`](https://github.com/npm/npm/commit/3f53cd795f8a600e904a97f215ba5b5a9989d9dd) [#6559](https://github.com/npm/npm/issues/6559) save local dependencies in npm-shrinkwrap.json ([@Torsph](https://github.com/Torsph)) * [`e249262`](https://github.com/npm/npm/commit/e24926268b2d2220910bc81cce6d3b2e08d94eb1) npm-faq.md: mention scoped pkgs in namespace Q ([@smikes](https://github.com/smikes)) * [`6b06ec4`](https://github.com/npm/npm/commit/6b06ec4ef5da490bdca1512fa7f12490245c192b) [#6642](https://github.com/npm/npm/issues/6642) `init-package-json@1.1.2`: Handle both `init-author-name` and `init.author.name`. ([@othiym23](https://github.com/othiym23)) * [`9cb334c`](https://github.com/npm/npm/commit/9cb334c8a895a55461aac18791babae779309a0e) [#6409](https://github.com/npm/npm/issues/6409) document commit-ish with GitHub URLs ([@smikes](https://github.com/smikes)) * [`0aefae9`](https://github.com/npm/npm/commit/0aefae9bc2598a4b7a3ee7bb2306b42e3e12bb28) [#2959](https://github.com/npm/npm/issues/2959) npm run no longer fails silently ([@flipside](https://github.com/flipside)) * [`e007a2c`](https://github.com/npm/npm/commit/e007a2c1e4fac1759fa61ac6e78c6b83b2417d11) [#3908](https://github.com/npm/npm/issues/3908) include command in spawn errors ([@smikes](https://github.com/smikes)) ### v2.1.7 (2014-10-30): * [`6750b05`](https://github.com/npm/npm/commit/6750b05dcba20d8990a672957ec56c48f97e241a) [#6398](https://github.com/npm/npm/issues/6398) `npm-registry-client@4.0.0`: consistent API, handle relative registry paths, use auth more consistently ([@othiym23](https://github.com/othiym23)) * [`7719cfd`](https://github.com/npm/npm/commit/7719cfdd8b204dfeccc41289707ea58b4d608905) [#6560](https://github.com/npm/npm/issues/6560) use new npm-registry-client API ([@othiym23](https://github.com/othiym23)) * [`ed61971`](https://github.com/npm/npm/commit/ed619714c93718b6c1922b8c286f4b6cd2b97c80) move caching of search metadata from `npm-registry-client` to npm itself ([@othiym23](https://github.com/othiym23)) * [`3457041`](https://github.com/npm/npm/commit/34570414cd528debeb22943873440594d7f47abf) handle caching of metadata independently from `npm-registry-client` ([@othiym23](https://github.com/othiym23)) * [`20a331c`](https://github.com/npm/npm/commit/20a331ced6a52faac6ec242e3ffdf28bcd447c40) [#6538](https://github.com/npm/npm/issues/6538) map registry URLs to credentials more safely ([@indexzero](https://github.com/indexzero)) * [`4072e97`](https://github.com/npm/npm/commit/4072e97856bf1e7affb38333d080c172767eea27) [#6589](https://github.com/npm/npm/issues/6589) `npm-registry-client@4.0.1`: allow publishing of packages with names identical to built-in Node modules ([@feross](https://github.com/feross)) * [`254f0e4`](https://github.com/npm/npm/commit/254f0e4adaf2c56e9df25c7343c43b0b0804a3b5) `tar@1.0.2`: better error-handling ([@runk](https://github.com/runk)) * [`73ee2aa`](https://github.com/npm/npm/commit/73ee2aa4f1a47e43fe7cf4317a5446875f7521fa) `request@2.47.0` ([@mikeal](https://github.com/mikeal)) ### v2.1.6 (2014-10-23): * [`681b398`](https://github.com/npm/npm/commit/681b3987a18e7aba0aaf78c91a23c7cc0ab82ce8) [#6523](https://github.com/npm/npm/issues/6523) fix default `logelevel` doc ([@KenanY](https://github.com/KenanY)) * [`80b368f`](https://github.com/npm/npm/commit/80b368ffd786d4d008734b56c4a6fe12d2cb2926) [#6528](https://github.com/npm/npm/issues/6528) `npm version` should work in a git directory without git ([@terinjokes](https://github.com/terinjokes)) * [`5f5f9e4`](https://github.com/npm/npm/commit/5f5f9e4ddf544c2da6adf3f8c885238b0e745076) [#6483](https://github.com/npm/npm/issues/6483) `init-package-json@1.1.1`: Properly pick up default values from environment variables. ([@othiym23](https://github.com/othiym23)) * [`a114870`](https://github.com/npm/npm/commit/a1148702f53f82d49606b2e4dac7581261fff442) perl 5.18.x doesn't like -pi without filenames ([@othiym23](https://github.com/othiym23)) * [`de5ba00`](https://github.com/npm/npm/commit/de5ba007a48db876eb5bfb6156435f3512d58977) `request@2.46.0`: Tests and cleanup. ([@othiym23](https://github.com/othiym23)) * [`76933f1`](https://github.com/npm/npm/commit/76933f169f17b5273b32e924a7b392d5729931a7) `fstream-npm@1.0.1`: Always include `LICENSE[.*]`, `LICENCE[.*]`, `CHANGES[.*]`, `CHANGELOG[.*]`, and `HISTORY[.*]`. ([@jonathanong](https://github.com/jonathanong)) ### v2.1.5 (2014-10-16): * [`6a14b23`](https://github.com/npm/npm/commit/6a14b232a0e34158bd95bb25c607167be995c204) [#6397](https://github.com/npm/npm/issues/6397) Defactor npmconf back into npm. ([@othiym23](https://github.com/othiym23)) * [`4000e33`](https://github.com/npm/npm/commit/4000e3333a76ca4844681efa8737cfac24b7c2c8) [#6323](https://github.com/npm/npm/issues/6323) Install `peerDependencies` from top. ([@othiym23](https://github.com/othiym23)) * [`5d119ae`](https://github.com/npm/npm/commit/5d119ae246f27353b14ff063559d1ba8c616bb89) [#6498](https://github.com/npm/npm/issues/6498) Better error messages on malformed `.npmrc` properties. ([@nicks](https://github.com/nicks)) * [`ae18efb`](https://github.com/npm/npm/commit/ae18efb65fed427b1ef18e4862885bf60b87b92e) [#6093](https://github.com/npm/npm/issues/6093) Replace instances of 'hash' with 'object' in documentation. ([@zeke](https://github.com/zeke)) * [`53108b2`](https://github.com/npm/npm/commit/53108b276fec5f97a38250933a2768d58b6928da) [#1558](https://github.com/npm/npm/issues/1558) Clarify how local paths should be used. ([@KenanY](https://github.com/KenanY)) * [`344fa1a`](https://github.com/npm/npm/commit/344fa1a219ac8867022df3dc58a47636dde8a242) [#6488](https://github.com/npm/npm/issues/6488) Work around bug in marked. ([@othiym23](https://github.com/othiym23)) OUTDATED DEPENDENCY CLEANUP JAMBOREE * [`60c2942`](https://github.com/npm/npm/commit/60c2942e13655d9ecdf6e0f1f97f10cb71a75255) `realize-package-specifier@1.2.0`: Handle names and rawSpecs more consistently. ([@iarna](https://github.com/iarna)) * [`1b5c95f`](https://github.com/npm/npm/commit/1b5c95fbda77b87342bd48c5ecac5b1fd571ccfe) `sha@1.3.0`: Change line endings? ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`d7dee3f`](https://github.com/npm/npm/commit/d7dee3f3f7d9e7c2061a4ecb4dd93e3e4bfe4f2e) `request@2.45.0`: Dependency updates, better proxy support, better compressed response handling, lots of 'use strict'. ([@mikeal](https://github.com/mikeal)) * [`3d75180`](https://github.com/npm/npm/commit/3d75180c2cc79fa3adfa0e4cb783a27192189a65) `opener@1.4.0`: Added gratuitous return. ([@Domenic](https://github.com/Domenic)) * [`8e2703f`](https://github.com/npm/npm/commit/8e2703f78d280d1edeb749e257dda1f288bad6e3) `retry@0.6.1` / `npm-registry-client@3.2.4`: Change of ownership. ([@tim-kos](https://github.com/tim-kos)) * [`c87b00f`](https://github.com/npm/npm/commit/c87b00f82f92434ee77831915012c77a6c244c39) `once@1.3.1`: Wrap once with wrappy. ([@isaacs](https://github.com/isaacs)) * [`01ec790`](https://github.com/npm/npm/commit/01ec790fd47def56eda6abb3b8d809093e8f493f) `npm-user-validate@0.1.1`: Correct repository URL. ([@robertkowalski](https://github.com/robertkowalski)) * [`389e52c`](https://github.com/npm/npm/commit/389e52c2d94c818ca8935ccdcf392994fec564a2) `glob@4.0.6`: Now absolutely requires `graceful-fs`. ([@isaacs](https://github.com/isaacs)) * [`e15ab15`](https://github.com/npm/npm/commit/e15ab15a27a8f14cf0d9dc6f11dee452080378a0) `ini@1.3.0`: Tighten up whitespace handling. ([@isaacs](https://github.com/isaacs)) * [`7610f3e`](https://github.com/npm/npm/commit/7610f3e62e699292ece081bfd33084d436e3246d) `archy@1.0.0` ([@substack](https://github.com/substack)) * [`9c13149`](https://github.com/npm/npm/commit/9c1314985e513e20ffa3ea0ca333ba2ab78299c9) `semver@4.1.0`: Add support for prerelease identifiers. ([@bromanko](https://github.com/bromanko)) * [`f096c25`](https://github.com/npm/npm/commit/f096c250441b031d758f03afbe8d2321f94c7703) `graceful-fs@3.0.4`: Add a bunch of additional tests, skip the unfortunate complications of `graceful-fs@3.0.3`. ([@isaacs](https://github.com/isaacs)) ### v2.1.4 (2014-10-09): * [`3aeb440`](https://github.com/npm/npm/commit/3aeb4401444fad83cc7a8d11bf2507658afa5248) [#6442](https://github.com/npm/npm/issues/6442) proxying git needs `GIT_SSL_CAINFO` ([@wmertens](https://github.com/wmertens)) * [`a8da8d6`](https://github.com/npm/npm/commit/a8da8d6e0cd56d97728c0b76b51604ee06ef6264) [#6413](https://github.com/npm/npm/issues/6413) write builtin config on any global npm install ([@isaacs](https://github.com/isaacs)) * [`9e4d632`](https://github.com/npm/npm/commit/9e4d632c0142ba55df07d624667738b8727336fc) [#6343](https://github.com/npm/npm/issues/6343) don't pass run arguments to pre & post scripts ([@TheLudd](https://github.com/TheLudd)) * [`d831b1f`](https://github.com/npm/npm/commit/d831b1f7ca1a9921ea5b394e39b7130ecbc6d7b4) [#6399](https://github.com/npm/npm/issues/6399) race condition: inflight installs, prevent `peerDependency` problems ([@othiym23](https://github.com/othiym23)) * [`82b775d`](https://github.com/npm/npm/commit/82b775d6ff34c4beb6c70b2344d491a9f2026577) [#6384](https://github.com/npm/npm/issues/6384) race condition: inflight caching by URL rather than semver range ([@othiym23](https://github.com/othiym23)) * [`7bee042`](https://github.com/npm/npm/commit/7bee0429066fedcc9e6e962c043eb740b3792809) `inflight@1.0.4`: callback can take arbitrary number of parameters ([@othiym23](https://github.com/othiym23)) * [`3bff494`](https://github.com/npm/npm/commit/3bff494f4abf17d6d7e0e4a3a76cf7421ecec35a) [#5195](https://github.com/npm/npm/issues/5195) fixed regex color regression for `npm search` ([@chrismeyersfsu](https://github.com/chrismeyersfsu)) * [`33ba2d5`](https://github.com/npm/npm/commit/33ba2d585160a0a2a322cb76c4cd989acadcc984) [#6387](https://github.com/npm/npm/issues/6387) allow `npm view global` if package is specified ([@evanlucas](https://github.com/evanlucas)) * [`99c4cfc`](https://github.com/npm/npm/commit/99c4cfceed413396d952cf05f4e3c710f9682c23) [#6388](https://github.com/npm/npm/issues/6388) npm-publish → npm-developers(7) ([@kennydude](https://github.com/kennydude)) TEST CLEANUP EXTRAVAGANZA: * [`8d6bfcb`](https://github.com/npm/npm/commit/8d6bfcb88408f5885a2a67409854c43e5c3a23f6) tap tests run with no system-wide side effects ([@chrismeyersfsu](https://github.com/chrismeyersfsu)) * [`7a1472f`](https://github.com/npm/npm/commit/7a1472fbdbe99956ad19f629e7eb1cc07ba026ef) added npm cache cleanup script ([@chrismeyersfsu](https://github.com/chrismeyersfsu)) * [`0ce6a37`](https://github.com/npm/npm/commit/0ce6a3752fa9119298df15671254db6bc1d8e64c) stripped out dead test code (othiym23) * replace spawn with common.npm (@chrismeyersfsu): * [`0dcd614`](https://github.com/npm/npm/commit/0dcd61446335eaf541bf5f2d5186ec1419f86a42) test/tap/cache-shasum-fork.js * [`97f861c`](https://github.com/npm/npm/commit/97f861c967606a7e51e3d5047cf805d9d1adea5a) test/tap/false_name.js * [`d01b3de`](https://github.com/npm/npm/commit/d01b3de6ce03f25bbf3db97bfcd3cc85830d6801) test/tap/git-cache-locking.js * [`7b63016`](https://github.com/npm/npm/commit/7b63016778124c6728d6bd89a045c841ae3900b6) test/tap/pack-scoped.js * [`c877553`](https://github.com/npm/npm/commit/c877553265c39673e03f0a97972f692af81a595d) test/tap/scripts-whitespace-windows.js * [`df98525`](https://github.com/npm/npm/commit/df98525331e964131299d457173c697cfb3d95b9) test/tap/prepublish.js * [`99c4cfc`](https://github.com/npm/npm/commit/99c4cfceed413396d952cf05f4e3c710f9682c23) test/tap/prune.js ### v2.1.3 (2014-10-02): BREAKING CHANGE FOR THE SQRT(i) PEOPLE ACTUALLY USING `npm submodule`: * [`1e64473`](https://github.com/npm/npm/commit/1e6447360207f45ad6188e5780fdf4517de6e23d) `rm -rf npm submodule` command, which has been broken since the Carter Administration ([@isaacs](https://github.com/isaacs)) BREAKING CHANGE IF YOU ARE FOR SOME REASON STILL USING NODE 0.6 AND YOU SHOULD NOT BE DOING THAT CAN YOU NOT: * [`3e431f9`](https://github.com/npm/npm/commit/3e431f9d6884acb4cde8bcb8a0b122a76b33ee1d) [joyent/node#8492](https://github.com/joyent/node/issues/8492) bye bye customFds, hello stdio ([@othiym23](https://github.com/othiym23)) Other changes: * [`ea607a8`](https://github.com/npm/npm/commit/ea607a8a20e891ad38eed11b5ce2c3c0a65484b9) [#6372](https://github.com/npm/npm/issues/6372) noisily error (without aborting) on multi-{install,build} ([@othiym23](https://github.com/othiym23)) * [`3ee2799`](https://github.com/npm/npm/commit/3ee2799b629fd079d2db21d7e8f25fa7fa1660d0) [#6372](https://github.com/npm/npm/issues/6372) only make cache creation requests in flight ([@othiym23](https://github.com/othiym23)) * [`1a90ec2`](https://github.com/npm/npm/commit/1a90ec2f2cfbefc8becc6ef0c480e5edacc8a4cb) [#6372](https://github.com/npm/npm/issues/6372) wait to put Git URLs in flight until normalized ([@othiym23](https://github.com/othiym23)) * [`664795b`](https://github.com/npm/npm/commit/664795bb7d8da7142417b3f4ef5986db3a394071) [#6372](https://github.com/npm/npm/issues/6372) log what is and isn't in flight ([@othiym23](https://github.com/othiym23)) * [`00ef580`](https://github.com/npm/npm/commit/00ef58025a1f52dfabf2c4dc3898621d16a6e062) `inflight@1.0.3`: fix largely theoretical race condition, because we really really hate race conditions ([@isaacs](https://github.com/isaacs)) * [`1cde465`](https://github.com/npm/npm/commit/1cde4658d897ae0f93ff1d65b258e1571b391182) [#6363](https://github.com/npm/npm/issues/6363) `realize-package-specifier@1.1.0`: handle local dependencies better ([@iarna](https://github.com/iarna)) * [`86f084c`](https://github.com/npm/npm/commit/86f084c6c6d7935cd85d72d9d94b8784c914d51e) `realize-package-specifier@1.0.2`: dependency realization! in its own module! ([@iarna](https://github.com/iarna)) * [`553d830`](https://github.com/npm/npm/commit/553d830334552b83606b6bebefd821c9ea71e964) `npm-package-arg@2.1.3`: simplified semver, better tests ([@iarna](https://github.com/iarna)) * [`bec9b61`](https://github.com/npm/npm/commit/bec9b61a316c19f5240657594f0905a92a474352) `readable-stream@1.0.32`: for some reason ([@rvagg](https://github.com/rvagg)) * [`ff08ec5`](https://github.com/npm/npm/commit/ff08ec5f6d717bdbd559de0b2ede769306a9a763) `dezalgo@1.0.1`: use wrappy for instrumentability ([@isaacs](https://github.com/isaacs)) ### v2.1.2 (2014-09-29): * [`a1aa20e`](https://github.com/npm/npm/commit/a1aa20e44bb8285c6be1e7fa63b9da920e3a70ed) [#6282](https://github.com/npm/npm/issues/6282) `normalize-package-data@1.0.3`: don't prune bundledDependencies ([@isaacs](https://github.com/isaacs)) * [`a1f5fe1`](https://github.com/npm/npm/commit/a1f5fe1005043ce20a06e8b17a3e201aa3215357) move locks back into cache, now path-aware ([@othiym23](https://github.com/othiym23)) * [`a432c4b`](https://github.com/npm/npm/commit/a432c4b48c881294d6d79b5f41c2e1c16ad15a8a) convert lib/utils/tar.js to use atomic streams ([@othiym23](https://github.com/othiym23)) * [`b8c3c74`](https://github.com/npm/npm/commit/b8c3c74a3c963564233204161cc263e0912c930b) `fs-write-stream-atomic@1.0.2`: Now works with streams1 fs.WriteStreams. ([@isaacs](https://github.com/isaacs)) * [`c7ab76f`](https://github.com/npm/npm/commit/c7ab76f44cce5f42add5e3ba879bd10e7e00c3e6) logging cleanup ([@othiym23](https://github.com/othiym23)) * [`4b2d95d`](https://github.com/npm/npm/commit/4b2d95d0641435b09d047ae5cb2226f292bf38f0) [#6329](https://github.com/npm/npm/issues/6329) efficiently validate tmp tarballs safely ([@othiym23](https://github.com/othiym23)) ### v2.1.1 (2014-09-26): * [`563225d`](https://github.com/npm/npm/commit/563225d813ea4c12f46d4f7821ac7f76ba8ee2d6) [#6318](https://github.com/npm/npm/issues/6318) clean up locking; prefix lockfile with "." ([@othiym23](https://github.com/othiym23)) * [`c7f30e4`](https://github.com/npm/npm/commit/c7f30e4550fea882d31fcd4a55b681cd30713c44) [#6318](https://github.com/npm/npm/issues/6318) remove locking code around tarball packing and unpacking ([@othiym23](https://github.com/othiym23)) ### v2.1.0 (2014-09-25): NEW FEATURE: * [`3635601`](https://github.com/npm/npm/commit/36356011b6f2e6a5a81490e85a0a44eb27199dd7) [#5520](https://github.com/npm/npm/issues/5520) Add `'npm view .'`. ([@evanlucas](https://github.com/evanlucas)) Other changes: * [`f24b552`](https://github.com/npm/npm/commit/f24b552b596d0627549cdd7c2d68fcf9006ea50a) [#6294](https://github.com/npm/npm/issues/6294) Lock cache → lock cache target. ([@othiym23](https://github.com/othiym23)) * [`ad54450`](https://github.com/npm/npm/commit/ad54450104f94c82c501138b4eee488ce3a4555e) [#6296](https://github.com/npm/npm/issues/6296) Ensure that npm-debug.log file is created when rollbacks are done. ([@isaacs](https://github.com/isaacs)) * [`6810071`](https://github.com/npm/npm/commit/681007155a40ac9d165293bd6ec5d8a1423ccfca) docs: Default loglevel "http" → "warn". ([@othiym23](https://github.com/othiym23)) * [`35ac89a`](https://github.com/npm/npm/commit/35ac89a940f23db875e882ce2888208395130336) Skip installation of installed scoped packages. ([@timoxley](https://github.com/timoxley)) * [`e468527`](https://github.com/npm/npm/commit/e468527256ec599892b9b88d61205e061d1ab735) Ensure cleanup executes for scripts-whitespace-windows test. ([@timoxley](https://github.com/timoxley)) * [`ef9101b`](https://github.com/npm/npm/commit/ef9101b7f346797749415086956a0394528a12c4) Ensure cleanup executes for packed-scope test. ([@timoxley](https://github.com/timoxley)) * [`69b4d18`](https://github.com/npm/npm/commit/69b4d18cdbc2ae04c9afaffbd273b436a394f398) `fs-write-stream-atomic@1.0.1`: Fix a race condition in our race-condition fixer. ([@isaacs](https://github.com/isaacs)) * [`26b17ff`](https://github.com/npm/npm/commit/26b17ff2e3b21ee26c6fdbecc8273520cff45718) [#6272](https://github.com/npm/npm/issues/6272) `npmconf` decides what the default prefix is. ([@othiym23](https://github.com/othiym23)) * [`846faca`](https://github.com/npm/npm/commit/846facacc6427dafcf5756dcd36d9036539938de) Fix development dependency is preferred over dependency. ([@andersjanmyr](https://github.com/andersjanmyr)) * [`9d1a9db`](https://github.com/npm/npm/commit/9d1a9db3af5adc48a7158a5a053eeb89ee41a0e7) [#3265](https://github.com/npm/npm/issues/3265) Re-apply a71615a. Fixes [#3265](https://github.com/npm/npm/issues/3265) again, with a test! ([@glasser](https://github.com/glasser)) * [`1d41db0`](https://github.com/npm/npm/commit/1d41db0b2744a7bd50971c35cc060ea0600fb4bf) `marked-man@0.1.4`: Fixes formatting of synopsis blocks in man docs. ([@kapouer](https://github.com/kapouer)) * [`a623da0`](https://github.com/npm/npm/commit/a623da01bea1b2d3f3a18b9117cfd2d8e3cbdd77) [#5867](https://github.com/npm/npm/issues/5867) Specify dummy git template dir when cloning to prevent copying hooks. ([@boneskull](https://github.com/boneskull)) ### v2.0.2 (2014-09-19): * [`42c872b`](https://github.com/npm/npm/commit/42c872b32cadc0e555638fc78eab3a38a04401d8) [#5920](https://github.com/npm/npm/issues/5920) `fs-write-stream-atomic@1.0.0` ([@isaacs](https://github.com/isaacs)) * [`6784767`](https://github.com/npm/npm/commit/6784767fe15e28b44c81a1d4bb1738c642a65d78) [#5920](https://github.com/npm/npm/issues/5920) make all write streams atomic ([@isaacs](https://github.com/isaacs)) * [`f6fac00`](https://github.com/npm/npm/commit/f6fac000dd98ebdd5ea1d5921175735d463d328b) [#5920](https://github.com/npm/npm/issues/5920) barf on 0-length cached tarballs ([@isaacs](https://github.com/isaacs)) * [`3b37592`](https://github.com/npm/npm/commit/3b37592a92ea98336505189ae8ca29248b0589f4) `write-file-atomic@1.1.0`: use graceful-fs ([@iarna](https://github.com/iarna)) ### v2.0.1 (2014-09-18): * [`74c5ab0`](https://github.com/npm/npm/commit/74c5ab0a676793c6dc19a3fd5fe149f85fecb261) [#6201](https://github.com/npm/npm/issues/6201) `npmconf@2.1.0`: scope always-auth to registry URI ([@othiym23](https://github.com/othiym23)) * [`774b127`](https://github.com/npm/npm/commit/774b127da1dd6fefe2f1299e73505d9146f00294) [#6201](https://github.com/npm/npm/issues/6201) `npm-registry-client@3.2.2`: use scoped always-auth settings ([@othiym23](https://github.com/othiym23)) * [`f2d2190`](https://github.com/npm/npm/commit/f2d2190aa365d22378d03afab0da13f95614a583) [#6201](https://github.com/npm/npm/issues/6201) support saving `--always-auth` when logging in ([@othiym23](https://github.com/othiym23)) * [`17c941a`](https://github.com/npm/npm/commit/17c941a2d583210fe97ed47e2968d94ce9f774ba) [#6163](https://github.com/npm/npm/issues/6163) use `write-file-atomic` instead of `fs.writeFile()` ([@fiws](https://github.com/fiws)) * [`fb5724f`](https://github.com/npm/npm/commit/fb5724fd98e1509c939693568df83d11417ea337) [#5925](https://github.com/npm/npm/issues/5925) `npm init -f`: allow `npm init` to run without prompting ([@michaelnisi](https://github.com/michaelnisi)) * [`b706d63`](https://github.com/npm/npm/commit/b706d637d5965dbf8f7ce07dc5c4bc80887f30d8) [#3059](https://github.com/npm/npm/issues/3059) disable prepublish when running `npm install --production` ([@jussi](https://github.com/jussi)-kalliokoski) * [`119f068`](https://github.com/npm/npm/commit/119f068eae2a36fa8b9c9ca557c70377792243a4) attach the node version used when publishing a package to its registry metadata ([@othiym23](https://github.com/othiym23)) * [`8fe0081`](https://github.com/npm/npm/commit/8fe008181665519c2ac201ee432a3ece9798c31f) seriously, don't use `npm -g update npm` ([@thomblake](https://github.com/thomblake)) * [`ea5b3d4`](https://github.com/npm/npm/commit/ea5b3d446b86dcabb0dbc6dba374d3039342ecb3) `request@2.44.0` ([@othiym23](https://github.com/othiym23)) ### v2.0.0 (2014-09-12): BREAKING CHANGES: * [`4378a17`](https://github.com/npm/npm/commit/4378a17db340404a725ffe2eb75c9936f1612670) `semver@4.0.0`: prerelease versions no longer show up in ranges; `^0.x.y` behaves the way it did in `semver@2` rather than `semver@3`; docs have been reorganized for comprehensibility ([@isaacs](https://github.com/isaacs)) * [`c6ddb64`](https://github.com/npm/npm/commit/c6ddb6462fe32bf3a27b2c4a62a032a92e982429) npm now assumes that node is newer than 0.6 ([@isaacs](https://github.com/isaacs)) Other changes: * [`ea515c3`](https://github.com/npm/npm/commit/ea515c3b858bf493a7b87fa4cdc2110a0d9cef7f) [#6043](https://github.com/npm/npm/issues/6043) `slide@1.1.6`: wait until all callbacks have finished before proceeding ([@othiym23](https://github.com/othiym23)) * [`0b0a59d`](https://github.com/npm/npm/commit/0b0a59d504f20f424294b1590ace73a7464f0378) [#6043](https://github.com/npm/npm/issues/6043) defer rollbacks until just before the CLI exits ([@isaacs](https://github.com/isaacs)) * [`a11c88b`](https://github.com/npm/npm/commit/a11c88bdb1488b87d8dcac69df9a55a7a91184b6) [#6175](https://github.com/npm/npm/issues/6175) pack scoped packages correctly ([@othiym23](https://github.com/othiym23)) * [`e4e48e0`](https://github.com/npm/npm/commit/e4e48e037d4e95fdb6acec80b04c5c6eaee59970) [#6121](https://github.com/npm/npm/issues/6121) `read-installed@3.1.2`: don't mark linked dev dependencies as extraneous ([@isaacs](https://github.com/isaacs)) * [`d673e41`](https://github.com/npm/npm/commit/d673e4185d43362c2b2a91acbca8c057e7303c7b) `cmd-shim@2.0.1`: depend on `graceful-fs` directly ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`9d54d45`](https://github.com/npm/npm/commit/9d54d45e602d595bdab7eae09b9fa1dc46370147) `npm-registry-couchapp@2.5.3`: make tests more reliable on Travis ([@iarna](https://github.com/iarna)) * [`673d738`](https://github.com/npm/npm/commit/673d738c6142c3d043dcee0b7aa02c9831a2e0ca) ensure permissions are set correctly in cache when running as root ([@isaacs](https://github.com/isaacs)) * [`6e6a5fb`](https://github.com/npm/npm/commit/6e6a5fb74af10fd345411df4e121e554e2e3f33e) prepare for upgrade to `node-semver@4.0.0` ([@isaacs](https://github.com/isaacs)) * [`ab8dd87`](https://github.com/npm/npm/commit/ab8dd87b943262f5996744e8d4cc30cc9358b7d7) swap out `ronn` for `marked-man@0.1.3` ([@isaacs](https://github.com/isaacs)) * [`803da54`](https://github.com/npm/npm/commit/803da5404d5a0b7c9defa3fe7fa0f2d16a2b19d3) `npm-registry-client@3.2.0`: prepare for `node-semver@4.0.0` and include more error information ([@isaacs](https://github.com/isaacs)) * [`4af0e71`](https://github.com/npm/npm/commit/4af0e7134f5757c3d456d83e8349224a4ba12660) make default error display less scary ([@isaacs](https://github.com/isaacs)) * [`4fd9e79`](https://github.com/npm/npm/commit/4fd9e7901a15abff7a3dd478d99ce239b9580bca) `npm-registry-client@3.2.1`: handle errors returned by the registry much, much better ([@othiym23](https://github.com/othiym23)) * [`ca791e2`](https://github.com/npm/npm/commit/ca791e27e97e51c1dd491bff6622ac90b54c3e23) restore a long (always?) missing pass for deduping ([@othiym23](https://github.com/othiym23)) * [`ca0ef0e`](https://github.com/npm/npm/commit/ca0ef0e99bbdeccf28d550d0296baa4cb5e7ece2) correctly interpret relative paths for local dependencies ([@othiym23](https://github.com/othiym23)) * [`5eb8db2`](https://github.com/npm/npm/commit/5eb8db2c370eeb4cd34f6e8dc6a935e4ea325621) `npm-package-arg@2.1.2`: support git+file:// URLs for local bare repos ([@othiym23](https://github.com/othiym23)) * [`860a185`](https://github.com/npm/npm/commit/860a185c43646aca84cb93d1c05e2266045c316b) tweak docs to no longer advocate checking in `node_modules` ([@hunterloftis](https://github.com/hunterloftis)) * [`80e9033`](https://github.com/npm/npm/commit/80e9033c40e373775e35c674faa6c1948661782b) add links to nodejs.org downloads to docs ([@meetar](https://github.com/meetar)) ### v1.4.28 (2014-09-12): * [`f4540b6`](https://github.com/npm/npm/commit/f4540b6537a87e653d7495a9ddcf72949fdd4d14) [#6043](https://github.com/npm/npm/issues/6043) defer rollbacks until just before the CLI exits ([@isaacs](https://github.com/isaacs)) * [`1eabfd5`](https://github.com/npm/npm/commit/1eabfd5c03f33c2bd28823714ff02059eeee3899) [#6043](https://github.com/npm/npm/issues/6043) `slide@1.1.6`: wait until all callbacks have finished before proceeding ([@othiym23](https://github.com/othiym23)) ### v2.0.0-beta.3 (2014-09-04): * [`fa79413`](https://github.com/npm/npm/commit/fa794138bec8edb7b88639db25ee9c010d2f4c2b) [#6119](https://github.com/npm/npm/issues/6119) fall back to registry installs if package.json is missing in a local directory ([@iarna](https://github.com/iarna)) * [`16073e2`](https://github.com/npm/npm/commit/16073e2d8ae035961c4c189b602d4aacc6d6b387) `npm-package-arg@2.1.0`: support file URIs as local specs ([@othiym23](https://github.com/othiym23)) * [`9164acb`](https://github.com/npm/npm/commit/9164acbdee28956fa816ce5e473c559395ae4ec2) `github-url-from-username-repo@1.0.2`: don't match strings that are already URIs ([@othiym23](https://github.com/othiym23)) * [`4067d6b`](https://github.com/npm/npm/commit/4067d6bf303a69be13f3af4b19cf4fee1b0d3e12) [#5629](https://github.com/npm/npm/issues/5629) support saving of local packages in `package.json` ([@dylang](https://github.com/dylang)) * [`1b2ffdf`](https://github.com/npm/npm/commit/1b2ffdf359a8c897a78f91fc5a5d535c97aaec97) [#6097](https://github.com/npm/npm/issues/6097) document scoped packages ([@seldo](https://github.com/seldo)) * [`0a67d53`](https://github.com/npm/npm/commit/0a67d536067c4808a594d81288d34c0f7e97e105) [#6007](https://github.com/npm/npm/issues/6007) `request@2.42.0`: properly set headers on proxy requests ([@isaacs](https://github.com/isaacs)) * [`9bac6b8`](https://github.com/npm/npm/commit/9bac6b860b674d24251bb7b8ba412fdb26cbc836) `npmconf@2.0.8`: disallow semver ranges in tag configuration ([@isaacs](https://github.com/isaacs)) * [`d2d4d7c`](https://github.com/npm/npm/commit/d2d4d7cd3c32f91a87ffa11fe464d524029011c3) [#6082](https://github.com/npm/npm/issues/6082) don't allow tagging with a semver range as the tag name ([@isaacs](https://github.com/isaacs)) ### v1.4.27 (2014-09-04): * [`4cf3c8f`](https://github.com/npm/npm/commit/4cf3c8fd78c9e2693a5f899f50c28f4823c88e2e) [#6007](https://github.com/npm/npm/issues/6007) request@2.42.0: properly set headers on proxy requests ([@isaacs](https://github.com/isaacs)) * [`403cb52`](https://github.com/npm/npm/commit/403cb526be1472bb7545fa8e62d4976382cdbbe5) [#6055](https://github.com/npm/npm/issues/6055) npmconf@1.1.8: restore case-insensitivity of environmental config ([@iarna](https://github.com/iarna)) ### v2.0.0-beta.2 (2014-08-29): SPECIAL LABOR DAY WEEKEND RELEASE PARTY WOOO * [`ed207e8`](https://github.com/npm/npm/commit/ed207e88019de3150037048df6267024566e1093) `npm-registry-client@3.1.7`: Clean up auth logic and improve logging around auth decisions. Also error on trying to change a user document without writing to it. ([@othiym23](https://github.com/othiym23)) * [`66c7423`](https://github.com/npm/npm/commit/66c7423b7fb07a326b83c83727879410d43c439f) `npmconf@2.0.7`: support -C as an alias for --prefix ([@isaacs](https://github.com/isaacs)) * [`0dc6a07`](https://github.com/npm/npm/commit/0dc6a07c778071c94c2251429c7d107e88a45095) [#6059](https://github.com/npm/npm/issues/6059) run commands in prefix, not cwd ([@isaacs](https://github.com/isaacs)) * [`65d2179`](https://github.com/npm/npm/commit/65d2179af96737eb9038eaa24a293a62184aaa13) `github-url-from-username-repo@1.0.1`: part 3 handle slashes in branch names ([@robertkowalski](https://github.com/robertkowalski)) * [`e8d75d0`](https://github.com/npm/npm/commit/e8d75d0d9f148ce2b3e8f7671fa281945bac363d) [#6057](https://github.com/npm/npm/issues/6057) `read-installed@3.1.1`: properly handle extraneous dev dependencies of required dependencies ([@othiym23](https://github.com/othiym23)) * [`0602f70`](https://github.com/npm/npm/commit/0602f708f070d524ad41573afd4c57171cab21ad) [#6064](https://github.com/npm/npm/issues/6064) ls: do not show deps of extraneous deps ([@isaacs](https://github.com/isaacs)) ### v2.0.0-beta.1 (2014-08-28): * [`78a1fc1`](https://github.com/npm/npm/commit/78a1fc12307a0cbdbc944775ed831b876ee65855) `github-url-from-git@1.4.0`: add support for git+https and git+ssh ([@stefanbuck](https://github.com/stefanbuck)) * [`bf247ed`](https://github.com/npm/npm/commit/bf247edf5429c6b3ec4d4cb798fa0eb0a9c19fc1) `columnify@1.2.1` ([@othiym23](https://github.com/othiym23)) * [`4bbe682`](https://github.com/npm/npm/commit/4bbe682a6d4eabcd23f892932308c9f228bf4de3) `cmd-shim@2.0.0`: upgrade to graceful-fs 3 ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`ae1d590`](https://github.com/npm/npm/commit/ae1d590bdfc2476a4ed446e760fea88686e3ae05) `npm-package-arg@2.0.4`: accept slashes in branch names ([@thealphanerd](https://github.com/thealphanerd)) * [`b2f51ae`](https://github.com/npm/npm/commit/b2f51aecadf585711e145b6516f99e7c05f53614) `semver@3.0.1`: semver.clean() is cleaner ([@isaacs](https://github.com/isaacs)) * [`1d041a8`](https://github.com/npm/npm/commit/1d041a8a5ebd5bf6cecafab2072d4ec07823adab) `github-url-from-username-repo@1.0.0`: accept slashes in branch names ([@robertkowalski](https://github.com/robertkowalski)) * [`02c85d5`](https://github.com/npm/npm/commit/02c85d592c4058e5d9eafb0be36b6743ae631998) `async-some@1.0.1` ([@othiym23](https://github.com/othiym23)) * [`5af493e`](https://github.com/npm/npm/commit/5af493efa8a463cd1acc4a9a394699e2c0793b9c) ensure lifecycle spawn errors caught properly ([@isaacs](https://github.com/isaacs)) * [`60fe012`](https://github.com/npm/npm/commit/60fe012fac9570d6c72554cdf34a6fa95bf0f0a6) `npmconf@2.0.6`: init.version defaults to 1.0.0 ([@isaacs](https://github.com/isaacs)) * [`b4c717b`](https://github.com/npm/npm/commit/b4c717bbf58fb6a0d64ad229036c79a184297ee2) `npm-registry-client@3.1.4`: properly encode % in passwords ([@isaacs](https://github.com/isaacs)) * [`7b55f44`](https://github.com/npm/npm/commit/7b55f44420252baeb3f30da437d22956315c31c9) doc: Fix 'npm help index' ([@isaacs](https://github.com/isaacs)) ### v1.4.26 (2014-08-28): * [`eceea95`](https://github.com/npm/npm/commit/eceea95c804fa15b18e91c52c0beb08d42a3e77d) `github-url-from-git@1.4.0`: add support for git+https and git+ssh ([@stefanbuck](https://github.com/stefanbuck)) * [`e561758`](https://github.com/npm/npm/commit/e5617587e7d7ab686192391ce55357dbc7fed0a3) `columnify@1.2.1` ([@othiym23](https://github.com/othiym23)) * [`0c4fab3`](https://github.com/npm/npm/commit/0c4fab372ee76eab01dda83b6749429a8564902e) `cmd-shim@2.0.0`: upgrade to graceful-fs 3 ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`2d69e4d`](https://github.com/npm/npm/commit/2d69e4d95777671958b5e08d3b2f5844109d73e4) `github-url-from-username-repo@1.0.0`: accept slashes in branch names ([@robertkowalski](https://github.com/robertkowalski)) * [`81f9b2b`](https://github.com/npm/npm/commit/81f9b2bac9d34c223ea093281ba3c495f23f10d1) ensure lifecycle spawn errors caught properly ([@isaacs](https://github.com/isaacs)) * [`bfaab8c`](https://github.com/npm/npm/commit/bfaab8c6e0942382a96b250634ded22454c36b5a) `npm-registry-client@2.0.7`: properly encode % in passwords ([@isaacs](https://github.com/isaacs)) * [`91cfb58`](https://github.com/npm/npm/commit/91cfb58dda851377ec604782263519f01fd96ad8) doc: Fix 'npm help index' ([@isaacs](https://github.com/isaacs)) ### v2.0.0-beta.0 (2014-08-21): * [`685f8be`](https://github.com/npm/npm/commit/685f8be1f2770cc75fd0e519a8d7aac72735a270) `npm-registry-client@3.1.3`: Print the notification header returned by the registry, and make sure status codes are printed without gratuitous quotes around them. ([@isaacs](https://github.com/isaacs) / [@othiym23](https://github.com/othiym23)) * [`a8cb676`](https://github.com/npm/npm/commit/a8cb676aef0561eaf04487d2719672b097392c85) [#5900](https://github.com/npm/npm/issues/5900) remove `npm` from its own `engines` field in `package.json`. None of us remember why it was there. ([@timoxley](https://github.com/timoxley)) * [`6c47201`](https://github.com/npm/npm/commit/6c47201a7d071e8bf091b36933daf4199cc98e80) [#5752](https://github.com/npm/npm/issues/5752), [#6013](https://github.com/npm/npm/issues/6013) save git URLs correctly in `_resolved` fields ([@isaacs](https://github.com/isaacs)) * [`e4e1223`](https://github.com/npm/npm/commit/e4e1223a91c37688ba3378e1fc9d5ae045654d00) [#5936](https://github.com/npm/npm/issues/5936) document the use of tags in `package.json` ([@KenanY](https://github.com/KenanY)) * [`c92b8d4`](https://github.com/npm/npm/commit/c92b8d4db7bde2a501da5b7d612684de1d629a42) [#6004](https://github.com/npm/npm/issues/6004) manually installed scoped packages are tracked correctly ([@dead](https://github.com/dead)-horse) * [`21ca0aa`](https://github.com/npm/npm/commit/21ca0aaacbcfe2b89b0a439d914da0cae62de550) [#5945](https://github.com/npm/npm/issues/5945) link scoped packages correctly ([@dead](https://github.com/dead)-horse) * [`16bead7`](https://github.com/npm/npm/commit/16bead7f2c82aec35b83ff0ec04df051ba456764) [#5958](https://github.com/npm/npm/issues/5958) ensure that file streams work in all versions of node ([@dead](https://github.com/dead)-horse) * [`dbf0cab`](https://github.com/npm/npm/commit/dbf0cab29d0db43ac95e4b5a1fbdea1e0af75f10) you can now pass quoted args to `npm run-script` ([@bcoe](https://github.com/bcoe)) * [`0583874`](https://github.com/npm/npm/commit/05838743f01ccb8d2432b3858d66847002fb62df) `tar@1.0.1`: Add test for removing an extract target immediately after unpacking. ([@isaacs](https://github.com/isaacs)) * [`cdf3b04`](https://github.com/npm/npm/commit/cdf3b0428bc0b0183fb41dcde9e34e8f42c5e3a7) `lockfile@1.0.0`: Fix incorrect interaction between `wait`, `stale`, and `retries` options. Part 2 of race condition leading to `ENOENT` ([@isaacs](https://github.com/isaacs)) errors. * [`22d72a8`](https://github.com/npm/npm/commit/22d72a87a9e1a9ab56d9585397f63551887d9125) `fstream@1.0.2`: Fix a double-finish call which can result in excess FS operations after the `close` event. Part 1 of race condition leading to `ENOENT` errors. ([@isaacs](https://github.com/isaacs)) ### v1.4.25 (2014-08-21): * [`64c0ec2`](https://github.com/npm/npm/commit/64c0ec241ef5d83761ca8de54acb3c41b079956e) `npm-registry-client@2.0.6`: Print the notification header returned by the registry, and make sure status codes are printed without gratuitous quotes around them. ([@othiym23](https://github.com/othiym23)) * [`a8ed12b`](https://github.com/npm/npm/commit/a8ed12b) `tar@1.0.1`: Add test for removing an extract target immediately after unpacking. ([@isaacs](https://github.com/isaacs)) * [`70fd11d`](https://github.com/npm/npm/commit/70fd11d) `lockfile@1.0.0`: Fix incorrect interaction between `wait`, `stale`, and `retries` options. Part 2 of race condition leading to `ENOENT` errors. ([@isaacs](https://github.com/isaacs)) * [`0072c4d`](https://github.com/npm/npm/commit/0072c4d) `fstream@1.0.2`: Fix a double-finish call which can result in excess FS operations after the `close` event. Part 2 of race condition leading to `ENOENT` errors. ([@isaacs](https://github.com/isaacs)) ### v2.0.0-alpha.7 (2014-08-14): * [`f23f1d8`](https://github.com/npm/npm/commit/f23f1d8e8f86ec1b7ab8dad68250bccaa67d61b1) doc: update version doc to include `pre-*` increment args ([@isaacs](https://github.com/isaacs)) * [`b6bb746`](https://github.com/npm/npm/commit/b6bb7461824d4dc1c0936f46bd7929b5cd597986) build: add 'make tag' to tag current release as latest ([@isaacs](https://github.com/isaacs)) * [`27c4bb6`](https://github.com/npm/npm/commit/27c4bb606e46e5eaf604b19fe8477bc6567f8b2e) build: publish with `--tag=v1.4-next` ([@isaacs](https://github.com/isaacs)) * [`cff66c3`](https://github.com/npm/npm/commit/cff66c3bf2850880058ebe2a26655dafd002495e) build: add script to output `v1.4-next` publish tag ([@isaacs](https://github.com/isaacs)) * [`22abec8`](https://github.com/npm/npm/commit/22abec8833474879ac49b9604c103bc845dad779) build: remove outdated `docpublish` make target ([@isaacs](https://github.com/isaacs)) * [`1be4de5`](https://github.com/npm/npm/commit/1be4de51c3976db8564f72b00d50384c921f0917) build: remove `unpublish` step from `make publish` ([@isaacs](https://github.com/isaacs)) * [`e429e20`](https://github.com/npm/npm/commit/e429e2011f4d78e398f2461bca3e5a9a146fbd0c) doc: add new changelog ([@othiym23](https://github.com/othiym23)) * [`9243d20`](https://github.com/npm/npm/commit/9243d207896ea307082256604c10817f7c318d68) lifecycle: test lifecycle path modification ([@isaacs](https://github.com/isaacs)) * [`021770b`](https://github.com/npm/npm/commit/021770b9cb07451509f0a44afff6c106311d8cf6) lifecycle: BREAKING CHANGE do not add the directory containing node executable ([@chulkilee](https://github.com/chulkilee)) * [`1d5c41d`](https://github.com/npm/npm/commit/1d5c41dd0d757bce8b87f10c4135f04ece55aeb9) install: rename .gitignore when unpacking foreign tarballs ([@isaacs](https://github.com/isaacs)) * [`9aac267`](https://github.com/npm/npm/commit/9aac2670a73423544d92b27cc301990a16a9563b) cache: detect non-gzipped tar files more reliably ([@isaacs](https://github.com/isaacs)) * [`3f24755`](https://github.com/npm/npm/commit/3f24755c8fce3c7ab11ed1dc632cc40d7ef42f62) `readdir-scoped-modules@1.0.0` ([@isaacs](https://github.com/isaacs)) * [`151cd2f`](https://github.com/npm/npm/commit/151cd2ff87b8ac2fc9ea366bc9b7f766dc5b9684) `read-installed@3.1.0` ([@isaacs](https://github.com/isaacs)) * [`f5a9434`](https://github.com/npm/npm/commit/f5a94343a8ebe4a8cd987320b55137aef53fb3fd) test: fix Travis timeouts ([@dylang](https://github.com/dylang)) * [`126cafc`](https://github.com/npm/npm/commit/126cafcc6706814c88af3042f2ffff408747bff4) `npm-registry-couchapp@2.5.0` ([@othiym23](https://github.com/othiym23)) ### v1.4.24 (2014-08-14): * [`9344bd9`](https://github.com/npm/npm/commit/9344bd9b2929b5c399a0e0e0b34d45bce7bc24bb) doc: add new changelog ([@othiym23](https://github.com/othiym23)) * [`4be76fd`](https://github.com/npm/npm/commit/4be76fd65e895883c337a99f275ccc8c801adda3) doc: update version doc to include `pre-*` increment args ([@isaacs](https://github.com/isaacs)) * [`e4f2620`](https://github.com/npm/npm/commit/e4f262036080a282ad60e236a9aeebd39fde9fe4) build: add `make tag` to tag current release as `latest` ([@isaacs](https://github.com/isaacs)) * [`ec2596a`](https://github.com/npm/npm/commit/ec2596a7cb626772780b25b0a94a7e547a812bd5) build: publish with `--tag=v1.4-next` ([@isaacs](https://github.com/isaacs)) * [`9ee55f8`](https://github.com/npm/npm/commit/9ee55f892b8b473032a43c59912c5684fd1b39e6) build: add script to output `v1.4-next` publish tag ([@isaacs](https://github.com/isaacs)) * [`aecb56f`](https://github.com/npm/npm/commit/aecb56f95a84687ea46920a0b98aaa587fee1568) build: remove outdated `docpublish` make target ([@isaacs](https://github.com/isaacs)) * [`b57a9b7`](https://github.com/npm/npm/commit/b57a9b7ccd13e6b38831ed63595c8ea5763da247) build: remove unpublish step from `make publish` ([@isaacs](https://github.com/isaacs)) * [`2c6acb9`](https://github.com/npm/npm/commit/2c6acb96c71c16106965d5cd829b67195dd673c7) install: rename `.gitignore` when unpacking foreign tarballs ([@isaacs](https://github.com/isaacs)) * [`22f3681`](https://github.com/npm/npm/commit/22f3681923e993a47fc1769ba735bfa3dd138082) cache: detect non-gzipped tar files more reliably ([@isaacs](https://github.com/isaacs)) ### v2.0.0-alpha.6 (2014-08-07): BREAKING CHANGE: * [`ea547e2`](https://github.com/npm/npm/commit/ea547e2) Bump semver to version 3: `^0.x.y` is now functionally the same as `=0.x.y`. ([@isaacs](https://github.com/isaacs)) Other changes: * [`d987707`](https://github.com/npm/npm/commit/d987707) move fetch into npm-registry-client ([@othiym23](https://github.com/othiym23)) * [`9b318e2`](https://github.com/npm/npm/commit/9b318e2) `read-installed@3.0.0` ([@isaacs](https://github.com/isaacs)) * [`9d73de7`](https://github.com/npm/npm/commit/9d73de7) remove unnecessary mkdirps ([@isaacs](https://github.com/isaacs)) * [`33ccd13`](https://github.com/npm/npm/commit/33ccd13) Don't squash execute perms in `_git-remotes/` dir ([@adammeadows](https://github.com/adammeadows)) * [`48fd233`](https://github.com/npm/npm/commit/48fd233) `npm-package-arg@2.0.1` ([@isaacs](https://github.com/isaacs)) ### v1.4.23 (2014-07-31): * [`8dd11d1`](https://github.com/npm/npm/commit/8dd11d1) update several dependencies to avoid using `semver`s starting with 0. ### v1.4.22 (2014-07-31): * [`d9a9e84`](https://github.com/npm/npm/commit/d9a9e84) `read-package-json@1.2.4` ([@isaacs](https://github.com/isaacs)) * [`86f0340`](https://github.com/npm/npm/commit/86f0340) `github-url-from-git@1.2.0` ([@isaacs](https://github.com/isaacs)) * [`a94136a`](https://github.com/npm/npm/commit/a94136a) `fstream@0.1.29` ([@isaacs](https://github.com/isaacs)) * [`bb82d18`](https://github.com/npm/npm/commit/bb82d18) `glob@4.0.5` ([@isaacs](https://github.com/isaacs)) * [`5b6bcf4`](https://github.com/npm/npm/commit/5b6bcf4) `cmd-shim@1.1.2` ([@isaacs](https://github.com/isaacs)) * [`c2aa8b3`](https://github.com/npm/npm/commit/c2aa8b3) license: Cleaned up legalese with actual lawyer ([@isaacs](https://github.com/isaacs)) * [`63fe0ee`](https://github.com/npm/npm/commit/63fe0ee) `init-package-json@1.0.0` ([@isaacs](https://github.com/isaacs)) ### v2.0.0-alpha-5 (2014-07-22): This release bumps up to 2.0 because of this breaking change, which could potentially affect how your package's scripts are run: * [`df4b0e7`](https://github.com/npm/npm/commit/df4b0e7fc1abd9a54f98db75ec9e4d03d37d125b) [#5518](https://github.com/npm/npm/issues/5518) BREAKING CHANGE: support passing arguments to `run` scripts ([@bcoe](https://github.com/bcoe)) Other changes: * [`cd422c9`](https://github.com/npm/npm/commit/cd422c9de510766797c65720d70f085000f50543) [#5748](https://github.com/npm/npm/issues/5748) link binaries for scoped packages ([@othiym23](https://github.com/othiym23)) * [`4c3c778`](https://github.com/npm/npm/commit/4c3c77839920e830991e0c229c3c6a855c914d67) [#5758](https://github.com/npm/npm/issues/5758) `npm link` includes scope when linking scoped package ([@fengmk2](https://github.com/fengmk2)) * [`f9f58dd`](https://github.com/npm/npm/commit/f9f58dd0f5b715d4efa6619f13901916d8f99c47) [#5707](https://github.com/npm/npm/issues/5707) document generic pre- / post-commands ([@sudodoki](https://github.com/sudodoki)) * [`ac7a480`](https://github.com/npm/npm/commit/ac7a4801d80361b41dce4a18f22bcdf75e396000) [#5406](https://github.com/npm/npm/issues/5406) `npm cache` displays usage when called without arguments ([@michaelnisi](https://github.com/michaelnisi)) * [`f4554e9`](https://github.com/npm/npm/commit/f4554e99d34f77a8a02884493748f7d49a9a9d8b) Test fixes for Windows ([@isaacs](https://github.com/isaacs)) * update dependencies ([@othiym23](https://github.com/othiym23)) ### v1.5.0-alpha-4 (2014-07-18): * fall back to `_auth` config as default auth when using default registry ([@isaacs](https://github.com/isaacs)) * support for 'init.version' for those who don't want to deal with semver 0.0.x oddities ([@rvagg](https://github.com/rvagg)) * [`be06213`](https://github.com/npm/npm/commit/be06213415f2d51a50d2c792b4cd0d3412a9a7b1) remove residual support for `win` log level ([@aterris](https://github.com/aterris)) ### v1.5.0-alpha-3 (2014-07-17): * [`a3a85dd`](https://github.com/npm/npm/commit/a3a85dd004c9245a71ad2f0213bd1a9a90d64cd6) `--save` scoped packages correctly ([@othiym23](https://github.com/othiym23)) * [`18a3385`](https://github.com/npm/npm/commit/18a3385bcf8bfb8312239216afbffb7eec759150) `npm-registry-client@3.0.2` ([@othiym23](https://github.com/othiym23)) * [`375988b`](https://github.com/npm/npm/commit/375988b9bf5aa5170f06a790d624d31b1eb32c6d) invalid package names are an early error for optional deps ([@othiym23](https://github.com/othiym23)) * consistently use `node-package-arg` instead of arbitrary package spec splitting ([@othiym23](https://github.com/othiym23)) ### v1.4.21 (2014-07-14): * [`88f51aa`](https://github.com/npm/npm/commit/88f51aa27eb9a958d1fa7ec50fee5cfdedd05110) fix handling for 301s in `npm-registry-client@2.0.3` ([@Raynos](https://github.com/Raynos)) ### v1.5.0-alpha-2 (2014-07-01): * [`54cf625`](https://github.com/npm/npm/commit/54cf62534e3331e3f454e609e44f0b944e819283) fix handling for 301s in `npm-registry-client@3.0.1` ([@Raynos](https://github.com/Raynos)) * [`e410861`](https://github.com/npm/npm/commit/e410861c69a3799c1874614cb5b87af8124ff98d) don't crash if no username set on `whoami` ([@isaacs](https://github.com/isaacs)) * [`0353dde`](https://github.com/npm/npm/commit/0353ddeaca8171aa7dbdd8102b7e2eb581a86406) respect `--json` for output ([@isaacs](https://github.com/isaacs)) * [`b3d112a`](https://github.com/npm/npm/commit/b3d112ae190b984cc1779b9e6de92218f22380c6) outdated: Don't show headings if there's nothing to output ([@isaacs](https://github.com/isaacs)) * [`bb4b90c`](https://github.com/npm/npm/commit/bb4b90c80dbf906a1cb26d85bc0625dc2758acc3) outdated: Default to `latest` rather than `*` for unspecified deps ([@isaacs](https://github.com/isaacs)) ### v1.4.20 (2014-07-02): * [`0353dde`](https://github.com/npm/npm/commit/0353ddeaca8171aa7dbdd8102b7e2eb581a86406) respect `--json` for output ([@isaacs](https://github.com/isaacs)) * [`b3d112a`](https://github.com/npm/npm/commit/b3d112ae190b984cc1779b9e6de92218f22380c6) outdated: Don't show headings if there's nothing to output ([@isaacs](https://github.com/isaacs)) * [`bb4b90c`](https://github.com/npm/npm/commit/bb4b90c80dbf906a1cb26d85bc0625dc2758acc3) outdated: Default to `latest` rather than `*` for unspecified deps ([@isaacs](https://github.com/isaacs)) ### v1.5.0-alpha-1 (2014-07-01): * [`eef4884`](https://github.com/npm/npm/commit/eef4884d6487ee029813e60a5f9c54e67925d9fa) use the correct piece of the spec for GitHub shortcuts ([@othiym23](https://github.com/othiym23)) ### v1.5.0-alpha-0 (2014-07-01): * [`7f55057`](https://github.com/npm/npm/commit/7f55057807cfdd9ceaf6331968e666424f48116c) install scoped packages ([#5239](https://github.com/npm/npm/issues/5239)) ([@othiym23](https://github.com/othiym23)) * [`0df7e16`](https://github.com/npm/npm/commit/0df7e16c0232d8f4d036ebf4ec3563215517caac) publish scoped packages ([#5239](https://github.com/npm/npm/issues/5239)) ([@othiym23](https://github.com/othiym23)) * [`0689ba2`](https://github.com/npm/npm/commit/0689ba249b92b4c6279a26804c96af6f92b3a501) support (and save) --scope=@s config ([@othiym23](https://github.com/othiym23)) * [`f34878f`](https://github.com/npm/npm/commit/f34878fc4cee29901e4daf7bace94be01e25cad7) scope credentials to registry ([@othiym23](https://github.com/othiym23)) * [`0ac7ca2`](https://github.com/npm/npm/commit/0ac7ca233f7a69751fe4386af6c4daa3ee9fc0da) capture and store bearer tokens when sent by registry ([@othiym23](https://github.com/othiym23)) * [`63c3277`](https://github.com/npm/npm/commit/63c3277f089b2c4417e922826bdc313ac854cad6) only delete files that are created by npm ([@othiym23](https://github.com/othiym23)) * [`4f54043`](https://github.com/npm/npm/commit/4f540437091d1cbca3915cd20c2da83c2a88bb8e) `npm-package-arg@2.0.0` ([@othiym23](https://github.com/othiym23)) * [`9e1460e`](https://github.com/npm/npm/commit/9e1460e6ac9433019758481ec031358f4af4cd44) `read-package-json@1.2.3` ([@othiym23](https://github.com/othiym23)) * [`719d8ad`](https://github.com/npm/npm/commit/719d8adb9082401f905ff4207ede494661f8a554) `fs-vacuum@1.2.1` ([@othiym23](https://github.com/othiym23)) * [`9ef8fe4`](https://github.com/npm/npm/commit/9ef8fe4d6ead3acb3e88c712000e2d3a9480ebec) `async-some@1.0.0` ([@othiym23](https://github.com/othiym23)) * [`a964f65`](https://github.com/npm/npm/commit/a964f65ab662107b62a4ca58535ce817e8cca331) `npmconf@2.0.1` ([@othiym23](https://github.com/othiym23)) * [`113765b`](https://github.com/npm/npm/commit/113765bfb7d3801917c1d9f124b8b3d942bec89a) `npm-registry-client@3.0.0` ([@othiym23](https://github.com/othiym23)) ### v1.4.19 (2014-07-01): * [`f687433`](https://github.com/npm/npm/commit/f687433) relative URLS for working non-root registry URLS ([@othiym23](https://github.com/othiym23)) * [`bea190c`](https://github.com/npm/npm/commit/bea190c) [#5591](https://github.com/npm/npm/issues/5591) bump nopt and npmconf ([@isaacs](https://github.com/isaacs)) ### v1.4.18 (2014-06-29): * Bump glob dependency from 4.0.2 to 4.0.3. It now uses graceful-fs when available, increasing resilience to [various filesystem errors](https://github.com/isaacs/node-graceful-fs#improvements-over-fs-module). ([@isaacs](https://github.com/isaacs)) ### v1.4.17 (2014-06-27): * replace escape codes with ansicolors ([@othiym23](https://github.com/othiym23)) * Allow to build all the docs OOTB. ([@GeJ](https://github.com/GeJ)) * Use core.longpaths on win32 git - fixes [#5525](https://github.com/npm/npm/issues/5525) ([@bmeck](https://github.com/bmeck)) * `npmconf@1.1.2` ([@isaacs](https://github.com/isaacs)) * Consolidate color sniffing in config/log loading process ([@isaacs](https://github.com/isaacs)) * add verbose log when project config file is ignored ([@isaacs](https://github.com/isaacs)) * npmconf: Float patch to remove 'scope' from config defs ([@isaacs](https://github.com/isaacs)) * doc: npm-explore can't handle a version ([@robertkowalski](https://github.com/robertkowalski)) * Add user-friendly errors for ENOSPC and EROFS. ([@voodootikigod](https://github.com/voodootikigod)) * bump tar and fstream deps ([@isaacs](https://github.com/isaacs)) * Run the npm-registry-couchapp tests along with npm tests ([@isaacs](https://github.com/isaacs)) ### v1.2.8000 (2014-06-17): * Same as v1.4.16, but with the spinner disabled, and a version number that starts with v1.2. ### v1.4.16 (2014-06-17): * `npm-registry-client@2.0.2` ([@isaacs](https://github.com/isaacs)) * `fstream@0.1.27` ([@isaacs](https://github.com/isaacs)) * `sha@1.2.4` ([@isaacs](https://github.com/isaacs)) * `rimraf@2.2.8` ([@isaacs](https://github.com/isaacs)) * `npmlog@1.0.1` ([@isaacs](https://github.com/isaacs)) * `npm-registry-client@2.0.1` ([@isaacs](https://github.com/isaacs)) * removed redundant dependency ([@othiym23](https://github.com/othiym23)) * `npmconf@1.0.5` ([@isaacs](https://github.com/isaacs)) * Properly handle errors that can occur in the config-loading process ([@isaacs](https://github.com/isaacs)) ### v1.4.15 (2014-06-10): * cache: atomic de-race-ified package.json writing ([@isaacs](https://github.com/isaacs)) * `fstream@0.1.26` ([@isaacs](https://github.com/isaacs)) * `graceful-fs@3.0.2` ([@isaacs](https://github.com/isaacs)) * `osenv@0.1.0` ([@isaacs](https://github.com/isaacs)) * Only spin the spinner when we're fetching stuff ([@isaacs](https://github.com/isaacs)) * Update `osenv@0.1.0` which removes ~/tmp as possible tmp-folder ([@robertkowalski](https://github.com/robertkowalski)) * `ini@1.2.1` ([@isaacs](https://github.com/isaacs)) * `graceful-fs@3` ([@isaacs](https://github.com/isaacs)) * Update glob and things depending on glob ([@isaacs](https://github.com/isaacs)) * github-url-from-username-repo and read-package-json updates ([@isaacs](https://github.com/isaacs)) * `editor@0.1.0` ([@isaacs](https://github.com/isaacs)) * `columnify@1.1.0` ([@isaacs](https://github.com/isaacs)) * bump ansi and associated deps ([@isaacs](https://github.com/isaacs)) ### v1.4.14 (2014-06-05): * char-spinner: update to not bork windows ([@isaacs](https://github.com/isaacs)) ### v1.4.13 (2014-05-23): * Fix `npm install` on a tarball. ([`ed3abf1`](https://github.com/npm/npm/commit/ed3abf1aa10000f0f687330e976d78d1955557f6), [#5330](https://github.com/npm/npm/issues/5330), [@othiym23](https://github.com/othiym23)) * Fix an issue with the spinner on Node 0.8. ([`9f00306`](https://github.com/npm/npm/commit/9f003067909440390198c0b8f92560d84da37762), [@isaacs](https://github.com/isaacs)) * Re-add `npm.commands.cache.clean` and `npm.commands.cache.read` APIs, and document `npm.commands.cache.*` as npm-cache(3). ([`e06799e`](https://github.com/npm/npm/commit/e06799e77e60c1fc51869619083a25e074d368b3), [@isaacs](https://github.com/isaacs)) ### v1.4.12 (2014-05-23): * remove normalize-package-data from top level, de-^-ify inflight dep ([@isaacs](https://github.com/isaacs)) * Always sort saved bundleDependencies ([@isaacs](https://github.com/isaacs)) * add inflight to bundledDependencies ([@othiym23](https://github.com/othiym23)) ### v1.4.11 (2014-05-22): * fix `npm ls` labeling issue * `node-gyp@0.13.1` * default repository to https:// instead of git:// * addLocalTarball: Remove extraneous unpack ([@isaacs](https://github.com/isaacs)) * Massive cache folder refactor ([@othiym23](https://github.com/othiym23) and [@isaacs](https://github.com/isaacs)) * Busy Spinner, no http noise ([@isaacs](https://github.com/isaacs)) * Per-project .npmrc file support ([@isaacs](https://github.com/isaacs)) * `npmconf@1.0.0`, Refactor config/uid/prefix loading process ([@isaacs](https://github.com/isaacs)) * Allow once-disallowed characters in passwords ([@isaacs](https://github.com/isaacs)) * Send npm version as 'version' header ([@isaacs](https://github.com/isaacs)) * fix cygwin encoding issue (Karsten Tinnefeld) * Allow non-github repositories with `npm repo` ([@evanlucas](https://github.com/evanlucas)) * Allow peer deps to be satisfied by grandparent * Stop optional deps moving into deps on `update --save` ([@timoxley](https://github.com/timoxley)) * Ensure only matching deps update with `update --save*` ([@timoxley](https://github.com/timoxley)) * Add support for `prerelease`, `preminor`, `prepatch` to `npm version` ### v1.4.10 (2014-05-05): * Don't set referer if already set * fetch: Send referer and npm-session headers * `run-script`: Support `--parseable` and `--json` * list runnable scripts ([@evanlucas](https://github.com/evanlucas)) * Use marked instead of ronn for html docs ### v1.4.9 (2014-05-01): * Send referer header (with any potentially private stuff redacted) * Fix critical typo bug in previous npm release ### v1.4.8 (2014-05-01): * Check SHA before using files from cache * adduser: allow change of the saved password * Make `npm install` respect `config.unicode` * Fix lifecycle to pass `Infinity` for config env value * Don't return 0 exit code on invalid command * cache: Handle 404s and other HTTP errors as errors * Resolve ~ in path configs to env.HOME * Include npm version in default user-agent conf * npm init: Use ISC as default license, use save-prefix for deps * Many test and doc fixes ### v1.4.7 (2014-04-15): * Add `--save-prefix` option that can be used to override the default of `^` when using `npm install --save` and its counterparts. ([`64eefdf`](https://github.com/npm/npm/commit/64eefdfe26bb27db8dc90e3ab5d27a5ef18a4470), [@thlorenz](https://github.com/thlorenz)) * Allow `--silent` to silence the echoing of commands that occurs with `npm run`. ([`c95cf08`](https://github.com/npm/npm/commit/c95cf086e5b97dbb48ff95a72517b203a8f29eab), [@Raynos](https://github.com/Raynos)) * Some speed improvements to the cache, which should improve install times. ([`cb94310`](https://github.com/npm/npm/commit/cb94310a6adb18cb7b881eacb8d67171eda8b744), [`3b0870f`](https://github.com/npm/npm/commit/3b0870fb2f40358b3051abdab6be4319d196b99d), [`120f5a9`](https://github.com/npm/npm/commit/120f5a93437bbbea9249801574a2f33e44e81c33), [@isaacs](https://github.com/isaacs)) * Improve ability to retry registry requests when a subset of the registry servers are down. ([`4a5257d`](https://github.com/npm/npm/commit/4a5257de3870ac3dafa39667379f19f6dcd6093e), https://github.com/npm/npm-registry-client/commit/7686d02cb0b844626d6a401e58c0755ef3bc8432, [@isaacs](https://github.com/isaacs)) * Fix marking of peer dependencies as extraneous. ([`779b164`](https://github.com/npm/npm/commit/779b1649764607b062c031c7e5c972151b4a1754), https://github.com/npm/read-installed/commit/6680ba6ef235b1ca3273a00b70869798ad662ddc, [@isaacs](https://github.com/isaacs)) * Fix npm crashing when doing `npm shrinkwrap` in the presence of a `package.json` with no dependencies. ([`a9d9fa5`](https://github.com/npm/npm/commit/a9d9fa5ad3b8c925a589422b7be28d2735f320b0), [@kislyuk](https://github.com/kislyuk)) * Fix error when using `npm view` on packages that have no versions or have been unpublished. ([`94df2f5`](https://github.com/npm/npm/commit/94df2f56d684b35d1df043660180fc321b743dc8), [@juliangruber](https://github.com/juliangruber); [`2241a09`](https://github.com/npm/npm/commit/2241a09c843669c70633c399ce698cec3add40b3), [@isaacs](https://github.com/isaacs)) ### v1.4.6 (2014-03-19): * Fix extraneous package detection to work in more cases. ([`f671286`](https://github.com/npm/npm/commit/f671286), npm/read-installed#20, [@LaurentVB](https://github.com/LaurentVB)) ### v1.4.5 (2014-03-18): * Sort dependencies in `package.json` when doing `npm install --save` and all its variants. ([`6fd6ff7`](https://github.com/npm/npm/commit/6fd6ff7e536ea6acd33037b1878d4eca1f931985), [@domenic](https://github.com/domenic)) * Add `--save-exact` option, usable alongside `--save` and its variants, which will write the exact version number into `package.json` instead of the appropriate semver-compatibility range. ([`17f07df`](https://github.com/npm/npm/commit/17f07df8ad8e594304c2445bf7489cb53346f2c5), [@timoxley](https://github.com/timoxley)) * Accept gzipped content from the registry to speed up downloads and save bandwidth. ([`a3762de`](https://github.com/npm/npm/commit/a3762de843b842be8fa0ab57cdcd6b164f145942), npm/npm-registry-client#40, [@fengmk2](https://github.com/fengmk2)) * Fix `npm ls`'s `--depth` and `--log` options. ([`1d29b17`](https://github.com/npm/npm/commit/1d29b17f5193d52a5c4faa412a95313dcf41ed91), npm/read-installed#13, [@zertosh](https://github.com/zertosh)) * Fix "Adding a cache directory to the cache will make the world implode" in certain cases. ([`9a4b2c4`](https://github.com/npm/npm/commit/9a4b2c4667c2b1e0054e3d5611ab86acb1760834), domenic/path-is-inside#1, [@pmarques](https://github.com/pmarques)) * Fix readmes not being uploaded in certain rare cases. ([`527b72c`](https://github.com/npm/npm/commit/527b72cca6c55762b51e592c48a9f28cc7e2ff8b), [@isaacs](https://github.com/isaacs)) ### v1.4.4 (2014-02-20): * Add `npm t` as an alias for `npm test` (which is itself an alias for `npm run test`, or even `npm run-script test`). We like making running your tests easy. ([`14e650b`](https://github.com/npm/npm/commit/14e650bce0bfebba10094c961ac104a61417a5de), [@isaacs](https://github.com/isaacs)) ### v1.4.3 (2014-02-16): * Add back `npm prune --production`, which was removed in 1.3.24. ([`acc4d02`](https://github.com/npm/npm/commit/acc4d023c57d07704b20a0955e4bf10ee91bdc83), [@davglass](https://github.com/davglass)) * Default `npm install --save` and its counterparts to use the `^` version specifier, instead of `~`. ([`0a3151c`](https://github.com/npm/npm/commit/0a3151c9cbeb50c1c65895685c2eabdc7e2608dc), [@mikolalysenko](https://github.com/mikolalysenko)) * Make `npm shrinkwrap` output dependencies in a sorted order, so that diffs between shrinkwrap files should be saner now. ([`059b2bf`](https://github.com/npm/npm/commit/059b2bfd06ae775205a37257dca80142596a0113), [@Raynos](https://github.com/Raynos)) * Fix `npm dedupe` not correctly respecting dependency constraints. ([`86028e9`](https://github.com/npm/npm/commit/86028e9fd8524d5e520ce01ba2ebab5a030103fc), [@rafeca](https://github.com/rafeca)) * Fix `npm ls` giving spurious warnings when you used `"latest"` as a version specifier. (https://github.com/npm/read-installed/commit/d2956400e0386931c926e0f30c334840e0938f14, [@bajtos](https://github.com/bajtos)) * Fixed a bug where using `npm link` on packages without a `name` value could cause npm to delete itself. ([`401a642`](https://github.com/npm/npm/commit/401a64286aa6665a94d1d2f13604f7014c5fce87), [@isaacs](https://github.com/isaacs)) * Fixed `npm install ./pkg@1.2.3` to actually install the directory at `pkg@1.2.3`; before it would try to find version `1.2.3` of the package `./pkg` in the npm registry. ([`46d8768`](https://github.com/npm/npm/commit/46d876821d1dd94c050d5ebc86444bed12c56739), [@rlidwka](https://github.com/rlidwka); see also [`f851b79`](https://github.com/npm/npm/commit/f851b79a71d9a5f5125aa85877c94faaf91bea5f)) * Fix `npm outdated` to respect the `color` configuration option. ([`d4f6f3f`](https://github.com/npm/npm/commit/d4f6f3ff83bd14fb60d3ac6392cb8eb6b1c55ce1), [@timoxley](https://github.com/timoxley)) * Fix `npm outdated --parseable`. ([`9575a23`](https://github.com/npm/npm/commit/9575a23f955ce3e75b509c89504ef0bd707c8cf6), [@yhpark](https://github.com/yhpark)) * Fix a lockfile-related errors when using certain Git URLs. ([`164b97e`](https://github.com/npm/npm/commit/164b97e6089f64e686db7a9a24016f245effc37f), [@nigelzor](https://github.com/nigelzor)) ### v1.4.2 (2014-02-13): * Fixed an issue related to mid-publish GET requests made against the registry. (https://github.com/npm/npm-registry-client/commit/acbec48372bc1816c67c9e7cbf814cf50437ff93, [@isaacs](https://github.com/isaacs)) ### v1.4.1 (2014-02-13): * Fix `npm shrinkwrap` forgetting to shrinkwrap dependencies that were also development dependencies. ([`9c575c5`](https://github.com/npm/npm/commit/9c575c56efa9b0c8b0d4a17cb9c1de3833004bcd), [@diwu1989](https://github.com/diwu1989)) * Fixed publishing of pre-existing packages with uppercase characters in their name. (https://github.com/npm/npm-registry-client/commit/9345d3b6c3d8510dd5c4418f27ee1fce59acebad, [@isaacs](https://github.com/isaacs)) ### v1.4.0 (2014-02-12): * Remove `npm publish --force`. See https://github.com/npm/npmjs.org/issues/148. ([@isaacs](https://github.com/isaacs), npm/npm-registry-client@2c8dba990de6a59af6545b75cc00a6dc12777c2a) * Other changes to the registry client related to saved configs and couch logins. ([@isaacs](https://github.com/isaacs); npm/npm-registry-client@25e2b019a1588155e5f87d035c27e79963b75951, npm/npm-registry-client@9e41e9101b68036e0f078398785f618575f3cdde, npm/npm-registry-client@2c8dba990de6a59af6545b75cc00a6dc12777c2a) * Show an error to the user when doing `npm update` and the `package.json` specifies a version that does not exist. ([@evanlucas](https://github.com/evanlucas), [`027a33a`](https://github.com/npm/npm/commit/027a33a5c594124cc1d82ddec5aee2c18bc8dc32)) * Fix some issues with cache ownership in certain installation configurations. ([@outcoldman](https://github.com/outcoldman), [`a132690`](https://github.com/npm/npm/commit/a132690a2876cda5dcd1e4ca751f21dfcb11cb9e)) * Fix issues where GitHub shorthand dependencies `user/repo` were not always treated the same as full Git URLs. ([@robertkowalski](https://github.com/robertkowalski), https://github.com/meryn/normalize-package-data/commit/005d0b637aec1895117fcb4e3b49185eebf9e240) ### v1.3.26 (2014-02-02): * Fixes and updates to publishing code ([`735427a`](https://github.com/npm/npm/commit/735427a69ba4fe92aafa2d88f202aaa42920a9e2) and [`c0ac832`](https://github.com/npm/npm/commit/c0ac83224d49aa62e55577f8f27d53bbfd640dc5), [@isaacs](https://github.com/isaacs)) * Fix `npm bugs` with no arguments. ([`b99d465`](https://github.com/npm/npm/commit/b99d465221ac03bca30976cbf4d62ca80ab34091), [@Hoops](https://github.com/Hoops)) ### v1.3.25 (2014-01-25): * Remove gubblebum blocky font from documentation headers. ([`6940c9a`](https://github.com/npm/npm/commit/6940c9a100160056dc6be8f54a7ad7fa8ceda7e2), [@isaacs](https://github.com/isaacs)) ### v1.3.24 (2014-01-19): * Make the search output prettier, with nice truncated columns, and a `--long` option to create wrapping columns. ([`20439b2`](https://github.com/npm/npm/commit/20439b2) and [`3a6942d`](https://github.com/npm/npm/commit/3a6942d), [@timoxley](https://github.com/timoxley)) * Support multiple packagenames in `npm docs`. ([`823010b`](https://github.com/npm/npm/commit/823010b), [@timoxley](https://github.com/timoxley)) * Fix the `npm adduser` bug regarding "Error: default value must be string or number" again. ([`b9b4248`](https://github.com/npm/npm/commit/b9b4248), [@isaacs](https://github.com/isaacs)) * Fix `scripts` entries containing whitespaces on Windows. ([`80282ed`](https://github.com/npm/npm/commit/80282ed), [@robertkowalski](https://github.com/robertkowalski)) * Fix `npm update` for Git URLs that have credentials in them ([`93fc364`](https://github.com/npm/npm/commit/93fc364), [@danielsantiago](https://github.com/danielsantiago)) * Fix `npm install` overwriting `npm link`-ed dependencies when they are tagged Git dependencies. ([`af9bbd9`](https://github.com/npm/npm/commit/af9bbd9), [@evanlucas](https://github.com/evanlucas)) * Remove `npm prune --production` since it buggily removed some dependencies that were necessary for production; see [#4509](https://github.com/npm/npm/issues/4509). Hopefully it can make its triumphant return, one day. ([`1101b6a`](https://github.com/npm/npm/commit/1101b6a), [@isaacs](https://github.com/isaacs)) Dependency updates: * [`909cccf`](https://github.com/npm/npm/commit/909cccf) `read-package-json@1.1.6` * [`a3891b6`](https://github.com/npm/npm/commit/a3891b6) `rimraf@2.2.6` * [`ac6efbc`](https://github.com/npm/npm/commit/ac6efbc) `sha@1.2.3` * [`dd30038`](https://github.com/npm/npm/commit/dd30038) `node-gyp@0.12.2` * [`c8c3ebe`](https://github.com/npm/npm/commit/c8c3ebe) `npm-registry-client@0.3.3` * [`4315286`](https://github.com/npm/npm/commit/4315286) `npmconf@0.1.12` ### v1.3.23 (2014-01-03): * Properly handle installations that contained a certain class of circular dependencies. ([`5dc93e8`](https://github.com/npm/npm/commit/5dc93e8c82604c45b6067b1acf1c768e0bfce754), [@substack](https://github.com/substack)) ### v1.3.22 (2013-12-25): * Fix a critical bug in `npm adduser` that would manifest in the error message "Error: default value must be string or number." ([`fba4bd2`](https://github.com/npm/npm/commit/fba4bd24bc2ab00ccfeda2043aa53af7d75ef7ce), [@isaacs](https://github.com/isaacs)) * Allow `npm bugs` in the current directory to open the current package's bugs URL. ([`d04cf64`](https://github.com/npm/npm/commit/d04cf6483932c693452f3f778c2fa90f6153a4af), [@evanlucas](https://github.com/evanlucas)) * Several fixes to various error messages to include more useful or updated information. ([`1e6f2a7`](https://github.com/npm/npm/commit/1e6f2a72ca058335f9f5e7ca22d01e1a8bb0f9f7), [`ff46366`](https://github.com/npm/npm/commit/ff46366bd40ff0ef33c7bac8400bc912c56201d1), [`8b4bb48`](https://github.com/npm/npm/commit/8b4bb4815d80a3612186dc5549d698e7b988eb03); [@rlidwka](https://github.com/rlidwka), [@evanlucas](https://github.com/evanlucas)) ### v1.3.21 (2013-12-17): * Fix a critical bug that prevented publishing due to incorrect hash calculation. ([`4ca4a2c`](https://github.com/npm/npm-registry-client/commit/4ca4a2c6333144299428be6b572e2691aa59852e), [@dominictarr](https://github.com/dominictarr)) ### v1.3.20 (2013-12-17): * Fixes a critical bug in v1.3.19. Thankfully, due to that bug, no one could install npm v1.3.19 :) ### v1.3.19 (2013-12-16): * Adds atomic PUTs for publishing packages, which should result in far fewer requests and less room for replication errors on the server-side. ### v1.3.18 (2013-12-16): * Added an `--ignore-scripts` option, which will prevent `package.json` scripts from being run. Most notably, this will work on `npm install`, so e.g. `npm install --ignore-scripts` will not run preinstall and prepublish scripts. ([`d7e67bf`](https://github.com/npm/npm/commit/d7e67bf0d94b085652ec1c87d595afa6f650a8f6), [@sqs](https://github.com/sqs)) * Fixed a bug introduced in 1.3.16 that would manifest with certain cache configurations, by causing spurious errors saying "Adding a cache directory to the cache will make the world implode." ([`966373f`](https://github.com/npm/npm/commit/966373fad8d741637f9744882bde9f6e94000865), [@domenic](https://github.com/domenic)) * Re-fixed the multiple download of URL dependencies, whose fix was reverted in 1.3.17. ([`a362c3f`](https://github.com/npm/npm/commit/a362c3f1919987419ed8a37c8defa19d2e6697b0), [@spmason](https://github.com/spmason)) ### v1.3.17 (2013-12-11): * This release reverts [`644c2ff`](https://github.com/npm/npm/commit/644c2ff3e3d9c93764f7045762477f48864d64a7), which avoided re-downloading URL and shinkwrap dependencies when doing `npm install`. You can see the in-depth reasoning in [`d8c907e`](https://github.com/npm/npm/commit/d8c907edc2019b75cff0f53467e34e0ffd7e5fba); the problem was, that the patch changed the behavior of `npm install -f` to reinstall all dependencies. * A new version of the no-re-downloading fix has been submitted as [#4303](https://github.com/npm/npm/issues/4303) and will hopefully be included in the next release. ### v1.3.16 (2013-12-11): * Git URL dependencies are now updated on `npm install`, fixing a two-year old bug ([`5829ecf`](https://github.com/npm/npm/commit/5829ecf032b392d2133bd351f53d3c644961396b), [@robertkowalski](https://github.com/robertkowalski)). Additional progress on reducing the resulting Git-related I/O is tracked as [#4191](https://github.com/npm/npm/issues/4191), but for now, this will be a big improvement. * Added a `--json` mode to `npm outdated` to give a parseable output. ([`0b6c9b7`](https://github.com/npm/npm/commit/0b6c9b7c8c5579f4d7d37a0c24d9b7a12ccbe5fe), [@yyx990803](https://github.com/yyx990803)) * Made `npm outdated` much prettier and more useful. It now outputs a color-coded and easy-to-read table. ([`fd3017f`](https://github.com/npm/npm/commit/fd3017fc3e9d42acf6394a5285122edb4dc16106), [@quimcalpe](https://github.com/quimcalpe)) * Added the `--depth` option to `npm outdated`, so that e.g. you can do `npm outdated --depth=0` to show only top-level outdated dependencies. ([`1d184ef`](https://github.com/npm/npm/commit/1d184ef3f4b4bc309d38e9128732e3e6fb46d49c), [@yyx990803](https://github.com/yyx990803)) * Added a `--no-git-tag-version` option to `npm version`, for doing the usual job of `npm version` minus the Git tagging. This could be useful if you need to increase the version in other related files before actually adding the tag. ([`59ca984`](https://github.com/npm/npm/commit/59ca9841ba4f4b2f11b8e72533f385c77ae9f8bd), [@evanlucas](https://github.com/evanlucas)) * Made `npm repo` and `npm docs` work without any arguments, adding them to the list of npm commands that work on the package in the current directory when invoked without arguments. ([`bf9048e`](https://github.com/npm/npm/commit/bf9048e2fa16d43fbc4b328d162b0a194ca484e8), [@robertkowalski](https://github.com/robertkowalski); [`07600d0`](https://github.com/npm/npm/commit/07600d006c652507cb04ac0dae9780e35073dd67), [@wilmoore](https://github.com/wilmoore)). There are a few other commands we still want to implement this for; see [#4204](https://github.com/npm/npm/issues/4204). * Pass through the `GIT_SSL_NO_VERIFY` environment variable to Git, if it is set; we currently do this with a few other environment variables, but we missed that one. ([`c625de9`](https://github.com/npm/npm/commit/c625de91770df24c189c77d2e4bc821f2265efa8), [@arikon](https://github.com/arikon)) * Fixed `npm dedupe` on Windows due to incorrect path separators being used ([`7677de4`](https://github.com/npm/npm/commit/7677de4583100bc39407093ecc6bc13715bf8161), [@mcolyer](https://github.com/mcolyer)). * Fixed the `npm help` command when multiple words were searched for; it previously gave a `ReferenceError`. ([`6a28dd1`](https://github.com/npm/npm/commit/6a28dd147c6957a93db12b1081c6e0da44fe5e3c), [@dereckson](https://github.com/dereckson)) * Stopped re-downloading URL and shrinkwrap dependencies, as demonstrated in [#3463](https://github.com/npm/npm/issues/3463) ([`644c2ff`](https://github.com/isaacs/npm/commit/644c2ff3e3d9c93764f7045762477f48864d64a7), [@spmason](https://github.com/spmason)). You can use the `--force` option to force re-download and installation of all dependencies. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/cli.js000755 000766 000024 00000000060 12455173731 022006 0ustar00iojsstaff000000 000000 #!/usr/bin/env node require("./bin/npm-cli.js") iojs-v1.0.2-darwin-x64/lib/node_modules/npm/configure000755 000766 000024 00000001011 12455173731 022602 0ustar00iojsstaff000000 000000 #!/bin/bash # set configurations that will be "sticky" on this system, # surviving npm self-updates. CONFIGS=() i=0 # get the location of this file. unset CDPATH CONFFILE=$(cd $(dirname "$0"); pwd -P)/npmrc while [ $# -gt 0 ]; do conf="$1" case $conf in --help) echo "./configure --param=value ..." exit 0 ;; --*) CONFIGS[$i]="${conf:2}" ;; *) CONFIGS[$i]="$conf" ;; esac let i++ shift done for c in "${CONFIGS[@]}"; do echo "$c" >> "$CONFFILE" done iojs-v1.0.2-darwin-x64/lib/node_modules/npm/CONTRIBUTING.md000644 000766 000024 00000000767 12455173731 023145 0ustar00iojsstaff000000 000000 ## Before you submit a new issue * Check if there's a simple solution in the [Troubleshooting](https://github.com/npm/npm/wiki/Troubleshooting) wiki. * [Search for similar issues](https://github.com/npm/npm/search?q=Similar%20issues&type=Issues). * Ensure your new issue conforms to the [Contributing Guidelines](https://github.com/npm/npm/wiki/Contributing-Guidelines). Participation in this open source project is subject to the [npm Code of Conduct](http://www.npmjs.com/policies/conduct). iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/000755 000766 000024 00000000000 12456115117 021442 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/000755 000766 000024 00000000000 12456115117 021641 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/000755 000766 000024 00000000000 12456115117 021443 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/LICENSE000644 000766 000024 00000026067 12455173731 021722 0ustar00iojsstaff000000 000000 Copyright (c) npm, Inc. and Contributors All rights reserved. npm is released under the Artistic License 2.0, subject to additional terms that are listed below. The text of the npm License follows and the text of the additional terms follows the Artistic License 2.0 terms: -------- The Artistic License 2.0 Copyright (c) 2000-2006, The Perl Foundation. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble This license establishes the terms under which a given free software Package may be copied, modified, distributed, and/or redistributed. The intent is that the Copyright Holder maintains some artistic control over the development of that Package while still keeping the Package available as open source and free software. You are always permitted to make arrangements wholly outside of this license directly with the Copyright Holder of a given Package. If the terms of this license do not permit the full use that you propose to make of the Package, you should contact the Copyright Holder and seek a different licensing arrangement. Definitions "Copyright Holder" means the individual(s) or organization(s) named in the copyright notice for the entire Package. "Contributor" means any party that has contributed code or other material to the Package, in accordance with the Copyright Holder's procedures. "You" and "your" means any person who would like to copy, distribute, or modify the Package. "Package" means the collection of files distributed by the Copyright Holder, and derivatives of that collection and/or of those files. A given Package may consist of either the Standard Version, or a Modified Version. "Distribute" means providing a copy of the Package or making it accessible to anyone else, or in the case of a company or organization, to others outside of your company or organization. "Distributor Fee" means any fee that you charge for Distributing this Package or providing support for this Package to another party. It does not mean licensing fees. "Standard Version" refers to the Package if it has not been modified, or has been modified only in ways explicitly requested by the Copyright Holder. "Modified Version" means the Package, if it has been changed, and such changes were not explicitly requested by the Copyright Holder. "Original License" means this Artistic License as Distributed with the Standard Version of the Package, in its current version or as it may be modified by The Perl Foundation in the future. "Source" form means the source code, documentation source, and configuration files for the Package. "Compiled" form means the compiled bytecode, object code, binary, or any other form resulting from mechanical transformation or translation of the Source form. Permission for Use and Modification Without Distribution (1) You are permitted to use the Standard Version and create and use Modified Versions for any purpose without restriction, provided that you do not Distribute the Modified Version. Permissions for Redistribution of the Standard Version (2) You may Distribute verbatim copies of the Source form of the Standard Version of this Package in any medium without restriction, either gratis or for a Distributor Fee, provided that you duplicate all of the original copyright notices and associated disclaimers. At your discretion, such verbatim copies may or may not include a Compiled form of the Package. (3) You may apply any bug fixes, portability changes, and other modifications made available from the Copyright Holder. The resulting Package will still be considered the Standard Version, and as such will be subject to the Original License. Distribution of Modified Versions of the Package as Source (4) You may Distribute your Modified Version as Source (either gratis or for a Distributor Fee, and with or without a Compiled form of the Modified Version) provided that you clearly document how it differs from the Standard Version, including, but not limited to, documenting any non-standard features, executables, or modules, and provided that you do at least ONE of the following: (a) make the Modified Version available to the Copyright Holder of the Standard Version, under the Original License, so that the Copyright Holder may include your modifications in the Standard Version. (b) ensure that installation of your Modified Version does not prevent the user installing or running the Standard Version. In addition, the Modified Version must bear a name that is different from the name of the Standard Version. (c) allow anyone who receives a copy of the Modified Version to make the Source form of the Modified Version available to others under (i) the Original License or (ii) a license that permits the licensee to freely copy, modify and redistribute the Modified Version using the same licensing terms that apply to the copy that the licensee received, and requires that the Source form of the Modified Version, and of any works derived from it, be made freely available in that license fees are prohibited but Distributor Fees are allowed. Distribution of Compiled Forms of the Standard Version or Modified Versions without the Source (5) You may Distribute Compiled forms of the Standard Version without the Source, provided that you include complete instructions on how to get the Source of the Standard Version. Such instructions must be valid at the time of your distribution. If these instructions, at any time while you are carrying out such distribution, become invalid, you must provide new instructions on demand or cease further distribution. If you provide valid instructions or cease distribution within thirty days after you become aware that the instructions are invalid, then you do not forfeit any of your rights under this license. (6) You may Distribute a Modified Version in Compiled form without the Source, provided that you comply with Section 4 with respect to the Source of the Modified Version. Aggregating or Linking the Package (7) You may aggregate the Package (either the Standard Version or Modified Version) with other packages and Distribute the resulting aggregation provided that you do not charge a licensing fee for the Package. Distributor Fees are permitted, and licensing fees for other components in the aggregation are permitted. The terms of this license apply to the use and Distribution of the Standard or Modified Versions as included in the aggregation. (8) You are permitted to link Modified and Standard Versions with other works, to embed the Package in a larger work of your own, or to build stand-alone binary or bytecode versions of applications that include the Package, and Distribute the result without restriction, provided the result does not expose a direct interface to the Package. Items That are Not Considered Part of a Modified Version (9) Works (including, but not limited to, modules and scripts) that merely extend or make use of the Package, do not, by themselves, cause the Package to be a Modified Version. In addition, such works are not considered parts of the Package itself, and are not subject to the terms of this license. General Provisions (10) Any use, modification, and distribution of the Standard or Modified Versions is governed by this Artistic License. By using, modifying or distributing the Package, you accept this license. Do not use, modify, or distribute the Package, if you do not accept this license. (11) If your Modified Version has been derived from a Modified Version made by someone other than you, you are nevertheless required to ensure that your Modified Version complies with the requirements of this license. (12) This license does not grant you the right to use any trademark, service mark, tradename, or logo of the Copyright Holder. (13) This license includes the non-exclusive, worldwide, free-of-charge patent license to make, have made, use, offer to sell, sell, import and otherwise transfer the Package with respect to any patent claims licensable by the Copyright Holder that are necessarily infringed by the Package. If you institute patent litigation (including a cross-claim or counterclaim) against any party alleging that the Package constitutes direct or contributory patent infringement, then this Artistic License to you shall terminate on the date that such litigation is filed. (14) Disclaimer of Warranty: THE PACKAGE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS "AS IS' AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES. THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT ARE DISCLAIMED TO THE EXTENT PERMITTED BY YOUR LOCAL LAW. UNLESS REQUIRED BY LAW, NO COPYRIGHT HOLDER OR CONTRIBUTOR WILL BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING IN ANY WAY OUT OF THE USE OF THE PACKAGE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -------- The following additional terms shall apply to use of the npm software, the npm website, the npm repository and any other services or products offered by npm, Inc.: "Node.js" trademark Joyent, Inc. npm is not officially part of the Node.js project, and is neither owned by nor affiliated with Joyent, Inc. "npm" and "The npm Registry" are owned by npm, Inc. All rights reserved. Modules published on the npm registry are not officially endorsed by npm, Inc. or the Node.js project. Data published to the npm registry is not part of npm itself, and is the sole property of the publisher. While every effort is made to ensure accountability, there is absolutely no guarantee, warrantee, or assertion expressed or implied as to the quality, fitness for a specific purpose, or lack of malice in any given npm package. Packages downloaded through the npm registry are independently licensed and are not covered by this license. Additional policies relating to, and restrictions on use of, npm products and services are available on the npm website. All such policies and restrictions, as updated from time to time, are hereby incorporated into this license agreement. By using npm, you acknowledge your agreement to all such policies and restrictions. If you have a complaint about a package in the public npm registry, and cannot resolve it with the package owner, please email support@npmjs.com and explain the situation. See the [npm Dispute Resolution policy](https://github.com/npm/policies/blob/master/disputes.md) for more details. Any data published to The npm Registry (including user account information) may be removed or modified at the sole discretion of the npm server administrators. "npm Logo" created by Mathias Pettersson and Brian Hammond, used with permission. "Gubblebum Blocky" font Copyright (c) by Tjarda Koster, http://jelloween.deviantart.com included for use in the npm website and documentation, used with permission. This program uses several Node modules contained in the node_modules/ subdirectory, according to the terms of their respective licenses. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/make.bat000644 000766 000024 00000000234 12455370707 022310 0ustar00iojsstaff000000 000000 :: The tests run "make doc" in the prepublish script, :: so this file gives windows something that'll exit :: successfully, without having to install make. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/Makefile000644 000766 000024 00000016076 12455173731 022354 0ustar00iojsstaff000000 000000 # vim: set softtabstop=2 shiftwidth=2: SHELL = bash PUBLISHTAG = $(shell node scripts/publish-tag.js) BRANCH = $(shell git rev-parse --abbrev-ref HEAD) markdowns = $(shell find doc -name '*.md' | grep -v 'index') README.md html_docdeps = html/dochead.html \ html/docfoot.html \ scripts/doc-build.sh \ package.json cli_mandocs = $(shell find doc/cli -name '*.md' \ |sed 's|.md|.1|g' \ |sed 's|doc/cli/|man/man1/|g' ) \ man/man1/npm-README.1 api_mandocs = $(shell find doc/api -name '*.md' \ |sed 's|.md|.3|g' \ |sed 's|doc/api/|man/man3/|g' ) files_mandocs = $(shell find doc/files -name '*.md' \ |sed 's|.md|.5|g' \ |sed 's|doc/files/|man/man5/|g' ) \ man/man5/npm-json.5 \ man/man5/npm-global.5 misc_mandocs = $(shell find doc/misc -name '*.md' \ |sed 's|.md|.7|g' \ |sed 's|doc/misc/|man/man7/|g' ) \ man/man7/npm-index.7 cli_partdocs = $(shell find doc/cli -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/cli/|html/partial/doc/cli/|g' ) \ html/partial/doc/README.html api_partdocs = $(shell find doc/api -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/api/|html/partial/doc/api/|g' ) files_partdocs = $(shell find doc/files -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/files/|html/partial/doc/files/|g' ) \ html/partial/doc/files/npm-json.html \ html/partial/doc/files/npm-global.html misc_partdocs = $(shell find doc/misc -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/misc/|html/partial/doc/misc/|g' ) \ html/partial/doc/index.html cli_htmldocs = $(shell find doc/cli -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/cli/|html/doc/cli/|g' ) \ html/doc/README.html api_htmldocs = $(shell find doc/api -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/api/|html/doc/api/|g' ) files_htmldocs = $(shell find doc/files -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/files/|html/doc/files/|g' ) \ html/doc/files/npm-json.html \ html/doc/files/npm-global.html misc_htmldocs = $(shell find doc/misc -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/misc/|html/doc/misc/|g' ) \ html/doc/index.html mandocs = $(api_mandocs) $(cli_mandocs) $(files_mandocs) $(misc_mandocs) partdocs = $(api_partdocs) $(cli_partdocs) $(files_partdocs) $(misc_partdocs) htmldocs = $(api_htmldocs) $(cli_htmldocs) $(files_htmldocs) $(misc_htmldocs) all: doc latest: @echo "Installing latest published npm" @echo "Use 'make install' or 'make link' to install the code" @echo "in this folder that you're looking at right now." node cli.js install -g -f npm install: all node cli.js install -g -f # backwards compat dev: install link: uninstall node cli.js link -f clean: markedclean marked-manclean doc-clean uninstall rm -rf npmrc node cli.js cache clean uninstall: node cli.js rm npm -g -f doc: $(mandocs) $(htmldocs) $(partdocs) markedclean: rm -rf node_modules/marked node_modules/.bin/marked .building_marked marked-manclean: rm -rf node_modules/marked-man node_modules/.bin/marked-man .building_marked-man docclean: doc-clean doc-clean: rm -rf \ .building_marked \ .building_marked-man \ html/doc \ html/api \ man # use `npm install marked-man` for this to work. man/man1/npm-README.1: README.md scripts/doc-build.sh package.json @[ -d man/man1 ] || mkdir -p man/man1 scripts/doc-build.sh $< $@ man/man1/%.1: doc/cli/%.md scripts/doc-build.sh package.json @[ -d man/man1 ] || mkdir -p man/man1 scripts/doc-build.sh $< $@ man/man3/%.3: doc/api/%.md scripts/doc-build.sh package.json @[ -d man/man3 ] || mkdir -p man/man3 scripts/doc-build.sh $< $@ man/man5/npm-json.5: man/man5/package.json.5 cp $< $@ man/man5/npm-global.5: man/man5/npm-folders.5 cp $< $@ man/man5/%.5: doc/files/%.md scripts/doc-build.sh package.json @[ -d man/man5 ] || mkdir -p man/man5 scripts/doc-build.sh $< $@ man/man7/%.7: doc/misc/%.md scripts/doc-build.sh package.json @[ -d man/man7 ] || mkdir -p man/man7 scripts/doc-build.sh $< $@ doc/misc/npm-index.md: scripts/index-build.js package.json node scripts/index-build.js > $@ # html/doc depends on html/partial/doc html/doc/%.html: html/partial/doc/%.html @[ -d html/doc ] || mkdir -p html/doc scripts/doc-build.sh $< $@ html/doc/README.html: html/partial/doc/README.html @[ -d html/doc ] || mkdir -p html/doc scripts/doc-build.sh $< $@ html/doc/cli/%.html: html/partial/doc/cli/%.html @[ -d html/doc/cli ] || mkdir -p html/doc/cli scripts/doc-build.sh $< $@ html/doc/misc/%.html: html/partial/doc/misc/%.html @[ -d html/doc/misc ] || mkdir -p html/doc/misc scripts/doc-build.sh $< $@ html/doc/files/%.html: html/partial/doc/files/%.html @[ -d html/doc/files ] || mkdir -p html/doc/files scripts/doc-build.sh $< $@ html/doc/api/%.html: html/partial/doc/api/%.html @[ -d html/doc/api ] || mkdir -p html/doc/api scripts/doc-build.sh $< $@ html/partial/doc/index.html: doc/misc/npm-index.md $(html_docdeps) @[ -d html/partial/doc ] || mkdir -p html/partial/doc scripts/doc-build.sh $< $@ html/partial/doc/README.html: README.md $(html_docdeps) @[ -d html/partial/doc ] || mkdir -p html/partial/doc scripts/doc-build.sh $< $@ html/partial/doc/cli/%.html: doc/cli/%.md $(html_docdeps) @[ -d html/partial/doc/cli ] || mkdir -p html/partial/doc/cli scripts/doc-build.sh $< $@ html/partial/doc/api/%.html: doc/api/%.md $(html_docdeps) @[ -d html/partial/doc/api ] || mkdir -p html/partial/doc/api scripts/doc-build.sh $< $@ html/partial/doc/files/npm-json.html: html/partial/doc/files/package.json.html cp $< $@ html/partial/doc/files/npm-global.html: html/partial/doc/files/npm-folders.html cp $< $@ html/partial/doc/files/%.html: doc/files/%.md $(html_docdeps) @[ -d html/partial/doc/files ] || mkdir -p html/partial/doc/files scripts/doc-build.sh $< $@ html/partial/doc/misc/%.html: doc/misc/%.md $(html_docdeps) @[ -d html/partial/doc/misc ] || mkdir -p html/partial/doc/misc scripts/doc-build.sh $< $@ marked: node_modules/.bin/marked node_modules/.bin/marked: node cli.js install marked --no-global marked-man: node_modules/.bin/marked-man node_modules/.bin/marked-man: node cli.js install marked-man --no-global doc: man man: $(cli_docs) $(api_docs) test: doc node cli.js test tag: npm tag npm@$(PUBLISHTAG) latest publish: link doc @git push origin :v$(shell npm -v) 2>&1 || true git clean -fd &&\ git push origin $(BRANCH) &&\ git push origin --tags &&\ npm publish --tag=$(PUBLISHTAG) release: @bash scripts/release.sh sandwich: @[ $$(whoami) = "root" ] && (echo "ok"; echo "ham" > sandwich) || (echo "make it yourself" && exit 13) .PHONY: all latest install dev link doc clean uninstall test man doc-clean docclean release iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/000755 000766 000024 00000000000 12456115117 021450 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/000755 000766 000024 00000000000 12456115120 023344 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/package.json000644 000766 000024 00000060564 12455173731 023203 0ustar00iojsstaff000000 000000 { "version": "2.1.18", "name": "npm", "description": "A package manager for node", "keywords": [ "package manager", "modules", "install", "package.json" ], "preferGlobal": true, "config": { "publishtest": false }, "homepage": "https://docs.npmjs.com/", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "repository": { "type": "git", "url": "https://github.com/npm/npm" }, "bugs": { "url": "http://github.com/npm/npm/issues", "email": "npm-@googlegroups.com" }, "directories": { "doc": "./doc", "man": "./man", "lib": "./lib", "bin": "./bin" }, "main": "./lib/npm.js", "bin": { "npm": "./bin/npm-cli.js" }, "dependencies": { "abbrev": "~1.0.5", "ansi": "~0.3.0", "ansicolors": "~0.3.2", "ansistyles": "~0.1.3", "archy": "~1.0.0", "async-some": "~1.0.1", "block-stream": "0.0.7", "char-spinner": "~1.0.1", "child-process-close": "~0.1.1", "chmodr": "~0.1.0", "chownr": "0", "cmd-shim": "~2.0.1", "columnify": "~1.3.2", "config-chain": "~1.1.8", "dezalgo": "~1.0.1", "editor": "~0.1.0", "fs-vacuum": "~1.2.5", "fs-write-stream-atomic": "~1.0.2", "fstream": "~1.0.3", "fstream-npm": "~1.0.1", "github-url-from-git": "~1.4.0", "github-url-from-username-repo": "~1.0.2", "glob": "~4.3.2", "graceful-fs": "~3.0.5", "inflight": "~1.0.4", "inherits": "~2.0.1", "ini": "~1.3.2", "init-package-json": "~1.1.3", "lockfile": "~1.0.0", "lru-cache": "~2.5.0", "minimatch": "~2.0.1", "mkdirp": "~0.5.0", "node-gyp": "~1.0.2", "nopt": "~3.0.1", "normalize-git-url": "~1.0.0", "normalize-package-data": "~1.0.3", "npm-cache-filename": "~1.0.1", "npm-install-checks": "~1.0.2", "npm-package-arg": "~2.1.3", "npm-registry-client": "~4.0.5", "npm-user-validate": "~0.1.1", "npmlog": "~0.1.1", "once": "~1.3.1", "opener": "~1.4.0", "osenv": "~0.1.0", "path-is-inside": "~1.0.0", "read": "~1.0.4", "read-installed": "~3.1.5", "read-package-json": "~1.2.7", "readable-stream": "~1.0.33", "realize-package-specifier": "~1.3.0", "request": "~2.51.0", "retry": "~0.6.1", "rimraf": "~2.2.8", "semver": "~4.2.0", "sha": "~1.3.0", "slide": "~1.1.6", "sorted-object": "~1.0.0", "tar": "~1.0.3", "text-table": "~0.2.0", "uid-number": "0.0.6", "which": "~1.0.8", "wrappy": "~1.0.1", "write-file-atomic": "~1.1.0" }, "bundleDependencies": [ "abbrev", "ansi", "ansicolors", "ansistyles", "archy", "async-some", "block-stream", "char-spinner", "child-process-close", "chmodr", "chownr", "cmd-shim", "columnify", "config-chain", "dezalgo", "editor", "fs-vacuum", "fs-write-stream-atomic", "fstream", "fstream-npm", "github-url-from-git", "github-url-from-username-repo", "glob", "graceful-fs", "inflight", "inherits", "ini", "init-package-json", "lockfile", "lru-cache", "minimatch", "mkdirp", "node-gyp", "nopt", "normalize-git-url", "normalize-package-data", "npm-cache-filename", "npm-install-checks", "npm-package-arg", "npm-registry-client", "npm-user-validate", "npmlog", "once", "opener", "osenv", "path-is-inside", "read", "read-installed", "read-package-json", "readable-stream", "realize-package-specifier", "request", "retry", "rimraf", "semver", "sha", "slide", "sorted-object", "tar", "text-table", "uid-number", "which", "wrappy", "write-file-atomic" ], "devDependencies": { "marked": "~0.3.2", "marked-man": "~0.1.4", "nock": "~0.52.4", "npm-registry-couchapp": "~2.6.2", "npm-registry-mock": "~0.6.3", "require-inject": "~1.1.0", "tap": "~0.4.12" }, "scripts": { "test-legacy": "node ./test/run.js", "test": "tap --timeout 120 test/tap/*.js", "tap": "tap --timeout 120 test/tap/*.js", "test-all": "node ./test/run.js && tap test/tap/*.js", "prepublish": "node bin/npm-cli.js prune --prefix=. --no-global && rm -rf test/*/*/node_modules && make -j8 doc", "dumpconf": "env | grep npm | sort | uniq" }, "license": "Artistic-2.0", "contributors": [ { "name": "Isaac Z. Schlueter", "email": "i@izs.me" }, { "name": "Steve Steiner", "email": "ssteinerX@gmail.com" }, { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com" }, { "name": "Aaron Blohowiak", "email": "aaron.blohowiak@gmail.com" }, { "name": "Martyn Smith", "email": "martyn@dollyfish.net.nz" }, { "name": "Mathias Pettersson", "email": "mape@mape.me" }, { "name": "Brian Hammond", "email": "brian@fictorial.com" }, { "name": "Charlie Robbins", "email": "charlie.robbins@gmail.com" }, { "name": "Francisco Treacy", "email": "francisco.treacy@gmail.com" }, { "name": "Cliffano Subagio", "email": "cliffano@gmail.com" }, { "name": "Christian Eager", "email": "christian.eager@nokia.com" }, { "name": "Dav Glass", "email": "davglass@gmail.com" }, { "name": "Alex K. Wolfe", "email": "alexkwolfe@gmail.com" }, { "name": "James Sanders", "email": "jimmyjazz14@gmail.com" }, { "name": "Reid Burke", "email": "me@reidburke.com" }, { "name": "Arlo Breault", "email": "arlolra@gmail.com" }, { "name": "Timo Derstappen", "email": "teemow@gmail.com" }, { "name": "Bradley Meck", "email": "bradley.meck@gmail.com" }, { "name": "Bart Teeuwisse", "email": "bart.teeuwisse@thecodemill.biz" }, { "name": "Ben Noordhuis", "email": "info@bnoordhuis.nl" }, { "name": "Tor Valamo", "email": "tor.valamo@gmail.com" }, { "name": "Whyme.Lyu", "email": "5longluna@gmail.com" }, { "name": "Olivier Melcher", "email": "olivier.melcher@gmail.com" }, { "name": "Tomaž Muraus", "email": "kami@k5-storitve.net" }, { "name": "Evan Meagher", "email": "evan.meagher@gmail.com" }, { "name": "Orlando Vazquez", "email": "ovazquez@gmail.com" }, { "name": "George Miroshnykov", "email": "gmiroshnykov@lohika.com" }, { "name": "Geoff Flarity", "email": "geoff.flarity@gmail.com" }, { "name": "Pete Kruckenberg", "email": "pete@kruckenberg.com" }, { "name": "Laurie Harper", "email": "laurie@holoweb.net" }, { "name": "Chris Wong", "email": "chris@chriswongstudio.com" }, { "name": "Max Goodman", "email": "c@chromacode.com" }, { "name": "Scott Bronson", "email": "brons_github@rinspin.com" }, { "name": "Federico Romero", "email": "federomero@gmail.com" }, { "name": "Visnu Pitiyanuvath", "email": "visnupx@gmail.com" }, { "name": "Irakli Gozalishvili", "email": "rfobic@gmail.com" }, { "name": "Mark Cahill", "email": "mark@tiemonster.info" }, { "name": "Zearin", "email": "zearin@gonk.net" }, { "name": "Iain Sproat", "email": "iainsproat@gmail.com" }, { "name": "Trent Mick", "email": "trentm@gmail.com" }, { "name": "Felix Geisendörfer", "email": "felix@debuggable.com" }, { "name": "Conny Brunnkvist", "email": "cbrunnkvist@gmail.com" }, { "name": "Will Elwood", "email": "w.elwood08@gmail.com" }, { "name": "Oleg Efimov", "email": "efimovov@gmail.com" }, { "name": "Martin Cooper", "email": "mfncooper@gmail.com" }, { "name": "Jameson Little", "email": "t.jameson.little@gmail.com" }, { "name": "cspotcode", "email": "cspotcode@gmail.com" }, { "name": "Maciej Małecki", "email": "maciej.malecki@notimplemented.org" }, { "name": "Stephen Sugden", "email": "glurgle@gmail.com" }, { "name": "Gautham Pai", "email": "buzypi@gmail.com" }, { "name": "David Trejo", "email": "david.daniel.trejo@gmail.com" }, { "name": "Paul Vorbach", "email": "paul@vorb.de" }, { "name": "George Ornbo", "email": "george@shapeshed.com" }, { "name": "Tim Oxley", "email": "secoif@gmail.com" }, { "name": "Tyler Green", "email": "tyler.green2@gmail.com" }, { "name": "atomizer", "email": "danila.gerasimov@gmail.com" }, { "name": "Rod Vagg", "email": "rod@vagg.org" }, { "name": "Christian Howe", "email": "coderarity@gmail.com" }, { "name": "Andrew Lunny", "email": "alunny@gmail.com" }, { "name": "Henrik Hodne", "email": "dvyjones@binaryhex.com" }, { "name": "Adam Blackburn", "email": "regality@gmail.com" }, { "name": "Kris Windham", "email": "kriswindham@gmail.com" }, { "name": "Jens Grunert", "email": "jens.grunert@gmail.com" }, { "name": "Joost-Wim Boekesteijn", "email": "joost-wim@boekesteijn.nl" }, { "name": "Dalmais Maxence", "email": "github@maxired.fr" }, { "name": "Marcus Ekwall", "email": "marcus.ekwall@gmail.com" }, { "name": "Aaron Stacy", "email": "aaron.r.stacy@gmail.com" }, { "name": "Phillip Howell", "email": "phowell@cothm.org" }, { "name": "Domenic Denicola", "email": "domenic@domenicdenicola.com" }, { "name": "James Halliday", "email": "mail@substack.net" }, { "name": "Jeremy Cantrell", "email": "jmcantrell@gmail.com" }, { "name": "Ribettes", "email": "patlogan29@gmail.com" }, { "name": "Einar Otto Stangvik", "email": "einaros@gmail.com" }, { "name": "Don Park", "email": "donpark@docuverse.com" }, { "name": "Kei Son", "email": "heyacct@gmail.com" }, { "name": "Nicolas Morel", "email": "marsup@gmail.com" }, { "name": "Mark Dube", "email": "markisdee@gmail.com" }, { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net" }, { "name": "Maxim Bogushevich", "email": "boga1@mail.ru" }, { "name": "Justin Beckwith", "email": "justbe@microsoft.com" }, { "name": "Meaglin", "email": "Meaglin.wasabi@gmail.com" }, { "name": "Ben Evans", "email": "ben@bensbit.co.uk" }, { "name": "Nathan Zadoks", "email": "nathan@nathan7.eu" }, { "name": "Brian White", "email": "mscdex@gmail.com" }, { "name": "Jed Schmidt", "email": "tr@nslator.jp" }, { "name": "Ian Livingstone", "email": "ianl@cs.dal.ca" }, { "name": "Patrick Pfeiffer", "email": "patrick@buzzle.at" }, { "name": "Paul Miller", "email": "paul@paulmillr.com" }, { "name": "seebees", "email": "seebees@gmail.com" }, { "name": "Carl Lange", "email": "carl@flax.ie" }, { "name": "Jan Lehnardt", "email": "jan@apache.org" }, { "name": "Alexey Kreschuk", "email": "akrsch@gmail.com" }, { "name": "Di Wu", "email": "dwu@palantir.com" }, { "name": "Florian Margaine", "email": "florian@margaine.com" }, { "name": "Forbes Lindesay", "email": "forbes@lindesay.co.uk" }, { "name": "Ian Babrou", "email": "ibobrik@gmail.com" }, { "name": "Jaakko Manninen", "email": "jaakko@rocketpack.fi" }, { "name": "Johan Nordberg", "email": "its@johan-nordberg.com" }, { "name": "Johan Sköld", "email": "johan@skold.cc" }, { "name": "Larz Conwell", "email": "larz@larz-laptop.(none)", "url": "none" }, { "name": "Luke Arduini", "email": "luke.arduini@gmail.com" }, { "name": "Marcel Klehr", "email": "mklehr@gmx.net" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be" }, { "name": "Matt Lunn", "email": "matt@mattlunn.me.uk" }, { "name": "Matt McClure", "email": "matt.mcclure@mapmyfitness.com" }, { "name": "Nirk Niggler", "email": "nirk.niggler@gmail.com" }, { "name": "Paolo Fragomeni", "email": "paolo@async.ly" }, { "name": "Jake Verbaten", "email": "raynos2@gmail.com", "url": "Raynos" }, { "name": "Robert Kowalski", "email": "rok@kowalski.gd" }, { "name": "Schabse Laks", "email": "Dev@SLaks.net" }, { "name": "Stuart Knightley", "email": "stuart@stuartk.com" }, { "name": "Stuart P. Bentley", "email": "stuart@testtrack4.com" }, { "name": "Vaz Allen", "email": "vaz@tryptid.com" }, { "name": "elisee", "email": "elisee@sparklin.org" }, { "name": "Evan You", "email": "yyx990803@gmail.com" }, { "name": "Wil Moore III", "email": "wil.moore@wilmoore.com" }, { "name": "Dylan Greene", "email": "dylang@gmail.com" }, { "name": "zeke", "email": "zeke@sikelianos.com" }, { "name": "Andrew Horton", "email": "andrew.j.horton@gmail.com" }, { "name": "Denis Gladkikh", "email": "outcoldman@gmail.com" }, { "name": "Daniel Santiago", "email": "daniel.santiago@highlevelwebs.com" }, { "name": "Alex Kocharin", "email": "alex@kocharin.ru" }, { "name": "Evan Lucas", "email": "evanlucas@me.com" }, { "name": "Steve Mason", "email": "stevem@brandwatch.com" }, { "name": "Quinn Slack", "email": "qslack@qslack.com" }, { "name": "Sébastien Santoro", "email": "dereckson@espace-win.org" }, { "name": "CamilleM", "email": "camille.moulin@alterway.fr" }, { "name": "Tom Huang", "email": "hzlhu.dargon@gmail.com" }, { "name": "Sergey Belov", "email": "peimei@ya.ru" }, { "name": "Younghoon Park", "email": "sola92@gmail.com" }, { "name": "Yazhong Liu", "email": "yorkiefixer@gmail.com" }, { "name": "Mikola Lysenko", "email": "mikolalysenko@gmail.com" }, { "name": "Rafael de Oleza", "email": "rafa@spotify.com" }, { "name": "Yeonghoon Park", "email": "sola92@gmail.com" }, { "name": "Franck Cuny", "email": "franck.cuny@gmail.com" }, { "name": "Alan Shaw", "email": "alan@freestyle-developments.co.uk" }, { "name": "Alex Rodionov", "email": "p0deje@gmail.com" }, { "name": "Alexej Yaroshevich", "email": "alex@qfox.ru" }, { "name": "Elan Shanker", "email": "elan.shanker@gmail.com" }, { "name": "François Frisch", "email": "francoisfrisch@gmail.com" }, { "name": "Gabriel Falkenberg", "email": "gabriel.falkenberg@gmail.com" }, { "name": "Jason Diamond", "email": "jason@diamond.name" }, { "name": "Jess Martin", "email": "jessmartin@gmail.com" }, { "name": "Jon Spencer", "email": "jon@jonspencer.ca" }, { "name": "Matt Colyer", "email": "matt@colyer.name" }, { "name": "Matt McClure", "email": "matt.mcclure@mapmyfitness.com" }, { "name": "Maximilian Antoni", "email": "maximilian.antoni@juliusbaer.com" }, { "name": "Nicholas Kinsey", "email": "pyro@feisty.io" }, { "name": "Paulo Cesar", "email": "pauloc062@gmail.com" }, { "name": "Quim Calpe", "email": "quim@kalpe.com" }, { "name": "Robert Gieseke", "email": "robert.gieseke@gmail.com" }, { "name": "Spain Train", "email": "michael.spainhower@opower.com" }, { "name": "TJ Holowaychuk", "email": "tj@vision-media.ca" }, { "name": "Thom Blake", "email": "tblake@brightroll.com" }, { "name": "Trevor Burnham", "email": "tburnham@hubspot.com" }, { "name": "bitspill", "email": "bitspill+github@bitspill.net" }, { "name": "Neil Gentleman", "email": "ngentleman@gmail.com" } ], "man": [ "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-README.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-adduser.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-bin.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-bugs.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-build.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-bundle.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-cache.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-completion.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-config.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-dedupe.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-deprecate.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-docs.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-edit.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-explore.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-help-search.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-help.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-init.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-install.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-link.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-ls.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-outdated.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-owner.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-pack.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-prefix.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-prune.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-publish.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-rebuild.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-repo.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-restart.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-rm.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-root.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-run-script.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-search.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-shrinkwrap.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-star.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-stars.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-start.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-stop.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-submodule.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-tag.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-test.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-uninstall.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-unpublish.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-update.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-version.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-view.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm-whoami.1", "/Users/ogd/Documents/projects/npm/npm/man/man1/npm.1", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-bin.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-bugs.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-cache.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-commands.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-config.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-deprecate.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-docs.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-edit.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-explore.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-help-search.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-init.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-install.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-link.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-load.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-ls.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-outdated.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-owner.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-pack.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-prefix.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-prune.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-publish.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-rebuild.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-repo.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-restart.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-root.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-run-script.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-search.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-shrinkwrap.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-start.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-stop.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-submodule.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-tag.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-test.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-uninstall.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-unpublish.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-update.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-version.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-view.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm-whoami.3", "/Users/ogd/Documents/projects/npm/npm/man/man3/npm.3", "/Users/ogd/Documents/projects/npm/npm/man/man5/npm-folders.5", "/Users/ogd/Documents/projects/npm/npm/man/man5/npm-global.5", "/Users/ogd/Documents/projects/npm/npm/man/man5/npm-json.5", "/Users/ogd/Documents/projects/npm/npm/man/man5/npmrc.5", "/Users/ogd/Documents/projects/npm/npm/man/man5/package.json.5", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-coding-style.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-config.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-developers.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-disputes.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-faq.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-index.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-registry.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-scope.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/npm-scripts.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/removing-npm.7", "/Users/ogd/Documents/projects/npm/npm/man/man7/semver.7" ], "gitHead": "cad3d1ed571981b13c8165ba4516b836bf79293c", "_id": "npm@2.1.18", "_shasum": "e2af4c5f848fb023851cd2ec129005d33090bd57", "_from": "npm@2.1.18", "_npmVersion": "2.1.18", "_nodeVersion": "0.10.35", "_npmUser": { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "dist": { "shasum": "e2af4c5f848fb023851cd2ec129005d33090bd57", "tarball": "http://registry.npmjs.org/npm/-/npm-2.1.18.tgz" }, "_resolved": "https://registry.npmjs.org/npm/-/npm-2.1.18.tgz", "readme": "ERROR: No README data found!" } iojs-v1.0.2-darwin-x64/lib/node_modules/npm/README.md000644 000766 000024 00000016061 12455173731 022165 0ustar00iojsstaff000000 000000 npm(1) -- a JavaScript package manager ============================== [![Build Status](https://img.shields.io/travis/npm/npm/master.svg)](https://travis-ci.org/npm/npm) ## SYNOPSIS This is just enough info to get you up and running. Much more info available via `npm help` once it's installed. ## IMPORTANT **You need node v0.8 or higher to run this program.** To install an old **and unsupported** version of npm that works on node 0.3 and prior, clone the git repo and dig through the old tags and branches. ## Super Easy Install npm comes with [node](http://nodejs.org/download/) now. ### Windows Computers [Get the MSI](http://nodejs.org/download/). npm is in it. ### Apple Macintosh Computers [Get the pkg](http://nodejs.org/download/). npm is in it. ### Other Sorts of Unices Run `make install`. npm will be installed with node. If you want a more fancy pants install (a different version, customized paths, etc.) then read on. ## Fancy Install (Unix) There's a pretty robust install script at . You can download that and run it. Here's an example using curl: curl -L https://npmjs.com/install.sh | sh ### Slightly Fancier You can set any npm configuration params with that script: npm_config_prefix=/some/path sh install.sh Or, you can run it in uber-debuggery mode: npm_debug=1 sh install.sh ### Even Fancier Get the code with git. Use `make` to build the docs and do other stuff. If you plan on hacking on npm, `make link` is your friend. If you've got the npm source code, you can also semi-permanently set arbitrary config keys using the `./configure --key=val ...`, and then run npm commands by doing `node cli.js `. (This is helpful for testing, or running stuff without actually installing npm itself.) ## Windows Install or Upgrade You can download a zip file from , and unpack it in the same folder where node.exe lives. The latest version in a zip file is 1.4.12. To upgrade to npm 2, follow the Windows upgrade instructions in the npm Troubleshooting Guide: If that's not fancy enough for you, then you can fetch the code with git, and mess with it directly. ## Installing on Cygwin No. ## Uninstalling So sad to see you go. sudo npm uninstall npm -g Or, if that fails, sudo make uninstall ## More Severe Uninstalling Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed. If you would like to remove all the packages that you have installed, then you can use the `npm ls` command to find them, and then `npm rm` to remove them. To remove cruft left behind by npm 0.x, you can use the included `clean-old.sh` script file. You can run it conveniently like this: npm explore npm -g -- sh scripts/clean-old.sh npm uses two configuration files, one for per-user configs, and another for global (every-user) configs. You can view them by doing: npm config get userconfig # defaults to ~/.npmrc npm config get globalconfig # defaults to /usr/local/etc/npmrc Uninstalling npm does not remove configuration files by default. You must remove them yourself manually if you want them gone. Note that this means that future npm installs will not remember the settings that you have chosen. ## Using npm Programmatically If you would like to use npm programmatically, you can do that. It's not very well documented, but it *is* rather simple. Most of the time, unless you actually want to do all the things that npm does, you should try using one of npm's dependencies rather than using npm itself, if possible. Eventually, npm will be just a thin cli wrapper around the modules that it depends on, but for now, there are some things that you must use npm itself to do. var npm = require("npm") npm.load(myConfigObject, function (er) { if (er) return handlError(er) npm.commands.install(["some", "args"], function (er, data) { if (er) return commandFailed(er) // command succeeded, and data might have some info }) npm.registry.log.on("log", function (message) { .... }) }) The `load` function takes an object hash of the command-line configs. The various `npm.commands.` functions take an **array** of positional argument **strings**. The last argument to any `npm.commands.` function is a callback. Some commands take other optional arguments. Read the source. You cannot set configs individually for any single npm function at this time. Since `npm` is a singleton, any call to `npm.config.set` will change the value for *all* npm commands in that process. See `./bin/npm-cli.js` for an example of pulling config values off of the command line arguments using nopt. You may also want to check out `npm help config` to learn about all the options you can set there. ## More Docs Check out the [docs](https://docs.npmjs.com/), especially the [faq](https://docs.npmjs.com/misc/faq). You can use the `npm help` command to read any of them. If you're a developer, and you want to use npm to publish your program, you should [read this](https://docs.npmjs.com/misc/developers) ## Legal Stuff "npm" and "The npm Registry" are owned by npm, Inc. All rights reserved. See the included LICENSE file for more details. "Node.js" and "node" are trademarks owned by Joyent, Inc. Modules published on the npm registry are not officially endorsed by npm, Inc. or the Node.js project. Data published to the npm registry is not part of npm itself, and is the sole property of the publisher. While every effort is made to ensure accountability, there is absolutely no guarantee, warrantee, or assertion expressed or implied as to the quality, fitness for a specific purpose, or lack of malice in any given npm package. If you have a complaint about a package in the public npm registry, and cannot [resolve it with the package owner](https://docs.npmjs.com/misc/disputes), please email and explain the situation. Any data published to The npm Registry (including user account information) may be removed or modified at the sole discretion of the npm server administrators. ### In plainer english npm is the property of npm, Inc. If you publish something, it's yours, and you are solely accountable for it. If other people publish something, it's theirs. Users can publish Bad Stuff. It will be removed promptly if reported. But there is no vetting process for published modules, and you use them at your own risk. Please inspect the source. If you publish Bad Stuff, we may delete it from the registry, or even ban your account in extreme cases. So don't do that. ## BUGS When you find issues, please report them: * web: Be sure to include *all* of the output from the npm command that didn't work as expected. The `npm-debug.log` file is also helpful to provide. You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email. ## SEE ALSO * npm(1) * npm-faq(7) * npm-help(1) * npm-index(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/000755 000766 000024 00000000000 12456115120 022356 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/wercker.yml000644 000766 000024 00000001206 12455173731 023066 0ustar00iojsstaff000000 000000 box: wercker/nodejs # Build definition build: # The steps that will be executed on build steps: # A step that executes `npm install` command - npm-install # A step that executes `npm test` command - npm-test # A custom script step, name value is used in the UI # and the code value contains the command that get executed - script: name: echo nodejs information code: | echo "node version $(node -v) running" echo "npm version $(npm -v) running" after-steps: - sherzberg/slack-notify: subdomain: npm-inc token: $SLACK_TOKEN channel: github-commits iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/clean-old.sh000755 000766 000024 00000010242 12455173731 024565 0ustar00iojsstaff000000 000000 #!/bin/bash # look for old 0.x cruft, and get rid of it. # Should already be sitting in the npm folder. # This doesn't have to be quite as cross-platform as install.sh. # There are some bash-isms, because maintaining *two* # fully-portable posix/bourne sh scripts is too much for # one project with a sane maintainer. # If readlink isn't available, then this is just too tricky. # However, greadlink is fine, so Solaris can join the party, too. readlink="readlink" which $readlink >/dev/null 2>/dev/null if [ $? -ne 0 ]; then readlink="greadlink" which $readlink >/dev/null 2>/dev/null if [ $? -ne 0 ]; then echo "Can't find the readlink or greadlink command. Aborting." exit 1 fi fi if [ "x$npm_config_prefix" != "x" ]; then PREFIXES=$npm_config_prefix else node="$NODE" if [ "x$node" = "x" ]; then node=`which node` fi if [ "x$node" = "x" ]; then echo "Can't find node to determine prefix. Aborting." exit 1 fi PREFIX=`dirname $node` PREFIX=`dirname $PREFIX` echo "cleanup prefix=$PREFIX" PREFIXES=$PREFIX altprefix=`"$node" -e process.installPrefix` if [ "x$altprefix" != "x" ] && [ "x$altprefix" != "x$PREFIX" ]; then echo "altprefix=$altprefix" PREFIXES="$PREFIX $altprefix" fi fi # now prefix is where npm would be rooted by default # go hunting. packages= for prefix in $PREFIXES; do packages="$packages "`ls "$prefix"/lib/node/.npm 2>/dev/null | grep -v .cache` done packages=`echo $packages` filelist=() fid=0 for prefix in $PREFIXES; do # remove any links into the .npm dir, or links to # version-named shims/symlinks. for folder in share/man bin lib/node; do find $prefix/$folder -type l | while read file; do target=`$readlink $file | grep '/\.npm/'` if [ "x$target" != "x" ]; then # found one! filelist[$fid]="$file" let 'fid++' # also remove any symlinks to this file. base=`basename "$file"` base=`echo "$base" | awk -F@ '{print $1}'` if [ "x$base" != "x" ]; then find "`dirname $file`" -type l -name "$base"'*' \ | while read l; do target=`$readlink "$l" | grep "$base"` if [ "x$target" != "x" ]; then filelist[$fid]="$1" let 'fid++' fi done fi fi done # Scour for shim files. These are relics of 0.2 npm installs. # note: grep -r is not portable. find $prefix/$folder -type f \ | xargs grep -sl '// generated by npm' \ | while read file; do filelist[$fid]="$file" let 'fid++' done done # now remove the package modules, and the .npm folder itself. if [ "x$packages" != "x" ]; then for pkg in $packages; do filelist[$fid]="$prefix/lib/node/$pkg" let 'fid++' for i in $prefix/lib/node/$pkg\@*; do filelist[$fid]="$i" let 'fid++' done done fi for folder in lib/node/.npm lib/npm share/npm; do if [ -d $prefix/$folder ]; then filelist[$fid]="$prefix/$folder" let 'fid++' fi done done # now actually clean, but only if there's anything TO clean if [ "${#filelist[@]}" -gt 0 ]; then echo "" echo "This script will find and eliminate any shims, symbolic" echo "links, and other cruft that was installed by npm 0.x." echo "" if [ "x$packages" != "x" ]; then echo "The following packages appear to have been installed with" echo "an old version of npm, and will be removed forcibly:" for pkg in $packages; do echo " $pkg" done echo "Make a note of these. You may want to install them" echo "with npm 1.0 when this process is completed." echo "" fi OK= if [ "x$1" = "x-y" ]; then OK="yes" fi while [ "$OK" != "y" ] && [ "$OK" != "yes" ] && [ "$OK" != "no" ]; do echo "Is this OK?" echo " enter 'yes' or 'no'" echo " or 'show' to see a list of files " read OK if [ "x$OK" = "xshow" ] || [ "x$OK" = "xs" ]; then for i in "${filelist[@]}"; do echo "$i" done fi done if [ "$OK" = "no" ]; then echo "Aborting" exit 1 fi for i in "${filelist[@]}"; do rm -rf "$i" done fi echo "" echo 'All clean!' exit 0 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/doc-build.sh000755 000766 000024 00000006460 12455173731 024600 0ustar00iojsstaff000000 000000 #!/usr/bin/env bash if [[ $DEBUG != "" ]]; then set -x fi set -o errexit set -o pipefail if ! [ -x node_modules/.bin/marked-man ]; then ps=0 if [ -f .building_marked-man ]; then pid=$(cat .building_marked-man) ps=$(ps -p $pid | grep $pid | wc -l) || true fi if [ -f .building_marked-man ] && [ $ps != 0 ]; then while [ -f .building_marked-man ]; do sleep 1 done else # a race to see which make process will be the one to install marked-man echo $$ > .building_marked-man sleep 1 if [ $(cat .building_marked-man) == $$ ]; then make node_modules/.bin/marked-man rm .building_marked-man else while [ -f .building_marked-man ]; do sleep 1 done fi fi fi if ! [ -x node_modules/.bin/marked ]; then ps=0 if [ -f .building_marked ]; then pid=$(cat .building_marked) ps=$(ps -p $pid | grep $pid | wc -l) || true fi if [ -f .building_marked ] && [ $ps != 0 ]; then while [ -f .building_marked ]; do sleep 1 done else # a race to see which make process will be the one to install marked echo $$ > .building_marked sleep 1 if [ $(cat .building_marked) == $$ ]; then make node_modules/.bin/marked rm .building_marked else while [ -f .building_marked ]; do sleep 1 done fi fi fi src=$1 dest=$2 name=$(basename ${src%.*}) date=$(date -u +'%Y-%M-%d %H:%m:%S') version=$(node cli.js -v) mkdir -p $(dirname $dest) html_replace_tokens () { local url=$1 sed "s|@NAME@|$name|g" \ | sed "s|@DATE@|$date|g" \ | sed "s|@URL@|$url|g" \ | sed "s|@VERSION@|$version|g" \ | perl -p -e 's/]*)>([^\(]*\([0-9]\)) -- (.*?)<\/h1>/

\2<\/h1>

\3<\/p>/g' \ | perl -p -e 's/npm-npm/npm/g' \ | perl -p -e 's/([^"-])(npm-)?README(?!\.html)(\(1\))?/\1README<\/a>/g' \ | perl -p -e 's/<a href="[^"]+README.html">README<\/a><\/title>/<title>README<\/title>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(1\))/\1<a href="..\/cli\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(3\))/\1<a href="..\/api\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(5\))/\1<a href="..\/files\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(7\))/\1<a href="..\/misc\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/\([1357]\)<\/a><\/h1>/<\/a><\/h1>/g' \ | (if [ $(basename $(dirname $dest)) == "doc" ]; then perl -p -e 's/ href="\.\.\// href="/g' else cat fi) } man_replace_tokens () { sed "s|@VERSION@|$version|g" \ | perl -p -e 's/(npm\\-)?([a-zA-Z\\\.\-]*)\(1\)/npm help \2/g' \ | perl -p -e 's/(npm\\-)?([a-zA-Z\\\.\-]*)\(([57])\)/npm help \3 \2/g' \ | perl -p -e 's/(npm\\-)?([a-zA-Z\\\.\-]*)\(3\)/npm apihelp \2/g' \ | perl -p -e 's/npm\(1\)/npm help npm/g' \ | perl -p -e 's/npm\(3\)/npm apihelp npm/g' } case $dest in *.[1357]) ./node_modules/.bin/marked-man --roff $src \ | man_replace_tokens > $dest exit $? ;; html/partial/*.html) url=${dest/html\/partial\//} cat $src | ./node_modules/.bin/marked | html_replace_tokens $url > $dest ;; html/*.html) url=${dest/html\//} (cat html/dochead.html && \ cat $src && \ cat html/docfoot.html)\ | html_replace_tokens $url \ > $dest exit $? ;; *) echo "Invalid destination type: $dest" >&2 exit 1 ;; esac ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/index-build.js����������������������������������000755 �000766 �000024 �00000003272 12455173731 025142� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var fs = require("fs") , path = require("path") , root = path.resolve(__dirname, "..") , glob = require("glob") , conversion = { "cli": 1, "api": 3, "files": 5, "misc": 7 } glob(root + "/{README.md,doc/*/*.md}", function (er, files) { if (er) throw er output(files.map(function (f) { var b = path.basename(f) if (b === "README.md") return [0, b] if (b === "index.md") return null var s = conversion[path.basename(path.dirname(f))] return [s, f] }).filter(function (f) { return f }).sort(function (a, b) { return (a[0] === b[0]) ? ( path.basename(a[1]) === "npm.md" ? -1 : path.basename(b[1]) === "npm.md" ? 1 : a[1] > b[1] ? 1 : -1 ) : a[0] - b[0] })) }) return function output (files) { console.log( "npm-index(7) -- Index of all npm documentation\n" + "==============================================\n") writeLines(files, 0) writeLines(files, 1, "Command Line Documentation", "Using npm on the command line") writeLines(files, 3, "API Documentation", "Using npm in your Node programs") writeLines(files, 5, "Files", "File system structures npm uses") writeLines(files, 7, "Misc", "Various other bits and bobs") } function writeLines (files, sxn, heading, desc) { if (heading) { console.log("## %s\n\n%s\n", heading, desc) } files.filter(function (f) { return f[0] === sxn }).forEach(writeLine) } function writeLine (sd) { var sxn = sd[0] || 1 , doc = sd[1] , d = path.basename(doc, ".md") var content = fs.readFileSync(doc, "utf8").split("\n")[0].split("-- ")[1] console.log("### %s(%d)\n", d, sxn) console.log(content + "\n") } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/install.sh��������������������������������������000755 �000766 �000024 �00000014137 12455173731 024404� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh # A word about this shell script: # # It must work everywhere, including on systems that lack # a /bin/bash, map 'sh' to ksh, ksh97, bash, ash, or zsh, # and potentially have either a posix shell or bourne # shell living at /bin/sh. # # See this helpful document on writing portable shell scripts: # http://www.gnu.org/s/hello/manual/autoconf/Portable-Shell.html # # The only shell it won't ever work on is cmd.exe. if [ "x$0" = "xsh" ]; then # run as curl | sh # on some systems, you can just do cat>npm-install.sh # which is a bit cuter. But on others, &1 is already closed, # so catting to another script file won't do anything. # Follow Location: headers, and fail on errors curl -f -L -s https://www.npmjs.org/install.sh > npm-install-$$.sh ret=$? if [ $ret -eq 0 ]; then (exit 0) else rm npm-install-$$.sh echo "Failed to download script" >&2 exit $ret fi sh npm-install-$$.sh ret=$? rm npm-install-$$.sh exit $ret fi # See what "npm_config_*" things there are in the env, # and make them permanent. # If this fails, it's not such a big deal. configures="`env | grep 'npm_config_' | sed -e 's|^npm_config_||g'`" npm_config_loglevel="error" if [ "x$npm_debug" = "x" ]; then (exit 0) else echo "Running in debug mode." echo "Note that this requires bash or zsh." set -o xtrace set -o pipefail npm_config_loglevel="verbose" fi export npm_config_loglevel # make sure that node exists node=`which node 2>&1` ret=$? if [ $ret -eq 0 ] && [ -x "$node" ]; then (exit 0) else echo "npm cannot be installed without node.js." >&2 echo "Install node first, and then try again." >&2 echo "" >&2 echo "Maybe node is installed, but not in the PATH?" >&2 echo "Note that running as sudo can change envs." >&2 echo "" echo "PATH=$PATH" >&2 exit $ret fi # set the temp dir TMP="${TMPDIR}" if [ "x$TMP" = "x" ]; then TMP="/tmp" fi TMP="${TMP}/npm.$$" rm -rf "$TMP" || true mkdir "$TMP" if [ $? -ne 0 ]; then echo "failed to mkdir $TMP" >&2 exit 1 fi BACK="$PWD" ret=0 tar="${TAR}" if [ -z "$tar" ]; then tar="${npm_config_tar}" fi if [ -z "$tar" ]; then tar=`which tar 2>&1` ret=$? fi if [ $ret -eq 0 ] && [ -x "$tar" ]; then echo "tar=$tar" echo "version:" $tar --version ret=$? fi if [ $ret -eq 0 ]; then (exit 0) else echo "No suitable tar program found." exit 1 fi # Try to find a suitable make # If the MAKE environment var is set, use that. # otherwise, try to find gmake, and then make. # If no make is found, then just execute the necessary commands. # XXX For some reason, make is building all the docs every time. This # is an annoying source of bugs. Figure out why this happens. MAKE=NOMAKE if [ "x$MAKE" = "x" ]; then make=`which gmake 2>&1` if [ $? -eq 0 ] && [ -x "$make" ]; then (exit 0) else make=`which make 2>&1` if [ $? -eq 0 ] && [ -x "$make" ]; then (exit 0) else make=NOMAKE fi fi else make="$MAKE" fi if [ -x "$make" ]; then (exit 0) else # echo "Installing without make. This may fail." >&2 make=NOMAKE fi # If there's no bash, then don't even try to clean if [ -x "/bin/bash" ]; then (exit 0) else clean="no" fi node_version=`"$node" --version 2>&1` ret=$? if [ $ret -ne 0 ]; then echo "You need node to run this program." >&2 echo "node --version reports: $node_version" >&2 echo "with exit code = $ret" >&2 echo "Please install node before continuing." >&2 exit $ret fi t="${npm_install}" if [ -z "$t" ]; then # switch based on node version. # note that we can only use strict sh-compatible patterns here. case $node_version in 0.[01234567].* | v0.[01234567].*) echo "You are using an outdated and unsupported version of" >&2 echo "node ($node_version). Please update node and try again." >&2 exit 99 ;; *) echo "install npm@latest" t="latest" ;; esac fi # need to echo "" after, because Posix sed doesn't treat EOF # as an implied end of line. url=`(curl -SsL https://registry.npmjs.org/npm/$t; echo "") \ | sed -e 's/^.*tarball":"//' \ | sed -e 's/".*$//'` ret=$? if [ "x$url" = "x" ]; then ret=125 # try without the -e arg to sed. url=`(curl -SsL https://registry.npmjs.org/npm/$t; echo "") \ | sed 's/^.*tarball":"//' \ | sed 's/".*$//'` ret=$? if [ "x$url" = "x" ]; then ret=125 fi fi if [ $ret -ne 0 ]; then echo "Failed to get tarball url for npm/$t" >&2 exit $ret fi echo "fetching: $url" >&2 cd "$TMP" \ && curl -SsL "$url" \ | $tar -xzf - \ && cd "$TMP"/* \ && (ver=`"$node" bin/read-package-json.js package.json version` isnpm10=0 if [ $ret -eq 0 ]; then if [ -d node_modules ]; then if "$node" node_modules/semver/bin/semver -v "$ver" -r "1" then isnpm10=1 fi else if "$node" bin/semver -v "$ver" -r ">=1.0"; then isnpm10=1 fi fi fi ret=0 if [ $isnpm10 -eq 1 ] && [ -f "scripts/clean-old.sh" ]; then if [ "x$skipclean" = "x" ]; then (exit 0) else clean=no fi if [ "x$clean" = "xno" ] \ || [ "x$clean" = "xn" ]; then echo "Skipping 0.x cruft clean" >&2 ret=0 elif [ "x$clean" = "xy" ] || [ "x$clean" = "xyes" ]; then NODE="$node" /bin/bash "scripts/clean-old.sh" "-y" ret=$? else NODE="$node" /bin/bash "scripts/clean-old.sh" </dev/tty ret=$? fi fi if [ $ret -ne 0 ]; then echo "Aborted 0.x cleanup. Exiting." >&2 exit $ret fi) \ && (if [ "x$configures" = "x" ]; then (exit 0) else echo "./configure $configures" echo "$configures" > npmrc fi) \ && (if [ "$make" = "NOMAKE" ]; then (exit 0) elif "$make" uninstall install; then (exit 0) else make="NOMAKE" fi if [ "$make" = "NOMAKE" ]; then "$node" cli.js rm npm -gf "$node" cli.js install -gf fi) \ && cd "$BACK" \ && rm -rf "$TMP" \ && echo "It worked" ret=$? if [ $ret -ne 0 ]; then echo "It failed" >&2 fi exit $ret ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/publish-tag.js����������������������������������000644 �000766 �000024 �00000000227 12455173731 025147� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var semver = require("semver") var version = semver.parse(require("../package.json").version) console.log('v%s.%s-next', version.major, version.minor) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/release.sh��������������������������������������000644 �000766 �000024 �00000001156 12455173731 024350� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash # script for creating a zip and tarball for inclusion in node unset CDPATH set -e rm -rf release *.tgz || true mkdir release npm pack --loglevel error >/dev/null mv *.tgz release cd release tar xzf *.tgz mkdir node_modules mv package node_modules/npm # make the zip for windows users cp node_modules/npm/bin/*.cmd . zipname=npm-$(npm -v).zip zip -q -9 -r -X "$zipname" *.cmd node_modules # make the tar for node's deps cd node_modules tarname=npm-$(npm -v).tgz tar czf "$tarname" npm cd .. mv "node_modules/$tarname" . rm -rf *.cmd rm -rf node_modules echo "release/$tarname" echo "release/$zipname" ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/scripts/relocate.sh�������������������������������������000755 �000766 �000024 �00000001231 12455173731 024523� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash # Change the cli shebang to point at the specified node # Useful for when the program is moved around after install. # Also used by the default 'make install' in node to point # npm at the newly installed node, rather than the first one # in the PATH, which would be the default otherwise. # bash /path/to/npm/scripts/relocate.sh $nodepath # If $nodepath is blank, then it'll use /usr/bin/env dir="$(dirname "$(dirname "$0")")" cli="$dir"/bin/npm-cli.js tmp="$cli".tmp node="$1" if [ "x$node" = "x" ]; then node="/usr/bin/env node" fi node="#!$node" sed -e 1d "$cli" > "$tmp" echo "$node" > "$cli" cat "$tmp" >> "$cli" rm "$tmp" chmod ogu+x $cli �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/��������������������������������������000755 �000766 �000024 �00000000000 12456115117 024200� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024613� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/��������������������������������������000755 �000766 �000024 �00000000000 12456115117 024304� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansicolors/��������������������������������000755 �000766 �000024 �00000000000 12456115117 025526� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansistyles/��������������������������������000755 �000766 �000024 �00000000000 12456115117 025550� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/�������������������������������������000755 �000766 �000024 �00000000000 12456115117 024460� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/async-some/��������������������������������000755 �000766 �000024 �00000000000 12456115117 025430� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/������������������������������000755 �000766 �000024 �00000000000 12456115117 025735� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/char-spinner/������������������������������000755 �000766 �000024 �00000000000 12456115117 025743� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/child-process-close/�����������������������000755 �000766 �000024 �00000000000 12456115117 027214� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chmodr/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024626� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chownr/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024652� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025053� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/���������������������������������000755 �000766 �000024 �00000000000 12456115117 025357� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/������������������������������000755 �000766 �000024 �00000000000 12456115117 025677� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/�����������������������������������000755 �000766 �000024 �00000000000 12456115117 024777� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024640� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-vacuum/���������������������������������000755 �000766 �000024 �00000000000 12456115117 025260� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-write-stream-atomic/��������������������000755 �000766 �000024 �00000000000 12456115117 027655� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/�����������������������������������000755 �000766 �000024 �00000000000 12456115117 025013� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/�������������������������������000755 �000766 �000024 �00000000000 12456115117 025603� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/�����������������������000755 �000766 �000024 �00000000000 12456115117 027156� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/�������������000755 �000766 �000024 �00000000000 12456115117 031155� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/��������������������������������������000755 �000766 �000024 �00000000000 12456115117 024275� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/�������������������������������000755 �000766 �000024 �00000000000 12456115117 025550� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025156� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025177� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ini/���������������������������������������000755 �000766 �000024 �00000000000 12456115117 024131� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/�������������������������000755 �000766 �000024 �00000000000 12456115117 026655� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lockfile/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025142� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/���������������������������������000755 �000766 �000024 �00000000000 12456115117 025215� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/���������������������������������000755 �000766 �000024 �00000000000 12456115117 025323� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024640� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025074� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/��������������������������������������000755 �000766 �000024 �00000000000 12456115117 024332� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/�������������������������000755 �000766 �000024 �00000000000 12456115117 026733� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/��������������������000755 �000766 �000024 �00000000000 12456115117 027652� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-cache-filename/������������������������000755 �000766 �000024 �00000000000 12456115117 026763� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-install-checks/������������������������000755 �000766 �000024 �00000000000 12456115117 027046� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-package-arg/���������������������������000755 �000766 �000024 �00000000000 12456115117 026304� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/�����������������������000755 �000766 �000024 �00000000000 12456115117 027266� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/�������������������������000755 �000766 �000024 �00000000000 12456115117 026707� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024646� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/once/��������������������������������������000755 �000766 �000024 �00000000000 12456115117 024276� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/opener/������������������������������������000755 �000766 �000024 �00000000000 12456115117 024642� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/osenv/�������������������������������������000755 �000766 �000024 �00000000000 12456115117 024504� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/path-is-inside/����������������������������000755 �000766 �000024 �00000000000 12456115117 026170� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/��������������������������������������000755 �000766 �000024 �00000000000 12456115117 024265� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/����������������������������000755 �000766 �000024 �00000000000 12456115117 026242� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-package-json/�������������������������000755 �000766 �000024 �00000000000 12456115117 026625� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/���������������������������000755 �000766 �000024 �00000000000 12456115117 026402� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/realize-package-specifier/�����������������000755 �000766 �000024 �00000000000 12456115117 030345� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/�����������������������������������000755 �000766 �000024 �00000000000 12456115120 025034� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/�������������������������������������000755 �000766 �000024 �00000000000 12456115120 024511� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/������������������������������������000755 �000766 �000024 �00000000000 12456115120 024624� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/������������������������������������000755 �000766 �000024 �00000000000 12456115120 024645� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/���������������������������������������000755 �000766 �000024 �00000000000 12456115120 024117� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/�������������������������������������000755 �000766 �000024 �00000000000 12456115120 024444� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sorted-object/�����������������������������000755 �000766 �000024 �00000000000 12456115120 026110� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/���������������������������������������000755 �000766 �000024 �00000000000 12456115120 024132� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/��������������������������������000755 �000766 �000024 �00000000000 12456115120 025415� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/uid-number/��������������������������������000755 �000766 �000024 �00000000000 12456115120 025413� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/�������������������������������������000755 �000766 �000024 �00000000000 12456115120 024446� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/wrappy/������������������������������������000755 �000766 �000024 �00000000000 12456115120 024666� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/write-file-atomic/�������������������������000755 �000766 �000024 �00000000000 12456115120 026665� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/write-file-atomic/.npmignore���������������000644 �000766 �000024 �00000000033 12455173731 030673� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*~ DEADJOE .#* node_modules�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/write-file-atomic/index.js�����������������000644 �000766 �000024 �00000002621 12455173731 030346� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs'); var chain = require('slide').chain; var crypto = require('crypto'); var md5hex = function () { var hash = crypto.createHash('md5'); for (var ii=0; ii<arguments.length; ++ii) hash.update(''+arguments[ii]) return hash.digest('hex') } var invocations = 0; var getTmpname = function (filename) { return filename + "." + md5hex(__filename, process.pid, ++invocations) } module.exports = function writeFile(filename, data, options, callback) { if (options instanceof Function) { callback = options; options = null; } if (!options) options = {}; var tmpfile = getTmpname(filename); chain([ [fs, fs.writeFile, tmpfile, data, options], options.chown && [fs, fs.chown, tmpfile, options.chown.uid, options.chown.gid], [fs, fs.rename, tmpfile, filename] ], function (err) { err ? fs.unlink(tmpfile, function () { callback(err) }) : callback() }) } module.exports.sync = function writeFileSync(filename, data, options) { if (!options) options = {}; var tmpfile = getTmpname(filename); try { fs.writeFileSync(tmpfile, data, options); if (options.chown) fs.chownSync(tmpfile, options.chown.uid, options.chown.gid); fs.renameSync(tmpfile, filename); } catch (err) { try { fs.unlinkSync(tmpfile) } catch(e) {} throw err; } } ���������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/write-file-atomic/package.json�������������000644 �000766 �000024 �00000002655 12455173731 031176� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "write-file-atomic", "version": "1.1.0", "description": "Write files in an atomic fashion w/configurable ownership", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git@github.com:iarna/write-file-atomic.git" }, "keywords": [ "writeFile", "atomic" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org", "url": "http://re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/write-file-atomic/issues" }, "homepage": "https://github.com/iarna/write-file-atomic", "dependencies": { "graceful-fs": "^3.0.2", "slide": "^1.1.5" }, "devDependencies": { "require-inject": "^1.1.0", "tap": "^0.4.12" }, "gitHead": "28e4df86547c6728eab0b51bca6f00cf44ef392c", "_id": "write-file-atomic@1.1.0", "_shasum": "e114cfb8f82188353f98217c5945451c9b4dc060", "_from": "write-file-atomic@>=1.1.0 <2.0.0", "_npmVersion": "1.4.28", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "dist": { "shasum": "e114cfb8f82188353f98217c5945451c9b4dc060", "tarball": "http://registry.npmjs.org/write-file-atomic/-/write-file-atomic-1.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/write-file-atomic/-/write-file-atomic-1.1.0.tgz" } �����������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/write-file-atomic/README.md����������������000644 �000766 �000024 �00000003121 12455173731 030154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������write-file-atomic ----------------- This is an extension for node's `fs.writeFile` that makes its operation atomic and allows you set ownership (uid/gid of the file). ### var writeFileAtomic = require('write-file-atomic')<br>writeFileAtomic(filename, data, [options], callback) * filename **String** * data **String** | **Buffer** * options **Object** * chown **Object** * uid **Number** * gid **Number** * encoding **String** | **Null** default = 'utf8' * mode **Number** default = 438 (aka 0666 in Octal) callback **Function** Atomically and asynchronously writes data to a file, replacing the file if it already exists. data can be a string or a buffer. The file is initially named `filename + "." + md5hex(__filename, process.pid, ++invocations)`. If writeFile completes successfully then, if passed the **chown** option it will change the ownership of the file. Finally it renames the file back to the filename you specified. If it encounters errors at any of these steps it will attempt to unlink the temporary file and then pass the error back to the caller. If provided, the **chown** option requires both **uid** and **gid** properties or else you'll get an error. The **encoding** option is ignored if **data** is a buffer. It defaults to 'utf8'. Example: ```javascript writeFileAtomic('message.txt', 'Hello Node', {chown:{uid:100,gid:50}}, function (err) { if (err) throw err; console.log('It\'s saved!'); }); ``` ### var writeFileAtomicSync = require('write-file-atomic').sync<br>writeFileAtomicSync(filename, data, [options]) The synchronous version of **writeFileAtomic**. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/wrappy/LICENSE�����������������������������000644 �000766 �000024 �00000001375 12455173731 025714� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/wrappy/package.json������������������������000644 �000766 �000024 �00000002327 12455173731 027173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "wrappy", "version": "1.0.1", "description": "Callback wrapping utility", "main": "wrappy.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "^0.4.12" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/wrappy" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/wrappy/issues" }, "homepage": "https://github.com/npm/wrappy", "gitHead": "006a8cbac6b99988315834c207896eed71fd069a", "_id": "wrappy@1.0.1", "_shasum": "1e65969965ccbc2db4548c6b84a6f2c5aedd4739", "_from": "wrappy@1.0.1", "_npmVersion": "2.0.0", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "1e65969965ccbc2db4548c6b84a6f2c5aedd4739", "tarball": "http://registry.npmjs.org/wrappy/-/wrappy-1.0.1.tgz" }, "_resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.1.tgz", "readme": "ERROR: No README data found!" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/wrappy/README.md���������������������������000644 �000766 �000024 �00000001255 12455173731 026163� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# wrappy Callback wrapping utility ## USAGE ```javascript var wrappy = require("wrappy") // var wrapper = wrappy(wrapperFunction) // make sure a cb is called only once // See also: http://npm.im/once for this specific use case var once = wrappy(function (cb) { var called = false return function () { if (called) return called = true return cb.apply(this, arguments) } }) function printBoo () { console.log('boo') } // has some rando property printBoo.iAmBooPrinter = true var onlyPrintOnce = once(printBoo) onlyPrintOnce() // prints 'boo' onlyPrintOnce() // does nothing // random property is retained! assert.equal(onlyPrintOnce.iAmBooPrinter, true) ``` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/wrappy/wrappy.js���������������������������000644 �000766 �000024 �00000001611 12455173731 026560� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Returns a wrapper function that returns a wrapped callback // The wrapper function should do some stuff, and return a // presumably different callback function. // This makes sure that own properties are retained, so that // decorations and such are not lost along the way. module.exports = wrappy function wrappy (fn, cb) { if (fn && cb) return wrappy(fn)(cb) if (typeof fn !== 'function') throw new TypeError('need wrapper function') Object.keys(fn).forEach(function (k) { wrapper[k] = fn[k] }) return wrapper function wrapper() { var args = new Array(arguments.length) for (var i = 0; i < args.length; i++) { args[i] = arguments[i] } var ret = fn.apply(this, args) var cb = args[args.length-1] if (typeof ret === 'function' && ret !== cb) { Object.keys(cb).forEach(function (k) { ret[k] = cb[k] }) } return ret } } �����������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/bin/���������������������������������000755 �000766 �000024 �00000000000 12456115120 025216� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/LICENSE������������������������������000644 �000766 �000024 �00000001375 12455173731 025474� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/package.json�������������������������000644 �000766 �000024 �00000002252 12455173731 026750� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "which", "description": "Like which(1) unix command. Find the first instance of an executable in the PATH.", "version": "1.0.8", "repository": { "type": "git", "url": "git://github.com/isaacs/node-which.git" }, "main": "which.js", "bin": { "which": "./bin/which" }, "license": "ISC", "gitHead": "681a9ebbc447cb428232ddf6c0983006d89e7755", "bugs": { "url": "https://github.com/isaacs/node-which/issues" }, "homepage": "https://github.com/isaacs/node-which", "_id": "which@1.0.8", "scripts": {}, "_shasum": "c2ff319534ac4a1fa45df2221b56c36279903ded", "_from": "which@>=1.0.8 <1.1.0", "_npmVersion": "2.1.11", "_nodeVersion": "0.10.16", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "c2ff319534ac4a1fa45df2221b56c36279903ded", "tarball": "http://registry.npmjs.org/which/-/which-1.0.8.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/which/-/which-1.0.8.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/README.md����������������������������000644 �000766 �000024 �00000000307 12455173731 025740� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The "which" util from npm's guts. Finds the first instance of a specified executable in the PATH environment variable. Does not cache the results, so `hash -r` is not needed when the PATH changes. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/which.js�����������������������������000644 �000766 �000024 �00000005606 12455173731 026130� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = which which.sync = whichSync var path = require("path") , fs , COLON = process.platform === "win32" ? ";" : ":" , isExe , fs = require("fs") if (process.platform == "win32") { // On windows, there is no good way to check that a file is executable isExe = function isExe () { return true } } else { isExe = function isExe (mod, uid, gid) { //console.error(mod, uid, gid); //console.error("isExe?", (mod & 0111).toString(8)) var ret = (mod & 0001) || (mod & 0010) && process.getgid && gid === process.getgid() || (mod & 0100) && process.getuid && uid === process.getuid() //console.error("isExe?", ret) return ret } } function which (cmd, cb) { if (isAbsolute(cmd)) return cb(null, cmd) var pathEnv = (process.env.PATH || "").split(COLON) , pathExt = [""] if (process.platform === "win32") { pathEnv.push(process.cwd()) pathExt = (process.env.PATHEXT || ".EXE").split(COLON) if (cmd.indexOf(".") !== -1) pathExt.unshift("") } //console.error("pathEnv", pathEnv) ;(function F (i, l) { if (i === l) return cb(new Error("not found: "+cmd)) var p = path.resolve(pathEnv[i], cmd) ;(function E (ii, ll) { if (ii === ll) return F(i + 1, l) var ext = pathExt[ii] //console.error(p + ext) fs.stat(p + ext, function (er, stat) { if (!er && stat && stat.isFile() && isExe(stat.mode, stat.uid, stat.gid)) { //console.error("yes, exe!", p + ext) return cb(null, p + ext) } return E(ii + 1, ll) }) })(0, pathExt.length) })(0, pathEnv.length) } function whichSync (cmd) { if (isAbsolute(cmd)) return cmd var pathEnv = (process.env.PATH || "").split(COLON) , pathExt = [""] if (process.platform === "win32") { pathEnv.push(process.cwd()) pathExt = (process.env.PATHEXT || ".EXE").split(COLON) if (cmd.indexOf(".") !== -1) pathExt.unshift("") } for (var i = 0, l = pathEnv.length; i < l; i ++) { var p = path.join(pathEnv[i], cmd) for (var j = 0, ll = pathExt.length; j < ll; j ++) { var cur = p + pathExt[j] var stat try { stat = fs.statSync(cur) } catch (ex) {} if (stat && stat.isFile() && isExe(stat.mode, stat.uid, stat.gid)) return cur } } throw new Error("not found: "+cmd) } var isAbsolute = process.platform === "win32" ? absWin : absUnix function absWin (p) { if (absUnix(p)) return true // pull off the device/UNC bit from a windows path. // from node's lib/path.js var splitDeviceRe = /^([a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/][^\\\/]+)?([\\\/])?/ , result = splitDeviceRe.exec(p) , device = result[1] || '' , isUnc = device && device.charAt(1) !== ':' , isAbsolute = !!result[2] || isUnc // UNC paths are always absolute return isAbsolute } function absUnix (p) { return p.charAt(0) === "/" || p === "" } ��������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/which/bin/which����������������������������000755 �000766 �000024 �00000000441 12455173731 026260� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var which = require("../") if (process.argv.length < 3) { console.error("Usage: which <thing>") process.exit(1) } which(process.argv[2], function (er, thing) { if (er) { console.error(er.message) process.exit(er.errno || 127) } console.log(thing) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/uid-number/get-uid-gid.js������������������000755 �000766 �000024 �00000001204 12455173731 030063� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������if (module !== require.main) { throw new Error("This file should not be loaded with require()") } if (!process.getuid || !process.getgid) { throw new Error("this file should not be called without uid/gid support") } var argv = process.argv.slice(2) , user = argv[0] || process.getuid() , group = argv[1] || process.getgid() if (!isNaN(user)) user = +user if (!isNaN(group)) group = +group console.error([user, group]) try { process.setgid(group) process.setuid(user) console.log(JSON.stringify({uid:+process.getuid(), gid:+process.getgid()})) } catch (ex) { console.log(JSON.stringify({error:ex.message,errno:ex.errno})) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/uid-number/LICENSE�������������������������000644 �000766 �000024 �00000001354 12455173731 026436� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/uid-number/package.json��������������������000644 �000766 �000024 �00000002370 12455173731 027716� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "uid-number", "description": "Convert a username/group name to a uid/gid number", "version": "0.0.6", "repository": { "type": "git", "url": "git://github.com/isaacs/uid-number.git" }, "main": "uid-number.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "license": "ISC", "gitHead": "aab48f5d6bda85794946b26d945d2ee452e0e9ab", "bugs": { "url": "https://github.com/isaacs/uid-number/issues" }, "homepage": "https://github.com/isaacs/uid-number", "_id": "uid-number@0.0.6", "scripts": {}, "_shasum": "0ea10e8035e8eb5b8e4449f06da1c730663baa81", "_from": "uid-number@>=0.0.6 <0.1.0", "_npmVersion": "2.1.3", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "0ea10e8035e8eb5b8e4449f06da1c730663baa81", "tarball": "http://registry.npmjs.org/uid-number/-/uid-number-0.0.6.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/uid-number/-/uid-number-0.0.6.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/uid-number/README.md�����������������������000644 �000766 �000024 �00000000531 12455173731 026704� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Use this module to convert a username/groupname to a uid/gid number. Usage: ``` npm install uid-number ``` Then, in your node program: ```javascript var uidNumber = require("uid-number") uidNumber("isaacs", function (er, uid, gid) { // gid is null because we didn't ask for a group name // uid === 24561 because that's my number. }) ``` �����������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/uid-number/uid-number.js�������������������000644 �000766 �000024 �00000003331 12455173731 030033� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = uidNumber // This module calls into get-uid-gid.js, which sets the // uid and gid to the supplied argument, in order to find out their // numeric value. This can't be done in the main node process, // because otherwise node would be running as that user from this // point on. var child_process = require("child_process") , path = require("path") , uidSupport = process.getuid && process.setuid , uidCache = {} , gidCache = {} function uidNumber (uid, gid, cb) { if (!uidSupport) return cb() if (typeof cb !== "function") cb = gid, gid = null if (typeof cb !== "function") cb = uid, uid = null if (gid == null) gid = process.getgid() if (uid == null) uid = process.getuid() if (!isNaN(gid)) gid = gidCache[gid] = +gid if (!isNaN(uid)) uid = uidCache[uid] = +uid if (uidCache.hasOwnProperty(uid)) uid = uidCache[uid] if (gidCache.hasOwnProperty(gid)) gid = gidCache[gid] if (typeof gid === "number" && typeof uid === "number") { return process.nextTick(cb.bind(null, null, uid, gid)) } var getter = require.resolve("./get-uid-gid.js") child_process.execFile( process.execPath , [getter, uid, gid] , function (code, out, stderr) { if (code) { var er = new Error("could not get uid/gid\n" + stderr) er.code = code return cb(er) } try { out = JSON.parse(out+"") } catch (ex) { return cb(ex) } if (out.error) { var er = new Error(out.error) er.errno = out.errno return cb(er) } if (isNaN(out.uid) || isNaN(out.gid)) return cb(new Error( "Could not get uid/gid: "+JSON.stringify(out))) cb(null, uidCache[uid] = +out.uid, gidCache[gid] = +out.gid) }) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/.travis.yml���������������������000644 �000766 �000024 �00000000060 12455173731 027535� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10" ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/example/������������������������000755 �000766 �000024 �00000000000 12456115120 027050� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/index.js������������������������000644 �000766 �000024 �00000004574 12455173731 027107� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = function (rows_, opts) { if (!opts) opts = {}; var hsep = opts.hsep === undefined ? ' ' : opts.hsep; var align = opts.align || []; var stringLength = opts.stringLength || function (s) { return String(s).length; } ; var dotsizes = reduce(rows_, function (acc, row) { forEach(row, function (c, ix) { var n = dotindex(c); if (!acc[ix] || n > acc[ix]) acc[ix] = n; }); return acc; }, []); var rows = map(rows_, function (row) { return map(row, function (c_, ix) { var c = String(c_); if (align[ix] === '.') { var index = dotindex(c); var size = dotsizes[ix] + (/\./.test(c) ? 1 : 2) - (stringLength(c) - index) ; return c + Array(size).join(' '); } else return c; }); }); var sizes = reduce(rows, function (acc, row) { forEach(row, function (c, ix) { var n = stringLength(c); if (!acc[ix] || n > acc[ix]) acc[ix] = n; }); return acc; }, []); return map(rows, function (row) { return map(row, function (c, ix) { var n = (sizes[ix] - stringLength(c)) || 0; var s = Array(Math.max(n + 1, 1)).join(' '); if (align[ix] === 'r' || align[ix] === '.') { return s + c; } if (align[ix] === 'c') { return Array(Math.ceil(n / 2 + 1)).join(' ') + c + Array(Math.floor(n / 2 + 1)).join(' ') ; } return c + s; }).join(hsep).replace(/\s+$/, ''); }).join('\n'); }; function dotindex (c) { var m = /\.[^.]*$/.exec(c); return m ? m.index + 1 : c.length; } function reduce (xs, f, init) { if (xs.reduce) return xs.reduce(f, init); var i = 0; var acc = arguments.length >= 3 ? init : xs[i++]; for (; i < xs.length; i++) { f(acc, xs[i], i); } return acc; } function forEach (xs, f) { if (xs.forEach) return xs.forEach(f); for (var i = 0; i < xs.length; i++) { f.call(xs, xs[i], i); } } function map (xs, f) { if (xs.map) return xs.map(f); var res = []; for (var i = 0; i < xs.length; i++) { res.push(f.call(xs, xs[i], i)); } return res; } ������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/LICENSE�������������������������000644 �000766 �000024 �00000002061 12455173731 026434� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/package.json��������������������000644 �000766 �000024 �00000007150 12455173731 027721� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "text-table", "version": "0.2.0", "description": "borderless text tables with alignment", "main": "index.js", "devDependencies": { "tap": "~0.4.0", "tape": "~1.0.2", "cli-color": "~0.2.3" }, "scripts": { "test": "tap test/*.js" }, "testling": { "files": "test/*.js", "browsers": [ "ie/6..latest", "chrome/20..latest", "firefox/10..latest", "safari/latest", "opera/11.0..latest", "iphone/6", "ipad/6" ] }, "repository": { "type": "git", "url": "git://github.com/substack/text-table.git" }, "homepage": "https://github.com/substack/text-table", "keywords": [ "text", "table", "align", "ascii", "rows", "tabular" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "readme": "# text-table\n\ngenerate borderless text table strings suitable for printing to stdout\n\n[![build status](https://secure.travis-ci.org/substack/text-table.png)](http://travis-ci.org/substack/text-table)\n\n[![browser support](https://ci.testling.com/substack/text-table.png)](http://ci.testling.com/substack/text-table)\n\n# example\n\n## default align\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'master', '0123456789abcdef' ],\n [ 'staging', 'fedcba9876543210' ]\n]);\nconsole.log(t);\n```\n\n```\nmaster 0123456789abcdef\nstaging fedcba9876543210\n```\n\n## left-right align\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'beep', '1024' ],\n [ 'boop', '33450' ],\n [ 'foo', '1006' ],\n [ 'bar', '45' ]\n], { align: [ 'l', 'r' ] });\nconsole.log(t);\n```\n\n```\nbeep 1024\nboop 33450\nfoo 1006\nbar 45\n```\n\n## dotted align\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'beep', '1024' ],\n [ 'boop', '334.212' ],\n [ 'foo', '1006' ],\n [ 'bar', '45.6' ],\n [ 'baz', '123.' ]\n], { align: [ 'l', '.' ] });\nconsole.log(t);\n```\n\n```\nbeep 1024\nboop 334.212\nfoo 1006\nbar 45.6\nbaz 123.\n```\n\n## centered\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'beep', '1024', 'xyz' ],\n [ 'boop', '3388450', 'tuv' ],\n [ 'foo', '10106', 'qrstuv' ],\n [ 'bar', '45', 'lmno' ]\n], { align: [ 'l', 'c', 'l' ] });\nconsole.log(t);\n```\n\n```\nbeep 1024 xyz\nboop 3388450 tuv\nfoo 10106 qrstuv\nbar 45 lmno\n```\n\n# methods\n\n``` js\nvar table = require('text-table')\n```\n\n## var s = table(rows, opts={})\n\nReturn a formatted table string `s` from an array of `rows` and some options\n`opts`.\n\n`rows` should be an array of arrays containing strings, numbers, or other\nprintable values.\n\noptions can be:\n\n* `opts.hsep` - separator to use between columns, default `' '`\n* `opts.align` - array of alignment types for each column, default `['l','l',...]`\n* `opts.stringLength` - callback function to use when calculating the string length\n\nalignment types are:\n\n* `'l'` - left\n* `'r'` - right\n* `'c'` - center\n* `'.'` - decimal\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install text-table\n```\n\n# Use with ANSI-colors\n\nSince the string length of ANSI color schemes does not equal the length\nJavaScript sees internally it is necessary to pass the a custom string length\ncalculator during the main function call.\n\nSee the `test/ansi-colors.js` file for an example.\n\n# license\n\nMIT\n", "readmeFilename": "readme.markdown", "bugs": { "url": "https://github.com/substack/text-table/issues" }, "_id": "text-table@0.2.0", "_from": "text-table@~0.2.0" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/readme.markdown�����������������000644 �000766 �000024 �00000004646 12455173731 030443� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# text-table generate borderless text table strings suitable for printing to stdout [![build status](https://secure.travis-ci.org/substack/text-table.png)](http://travis-ci.org/substack/text-table) [![browser support](https://ci.testling.com/substack/text-table.png)](http://ci.testling.com/substack/text-table) # example ## default align ``` js var table = require('text-table'); var t = table([ [ 'master', '0123456789abcdef' ], [ 'staging', 'fedcba9876543210' ] ]); console.log(t); ``` ``` master 0123456789abcdef staging fedcba9876543210 ``` ## left-right align ``` js var table = require('text-table'); var t = table([ [ 'beep', '1024' ], [ 'boop', '33450' ], [ 'foo', '1006' ], [ 'bar', '45' ] ], { align: [ 'l', 'r' ] }); console.log(t); ``` ``` beep 1024 boop 33450 foo 1006 bar 45 ``` ## dotted align ``` js var table = require('text-table'); var t = table([ [ 'beep', '1024' ], [ 'boop', '334.212' ], [ 'foo', '1006' ], [ 'bar', '45.6' ], [ 'baz', '123.' ] ], { align: [ 'l', '.' ] }); console.log(t); ``` ``` beep 1024 boop 334.212 foo 1006 bar 45.6 baz 123. ``` ## centered ``` js var table = require('text-table'); var t = table([ [ 'beep', '1024', 'xyz' ], [ 'boop', '3388450', 'tuv' ], [ 'foo', '10106', 'qrstuv' ], [ 'bar', '45', 'lmno' ] ], { align: [ 'l', 'c', 'l' ] }); console.log(t); ``` ``` beep 1024 xyz boop 3388450 tuv foo 10106 qrstuv bar 45 lmno ``` # methods ``` js var table = require('text-table') ``` ## var s = table(rows, opts={}) Return a formatted table string `s` from an array of `rows` and some options `opts`. `rows` should be an array of arrays containing strings, numbers, or other printable values. options can be: * `opts.hsep` - separator to use between columns, default `' '` * `opts.align` - array of alignment types for each column, default `['l','l',...]` * `opts.stringLength` - callback function to use when calculating the string length alignment types are: * `'l'` - left * `'r'` - right * `'c'` - center * `'.'` - decimal # install With [npm](https://npmjs.org) do: ``` npm install text-table ``` # Use with ANSI-colors Since the string length of ANSI color schemes does not equal the length JavaScript sees internally it is necessary to pass the a custom string length calculator during the main function call. See the `test/ansi-colors.js` file for an example. # license MIT ������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/example/align.js����������������000644 �000766 �000024 �00000000265 12455173731 030516� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var table = require('../'); var t = table([ [ 'beep', '1024' ], [ 'boop', '33450' ], [ 'foo', '1006' ], [ 'bar', '45' ] ], { align: [ 'l', 'r' ] }); console.log(t); �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/example/center.js���������������000644 �000766 �000024 �00000000335 12455173731 030702� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var table = require('../'); var t = table([ [ 'beep', '1024', 'xyz' ], [ 'boop', '3388450', 'tuv' ], [ 'foo', '10106', 'qrstuv' ], [ 'bar', '45', 'lmno' ] ], { align: [ 'l', 'c', 'l' ] }); console.log(t); ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/example/dotalign.js�������������000644 �000766 �000024 �00000000320 12455173731 031215� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var table = require('../'); var t = table([ [ 'beep', '1024' ], [ 'boop', '334.212' ], [ 'foo', '1006' ], [ 'bar', '45.6' ], [ 'baz', '123.' ] ], { align: [ 'l', '.' ] }); console.log(t); ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/example/doubledot.js������������000644 �000766 �000024 �00000000316 12455173731 031402� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var table = require('../'); var t = table([ [ '0.1.2' ], [ '11.22.33' ], [ '5.6.7' ], [ '1.22222' ], [ '12345.' ], [ '5555.' ], [ '123' ] ], { align: [ '.' ] }); console.log(t); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/text-table/example/table.js����������������000644 �000766 �000024 �00000000214 12455173731 030505� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var table = require('../'); var t = table([ [ 'master', '0123456789abcdef' ], [ 'staging', 'fedcba9876543210' ] ]); console.log(t); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/.npmignore�����������������������������000644 �000766 �000024 �00000000077 12455173731 026150� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.*.swp node_modules examples/extract/ test/tmp/ test/fixtures/ �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/.travis.yml����������������������������000644 �000766 �000024 �00000000055 12455173731 026256� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 - 0.11 �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/examples/������������������������������000755 �000766 �000024 �00000000000 12456115120 025750� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/�����������������������������������000755 �000766 �000024 �00000000000 12456115120 024700� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/LICENCE��������������������������������000644 �000766 �000024 �00000002446 12455173731 025140� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/package.json���������������������������000644 �000766 �000024 �00000004131 12455173731 026432� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "tar", "description": "tar for node", "version": "1.0.3", "repository": { "type": "git", "url": "git://github.com/isaacs/node-tar.git" }, "main": "tar.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "block-stream": "*", "fstream": "^1.0.2", "inherits": "2" }, "devDependencies": { "graceful-fs": "^3.0.2", "rimraf": "1.x", "tap": "0.x", "mkdirp": "^0.5.0" }, "license": "BSD", "readme": "# node-tar\n\nTar for Node.js.\n\n[![NPM](https://nodei.co/npm/tar.png)](https://nodei.co/npm/tar/)\n\n## API\n\nSee `examples/` for usage examples.\n\n### var tar = require('tar')\n\nReturns an object with `.Pack`, `.Extract` and `.Parse` methods.\n\n### tar.Pack([properties])\n\nReturns a through stream. Use\n[fstream](https://npmjs.org/package/fstream) to write files into the\npack stream and you will receive tar archive data from the pack\nstream.\n\nThis only works with directories, it does not work with individual files.\n\nThe optional `properties` object are used to set properties in the tar\n'Global Extended Header'.\n\n### tar.Extract([options])\n\nReturns a through stream. Write tar data to the stream and the files\nin the tarball will be extracted onto the filesystem.\n\n`options` can be:\n\n```js\n{\n path: '/path/to/extract/tar/into',\n strip: 0, // how many path segments to strip from the root when extracting\n}\n```\n\n`options` also get passed to the `fstream.Writer` instance that `tar`\nuses internally.\n\n### tar.Parse()\n\nReturns a writable stream. Write tar data to it and it will emit\n`entry` events for each entry parsed from the tarball. This is used by\n`tar.Extract`.\n", "readmeFilename": "README.md", "gitHead": "f4151128c585da236c6b1e278b762ecaedc20c15", "bugs": { "url": "https://github.com/isaacs/node-tar/issues" }, "homepage": "https://github.com/isaacs/node-tar", "_id": "tar@1.0.3", "_shasum": "15bcdab244fa4add44e4244a0176edb8aa9a2b44", "_from": "tar@>=1.0.3 <1.1.0" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/README.md������������������������������000644 �000766 �000024 �00000002217 12455173731 025426� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# node-tar Tar for Node.js. [![NPM](https://nodei.co/npm/tar.png)](https://nodei.co/npm/tar/) ## API See `examples/` for usage examples. ### var tar = require('tar') Returns an object with `.Pack`, `.Extract` and `.Parse` methods. ### tar.Pack([properties]) Returns a through stream. Use [fstream](https://npmjs.org/package/fstream) to write files into the pack stream and you will receive tar archive data from the pack stream. This only works with directories, it does not work with individual files. The optional `properties` object are used to set properties in the tar 'Global Extended Header'. ### tar.Extract([options]) Returns a through stream. Write tar data to the stream and the files in the tarball will be extracted onto the filesystem. `options` can be: ```js { path: '/path/to/extract/tar/into', strip: 0, // how many path segments to strip from the root when extracting } ``` `options` also get passed to the `fstream.Writer` instance that `tar` uses internally. ### tar.Parse() Returns a writable stream. Write tar data to it and it will emit `entry` events for each entry parsed from the tarball. This is used by `tar.Extract`. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/tar.js���������������������������������000644 �000766 �000024 �00000010024 12455173731 025266� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// field paths that every tar file must have. // header is padded to 512 bytes. var f = 0 , fields = {} , path = fields.path = f++ , mode = fields.mode = f++ , uid = fields.uid = f++ , gid = fields.gid = f++ , size = fields.size = f++ , mtime = fields.mtime = f++ , cksum = fields.cksum = f++ , type = fields.type = f++ , linkpath = fields.linkpath = f++ , headerSize = 512 , blockSize = 512 , fieldSize = [] fieldSize[path] = 100 fieldSize[mode] = 8 fieldSize[uid] = 8 fieldSize[gid] = 8 fieldSize[size] = 12 fieldSize[mtime] = 12 fieldSize[cksum] = 8 fieldSize[type] = 1 fieldSize[linkpath] = 100 // "ustar\0" may introduce another bunch of headers. // these are optional, and will be nulled out if not present. var ustar = fields.ustar = f++ , ustarver = fields.ustarver = f++ , uname = fields.uname = f++ , gname = fields.gname = f++ , devmaj = fields.devmaj = f++ , devmin = fields.devmin = f++ , prefix = fields.prefix = f++ , fill = fields.fill = f++ // terminate fields. fields[f] = null fieldSize[ustar] = 6 fieldSize[ustarver] = 2 fieldSize[uname] = 32 fieldSize[gname] = 32 fieldSize[devmaj] = 8 fieldSize[devmin] = 8 fieldSize[prefix] = 155 fieldSize[fill] = 12 // nb: prefix field may in fact be 130 bytes of prefix, // a null char, 12 bytes for atime, 12 bytes for ctime. // // To recognize this format: // 1. prefix[130] === ' ' or '\0' // 2. atime and ctime are octal numeric values // 3. atime and ctime have ' ' in their last byte var fieldEnds = {} , fieldOffs = {} , fe = 0 for (var i = 0; i < f; i ++) { fieldOffs[i] = fe fieldEnds[i] = (fe += fieldSize[i]) } // build a translation table of field paths. Object.keys(fields).forEach(function (f) { if (fields[f] !== null) fields[fields[f]] = f }) // different values of the 'type' field // paths match the values of Stats.isX() functions, where appropriate var types = { 0: "File" , "\0": "OldFile" // like 0 , "": "OldFile" , 1: "Link" , 2: "SymbolicLink" , 3: "CharacterDevice" , 4: "BlockDevice" , 5: "Directory" , 6: "FIFO" , 7: "ContiguousFile" // like 0 // posix headers , g: "GlobalExtendedHeader" // k=v for the rest of the archive , x: "ExtendedHeader" // k=v for the next file // vendor-specific stuff , A: "SolarisACL" // skip , D: "GNUDumpDir" // like 5, but with data, which should be skipped , I: "Inode" // metadata only, skip , K: "NextFileHasLongLinkpath" // data = link path of next file , L: "NextFileHasLongPath" // data = path of next file , M: "ContinuationFile" // skip , N: "OldGnuLongPath" // like L , S: "SparseFile" // skip , V: "TapeVolumeHeader" // skip , X: "OldExtendedHeader" // like x } Object.keys(types).forEach(function (t) { types[types[t]] = types[types[t]] || t }) // values for the mode field var modes = { suid: 04000 // set uid on extraction , sgid: 02000 // set gid on extraction , svtx: 01000 // set restricted deletion flag on dirs on extraction , uread: 0400 , uwrite: 0200 , uexec: 0100 , gread: 040 , gwrite: 020 , gexec: 010 , oread: 4 , owrite: 2 , oexec: 1 , all: 07777 } var numeric = { mode: true , uid: true , gid: true , size: true , mtime: true , devmaj: true , devmin: true , cksum: true , atime: true , ctime: true , dev: true , ino: true , nlink: true } Object.keys(modes).forEach(function (t) { modes[modes[t]] = modes[modes[t]] || t }) var knownExtended = { atime: true , charset: true , comment: true , ctime: true , gid: true , gname: true , linkpath: true , mtime: true , path: true , realtime: true , security: true , size: true , uid: true , uname: true } exports.fields = fields exports.fieldSize = fieldSize exports.fieldOffs = fieldOffs exports.fieldEnds = fieldEnds exports.types = types exports.modes = modes exports.numeric = numeric exports.headerSize = headerSize exports.blockSize = blockSize exports.knownExtended = knownExtended exports.Pack = require("./lib/pack.js") exports.Parse = require("./lib/parse.js") exports.Extract = require("./lib/extract.js") ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/buffer-entry.js��������������������000644 �000766 �000024 �00000001444 12455173731 027664� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// just like the Entry class, but it buffers the contents // // XXX It would be good to set a maximum BufferEntry filesize, // since it eats up memory. In normal operation, // these are only for long filenames or link names, which are // rarely very big. module.exports = BufferEntry var inherits = require("inherits") , Entry = require("./entry.js") function BufferEntry () { Entry.apply(this, arguments) this._buffer = new Buffer(this.props.size) this._offset = 0 this.body = "" this.on("end", function () { this.body = this._buffer.toString().slice(0, -1) }) } inherits(BufferEntry, Entry) // collect the bytes as they come in. BufferEntry.prototype.write = function (c) { c.copy(this._buffer, this._offset) this._offset += c.length Entry.prototype.write.call(this, c) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/entry-writer.js��������������������000644 �000766 �000024 �00000007244 12455173731 027733� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = EntryWriter var tar = require("../tar.js") , TarHeader = require("./header.js") , Entry = require("./entry.js") , inherits = require("inherits") , BlockStream = require("block-stream") , ExtendedHeaderWriter , Stream = require("stream").Stream , EOF = {} inherits(EntryWriter, Stream) function EntryWriter (props) { var me = this if (!(me instanceof EntryWriter)) { return new EntryWriter(props) } Stream.apply(this) me.writable = true me.readable = true me._stream = new BlockStream(512) me._stream.on("data", function (c) { me.emit("data", c) }) me._stream.on("drain", function () { me.emit("drain") }) me._stream.on("end", function () { me.emit("end") me.emit("close") }) me.props = props if (props.type === "Directory") { props.size = 0 } props.ustar = "ustar\0" props.ustarver = "00" me.path = props.path me._buffer = [] me._didHeader = false me._meta = false me.on("pipe", function () { me._process() }) } EntryWriter.prototype.write = function (c) { // console.error(".. ew write") if (this._ended) return this.emit("error", new Error("write after end")) this._buffer.push(c) this._process() this._needDrain = this._buffer.length > 0 return !this._needDrain } EntryWriter.prototype.end = function (c) { // console.error(".. ew end") if (c) this._buffer.push(c) this._buffer.push(EOF) this._ended = true this._process() this._needDrain = this._buffer.length > 0 } EntryWriter.prototype.pause = function () { // console.error(".. ew pause") this._paused = true this.emit("pause") } EntryWriter.prototype.resume = function () { // console.error(".. ew resume") this._paused = false this.emit("resume") this._process() } EntryWriter.prototype.add = function (entry) { // console.error(".. ew add") if (!this.parent) return this.emit("error", new Error("no parent")) // make sure that the _header and such is emitted, and clear out // the _currentEntry link on the parent. if (!this._ended) this.end() return this.parent.add(entry) } EntryWriter.prototype._header = function () { // console.error(".. ew header") if (this._didHeader) return this._didHeader = true var headerBlock = TarHeader.encode(this.props) if (this.props.needExtended && !this._meta) { var me = this ExtendedHeaderWriter = ExtendedHeaderWriter || require("./extended-header-writer.js") ExtendedHeaderWriter(this.props) .on("data", function (c) { me.emit("data", c) }) .on("error", function (er) { me.emit("error", er) }) .end() } // console.error(".. .. ew headerBlock emitting") this.emit("data", headerBlock) this.emit("header") } EntryWriter.prototype._process = function () { // console.error(".. .. ew process") if (!this._didHeader && !this._meta) { this._header() } if (this._paused || this._processing) { // console.error(".. .. .. paused=%j, processing=%j", this._paused, this._processing) return } this._processing = true var buf = this._buffer for (var i = 0; i < buf.length; i ++) { // console.error(".. .. .. i=%d", i) var c = buf[i] if (c === EOF) this._stream.end() else this._stream.write(c) if (this._paused) { // console.error(".. .. .. paused mid-emission") this._processing = false if (i < buf.length) { this._needDrain = true this._buffer = buf.slice(i + 1) } return } } // console.error(".. .. .. emitted") this._buffer.length = 0 this._processing = false // console.error(".. .. .. emitting drain") this.emit("drain") } EntryWriter.prototype.destroy = function () {} ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/entry.js���������������������������000644 �000766 �000024 �00000011746 12455173731 026423� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// A passthrough read/write stream that sets its properties // based on a header, extendedHeader, and globalHeader // // Can be either a file system object of some sort, or // a pax/ustar metadata entry. module.exports = Entry var TarHeader = require("./header.js") , tar = require("../tar") , assert = require("assert").ok , Stream = require("stream").Stream , inherits = require("inherits") , fstream = require("fstream").Abstract function Entry (header, extended, global) { Stream.call(this) this.readable = true this.writable = true this._needDrain = false this._paused = false this._reading = false this._ending = false this._ended = false this._remaining = 0 this._queue = [] this._index = 0 this._queueLen = 0 this._read = this._read.bind(this) this.props = {} this._header = header this._extended = extended || {} // globals can change throughout the course of // a file parse operation. Freeze it at its current state. this._global = {} var me = this Object.keys(global || {}).forEach(function (g) { me._global[g] = global[g] }) this._setProps() } inherits(Entry, Stream) Entry.prototype.write = function (c) { if (this._ending) this.error("write() after end()", null, true) if (this._remaining === 0) { this.error("invalid bytes past eof") } // often we'll get a bunch of \0 at the end of the last write, // since chunks will always be 512 bytes when reading a tarball. if (c.length > this._remaining) { c = c.slice(0, this._remaining) } this._remaining -= c.length // put it on the stack. var ql = this._queueLen this._queue.push(c) this._queueLen ++ this._read() // either paused, or buffered if (this._paused || ql > 0) { this._needDrain = true return false } return true } Entry.prototype.end = function (c) { if (c) this.write(c) this._ending = true this._read() } Entry.prototype.pause = function () { this._paused = true this.emit("pause") } Entry.prototype.resume = function () { // console.error(" Tar Entry resume", this.path) this.emit("resume") this._paused = false this._read() return this._queueLen - this._index > 1 } // This is bound to the instance Entry.prototype._read = function () { // console.error(" Tar Entry _read", this.path) if (this._paused || this._reading || this._ended) return // set this flag so that event handlers don't inadvertently // get multiple _read() calls running. this._reading = true // have any data to emit? while (this._index < this._queueLen && !this._paused) { var chunk = this._queue[this._index ++] this.emit("data", chunk) } // check if we're drained if (this._index >= this._queueLen) { this._queue.length = this._queueLen = this._index = 0 if (this._needDrain) { this._needDrain = false this.emit("drain") } if (this._ending) { this._ended = true this.emit("end") } } // if the queue gets too big, then pluck off whatever we can. // this should be fairly rare. var mql = this._maxQueueLen if (this._queueLen > mql && this._index > 0) { mql = Math.min(this._index, mql) this._index -= mql this._queueLen -= mql this._queue = this._queue.slice(mql) } this._reading = false } Entry.prototype._setProps = function () { // props = extended->global->header->{} var header = this._header , extended = this._extended , global = this._global , props = this.props // first get the values from the normal header. var fields = tar.fields for (var f = 0; fields[f] !== null; f ++) { var field = fields[f] , val = header[field] if (typeof val !== "undefined") props[field] = val } // next, the global header for this file. // numeric values, etc, will have already been parsed. ;[global, extended].forEach(function (p) { Object.keys(p).forEach(function (f) { if (typeof p[f] !== "undefined") props[f] = p[f] }) }) // no nulls allowed in path or linkpath ;["path", "linkpath"].forEach(function (p) { if (props.hasOwnProperty(p)) { props[p] = props[p].split("\0")[0] } }) // set date fields to be a proper date ;["mtime", "ctime", "atime"].forEach(function (p) { if (props.hasOwnProperty(p)) { props[p] = new Date(props[p] * 1000) } }) // set the type so that we know what kind of file to create var type switch (tar.types[props.type]) { case "OldFile": case "ContiguousFile": type = "File" break case "GNUDumpDir": type = "Directory" break case undefined: type = "Unknown" break case "Link": case "SymbolicLink": case "CharacterDevice": case "BlockDevice": case "Directory": case "FIFO": default: type = tar.types[props.type] } this.type = type this.path = props.path this.size = props.size // size is special, since it signals when the file needs to end. this._remaining = props.size } Entry.prototype.warn = fstream.warn Entry.prototype.error = fstream.error ��������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/extended-header-writer.js����������000644 �000766 �000024 �00000012321 12455173731 031610� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = ExtendedHeaderWriter var inherits = require("inherits") , EntryWriter = require("./entry-writer.js") inherits(ExtendedHeaderWriter, EntryWriter) var tar = require("../tar.js") , path = require("path") , TarHeader = require("./header.js") // props is the props of the thing we need to write an // extended header for. // Don't be shy with it. Just encode everything. function ExtendedHeaderWriter (props) { // console.error(">> ehw ctor") var me = this if (!(me instanceof ExtendedHeaderWriter)) { return new ExtendedHeaderWriter(props) } me.fields = props var p = { path : ("PaxHeader" + path.join("/", props.path || "")) .replace(/\\/g, "/").substr(0, 100) , mode : props.mode || 0666 , uid : props.uid || 0 , gid : props.gid || 0 , size : 0 // will be set later , mtime : props.mtime || Date.now() / 1000 , type : "x" , linkpath : "" , ustar : "ustar\0" , ustarver : "00" , uname : props.uname || "" , gname : props.gname || "" , devmaj : props.devmaj || 0 , devmin : props.devmin || 0 } EntryWriter.call(me, p) // console.error(">> ehw props", me.props) me.props = p me._meta = true } ExtendedHeaderWriter.prototype.end = function () { // console.error(">> ehw end") var me = this if (me._ended) return me._ended = true me._encodeFields() if (me.props.size === 0) { // nothing to write! me._ready = true me._stream.end() return } me._stream.write(TarHeader.encode(me.props)) me.body.forEach(function (l) { me._stream.write(l) }) me._ready = true // console.error(">> ehw _process calling end()", me.props) this._stream.end() } ExtendedHeaderWriter.prototype._encodeFields = function () { // console.error(">> ehw _encodeFields") this.body = [] if (this.fields.prefix) { this.fields.path = this.fields.prefix + "/" + this.fields.path this.fields.prefix = "" } encodeFields(this.fields, "", this.body, this.fields.noProprietary) var me = this this.body.forEach(function (l) { me.props.size += l.length }) } function encodeFields (fields, prefix, body, nop) { // console.error(">> >> ehw encodeFields") // "%d %s=%s\n", <length>, <keyword>, <value> // The length is a decimal number, and includes itself and the \n // Numeric values are decimal strings. Object.keys(fields).forEach(function (k) { var val = fields[k] , numeric = tar.numeric[k] if (prefix) k = prefix + "." + k // already including NODETAR.type, don't need File=true also if (k === fields.type && val === true) return switch (k) { // don't include anything that's always handled just fine // in the normal header, or only meaningful in the context // of nodetar case "mode": case "cksum": case "ustar": case "ustarver": case "prefix": case "basename": case "dirname": case "needExtended": case "block": case "filter": return case "rdev": if (val === 0) return break case "nlink": case "dev": // Truly a hero among men, Creator of Star! case "ino": // Speak his name with reverent awe! It is: k = "SCHILY." + k break default: break } if (val && typeof val === "object" && !Buffer.isBuffer(val)) encodeFields(val, k, body, nop) else if (val === null || val === undefined) return else body.push.apply(body, encodeField(k, val, nop)) }) return body } function encodeField (k, v, nop) { // lowercase keys must be valid, otherwise prefix with // "NODETAR." if (k.charAt(0) === k.charAt(0).toLowerCase()) { var m = k.split(".")[0] if (!tar.knownExtended[m]) k = "NODETAR." + k } // no proprietary if (nop && k.charAt(0) !== k.charAt(0).toLowerCase()) { return [] } if (typeof val === "number") val = val.toString(10) var s = new Buffer(" " + k + "=" + v + "\n") , digits = Math.floor(Math.log(s.length) / Math.log(10)) + 1 // console.error("1 s=%j digits=%j s.length=%d", s.toString(), digits, s.length) // if adding that many digits will make it go over that length, // then add one to it. For example, if the string is: // " foo=bar\n" // then that's 9 characters. With the "9", that bumps the length // up to 10. However, this is invalid: // "10 foo=bar\n" // but, since that's actually 11 characters, since 10 adds another // character to the length, and the length includes the number // itself. In that case, just bump it up again. if (s.length + digits >= Math.pow(10, digits)) digits += 1 // console.error("2 s=%j digits=%j s.length=%d", s.toString(), digits, s.length) var len = digits + s.length // console.error("3 s=%j digits=%j s.length=%d len=%d", s.toString(), digits, s.length, len) var lenBuf = new Buffer("" + len) if (lenBuf.length + s.length !== len) { throw new Error("Bad length calculation\n"+ "len="+len+"\n"+ "lenBuf="+JSON.stringify(lenBuf.toString())+"\n"+ "lenBuf.length="+lenBuf.length+"\n"+ "digits="+digits+"\n"+ "s="+JSON.stringify(s.toString())+"\n"+ "s.length="+s.length) } return [lenBuf, s] } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/extended-header.js�����������������000644 �000766 �000024 �00000006753 12455173731 030312� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// An Entry consisting of: // // "%d %s=%s\n", <length>, <keyword>, <value> // // The length is a decimal number, and includes itself and the \n // \0 does not terminate anything. Only the length terminates the string. // Numeric values are decimal strings. module.exports = ExtendedHeader var Entry = require("./entry.js") , inherits = require("inherits") , tar = require("../tar.js") , numeric = tar.numeric , keyTrans = { "SCHILY.dev": "dev" , "SCHILY.ino": "ino" , "SCHILY.nlink": "nlink" } function ExtendedHeader () { Entry.apply(this, arguments) this.on("data", this._parse) this.fields = {} this._position = 0 this._fieldPos = 0 this._state = SIZE this._sizeBuf = [] this._keyBuf = [] this._valBuf = [] this._size = -1 this._key = "" } inherits(ExtendedHeader, Entry) ExtendedHeader.prototype._parse = parse var s = 0 , states = ExtendedHeader.states = {} , SIZE = states.SIZE = s++ , KEY = states.KEY = s++ , VAL = states.VAL = s++ , ERR = states.ERR = s++ Object.keys(states).forEach(function (s) { states[states[s]] = states[s] }) states[s] = null // char code values for comparison var _0 = "0".charCodeAt(0) , _9 = "9".charCodeAt(0) , point = ".".charCodeAt(0) , a = "a".charCodeAt(0) , Z = "Z".charCodeAt(0) , a = "a".charCodeAt(0) , z = "z".charCodeAt(0) , space = " ".charCodeAt(0) , eq = "=".charCodeAt(0) , cr = "\n".charCodeAt(0) function parse (c) { if (this._state === ERR) return for ( var i = 0, l = c.length ; i < l ; this._position++, this._fieldPos++, i++) { // console.error("top of loop, size="+this._size) var b = c[i] if (this._size >= 0 && this._fieldPos > this._size) { error(this, "field exceeds length="+this._size) return } switch (this._state) { case ERR: return case SIZE: // console.error("parsing size, b=%d, rest=%j", b, c.slice(i).toString()) if (b === space) { this._state = KEY // this._fieldPos = this._sizeBuf.length this._size = parseInt(new Buffer(this._sizeBuf).toString(), 10) this._sizeBuf.length = 0 continue } if (b < _0 || b > _9) { error(this, "expected [" + _0 + ".." + _9 + "], got " + b) return } this._sizeBuf.push(b) continue case KEY: // can be any char except =, not > size. if (b === eq) { this._state = VAL this._key = new Buffer(this._keyBuf).toString() if (keyTrans[this._key]) this._key = keyTrans[this._key] this._keyBuf.length = 0 continue } this._keyBuf.push(b) continue case VAL: // field must end with cr if (this._fieldPos === this._size - 1) { // console.error("finished with "+this._key) if (b !== cr) { error(this, "expected \\n at end of field") return } var val = new Buffer(this._valBuf).toString() if (numeric[this._key]) { val = parseFloat(val) } this.fields[this._key] = val this._valBuf.length = 0 this._state = SIZE this._size = -1 this._fieldPos = -1 continue } this._valBuf.push(b) continue } } } function error (me, msg) { msg = "invalid header: " + msg + "\nposition=" + me._position + "\nfield position=" + me._fieldPos me.error(msg) me.state = ERR } ���������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/extract.js�������������������������000644 �000766 �000024 �00000004272 12455173731 026730� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// give it a tarball and a path, and it'll dump the contents module.exports = Extract var tar = require("../tar.js") , fstream = require("fstream") , inherits = require("inherits") , path = require("path") function Extract (opts) { if (!(this instanceof Extract)) return new Extract(opts) tar.Parse.apply(this) // have to dump into a directory opts.type = "Directory" opts.Directory = true if (typeof opts !== "object") { opts = { path: opts } } // better to drop in cwd? seems more standard. opts.path = opts.path || path.resolve("node-tar-extract") opts.type = "Directory" opts.Directory = true // similar to --strip or --strip-components opts.strip = +opts.strip if (!opts.strip || opts.strip <= 0) opts.strip = 0 this._fst = fstream.Writer(opts) this.pause() var me = this // Hardlinks in tarballs are relative to the root // of the tarball. So, they need to be resolved against // the target directory in order to be created properly. me.on("entry", function (entry) { // if there's a "strip" argument, then strip off that many // path components. if (opts.strip) { var p = entry.path.split("/").slice(opts.strip).join("/") entry.path = entry.props.path = p if (entry.linkpath) { var lp = entry.linkpath.split("/").slice(opts.strip).join("/") entry.linkpath = entry.props.linkpath = lp } } if (entry.type !== "Link") return entry.linkpath = entry.props.linkpath = path.join(opts.path, path.join("/", entry.props.linkpath)) }) this._fst.on("ready", function () { me.pipe(me._fst, { end: false }) me.resume() }) this._fst.on('error', function(err) { me.emit('error', err) }) this._fst.on('drain', function() { me.emit('drain') }) // this._fst.on("end", function () { // console.error("\nEEEE Extract End", me._fst.path) // }) this._fst.on("close", function () { // console.error("\nEEEE Extract End", me._fst.path) me.emit("end") me.emit("close") }) } inherits(Extract, tar.Parse) Extract.prototype._streamEnd = function () { var me = this if (!me._ended) me.error("unexpected eof") me._fst.end() // my .end() is coming later. } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/global-header-writer.js������������000644 �000766 �000024 �00000000604 12455173731 031251� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = GlobalHeaderWriter var ExtendedHeaderWriter = require("./extended-header-writer.js") , inherits = require("inherits") inherits(GlobalHeaderWriter, ExtendedHeaderWriter) function GlobalHeaderWriter (props) { if (!(this instanceof GlobalHeaderWriter)) { return new GlobalHeaderWriter(props) } ExtendedHeaderWriter.call(this, props) this.props.type = "g" } ����������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/header.js��������������������������000644 �000766 �000024 �00000025432 12455173731 026507� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// parse a 512-byte header block to a data object, or vice-versa // If the data won't fit nicely in a simple header, then generate // the appropriate extended header file, and return that. module.exports = TarHeader var tar = require("../tar.js") , fields = tar.fields , fieldOffs = tar.fieldOffs , fieldEnds = tar.fieldEnds , fieldSize = tar.fieldSize , numeric = tar.numeric , assert = require("assert").ok , space = " ".charCodeAt(0) , slash = "/".charCodeAt(0) , bslash = process.platform === "win32" ? "\\".charCodeAt(0) : null function TarHeader (block) { if (!(this instanceof TarHeader)) return new TarHeader(block) if (block) this.decode(block) } TarHeader.prototype = { decode : decode , encode: encode , calcSum: calcSum , checkSum: checkSum } TarHeader.parseNumeric = parseNumeric TarHeader.encode = encode TarHeader.decode = decode // note that this will only do the normal ustar header, not any kind // of extended posix header file. If something doesn't fit comfortably, // then it will set obj.needExtended = true, and set the block to // the closest approximation. function encode (obj) { if (!obj && !(this instanceof TarHeader)) throw new Error( "encode must be called on a TarHeader, or supplied an object") obj = obj || this var block = obj.block = new Buffer(512) // if the object has a "prefix", then that's actually an extension of // the path field. if (obj.prefix) { // console.error("%% header encoding, got a prefix", obj.prefix) obj.path = obj.prefix + "/" + obj.path // console.error("%% header encoding, prefixed path", obj.path) obj.prefix = "" } obj.needExtended = false if (obj.mode) { if (typeof obj.mode === "string") obj.mode = parseInt(obj.mode, 8) obj.mode = obj.mode & 0777 } for (var f = 0; fields[f] !== null; f ++) { var field = fields[f] , off = fieldOffs[f] , end = fieldEnds[f] , ret switch (field) { case "cksum": // special, done below, after all the others break case "prefix": // special, this is an extension of the "path" field. // console.error("%% header encoding, skip prefix later") break case "type": // convert from long name to a single char. var type = obj.type || "0" if (type.length > 1) { type = tar.types[obj.type] if (!type) type = "0" } writeText(block, off, end, type) break case "path": // uses the "prefix" field if > 100 bytes, but <= 255 var pathLen = Buffer.byteLength(obj.path) , pathFSize = fieldSize[fields.path] , prefFSize = fieldSize[fields.prefix] // paths between 100 and 255 should use the prefix field. // longer than 255 if (pathLen > pathFSize && pathLen <= pathFSize + prefFSize) { // need to find a slash somewhere in the middle so that // path and prefix both fit in their respective fields var searchStart = pathLen - 1 - pathFSize , searchEnd = prefFSize , found = false , pathBuf = new Buffer(obj.path) for ( var s = searchStart ; (s <= searchEnd) ; s ++ ) { if (pathBuf[s] === slash || pathBuf[s] === bslash) { found = s break } } if (found !== false) { prefix = pathBuf.slice(0, found).toString("utf8") path = pathBuf.slice(found + 1).toString("utf8") ret = writeText(block, off, end, path) off = fieldOffs[fields.prefix] end = fieldEnds[fields.prefix] // console.error("%% header writing prefix", off, end, prefix) ret = writeText(block, off, end, prefix) || ret break } } // paths less than 100 chars don't need a prefix // and paths longer than 255 need an extended header and will fail // on old implementations no matter what we do here. // Null out the prefix, and fallthrough to default. // console.error("%% header writing no prefix") var poff = fieldOffs[fields.prefix] , pend = fieldEnds[fields.prefix] writeText(block, poff, pend, "") // fallthrough // all other fields are numeric or text default: ret = numeric[field] ? writeNumeric(block, off, end, obj[field]) : writeText(block, off, end, obj[field] || "") break } obj.needExtended = obj.needExtended || ret } var off = fieldOffs[fields.cksum] , end = fieldEnds[fields.cksum] writeNumeric(block, off, end, calcSum.call(this, block)) return block } // if it's a negative number, or greater than will fit, // then use write256. var MAXNUM = { 12: 077777777777 , 11: 07777777777 , 8 : 07777777 , 7 : 0777777 } function writeNumeric (block, off, end, num) { var writeLen = end - off , maxNum = MAXNUM[writeLen] || 0 num = num || 0 // console.error(" numeric", num) if (num instanceof Date || Object.prototype.toString.call(num) === "[object Date]") { num = num.getTime() / 1000 } if (num > maxNum || num < 0) { write256(block, off, end, num) // need an extended header if negative or too big. return true } // god, tar is so annoying // if the string is small enough, you should put a space // between the octal string and the \0, but if it doesn't // fit, then don't. var numStr = Math.floor(num).toString(8) if (num < MAXNUM[writeLen - 1]) numStr += " " // pad with "0" chars if (numStr.length < writeLen) { numStr = (new Array(writeLen - numStr.length).join("0")) + numStr } if (numStr.length !== writeLen - 1) { throw new Error("invalid length: " + JSON.stringify(numStr) + "\n" + "expected: "+writeLen) } block.write(numStr, off, writeLen, "utf8") block[end - 1] = 0 } function write256 (block, off, end, num) { var buf = block.slice(off, end) var positive = num >= 0 buf[0] = positive ? 0x80 : 0xFF // get the number as a base-256 tuple if (!positive) num *= -1 var tuple = [] do { var n = num % 256 tuple.push(n) num = (num - n) / 256 } while (num) var bytes = tuple.length var fill = buf.length - bytes for (var i = 1; i < fill; i ++) { buf[i] = positive ? 0 : 0xFF } // tuple is a base256 number, with [0] as the *least* significant byte // if it's negative, then we need to flip all the bits once we hit the // first non-zero bit. The 2's-complement is (0x100 - n), and the 1's- // complement is (0xFF - n). var zero = true for (i = bytes; i > 0; i --) { var byte = tuple[bytes - i] if (positive) buf[fill + i] = byte else if (zero && byte === 0) buf[fill + i] = 0 else if (zero) { zero = false buf[fill + i] = 0x100 - byte } else buf[fill + i] = 0xFF - byte } } function writeText (block, off, end, str) { // strings are written as utf8, then padded with \0 var strLen = Buffer.byteLength(str) , writeLen = Math.min(strLen, end - off) // non-ascii fields need extended headers // long fields get truncated , needExtended = strLen !== str.length || strLen > writeLen // write the string, and null-pad if (writeLen > 0) block.write(str, off, writeLen, "utf8") for (var i = off + writeLen; i < end; i ++) block[i] = 0 return needExtended } function calcSum (block) { block = block || this.block assert(Buffer.isBuffer(block) && block.length === 512) if (!block) throw new Error("Need block to checksum") // now figure out what it would be if the cksum was " " var sum = 0 , start = fieldOffs[fields.cksum] , end = fieldEnds[fields.cksum] for (var i = 0; i < fieldOffs[fields.cksum]; i ++) { sum += block[i] } for (var i = start; i < end; i ++) { sum += space } for (var i = end; i < 512; i ++) { sum += block[i] } return sum } function checkSum (block) { var sum = calcSum.call(this, block) block = block || this.block var cksum = block.slice(fieldOffs[fields.cksum], fieldEnds[fields.cksum]) cksum = parseNumeric(cksum) return cksum === sum } function decode (block) { block = block || this.block assert(Buffer.isBuffer(block) && block.length === 512) this.block = block this.cksumValid = this.checkSum() var prefix = null // slice off each field. for (var f = 0; fields[f] !== null; f ++) { var field = fields[f] , val = block.slice(fieldOffs[f], fieldEnds[f]) switch (field) { case "ustar": // if not ustar, then everything after that is just padding. if (val.toString() !== "ustar\0") { this.ustar = false return } else { // console.error("ustar:", val, val.toString()) this.ustar = val.toString() } break // prefix is special, since it might signal the xstar header case "prefix": var atime = parseNumeric(val.slice(131, 131 + 12)) , ctime = parseNumeric(val.slice(131 + 12, 131 + 12 + 12)) if ((val[130] === 0 || val[130] === space) && typeof atime === "number" && typeof ctime === "number" && val[131 + 12] === space && val[131 + 12 + 12] === space) { this.atime = atime this.ctime = ctime val = val.slice(0, 130) } prefix = val.toString("utf8").replace(/\0+$/, "") // console.error("%% header reading prefix", prefix) break // all other fields are null-padding text // or a number. default: if (numeric[field]) { this[field] = parseNumeric(val) } else { this[field] = val.toString("utf8").replace(/\0+$/, "") } break } } // if we got a prefix, then prepend it to the path. if (prefix) { this.path = prefix + "/" + this.path // console.error("%% header got a prefix", this.path) } } function parse256 (buf) { // first byte MUST be either 80 or FF // 80 for positive, FF for 2's comp var positive if (buf[0] === 0x80) positive = true else if (buf[0] === 0xFF) positive = false else return null // build up a base-256 tuple from the least sig to the highest var zero = false , tuple = [] for (var i = buf.length - 1; i > 0; i --) { var byte = buf[i] if (positive) tuple.push(byte) else if (zero && byte === 0) tuple.push(0) else if (zero) { zero = false tuple.push(0x100 - byte) } else tuple.push(0xFF - byte) } for (var sum = 0, i = 0, l = tuple.length; i < l; i ++) { sum += tuple[i] * Math.pow(256, i) } return positive ? sum : -1 * sum } function parseNumeric (f) { if (f[0] & 0x80) return parse256(f) var str = f.toString("utf8").split("\0")[0].trim() , res = parseInt(str, 8) return isNaN(res) ? null : res } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/pack.js����������������������������000644 �000766 �000024 �00000012701 12455173731 026170� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// pipe in an fstream, and it'll make a tarball. // key-value pair argument is global extended header props. module.exports = Pack var EntryWriter = require("./entry-writer.js") , Stream = require("stream").Stream , path = require("path") , inherits = require("inherits") , GlobalHeaderWriter = require("./global-header-writer.js") , collect = require("fstream").collect , eof = new Buffer(512) for (var i = 0; i < 512; i ++) eof[i] = 0 inherits(Pack, Stream) function Pack (props) { // console.error("-- p ctor") var me = this if (!(me instanceof Pack)) return new Pack(props) if (props) me._noProprietary = props.noProprietary else me._noProprietary = false me._global = props me.readable = true me.writable = true me._buffer = [] // console.error("-- -- set current to null in ctor") me._currentEntry = null me._processing = false me._pipeRoot = null me.on("pipe", function (src) { if (src.root === me._pipeRoot) return me._pipeRoot = src src.on("end", function () { me._pipeRoot = null }) me.add(src) }) } Pack.prototype.addGlobal = function (props) { // console.error("-- p addGlobal") if (this._didGlobal) return this._didGlobal = true var me = this GlobalHeaderWriter(props) .on("data", function (c) { me.emit("data", c) }) .end() } Pack.prototype.add = function (stream) { if (this._global && !this._didGlobal) this.addGlobal(this._global) if (this._ended) return this.emit("error", new Error("add after end")) collect(stream) this._buffer.push(stream) this._process() this._needDrain = this._buffer.length > 0 return !this._needDrain } Pack.prototype.pause = function () { this._paused = true if (this._currentEntry) this._currentEntry.pause() this.emit("pause") } Pack.prototype.resume = function () { this._paused = false if (this._currentEntry) this._currentEntry.resume() this.emit("resume") this._process() } Pack.prototype.end = function () { this._ended = true this._buffer.push(eof) this._process() } Pack.prototype._process = function () { var me = this if (me._paused || me._processing) { return } var entry = me._buffer.shift() if (!entry) { if (me._needDrain) { me.emit("drain") } return } if (entry.ready === false) { // console.error("-- entry is not ready", entry) me._buffer.unshift(entry) entry.on("ready", function () { // console.error("-- -- ready!", entry) me._process() }) return } me._processing = true if (entry === eof) { // need 2 ending null blocks. me.emit("data", eof) me.emit("data", eof) me.emit("end") me.emit("close") return } // Change the path to be relative to the root dir that was // added to the tarball. // // XXX This should be more like how -C works, so you can // explicitly set a root dir, and also explicitly set a pathname // in the tarball to use. That way we can skip a lot of extra // work when resolving symlinks for bundled dependencies in npm. var root = path.dirname((entry.root || entry).path) var wprops = {} Object.keys(entry.props || {}).forEach(function (k) { wprops[k] = entry.props[k] }) if (me._noProprietary) wprops.noProprietary = true wprops.path = path.relative(root, entry.path || '') // actually not a matter of opinion or taste. if (process.platform === "win32") { wprops.path = wprops.path.replace(/\\/g, "/") } if (!wprops.type) wprops.type = 'Directory' switch (wprops.type) { // sockets not supported case "Socket": return case "Directory": wprops.path += "/" wprops.size = 0 break case "Link": var lp = path.resolve(path.dirname(entry.path), entry.linkpath) wprops.linkpath = path.relative(root, lp) || "." wprops.size = 0 break case "SymbolicLink": var lp = path.resolve(path.dirname(entry.path), entry.linkpath) wprops.linkpath = path.relative(path.dirname(entry.path), lp) || "." wprops.size = 0 break } // console.error("-- new writer", wprops) // if (!wprops.type) { // // console.error("-- no type?", entry.constructor.name, entry) // } // console.error("-- -- set current to new writer", wprops.path) var writer = me._currentEntry = EntryWriter(wprops) writer.parent = me // writer.on("end", function () { // // console.error("-- -- writer end", writer.path) // }) writer.on("data", function (c) { me.emit("data", c) }) writer.on("header", function () { Buffer.prototype.toJSON = function () { return this.toString().split(/\0/).join(".") } // console.error("-- -- writer header %j", writer.props) if (writer.props.size === 0) nextEntry() }) writer.on("close", nextEntry) var ended = false function nextEntry () { if (ended) return ended = true // console.error("-- -- writer close", writer.path) // console.error("-- -- set current to null", wprops.path) me._currentEntry = null me._processing = false me._process() } writer.on("error", function (er) { // console.error("-- -- writer error", writer.path) me.emit("error", er) }) // if it's the root, then there's no need to add its entries, // or data, since they'll be added directly. if (entry === me._pipeRoot) { // console.error("-- is the root, don't auto-add") writer.add = null } entry.pipe(writer) } Pack.prototype.destroy = function () {} Pack.prototype.write = function () {} ���������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/lib/parse.js���������������������������000644 �000766 �000024 �00000015243 12455173731 026370� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // A writable stream. // It emits "entry" events, which provide a readable stream that has // header info attached. module.exports = Parse.create = Parse var stream = require("stream") , Stream = stream.Stream , BlockStream = require("block-stream") , tar = require("../tar.js") , TarHeader = require("./header.js") , Entry = require("./entry.js") , BufferEntry = require("./buffer-entry.js") , ExtendedHeader = require("./extended-header.js") , assert = require("assert").ok , inherits = require("inherits") , fstream = require("fstream") // reading a tar is a lot like reading a directory // However, we're actually not going to run the ctor, // since it does a stat and various other stuff. // This inheritance gives us the pause/resume/pipe // behavior that is desired. inherits(Parse, fstream.Reader) function Parse () { var me = this if (!(me instanceof Parse)) return new Parse() // doesn't apply fstream.Reader ctor? // no, becasue we don't want to stat/etc, we just // want to get the entry/add logic from .pipe() Stream.apply(me) me.writable = true me.readable = true me._stream = new BlockStream(512) me.position = 0 me._ended = false me._stream.on("error", function (e) { me.emit("error", e) }) me._stream.on("data", function (c) { me._process(c) }) me._stream.on("end", function () { me._streamEnd() }) me._stream.on("drain", function () { me.emit("drain") }) } // overridden in Extract class, since it needs to // wait for its DirWriter part to finish before // emitting "end" Parse.prototype._streamEnd = function () { var me = this if (!me._ended) me.error("unexpected eof") me.emit("end") } // a tar reader is actually a filter, not just a readable stream. // So, you should pipe a tarball stream into it, and it needs these // write/end methods to do that. Parse.prototype.write = function (c) { if (this._ended) { // gnutar puts a LOT of nulls at the end. // you can keep writing these things forever. // Just ignore them. for (var i = 0, l = c.length; i > l; i ++) { if (c[i] !== 0) return this.error("write() after end()") } return } return this._stream.write(c) } Parse.prototype.end = function (c) { this._ended = true return this._stream.end(c) } // don't need to do anything, since we're just // proxying the data up from the _stream. // Just need to override the parent's "Not Implemented" // error-thrower. Parse.prototype._read = function () {} Parse.prototype._process = function (c) { assert(c && c.length === 512, "block size should be 512") // one of three cases. // 1. A new header // 2. A part of a file/extended header // 3. One of two or more EOF null blocks if (this._entry) { var entry = this._entry entry.write(c) if (entry._remaining === 0) { entry.end() this._entry = null } } else { // either zeroes or a header var zero = true for (var i = 0; i < 512 && zero; i ++) { zero = c[i] === 0 } // eof is *at least* 2 blocks of nulls, and then the end of the // file. you can put blocks of nulls between entries anywhere, // so appending one tarball to another is technically valid. // ending without the eof null blocks is not allowed, however. if (zero) { if (this._eofStarted) this._ended = true this._eofStarted = true } else { this._eofStarted = false this._startEntry(c) } } this.position += 512 } // take a header chunk, start the right kind of entry. Parse.prototype._startEntry = function (c) { var header = new TarHeader(c) , self = this , entry , ev , EntryType , onend , meta = false if (null === header.size || !header.cksumValid) { var e = new Error("invalid tar file") e.header = header e.tar_file_offset = this.position e.tar_block = this.position / 512 return this.emit("error", e) } switch (tar.types[header.type]) { case "File": case "OldFile": case "Link": case "SymbolicLink": case "CharacterDevice": case "BlockDevice": case "Directory": case "FIFO": case "ContiguousFile": case "GNUDumpDir": // start a file. // pass in any extended headers // These ones consumers are typically most interested in. EntryType = Entry ev = "entry" break case "GlobalExtendedHeader": // extended headers that apply to the rest of the tarball EntryType = ExtendedHeader onend = function () { self._global = self._global || {} Object.keys(entry.fields).forEach(function (k) { self._global[k] = entry.fields[k] }) } ev = "globalExtendedHeader" meta = true break case "ExtendedHeader": case "OldExtendedHeader": // extended headers that apply to the next entry EntryType = ExtendedHeader onend = function () { self._extended = entry.fields } ev = "extendedHeader" meta = true break case "NextFileHasLongLinkpath": // set linkpath=<contents> in extended header EntryType = BufferEntry onend = function () { self._extended = self._extended || {} self._extended.linkpath = entry.body } ev = "longLinkpath" meta = true break case "NextFileHasLongPath": case "OldGnuLongPath": // set path=<contents> in file-extended header EntryType = BufferEntry onend = function () { self._extended = self._extended || {} self._extended.path = entry.body } ev = "longPath" meta = true break default: // all the rest we skip, but still set the _entry // member, so that we can skip over their data appropriately. // emit an event to say that this is an ignored entry type? EntryType = Entry ev = "ignoredEntry" break } var global, extended if (meta) { global = extended = null } else { var global = this._global var extended = this._extended // extendedHeader only applies to one entry, so once we start // an entry, it's over. this._extended = null } entry = new EntryType(header, extended, global) entry.meta = meta // only proxy data events of normal files. if (!meta) { entry.on("data", function (c) { me.emit("data", c) }) } if (onend) entry.on("end", onend) this._entry = entry var me = this entry.on("pause", function () { me.pause() }) entry.on("resume", function () { me.resume() }) if (this.listeners("*").length) { this.emit("*", ev, entry) } this.emit(ev, entry) // Zero-byte entry. End immediately. if (entry.props.size === 0) { entry.end() this._entry = null } } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/examples/extracter.js������������������000644 �000766 �000024 �00000000601 12455173731 030317� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var tar = require("../tar.js") , fs = require("fs") function onError(err) { console.error('An error occurred:', err) } function onEnd() { console.log('Extracted!') } var extractor = tar.Extract({path: __dirname + "/extract"}) .on('error', onError) .on('end', onEnd); fs.createReadStream(__dirname + "/../test/fixtures/c.tar") .on('error', onError) .pipe(extractor); �������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/examples/packer.js���������������������000644 �000766 �000024 �00000000750 12455173731 027570� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var tar = require("../tar.js") , fstream = require("fstream") , fs = require("fs") var dirDest = fs.createWriteStream('dir.tar') function onError(err) { console.error('An error occurred:', err) } function onEnd() { console.log('Packed!') } var packer = tar.Pack({ noProprietary: true }) .on('error', onError) .on('end', onEnd); // This must be a "directory" fstream.Reader({ path: __dirname, type: "Directory" }) .on('error', onError) .pipe(packer) .pipe(dirDest) ������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/tar/examples/reader.js���������������������000644 �000766 �000024 �00000001754 12455173731 027572� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var tar = require("../tar.js") , fs = require("fs") fs.createReadStream(__dirname + "/../test/fixtures/c.tar") .pipe(tar.Parse()) .on("extendedHeader", function (e) { console.error("extended pax header", e.props) e.on("end", function () { console.error("extended pax fields:", e.fields) }) }) .on("ignoredEntry", function (e) { console.error("ignoredEntry?!?", e.props) }) .on("longLinkpath", function (e) { console.error("longLinkpath entry", e.props) e.on("end", function () { console.error("value=%j", e.body.toString()) }) }) .on("longPath", function (e) { console.error("longPath entry", e.props) e.on("end", function () { console.error("value=%j", e.body.toString()) }) }) .on("entry", function (e) { console.error("entry", e.props) e.on("data", function (c) { console.error(" >>>" + c.toString().replace(/\n/g, "\\n")) }) e.on("end", function () { console.error(" <<<EOF") }) }) ��������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sorted-object/lib/�������������������������000755 �000766 �000024 �00000000000 12456115120 026656� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sorted-object/LICENSE.txt������������������000644 �000766 �000024 �00000001346 12455173731 027752� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright © 2014 Domenic Denicola <domenic@domenicdenicola.com> This work is free. You can redistribute it and/or modify it under the terms of the Do What The Fuck You Want To Public License, Version 2, as published by Sam Hocevar. See below for more details. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE Version 2, December 2004 Copyright (C) 2004 Sam Hocevar <sam@hocevar.net> Everyone is permitted to copy and distribute verbatim or modified copies of this license document, and changing it is allowed as long as the name is changed. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. You just DO WHAT THE FUCK YOU WANT TO. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sorted-object/package.json�����������������000644 �000766 �000024 �00000003576 12455173731 030424� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "sorted-object", "description": "Returns a copy of an object with its keys sorted", "keywords": [ "sort", "keys", "object" ], "version": "1.0.0", "author": { "name": "Domenic Denicola", "email": "domenic@domenicdenicola.com", "url": "http://domenic.me/" }, "license": "WTFPL", "repository": { "type": "git", "url": "git://github.com/domenic/sorted-object.git" }, "bugs": { "url": "http://github.com/domenic/sorted-object/issues" }, "main": "lib/sorted-object.js", "scripts": { "test": "tape test/tests.js", "lint": "jshint lib && jshint test" }, "devDependencies": { "jshint": "~2.4.3", "tape": "~2.4.2" }, "readme": "# Get a Version of an Object with Sorted Keys\n\nAlthough objects in JavaScript are theoretically unsorted, in practice most engines use insertion order—at least, ignoring numeric keys. This manifests itself most prominently when dealing with an object's JSON serialization.\n\nSo, for example, you might be trying to serialize some object to a JSON file. But every time you write it, it ends up being output in a different order, depending on how you created it in the first place! This makes for some ugly diffs.\n\n**sorted-object** gives you the answer. Just use this package to create a version of your object with its keys sorted before serializing, and you'll get a consistent order every time.\n\n```js\nvar sortedObject = require(\"sorted-object\");\n\nvar objectToSerialize = generateStuffNondeterministically();\n\n// Before:\nfs.writeFileSync(\"dest.json\", JSON.stringify(objectToSerialize));\n\n// After:\nvar sortedVersion = sortedObject(objectToSerialize);\nfs.writeFileSync(\"dest.json\", JSON.stringify(sortedVersion));\n```\n", "readmeFilename": "README.md", "homepage": "https://github.com/domenic/sorted-object", "_id": "sorted-object@1.0.0", "_from": "sorted-object@" } ����������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sorted-object/README.md��������������������000644 �000766 �000024 �00000001774 12455173731 027413� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Get a Version of an Object with Sorted Keys Although objects in JavaScript are theoretically unsorted, in practice most engines use insertion order—at least, ignoring numeric keys. This manifests itself most prominently when dealing with an object's JSON serialization. So, for example, you might be trying to serialize some object to a JSON file. But every time you write it, it ends up being output in a different order, depending on how you created it in the first place! This makes for some ugly diffs. **sorted-object** gives you the answer. Just use this package to create a version of your object with its keys sorted before serializing, and you'll get a consistent order every time. ```js var sortedObject = require("sorted-object"); var objectToSerialize = generateStuffNondeterministically(); // Before: fs.writeFileSync("dest.json", JSON.stringify(objectToSerialize)); // After: var sortedVersion = sortedObject(objectToSerialize); fs.writeFileSync("dest.json", JSON.stringify(sortedVersion)); ``` ����iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sorted-object/lib/sorted-object.js���������000644 �000766 �000024 �00000000322 12455173731 031770� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������"use strict"; module.exports = function (input) { var output = Object.create(null); Object.keys(input).sort().forEach(function (key) { output[key] = input[key]; }); return output; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/index.js�����������������������������000644 �000766 �000024 �00000000046 12455173731 026124� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports=require("./lib/slide") ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/lib/���������������������������������000755 �000766 �000024 �00000000000 12456115120 025212� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/LICENSE������������������������������000644 �000766 �000024 �00000001354 12455173731 025467� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/package.json�������������������������000644 �000766 �000024 �00000002554 12455173731 026753� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "slide", "version": "1.1.6", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "contributors": [ { "name": "S. Sriram", "email": "ssriram@gmail.com", "url": "http://www.565labs.com" } ], "description": "A flow control lib small enough to fit on in a slide presentation. Derived live at Oak.JS", "main": "./lib/slide.js", "dependencies": {}, "devDependencies": {}, "engines": { "node": "*" }, "repository": { "type": "git", "url": "git://github.com/isaacs/slide-flow-control.git" }, "license": "ISC", "gitHead": "8345e51ee41e35825abc1a40750ea11462f57028", "bugs": { "url": "https://github.com/isaacs/slide-flow-control/issues" }, "homepage": "https://github.com/isaacs/slide-flow-control", "_id": "slide@1.1.6", "scripts": {}, "_shasum": "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707", "_from": "slide@>=1.1.6 <1.2.0", "_npmVersion": "2.0.0-beta.3", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707", "tarball": "http://registry.npmjs.org/slide/-/slide-1.1.6.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/slide/-/slide-1.1.6.tgz" } ����������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/README.md����������������������������000644 �000766 �000024 �00000007655 12455173731 025753� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Controlling Flow: callbacks are easy ## What's actually hard? - Doing a bunch of things in a specific order. - Knowing when stuff is done. - Handling failures. - Breaking up functionality into parts (avoid nested inline callbacks) ## Common Mistakes - Abandoning convention and consistency. - Putting all callbacks inline. - Using libraries without grokking them. - Trying to make async code look sync. ## Define Conventions - Two kinds of functions: *actors* take action, *callbacks* get results. - Essentially the continuation pattern. Resulting code *looks* similar to fibers, but is *much* simpler to implement. - Node works this way in the lowlevel APIs already, and it's very flexible. ## Callbacks - Simple responders - Must always be prepared to handle errors, that's why it's the first argument. - Often inline anonymous, but not always. - Can trap and call other callbacks with modified data, or pass errors upwards. ## Actors - Last argument is a callback. - If any error occurs, and can't be handled, pass it to the callback and return. - Must not throw. Return value ignored. - return x ==> return cb(null, x) - throw er ==> return cb(er) ```javascript // return true if a path is either // a symlink or a directory. function isLinkOrDir (path, cb) { fs.lstat(path, function (er, s) { if (er) return cb(er) return cb(null, s.isDirectory() || s.isSymbolicLink()) }) } ``` # asyncMap ## Usecases - I have a list of 10 files, and need to read all of them, and then continue when they're all done. - I have a dozen URLs, and need to fetch them all, and then continue when they're all done. - I have 4 connected users, and need to send a message to all of them, and then continue when that's done. - I have a list of n things, and I need to dosomething with all of them, in parallel, and get the results once they're all complete. ## Solution ```javascript var asyncMap = require("slide").asyncMap function writeFiles (files, what, cb) { asyncMap(files, function (f, cb) { fs.writeFile(f, what, cb) }, cb) } writeFiles([my, file, list], "foo", cb) ``` # chain ## Usecases - I have to do a bunch of things, in order. Get db credentials out of a file, read the data from the db, write that data to another file. - If anything fails, do not continue. - I still have to provide an array of functions, which is a lot of boilerplate, and a pita if your functions take args like ```javascript function (cb) { blah(a, b, c, cb) } ``` - Results are discarded, which is a bit lame. - No way to branch. ## Solution - reduces boilerplate by converting an array of [fn, args] to an actor that takes no arguments (except cb) - A bit like Function#bind, but tailored for our use-case. - bindActor(obj, "method", a, b, c) - bindActor(fn, a, b, c) - bindActor(obj, fn, a, b, c) - branching, skipping over falsey arguments ```javascript chain([ doThing && [thing, a, b, c] , isFoo && [doFoo, "foo"] , subChain && [chain, [one, two]] ], cb) ``` - tracking results: results are stored in an optional array passed as argument, last result is always in results[results.length - 1]. - treat chain.first and chain.last as placeholders for the first/last result up until that point. ## Non-trivial example - Read number files in a directory - Add the results together - Ping a web service with the result - Write the response to a file - Delete the number files ```javascript var chain = require("slide").chain function myProgram (cb) { var res = [], last = chain.last, first = chain.first chain([ [fs, "readdir", "the-directory"] , [readFiles, "the-directory", last] , [sum, last] , [ping, "POST", "example.com", 80, "/foo", last] , [fs, "writeFile", "result.txt", last] , [rmFiles, "./the-directory", first] ], res, cb) } ``` # Conclusion: Convention Profits - Consistent API from top to bottom. - Sneak in at any point to inject functionality. Testable, reusable, ... - When ruby and python users whine, you can smile condescendingly. �����������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/lib/async-map-ordered.js�������������000644 �000766 �000024 �00000003301 12455173731 031072� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ throw new Error("TODO: Not yet implemented.") /* usage: Like asyncMap, but only can take a single cb, and guarantees the order of the results. */ module.exports = asyncMapOrdered function asyncMapOrdered (list, fn, cb_) { if (typeof cb_ !== "function") throw new Error( "No callback provided to asyncMapOrdered") if (typeof fn !== "function") throw new Error( "No map function provided to asyncMapOrdered") if (list === undefined || list === null) return cb_(null, []) if (!Array.isArray(list)) list = [list] if (!list.length) return cb_(null, []) var errState = null , l = list.length , a = l , res = [] , resCount = 0 , maxArgLen = 0 function cb (index) { return function () { if (errState) return var er = arguments[0] var argLen = arguments.length maxArgLen = Math.max(maxArgLen, argLen) res[index] = argLen === 1 ? [er] : Array.apply(null, arguments) // see if any new things have been added. if (list.length > l) { var newList = list.slice(l) a += (list.length - l) var oldLen = l l = list.length process.nextTick(function () { newList.forEach(function (ar, i) { fn(ar, cb(i + oldLen)) }) }) } if (er || --a === 0) { errState = er cb_.apply(null, [errState].concat(flip(res, resCount, maxArgLen))) } }} // expect the supplied cb function to be called // "n" times for each thing in the array. list.forEach(function (ar) { steps.forEach(function (fn, i) { fn(ar, cb(i)) }) }) } function flip (res, resCount, argLen) { var flat = [] // res = [[er, x, y], [er, x1, y1], [er, x2, y2, z2]] // return [[x, x1, x2], [y, y1, y2], [undefined, undefined, z2]] �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/lib/async-map.js���������������������000644 �000766 �000024 �00000002677 12455173731 027467� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ /* usage: // do something to a list of things asyncMap(myListOfStuff, function (thing, cb) { doSomething(thing.foo, cb) }, cb) // do more than one thing to each item asyncMap(list, fooFn, barFn, cb) */ module.exports = asyncMap function asyncMap () { var steps = Array.prototype.slice.call(arguments) , list = steps.shift() || [] , cb_ = steps.pop() if (typeof cb_ !== "function") throw new Error( "No callback provided to asyncMap") if (!list) return cb_(null, []) if (!Array.isArray(list)) list = [list] var n = steps.length , data = [] // 2d array , errState = null , l = list.length , a = l * n if (!a) return cb_(null, []) function cb (er) { if (er && !errState) errState = er var argLen = arguments.length for (var i = 1; i < argLen; i ++) if (arguments[i] !== undefined) { data[i - 1] = (data[i - 1] || []).concat(arguments[i]) } // see if any new things have been added. if (list.length > l) { var newList = list.slice(l) a += (list.length - l) * n l = list.length process.nextTick(function () { newList.forEach(function (ar) { steps.forEach(function (fn) { fn(ar, cb) }) }) }) } if (--a === 0) cb_.apply(null, [errState].concat(data)) } // expect the supplied cb function to be called // "n" times for each thing in the array. list.forEach(function (ar) { steps.forEach(function (fn) { fn(ar, cb) }) }) } �����������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/lib/bind-actor.js��������������������000644 �000766 �000024 �00000000576 12455173731 027615� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = bindActor function bindActor () { var args = Array.prototype.slice.call (arguments) // jswtf. , obj = null , fn if (typeof args[0] === "object") { obj = args.shift() fn = args.shift() if (typeof fn === "string") fn = obj[ fn ] } else fn = args.shift() return function (cb) { fn.apply(obj, args.concat(cb)) } } ����������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/lib/chain.js�������������������������000644 �000766 �000024 �00000001224 12455173731 026644� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = chain var bindActor = require("./bind-actor.js") chain.first = {} ; chain.last = {} function chain (things, cb) { var res = [] ;(function LOOP (i, len) { if (i >= len) return cb(null,res) if (Array.isArray(things[i])) things[i] = bindActor.apply(null, things[i].map(function(i){ return (i===chain.first) ? res[0] : (i===chain.last) ? res[res.length - 1] : i })) if (!things[i]) return LOOP(i + 1, len) things[i](function (er, data) { if (er) return cb(er, res) if (data !== undefined) res = res.concat(data) LOOP(i + 1, len) }) })(0, things.length) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/slide/lib/slide.js�������������������������000644 �000766 �000024 �00000000171 12455173731 026662� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������exports.asyncMap = require("./async-map") exports.bindActor = require("./bind-actor") exports.chain = require("./chain") �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/.npmignore�����������������������������000644 �000766 �000024 �00000000050 12455173731 026124� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules test .gitignore .travis.yml����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/index.js�������������������������������000644 �000766 �000024 �00000005766 12455173731 025615� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var Transform = require('stream').Transform || require('readable-stream').Transform var crypto = require('crypto') var fs try { fs = require('graceful-fs') } catch (ex) { fs = require('fs') } try { process.binding('crypto') } catch (e) { var er = new Error( 'crypto binding not found.\n' + 'Please build node with openssl.\n' + e.message ) throw er } exports.check = check exports.checkSync = checkSync exports.get = get exports.getSync = getSync exports.stream = stream function check(file, expected, options, cb) { if (typeof options === 'function') { cb = options options = undefined } expected = expected.toLowerCase().trim() get(file, options, function (er, actual) { if (er) { if (er.message) er.message += ' while getting shasum for ' + file return cb(er) } if (actual === expected) return cb(null) cb(new Error( 'shasum check failed for ' + file + '\n' + 'Expected: ' + expected + '\n' + 'Actual: ' + actual)) }) } function checkSync(file, expected, options) { expected = expected.toLowerCase().trim() var actual try { actual = getSync(file, options) } catch (er) { if (er.message) er.message += ' while getting shasum for ' + file throw er } if (actual !== expected) { var ex = new Error( 'shasum check failed for ' + file + '\n' + 'Expected: ' + expected + '\n' + 'Actual: ' + actual) throw ex } } function get(file, options, cb) { if (typeof options === 'function') { cb = options options = undefined } options = options || {} var algorithm = options.algorithm || 'sha1' var hash = crypto.createHash(algorithm) var source = fs.createReadStream(file) var errState = null source .on('error', function (er) { if (errState) return return cb(errState = er) }) .on('data', function (chunk) { if (errState) return hash.update(chunk) }) .on('end', function () { if (errState) return var actual = hash.digest("hex").toLowerCase().trim() cb(null, actual) }) } function getSync(file, options) { options = options || {} var algorithm = options.algorithm || 'sha1' var hash = crypto.createHash(algorithm) var source = fs.readFileSync(file) hash.update(source) return hash.digest("hex").toLowerCase().trim() } function stream(expected, options) { expected = expected.toLowerCase().trim() options = options || {} var algorithm = options.algorithm || 'sha1' var hash = crypto.createHash(algorithm) var stream = new Transform() stream._transform = function (chunk, encoding, callback) { hash.update(chunk) stream.push(chunk) callback() } stream._flush = function (cb) { var actual = hash.digest("hex").toLowerCase().trim() if (actual === expected) return cb(null) cb(new Error( 'shasum check failed for:\n' + ' Expected: ' + expected + '\n' + ' Actual: ' + actual)) this.push(null) } return stream }����������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/LICENSE��������������������������������000644 �000766 �000024 �00000004426 12455173731 025145� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2013 Forbes Lindesay The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. The MIT License (MIT) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/��������������������������000755 �000766 �000024 �00000000000 12456115120 026574� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/package.json���������������������������000644 �000766 �000024 �00000002317 12455173731 026423� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "sha", "version": "1.3.0", "description": "Check and get file hashes", "scripts": { "test": "mocha -R spec" }, "repository": { "type": "git", "url": "https://github.com/ForbesLindesay/sha.git" }, "license": "BSD", "optionalDependencies": { "graceful-fs": "2 || 3", "readable-stream": "~1.1" }, "devDependencies": { "mocha": "~1.9.0" }, "gitHead": "f1985eefbf7538e5809a2157c728d2f740901600", "bugs": { "url": "https://github.com/ForbesLindesay/sha/issues" }, "homepage": "https://github.com/ForbesLindesay/sha", "dependencies": { "graceful-fs": "2 || 3", "readable-stream": "~1.1" }, "_id": "sha@1.3.0", "_shasum": "79f4787045d0ede7327d702c25c443460dbc6764", "_from": "sha@>=1.3.0 <1.4.0", "_npmVersion": "1.5.0-alpha-4", "_npmUser": { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" }, "maintainers": [ { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" } ], "dist": { "shasum": "79f4787045d0ede7327d702c25c443460dbc6764", "tarball": "http://registry.npmjs.org/sha/-/sha-1.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/sha/-/sha-1.3.0.tgz" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/README.md������������������������������000644 �000766 �000024 �00000003320 12455173731 025407� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# sha Check and get file hashes (using any algorithm) [![Build Status](https://img.shields.io/travis/ForbesLindesay/sha/master.svg)](https://travis-ci.org/ForbesLindesay/sha) [![Dependency Status](https://img.shields.io/gemnasium/ForbesLindesay/sha.svg)](https://gemnasium.com/ForbesLindesay/sha) [![NPM version](https://img.shields.io/npm/v/sha.svg)](http://badge.fury.io/js/sha) ## Installation $ npm install sha ## API ### check(fileName, expected, [options,] cb) / checkSync(filename, expected, [options]) Asynchronously check that `fileName` has a "hash" of `expected`. The callback will be called with either `null` or an error (indicating that they did not match). Options: - algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash` ### get(fileName, [options,] cb) / getSync(filename, [options]) Asynchronously get the "hash" of `fileName`. The callback will be called with an optional `error` object and the (lower cased) hex digest of the hash. Options: - algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash` ### stream(expected, [options]) Check the hash of a stream without ever buffering it. This is a pass through stream so you can do things like: ```js fs.createReadStream('src') .pipe(sha.stream('expected')) .pipe(fs.createWriteStream('dest')) ``` `dest` will be a complete copy of `src` and an error will be emitted if the hash did not match `'expected'`. Options: - algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash` ## License You may use this software under the BSD or MIT. Take your pick. If you want me to release it under another license, open a pull request.����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/����������000755 �000766 �000024 �00000000000 12456115120 031624� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/.npmignore000644 �000766 �000024 �00000000044 12455173731 033634� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������build/ test/ examples/ fs.js zlib.js��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/duplex.js�000644 �000766 �000024 �00000000064 12455173731 033476� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_duplex.js") ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/float.patch����������������������000644 �000766 �000024 �00000072562 12455173731 033722� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� module.exports = PassThrough; -var Transform = require('_stream_transform'); +var Transform = require('./_stream_transform'); var util = require('util'); util.inherits(PassThrough, Transform); diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js index 0c3fe3e..90a8298 100644 --- a/lib/_stream_readable.js +++ b/lib/_stream_readable.js @@ -23,10 +23,34 @@ module.exports = Readable; Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; +if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { + return emitter.listeners(type).length; +}; + +if (!global.setImmediate) global.setImmediate = function setImmediate(fn) { + return setTimeout(fn, 0); +}; +if (!global.clearImmediate) global.clearImmediate = function clearImmediate(i) { + return clearTimeout(i); +}; + var Stream = require('stream'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var StringDecoder; -var debug = util.debuglog('stream'); +var debug; +if (util.debuglog) + debug = util.debuglog('stream'); +else try { + debug = require('debuglog')('stream'); +} catch (er) { + debug = function() {}; +} util.inherits(Readable, Stream); @@ -380,7 +404,7 @@ function chunkInvalid(state, chunk) { function onEofChunk(stream, state) { - if (state.decoder && !state.ended) { + if (state.decoder && !state.ended && state.decoder.end) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js index b1f9fcc..b0caf57 100644 --- a/lib/_stream_transform.js +++ b/lib/_stream_transform.js @@ -64,8 +64,14 @@ module.exports = Transform; -var Duplex = require('_stream_duplex'); +var Duplex = require('./_stream_duplex'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} util.inherits(Transform, Duplex); diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index ba2e920..f49288b 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -27,6 +27,12 @@ module.exports = Writable; Writable.WritableState = WritableState; var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var Stream = require('stream'); util.inherits(Writable, Stream); @@ -119,7 +125,7 @@ function WritableState(options, stream) { function Writable(options) { // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. - if (!(this instanceof Writable) && !(this instanceof Stream.Duplex)) + if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex'))) return new Writable(options); this._writableState = new WritableState(options, this); diff --git a/test/simple/test-stream-big-push.js b/test/simple/test-stream-big-push.js index e3787e4..8cd2127 100644 --- a/test/simple/test-stream-big-push.js +++ b/test/simple/test-stream-big-push.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var str = 'asdfasdfasdfasdfasdf'; var r = new stream.Readable({ diff --git a/test/simple/test-stream-end-paused.js b/test/simple/test-stream-end-paused.js index bb73777..d40efc7 100644 --- a/test/simple/test-stream-end-paused.js +++ b/test/simple/test-stream-end-paused.js @@ -25,7 +25,7 @@ var gotEnd = false; // Make sure we don't miss the end event for paused 0-length streams -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var stream = new Readable(); var calledRead = false; stream._read = function() { diff --git a/test/simple/test-stream-pipe-after-end.js b/test/simple/test-stream-pipe-after-end.js index b46ee90..0be8366 100644 --- a/test/simple/test-stream-pipe-after-end.js +++ b/test/simple/test-stream-pipe-after-end.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var util = require('util'); util.inherits(TestReadable, Readable); diff --git a/test/simple/test-stream-pipe-cleanup.js b/test/simple/test-stream-pipe-cleanup.js deleted file mode 100644 index f689358..0000000 --- a/test/simple/test-stream-pipe-cleanup.js +++ /dev/null @@ -1,122 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -// This test asserts that Stream.prototype.pipe does not leave listeners -// hanging on the source or dest. - -var common = require('../common'); -var stream = require('stream'); -var assert = require('assert'); -var util = require('util'); - -function Writable() { - this.writable = true; - this.endCalls = 0; - stream.Stream.call(this); -} -util.inherits(Writable, stream.Stream); -Writable.prototype.end = function() { - this.endCalls++; -}; - -Writable.prototype.destroy = function() { - this.endCalls++; -}; - -function Readable() { - this.readable = true; - stream.Stream.call(this); -} -util.inherits(Readable, stream.Stream); - -function Duplex() { - this.readable = true; - Writable.call(this); -} -util.inherits(Duplex, Writable); - -var i = 0; -var limit = 100; - -var w = new Writable(); - -var r; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('end'); -} -assert.equal(0, r.listeners('end').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('close'); -} -assert.equal(0, r.listeners('close').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -r = new Readable(); - -for (i = 0; i < limit; i++) { - w = new Writable(); - r.pipe(w); - w.emit('close'); -} -assert.equal(0, w.listeners('close').length); - -r = new Readable(); -w = new Writable(); -var d = new Duplex(); -r.pipe(d); // pipeline A -d.pipe(w); // pipeline B -assert.equal(r.listeners('end').length, 2); // A.onend, A.cleanup -assert.equal(r.listeners('close').length, 2); // A.onclose, A.cleanup -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 3); // A.cleanup, B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -r.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 0); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 2); // B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -d.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 1); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 0); -assert.equal(d.listeners('close').length, 0); -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 0); diff --git a/test/simple/test-stream-pipe-error-handling.js b/test/simple/test-stream-pipe-error-handling.js index c5d724b..c7d6b7d 100644 --- a/test/simple/test-stream-pipe-error-handling.js +++ b/test/simple/test-stream-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream').Stream; +var Stream = require('../../').Stream; (function testErrorListenerCatches() { var source = new Stream(); diff --git a/test/simple/test-stream-pipe-event.js b/test/simple/test-stream-pipe-event.js index cb9d5fe..56f8d61 100644 --- a/test/simple/test-stream-pipe-event.js +++ b/test/simple/test-stream-pipe-event.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common'); -var stream = require('stream'); +var stream = require('../../'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream-push-order.js b/test/simple/test-stream-push-order.js index f2e6ec2..a5c9bf9 100644 --- a/test/simple/test-stream-push-order.js +++ b/test/simple/test-stream-push-order.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var assert = require('assert'); var s = new Readable({ diff --git a/test/simple/test-stream-push-strings.js b/test/simple/test-stream-push-strings.js index 06f43dc..1701a9a 100644 --- a/test/simple/test-stream-push-strings.js +++ b/test/simple/test-stream-push-strings.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var util = require('util'); util.inherits(MyStream, Readable); diff --git a/test/simple/test-stream-readable-event.js b/test/simple/test-stream-readable-event.js index ba6a577..a8e6f7b 100644 --- a/test/simple/test-stream-readable-event.js +++ b/test/simple/test-stream-readable-event.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; (function first() { // First test, not reading when the readable is added. diff --git a/test/simple/test-stream-readable-flow-recursion.js b/test/simple/test-stream-readable-flow-recursion.js index 2891ad6..11689ba 100644 --- a/test/simple/test-stream-readable-flow-recursion.js +++ b/test/simple/test-stream-readable-flow-recursion.js @@ -27,7 +27,7 @@ var assert = require('assert'); // more data continuously, but without triggering a nextTick // warning or RangeError. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; // throw an error if we trigger a nextTick warning. process.throwDeprecation = true; diff --git a/test/simple/test-stream-unshift-empty-chunk.js b/test/simple/test-stream-unshift-empty-chunk.js index 0c96476..7827538 100644 --- a/test/simple/test-stream-unshift-empty-chunk.js +++ b/test/simple/test-stream-unshift-empty-chunk.js @@ -24,7 +24,7 @@ var assert = require('assert'); // This test verifies that stream.unshift(Buffer(0)) or // stream.unshift('') does not set state.reading=false. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var nChunks = 10; diff --git a/test/simple/test-stream-unshift-read-race.js b/test/simple/test-stream-unshift-read-race.js index 83fd9fa..17c18aa 100644 --- a/test/simple/test-stream-unshift-read-race.js +++ b/test/simple/test-stream-unshift-read-race.js @@ -29,7 +29,7 @@ var assert = require('assert'); // 3. push() after the EOF signaling null is an error. // 4. _read() is not called after pushing the EOF null chunk. -var stream = require('stream'); +var stream = require('../../'); var hwm = 10; var r = stream.Readable({ highWaterMark: hwm }); var chunks = 10; @@ -51,7 +51,14 @@ r._read = function(n) { function push(fast) { assert(!pushedNull, 'push() after null push'); - var c = pos >= data.length ? null : data.slice(pos, pos + n); + var c; + if (pos >= data.length) + c = null; + else { + if (n + pos > data.length) + n = data.length - pos; + c = data.slice(pos, pos + n); + } pushedNull = c === null; if (fast) { pos += n; diff --git a/test/simple/test-stream-writev.js b/test/simple/test-stream-writev.js index 5b49e6e..b5321f3 100644 --- a/test/simple/test-stream-writev.js +++ b/test/simple/test-stream-writev.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var queue = []; for (var decode = 0; decode < 2; decode++) { diff --git a/test/simple/test-stream2-basic.js b/test/simple/test-stream2-basic.js index 3814bf0..248c1be 100644 --- a/test/simple/test-stream2-basic.js +++ b/test/simple/test-stream2-basic.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-compatibility.js b/test/simple/test-stream2-compatibility.js index 6cdd4e9..f0fa84b 100644 --- a/test/simple/test-stream2-compatibility.js +++ b/test/simple/test-stream2-compatibility.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-finish-pipe.js b/test/simple/test-stream2-finish-pipe.js index 39b274f..006a19b 100644 --- a/test/simple/test-stream2-finish-pipe.js +++ b/test/simple/test-stream2-finish-pipe.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Buffer = require('buffer').Buffer; var r = new stream.Readable(); diff --git a/test/simple/test-stream2-fs.js b/test/simple/test-stream2-fs.js deleted file mode 100644 index e162406..0000000 --- a/test/simple/test-stream2-fs.js +++ /dev/null @@ -1,72 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - - -var common = require('../common.js'); -var R = require('_stream_readable'); -var assert = require('assert'); - -var fs = require('fs'); -var FSReadable = fs.ReadStream; - -var path = require('path'); -var file = path.resolve(common.fixturesDir, 'x1024.txt'); - -var size = fs.statSync(file).size; - -var expectLengths = [1024]; - -var util = require('util'); -var Stream = require('stream'); - -util.inherits(TestWriter, Stream); - -function TestWriter() { - Stream.apply(this); - this.buffer = []; - this.length = 0; -} - -TestWriter.prototype.write = function(c) { - this.buffer.push(c.toString()); - this.length += c.length; - return true; -}; - -TestWriter.prototype.end = function(c) { - if (c) this.buffer.push(c.toString()); - this.emit('results', this.buffer); -} - -var r = new FSReadable(file); -var w = new TestWriter(); - -w.on('results', function(res) { - console.error(res, w.length); - assert.equal(w.length, size); - var l = 0; - assert.deepEqual(res.map(function (c) { - return c.length; - }), expectLengths); - console.log('ok'); -}); - -r.pipe(w); diff --git a/test/simple/test-stream2-httpclient-response-end.js b/test/simple/test-stream2-httpclient-response-end.js deleted file mode 100644 index 15cffc2..0000000 --- a/test/simple/test-stream2-httpclient-response-end.js +++ /dev/null @@ -1,52 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -var common = require('../common.js'); -var assert = require('assert'); -var http = require('http'); -var msg = 'Hello'; -var readable_event = false; -var end_event = false; -var server = http.createServer(function(req, res) { - res.writeHead(200, {'Content-Type': 'text/plain'}); - res.end(msg); -}).listen(common.PORT, function() { - http.get({port: common.PORT}, function(res) { - var data = ''; - res.on('readable', function() { - console.log('readable event'); - readable_event = true; - data += res.read(); - }); - res.on('end', function() { - console.log('end event'); - end_event = true; - assert.strictEqual(msg, data); - server.close(); - }); - }); -}); - -process.on('exit', function() { - assert(readable_event); - assert(end_event); -}); - diff --git a/test/simple/test-stream2-large-read-stall.js b/test/simple/test-stream2-large-read-stall.js index 2fbfbca..667985b 100644 --- a/test/simple/test-stream2-large-read-stall.js +++ b/test/simple/test-stream2-large-read-stall.js @@ -30,7 +30,7 @@ var PUSHSIZE = 20; var PUSHCOUNT = 1000; var HWM = 50; -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable({ highWaterMark: HWM }); @@ -39,23 +39,23 @@ var rs = r._readableState; r._read = push; r.on('readable', function() { - console.error('>> readable'); + //console.error('>> readable'); do { - console.error(' > read(%d)', READSIZE); + //console.error(' > read(%d)', READSIZE); var ret = r.read(READSIZE); - console.error(' < %j (%d remain)', ret && ret.length, rs.length); + //console.error(' < %j (%d remain)', ret && ret.length, rs.length); } while (ret && ret.length === READSIZE); - console.error('<< after read()', - ret && ret.length, - rs.needReadable, - rs.length); + //console.error('<< after read()', + // ret && ret.length, + // rs.needReadable, + // rs.length); }); var endEmitted = false; r.on('end', function() { endEmitted = true; - console.error('end'); + //console.error('end'); }); var pushes = 0; @@ -64,11 +64,11 @@ function push() { return; if (pushes++ === PUSHCOUNT) { - console.error(' push(EOF)'); + //console.error(' push(EOF)'); return r.push(null); } - console.error(' push #%d', pushes); + //console.error(' push #%d', pushes); if (r.push(new Buffer(PUSHSIZE))) setTimeout(push); } diff --git a/test/simple/test-stream2-objects.js b/test/simple/test-stream2-objects.js index 3e6931d..ff47d89 100644 --- a/test/simple/test-stream2-objects.js +++ b/test/simple/test-stream2-objects.js @@ -21,8 +21,8 @@ var common = require('../common.js'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var assert = require('assert'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-pipe-error-handling.js b/test/simple/test-stream2-pipe-error-handling.js index cf7531c..e3f3e4e 100644 --- a/test/simple/test-stream2-pipe-error-handling.js +++ b/test/simple/test-stream2-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); (function testErrorListenerCatches() { var count = 1000; diff --git a/test/simple/test-stream2-pipe-error-once-listener.js b/test/simple/test-stream2-pipe-error-once-listener.js index 5e8e3cb..53b2616 100755 --- a/test/simple/test-stream2-pipe-error-once-listener.js +++ b/test/simple/test-stream2-pipe-error-once-listener.js @@ -24,7 +24,7 @@ var common = require('../common.js'); var assert = require('assert'); var util = require('util'); -var stream = require('stream'); +var stream = require('../../'); var Read = function() { diff --git a/test/simple/test-stream2-push.js b/test/simple/test-stream2-push.js index b63edc3..eb2b0e9 100644 --- a/test/simple/test-stream2-push.js +++ b/test/simple/test-stream2-push.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; var assert = require('assert'); diff --git a/test/simple/test-stream2-read-sync-stack.js b/test/simple/test-stream2-read-sync-stack.js index e8a7305..9740a47 100644 --- a/test/simple/test-stream2-read-sync-stack.js +++ b/test/simple/test-stream2-read-sync-stack.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var N = 256 * 1024; diff --git a/test/simple/test-stream2-readable-empty-buffer-no-eof.js b/test/simple/test-stream2-readable-empty-buffer-no-eof.js index cd30178..4b1659d 100644 --- a/test/simple/test-stream2-readable-empty-buffer-no-eof.js +++ b/test/simple/test-stream2-readable-empty-buffer-no-eof.js @@ -22,10 +22,9 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; test1(); -test2(); function test1() { var r = new Readable(); @@ -88,31 +87,3 @@ function test1() { console.log('ok'); }); } - -function test2() { - var r = new Readable({ encoding: 'base64' }); - var reads = 5; - r._read = function(n) { - if (!reads--) - return r.push(null); // EOF - else - return r.push(new Buffer('x')); - }; - - var results = []; - function flow() { - var chunk; - while (null !== (chunk = r.read())) - results.push(chunk + ''); - } - r.on('readable', flow); - r.on('end', function() { - results.push('EOF'); - }); - flow(); - - process.on('exit', function() { - assert.deepEqual(results, [ 'eHh4', 'eHg=', 'EOF' ]); - console.log('ok'); - }); -} diff --git a/test/simple/test-stream2-readable-from-list.js b/test/simple/test-stream2-readable-from-list.js index 7c96ffe..04a96f5 100644 --- a/test/simple/test-stream2-readable-from-list.js +++ b/test/simple/test-stream2-readable-from-list.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var fromList = require('_stream_readable')._fromList; +var fromList = require('../../lib/_stream_readable')._fromList; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-readable-legacy-drain.js b/test/simple/test-stream2-readable-legacy-drain.js index 675da8e..51fd3d5 100644 --- a/test/simple/test-stream2-readable-legacy-drain.js +++ b/test/simple/test-stream2-readable-legacy-drain.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream'); +var Stream = require('../../'); var Readable = Stream.Readable; var r = new Readable(); diff --git a/test/simple/test-stream2-readable-non-empty-end.js b/test/simple/test-stream2-readable-non-empty-end.js index 7314ae7..c971898 100644 --- a/test/simple/test-stream2-readable-non-empty-end.js +++ b/test/simple/test-stream2-readable-non-empty-end.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var len = 0; var chunks = new Array(10); diff --git a/test/simple/test-stream2-readable-wrap-empty.js b/test/simple/test-stream2-readable-wrap-empty.js index 2e5cf25..fd8a3dc 100644 --- a/test/simple/test-stream2-readable-wrap-empty.js +++ b/test/simple/test-stream2-readable-wrap-empty.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var EE = require('events').EventEmitter; var oldStream = new EE(); diff --git a/test/simple/test-stream2-readable-wrap.js b/test/simple/test-stream2-readable-wrap.js index 90eea01..6b177f7 100644 --- a/test/simple/test-stream2-readable-wrap.js +++ b/test/simple/test-stream2-readable-wrap.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var EE = require('events').EventEmitter; var testRuns = 0, completedRuns = 0; diff --git a/test/simple/test-stream2-set-encoding.js b/test/simple/test-stream2-set-encoding.js index 5d2c32a..685531b 100644 --- a/test/simple/test-stream2-set-encoding.js +++ b/test/simple/test-stream2-set-encoding.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var util = require('util'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-transform.js b/test/simple/test-stream2-transform.js index 9c9ddd8..a0cacc6 100644 --- a/test/simple/test-stream2-transform.js +++ b/test/simple/test-stream2-transform.js @@ -21,8 +21,8 @@ var assert = require('assert'); var common = require('../common.js'); -var PassThrough = require('_stream_passthrough'); -var Transform = require('_stream_transform'); +var PassThrough = require('../../').PassThrough; +var Transform = require('../../').Transform; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-unpipe-drain.js b/test/simple/test-stream2-unpipe-drain.js index d66dc3c..365b327 100644 --- a/test/simple/test-stream2-unpipe-drain.js +++ b/test/simple/test-stream2-unpipe-drain.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var crypto = require('crypto'); var util = require('util'); diff --git a/test/simple/test-stream2-unpipe-leak.js b/test/simple/test-stream2-unpipe-leak.js index 99f8746..17c92ae 100644 --- a/test/simple/test-stream2-unpipe-leak.js +++ b/test/simple/test-stream2-unpipe-leak.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var chunk = new Buffer('hallo'); diff --git a/test/simple/test-stream2-writable.js b/test/simple/test-stream2-writable.js index 704100c..209c3a6 100644 --- a/test/simple/test-stream2-writable.js +++ b/test/simple/test-stream2-writable.js @@ -20,8 +20,8 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var W = require('_stream_writable'); -var D = require('_stream_duplex'); +var W = require('../../').Writable; +var D = require('../../').Duplex; var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream3-pause-then-read.js b/test/simple/test-stream3-pause-then-read.js index b91bde3..2f72c15 100644 --- a/test/simple/test-stream3-pause-then-read.js +++ b/test/simple/test-stream3-pause-then-read.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; ����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/������000755 �000766 �000024 �00000000000 12456115120 032372� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/LICENSE���000644 �000766 �000024 �00000002110 12455173731 032636� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/��������������������000755 �000766 �000024 �00000000000 12456115120 034222� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/package.json���������������������000644 �000766 �000024 �00000003265 12455173731 034054� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "readable-stream", "version": "1.1.13", "description": "Streams3, a user-land copy of the stream library from Node.js v0.11.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "isarray": "0.0.1", "string_decoder": "~0.10.x", "inherits": "~2.0.1" }, "devDependencies": { "tap": "~0.2.6" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/readable-stream" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "gitHead": "3b672fd7ae92acf5b4ffdbabf74b372a0a56b051", "bugs": { "url": "https://github.com/isaacs/readable-stream/issues" }, "homepage": "https://github.com/isaacs/readable-stream", "_id": "readable-stream@1.1.13", "_shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "_from": "readable-stream@>=1.1.0 <1.2.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/passthrough.js�������������������000644 �000766 �000024 �00000000071 12455173731 034463� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_passthrough.js") �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/readable.js����������������������000644 �000766 �000024 �00000000551 12455173731 033656� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = require('stream'); exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); �������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/README.md�000644 �000766 �000024 �00000002430 12455173731 033115� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# readable-stream ***Node-core streams for userland*** [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/) This package is a mirror of the Streams2 and Streams3 implementations in Node-core. If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core. **readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12. **readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` you’ll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when you’re ready to start using Streams3, pin to `"~1.1.0"` ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/transform.js���������������������000644 �000766 �000024 �00000000067 12455173731 034134� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_transform.js") �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/writable.js����������������������000644 �000766 �000024 �00000000066 12455173731 033731� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_writable.js") ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/�������000755 �000766 �000024 �00000000000 12456115120 036536� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/������������000755 �000766 �000024 �00000000000 12456115120 035674� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/�����000755 �000766 �000024 �00000000000 12456115120 037215� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/.npmignore������������000644 �000766 �000024 �00000000013 12455173731 041221� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������build test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/index.js�000644 �000766 �000024 �00000015006 12455173731 040677� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/LICENSE��000644 �000766 �000024 �00000002064 12455173731 040237� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/package.json����������000644 �000766 �000024 �00000002527 12455173731 041524� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_from": "string_decoder@>=0.10.0 <0.11.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/README.md000644 �000766 �000024 �00000000762 12455173731 040514� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.��������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/build/������000755 �000766 �000024 �00000000000 12456115120 036773� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/component.json��000644 �000766 �000024 �00000000726 12455173731 040611� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/index.js����000644 �000766 �000024 �00000000170 12455173731 037352� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/package.json000644 �000766 �000024 �00000005534 12455173731 040204� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "readme": "\n# isarray\n\n`Array#isArray` for older browsers.\n\n## Usage\n\n```js\nvar isArray = require('isarray');\n\nconsole.log(isArray([])); // => true\nconsole.log(isArray({})); // => false\n```\n\n## Installation\n\nWith [npm](http://npmjs.org) do\n\n```bash\n$ npm install isarray\n```\n\nThen bundle for the browser with\n[browserify](https://github.com/substack/browserify).\n\nWith [component](http://component.io) do\n\n```bash\n$ component install juliangruber/isarray\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber <julian@juliangruber.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n", "readmeFilename": "README.md", "_id": "isarray@0.0.1", "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "_from": "isarray@0.0.1", "_npmVersion": "1.2.18", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "directories": {}, "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/README.md���000644 �000766 �000024 �00000003025 12455173731 037166� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/isarray/build/build.js��000644 �000766 �000024 �00000007771 12455173731 040457� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib��������������������������������������������������������������������������������������������������������������������������������������������� /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); �������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/float.patch000644 �000766 �000024 �00000037626 12455173731 040715� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}����������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/lib/���000755 �000766 �000024 �00000000000 12456115120 037304� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/package.json������������000644 �000766 �000024 �00000002523 12455173731 041041� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is", "_id": "core-util-is@1.0.1", "dist": { "shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" }, "_from": "core-util-is@>=1.0.0 <1.1.0", "_npmVersion": "1.3.23", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz", "scripts": {} } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/README.md��000644 �000766 �000024 �00000000103 12455173731 040022� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������# core-util-is The `util.is*` functions introduced in Node v0.12. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/util.js000644 �000766 �000024 �00000003526 12455173731 040072� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/lib/util.js000644 �000766 �000024 �00000003562 12455173731 040640� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������// NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }����������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_duplex.js������������000644 �000766 �000024 �00000003215 12455173731 035700� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. module.exports = Duplex; /*<replacement>*/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /*</replacement>*/ /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); forEach(objectKeys(Writable.prototype), function(method) { if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; }); function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. process.nextTick(this.end.bind(this)); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_passthrough.js�������000644 �000766 �000024 �00000001121 12455173731 036740� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. module.exports = PassThrough; var Transform = require('./_stream_transform'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_readable.js����������000644 �000766 �000024 �00000060371 12455173731 036144� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = Readable; /*<replacement>*/ var isArray = require('isarray'); /*</replacement>*/ /*<replacement>*/ var Buffer = require('buffer').Buffer; /*</replacement>*/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /*<replacement>*/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /*</replacement>*/ var Stream = require('stream'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var StringDecoder; /*<replacement>*/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /*</replacement>*/ util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (util.isString(chunk) && !state.objectMode) { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (util.isNullOrUndefined(chunk)) { state.reading = false; if (!state.ended) onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (isNaN(n) || util.isNull(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else return state.length; } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (!util.isNumber(n) || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (util.isNull(ret)) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (!util.isNull(ret)) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) process.nextTick(function() { emitReadable_(stream); }); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; process.nextTick(function() { maybeReadMore_(stream, state); }); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) process.nextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { var self = this; process.nextTick(function() { debug('readable nexttick read 0'); self.read(0); }); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; if (!state.reading) { debug('resume read 0'); this.read(0); } resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; process.nextTick(function() { resume_(stream, state); }); } } function resume_(stream, state) { state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); if (!chunk || !state.objectMode && !chunk.length) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (util.isFunction(stream[i]) && util.isUndefined(this[i])) { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }}(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; process.nextTick(function() { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } }); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_transform.js���������000644 �000766 �000024 �00000014107 12455173731 036414� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. module.exports = Transform; var Duplex = require('./_stream_duplex'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ util.inherits(Transform, Duplex); function TransformState(options, stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (!util.isNullOrUndefined(data)) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(options, this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; this.once('prefinish', function() { if (util.isFunction(this._flush)) this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (!util.isNull(ts.writechunk) && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/sha/node_modules/readable-stream/lib/_stream_writable.js����������000644 �000766 �000024 �00000027237 12455173731 036222� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. module.exports = Writable; /*<replacement>*/ var Buffer = require('buffer').Buffer; /*</replacement>*/ Writable.WritableState = WritableState; /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var Stream = require('stream'); util.inherits(Writable, Stream); function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.buffer = []; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, state, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); process.nextTick(function() { cb(er); }); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); process.nextTick(function() { cb(er); }); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (util.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (!util.isFunction(cb)) cb = function() {}; if (state.ended) writeAfterEnd(this, state, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.buffer.length) clearBuffer(this, state); } }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && util.isString(chunk)) { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (util.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) state.buffer.push(new WriteReq(chunk, encoding, cb)); else doWrite(stream, state, false, len, chunk, encoding, cb); return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { if (sync) process.nextTick(function() { state.pendingcb--; cb(er); }); else { state.pendingcb--; cb(er); } stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(stream, state); if (!finished && !state.corked && !state.bufferProcessing && state.buffer.length) { clearBuffer(stream, state); } if (sync) { process.nextTick(function() { afterWrite(stream, state, finished, cb); }); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; if (stream._writev && state.buffer.length > 1) { // Fast case, write everything using _writev() var cbs = []; for (var c = 0; c < state.buffer.length; c++) cbs.push(state.buffer[c].callback); // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; doWrite(stream, state, true, state.length, state.buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer state.buffer = []; } else { // Slow case, write chunks one-by-one for (var c = 0; c < state.buffer.length; c++) { var entry = state.buffer[c]; var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { c++; break; } } if (c < state.buffer.length) state.buffer = state.buffer.slice(c); else state.buffer.length = 0; } state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (util.isFunction(chunk)) { cb = chunk; chunk = null; encoding = null; } else if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (!util.isNullOrUndefined(chunk)) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(stream, state) { return (state.ending && state.length === 0 && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(stream, state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else prefinish(stream, state); } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) process.nextTick(cb); else stream.once('finish', cb); } state.ended = true; } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/.npmignore��������������������������000644 �000766 �000024 �00000000007 12455173731 026654� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# nada �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/bin/��������������������������������000755 �000766 �000024 �00000000000 12456115120 025415� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/foot.js.txt�������������������������000644 �000766 �000024 �00000000164 12455173731 027004� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ })( typeof exports === 'object' ? exports : typeof define === 'function' && define.amd ? {} : semver = {} ); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/head.js.txt�������������������������000644 �000766 �000024 �00000000027 12455173731 026734� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������;(function(exports) { ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/LICENSE�����������������������������000644 �000766 �000024 �00000002436 12455173731 025672� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/Makefile����������������������������000644 �000766 �000024 �00000000754 12455173731 026326� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������files = semver.browser.js \ semver.min.js \ semver.browser.js.gz \ semver.min.js.gz all: $(files) clean: rm -f $(files) semver.browser.js: head.js.txt semver.js foot.js.txt ( cat head.js.txt; \ cat semver.js | \ egrep -v '^ *\/\* nomin \*\/' | \ perl -pi -e 's/debug\([^\)]+\)//g'; \ cat foot.js.txt ) > semver.browser.js semver.min.js: semver.browser.js uglifyjs -m <semver.browser.js >semver.min.js %.gz: % gzip --stdout -9 <$< >$@ .PHONY: all clean ��������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/package.json������������������������000644 �000766 �000024 �00000002534 12455173731 027152� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "semver", "version": "4.2.0", "description": "The semantic version parser used by npm.", "main": "semver.js", "browser": "semver.browser.js", "min": "semver.min.js", "scripts": { "test": "tap test/*.js", "prepublish": "make" }, "devDependencies": { "tap": "0.x >=0.0.4", "uglify-js": "~2.3.6" }, "license": "BSD", "repository": { "type": "git", "url": "git://github.com/isaacs/node-semver.git" }, "bin": { "semver": "./bin/semver" }, "gitHead": "f353d3337dd9bef990b6873e281342260b4e63ae", "bugs": { "url": "https://github.com/isaacs/node-semver/issues" }, "homepage": "https://github.com/isaacs/node-semver", "_id": "semver@4.2.0", "_shasum": "a571fd4adbe974fe32bd9cb4c5e249606f498423", "_from": "semver@>=4.2.0 <4.3.0", "_npmVersion": "2.1.14", "_nodeVersion": "0.10.33", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "dist": { "shasum": "a571fd4adbe974fe32bd9cb4c5e249606f498423", "tarball": "http://registry.npmjs.org/semver/-/semver-4.2.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/semver/-/semver-4.2.0.tgz", "readme": "ERROR: No README data found!" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/README.md���������������������������000644 �000766 �000024 �00000026635 12455173731 026153� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������semver(1) -- The semantic versioner for npm =========================================== ## Usage $ npm install semver semver.valid('1.2.3') // '1.2.3' semver.valid('a.b.c') // null semver.clean(' =v1.2.3 ') // '1.2.3' semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true semver.gt('1.2.3', '9.8.7') // false semver.lt('1.2.3', '9.8.7') // true As a command-line utility: $ semver -h Usage: semver <version> [<version> [...]] [-r <range> | -i <inc> | --preid <identifier> | -l | -rv] Test if version(s) satisfy the supplied range(s), and sort them. Multiple versions or ranges may be supplied, unless increment option is specified. In that case, only a single version may be used, and it is incremented by the specified level Program exits successfully if any valid version satisfies all supplied ranges, and prints all satisfying versions. If no versions are valid, or ranges are not satisfied, then exits failure. Versions are printed in ascending order, so supplying multiple versions to the utility will just sort them. ## Versions A "version" is described by the `v2.0.0` specification found at <http://semver.org/>. A leading `"="` or `"v"` character is stripped off and ignored. ## Ranges A `version range` is a set of `comparators` which specify versions that satisfy the range. A `comparator` is composed of an `operator` and a `version`. The set of primitive `operators` is: * `<` Less than * `<=` Less than or equal to * `>` Greater than * `>=` Greater than or equal to * `=` Equal. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included. For example, the comparator `>=1.2.7` would match the versions `1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` or `1.1.0`. Comparators can be joined by whitespace to form a `comparator set`, which is satisfied by the **intersection** of all of the comparators it includes. A range is composed of one or more comparator sets, joined by `||`. A version matches a range if and only if every comparator in at least one of the `||`-separated comparator sets is satisfied by the version. For example, the range `>=1.2.7 <1.3.0` would match the versions `1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, or `1.1.0`. The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, `1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. ### Prerelease Tags If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then it will only be allowed to satisfy comparator sets if at least one comparator with the same `[major, minor, patch]` tuple also has a prerelease tag. For example, the range `>1.2.3-alpha.3` would be allowed to match the version `1.2.3-alpha.7`, but it would *not* be satisfied by `3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater than" `1.2.3-alpha.3` according to the SemVer sort rules. The version range only accepts prerelease tags on the `1.2.3` version. The version `3.4.5` *would* satisfy the range, because it does not have a prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. The purpose for this behavior is twofold. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author's design) not yet fit for public consumption. Therefore, by default, they are excluded from range matching semantics. Second, a user who has opted into using a prerelease version has clearly indicated the intent to use *that specific* set of alpha/beta/rc versions. By including a prerelease tag in the range, the user is indicating that they are aware of the risk. However, it is still not appropriate to assume that they have opted into taking a similar risk on the *next* set of prerelease versions. #### Prerelease Identifiers The method `.inc` takes an additional `identifier` string argument that will append the value of the string as a prerelease identifier: ````javascript > semver.inc('1.2.3', 'pre', 'beta') '1.2.4-beta.0' ``` command-line example: ```shell $ semver 1.2.3 -i prerelease --preid beta 1.2.4-beta.0 ``` Which then can be used to increment further: ```shell $ semver 1.2.4-beta.0 -i prerelease 1.2.4-beta.1 ``` ### Advanced Range Syntax Advanced range syntax desugars to primitive comparators in deterministic ways. Advanced ranges may be combined in the same way as primitive comparators using white space or `||`. #### Hyphen Ranges `X.Y.Z - A.B.C` Specifies an inclusive set. * `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` If a partial version is provided as the first version in the inclusive range, then the missing pieces are replaced with zeroes. * `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` If a partial version is provided as the second version in the inclusive range, then all versions that start with the supplied parts of the tuple are accepted, but nothing that would be greater than the provided tuple parts. * `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` * `1.2.3 - 2` := `>=1.2.3 <3.0.0` #### X-Ranges `1.2.x` `1.X` `1.2.*` `*` Any of `X`, `x`, or `*` may be used to "stand in" for one of the numeric values in the `[major, minor, patch]` tuple. * `*` := `>=0.0.0` (Any version satisfies) * `1.x` := `>=1.0.0 <2.0.0` (Matching major version) * `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) A partial version range is treated as an X-Range, so the special character is in fact optional. * `""` (empty string) := `*` := `>=0.0.0` * `1` := `1.x.x` := `>=1.0.0 <2.0.0` * `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` #### Tilde Ranges `~1.2.3` `~1.2` `~1` Allows patch-level changes if a minor version is specified on the comparator. Allows minor-level changes if not. * `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` * `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) * `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) * `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` * `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) * `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) * `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in the `1.2.3` version will be allowed, if they are greater than or equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but `1.2.4-beta.2` would not, because it is a prerelease of a different `[major, minor, patch]` tuple. #### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` Allows changes that do not modify the left-most non-zero digit in the `[major, minor, patch]` tuple. In other words, this allows patch and minor updates for versions `1.0.0` and above, patch updates for versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. Many authors treat a `0.x` version as if the `x` were the major "breaking-change" indicator. Caret ranges are ideal when an author may make breaking changes between `0.2.4` and `0.3.0` releases, which is a common practice. However, it presumes that there will *not* be breaking changes between `0.2.4` and `0.2.5`. It allows for changes that are presumed to be additive (but non-breaking), according to commonly observed practices. * `^1.2.3` := `>=1.2.3 <2.0.0` * `^0.2.3` := `>=0.2.3 <0.3.0` * `^0.0.3` := `>=0.0.3 <0.0.4` * `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in the `1.2.3` version will be allowed, if they are greater than or equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but `1.2.4-beta.2` would not, because it is a prerelease of a different `[major, minor, patch]` tuple. * `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the `0.0.3` version *only* will be allowed, if they are greater than or equal to `beta`. So, `0.0.3-pr.2` would be allowed. When parsing caret ranges, a missing `patch` value desugars to the number `0`, but will allow flexibility within that value, even if the major and minor versions are both `0`. * `^1.2.x` := `>=1.2.0 <2.0.0` * `^0.0.x` := `>=0.0.0 <0.1.0` * `^0.0` := `>=0.0.0 <0.1.0` A missing `minor` and `patch` values will desugar to zero, but also allow flexibility within those values, even if the major version is zero. * `^1.x` := `>=1.0.0 <2.0.0` * `^0.x` := `>=0.0.0 <1.0.0` ## Functions All methods and classes take a final `loose` boolean argument that, if true, will be more forgiving about not-quite-valid semver strings. The resulting output will always be 100% strict, of course. Strict-mode Comparators and Ranges will be strict about the SemVer strings that they parse. * `valid(v)`: Return the parsed version, or null if it's not valid. * `inc(v, release)`: Return the version incremented by the release type (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), or null if it's not valid * `premajor` in one call will bump the version up to the next major version and down to a prerelease of that major version. `preminor`, and `prepatch` work the same way. * If called from a non-prerelease version, the `prerelease` will work the same as `prepatch`. It increments the patch version, then makes a prerelease. If the input version is already a prerelease it simply increments it. ### Comparison * `gt(v1, v2)`: `v1 > v2` * `gte(v1, v2)`: `v1 >= v2` * `lt(v1, v2)`: `v1 < v2` * `lte(v1, v2)`: `v1 <= v2` * `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, even if they're not the exact same string. You already know how to compare strings. * `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. * `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call the corresponding function above. `"==="` and `"!=="` do simple string comparison, but are included for completeness. Throws if an invalid comparison string is provided. * `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. * `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions in descending order when passed to `Array.sort()`. * `diff(v1, v2)`: Returns difference between two versions by the release type (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), or null if the versions are the same. ### Ranges * `validRange(range)`: Return the valid range or null if it's not valid * `satisfies(version, range)`: Return true if the version satisfies the range. * `maxSatisfying(versions, range)`: Return the highest version in the list that satisfies the range, or `null` if none of them do. * `gtr(version, range)`: Return `true` if version is greater than all the versions possible in the range. * `ltr(version, range)`: Return `true` if version is less than all the versions possible in the range. * `outside(version, range, hilo)`: Return true if the version is outside the bounds of the range in either the high or low direction. The `hilo` argument must be either the string `'>'` or `'<'`. (This is the function called by `gtr` and `ltr`.) Note that, since ranges may be non-contiguous, a version might not be greater than a range, less than a range, *or* satisfy a range! For example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` until `2.0.0`, so the version `1.2.10` would not be greater than the range (because `2.0.1` satisfies, which is higher), nor less than the range (since `1.2.8` satisfies, which is lower), and it also does not satisfy the range. If you want to know if a version satisfies or does not satisfy a range, use the `satisfies(version, range)` function. ���������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/semver.browser.js�������������������000644 �000766 �000024 �00000073642 12455173731 030215� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������;(function(exports) { // export the class if we are in a Node-like system. if (typeof module === 'object' && module.exports === exports) exports = module.exports = SemVer; // The debug function is excluded entirely from the minified version. // Note: this is the semver.org version of the spec that it implements // Not necessarily the package version of this code. exports.SEMVER_SPEC_VERSION = '2.0.0'; // The actual regexps go on exports.re var re = exports.re = []; var src = exports.src = []; var R = 0; // The following Regular Expressions can be used for tokenizing, // validating, and parsing SemVer version strings. // ## Numeric Identifier // A single `0`, or a non-zero digit followed by zero or more digits. var NUMERICIDENTIFIER = R++; src[NUMERICIDENTIFIER] = '0|[1-9]\\d*'; var NUMERICIDENTIFIERLOOSE = R++; src[NUMERICIDENTIFIERLOOSE] = '[0-9]+'; // ## Non-numeric Identifier // Zero or more digits, followed by a letter or hyphen, and then zero or // more letters, digits, or hyphens. var NONNUMERICIDENTIFIER = R++; src[NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*'; // ## Main Version // Three dot-separated numeric identifiers. var MAINVERSION = R++; src[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\.' + '(' + src[NUMERICIDENTIFIER] + ')\\.' + '(' + src[NUMERICIDENTIFIER] + ')'; var MAINVERSIONLOOSE = R++; src[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + '(' + src[NUMERICIDENTIFIERLOOSE] + ')'; // ## Pre-release Version Identifier // A numeric identifier, or a non-numeric identifier. var PRERELEASEIDENTIFIER = R++; src[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] + '|' + src[NONNUMERICIDENTIFIER] + ')'; var PRERELEASEIDENTIFIERLOOSE = R++; src[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] + '|' + src[NONNUMERICIDENTIFIER] + ')'; // ## Pre-release Version // Hyphen, followed by one or more dot-separated pre-release version // identifiers. var PRERELEASE = R++; src[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] + '(?:\\.' + src[PRERELEASEIDENTIFIER] + ')*))'; var PRERELEASELOOSE = R++; src[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] + '(?:\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))'; // ## Build Metadata Identifier // Any combination of digits, letters, or hyphens. var BUILDIDENTIFIER = R++; src[BUILDIDENTIFIER] = '[0-9A-Za-z-]+'; // ## Build Metadata // Plus sign, followed by one or more period-separated build metadata // identifiers. var BUILD = R++; src[BUILD] = '(?:\\+(' + src[BUILDIDENTIFIER] + '(?:\\.' + src[BUILDIDENTIFIER] + ')*))'; // ## Full Version String // A main version, followed optionally by a pre-release version and // build metadata. // Note that the only major, minor, patch, and pre-release sections of // the version string are capturing groups. The build metadata is not a // capturing group, because it should not ever be used in version // comparison. var FULL = R++; var FULLPLAIN = 'v?' + src[MAINVERSION] + src[PRERELEASE] + '?' + src[BUILD] + '?'; src[FULL] = '^' + FULLPLAIN + '$'; // like full, but allows v1.2.3 and =1.2.3, which people do sometimes. // also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty // common in the npm registry. var LOOSEPLAIN = '[v=\\s]*' + src[MAINVERSIONLOOSE] + src[PRERELEASELOOSE] + '?' + src[BUILD] + '?'; var LOOSE = R++; src[LOOSE] = '^' + LOOSEPLAIN + '$'; var GTLT = R++; src[GTLT] = '((?:<|>)?=?)'; // Something like "2.*" or "1.2.x". // Note that "x.x" is a valid xRange identifer, meaning "any version" // Only the first item is strictly required. var XRANGEIDENTIFIERLOOSE = R++; src[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\*'; var XRANGEIDENTIFIER = R++; src[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\*'; var XRANGEPLAIN = R++; src[XRANGEPLAIN] = '[v=\\s]*(' + src[XRANGEIDENTIFIER] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + '(?:' + src[PRERELEASE] + ')?' + src[BUILD] + '?' + ')?)?'; var XRANGEPLAINLOOSE = R++; src[XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:' + src[PRERELEASELOOSE] + ')?' + src[BUILD] + '?' + ')?)?'; var XRANGE = R++; src[XRANGE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAIN] + '$'; var XRANGELOOSE = R++; src[XRANGELOOSE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAINLOOSE] + '$'; // Tilde ranges. // Meaning is "reasonably at or greater than" var LONETILDE = R++; src[LONETILDE] = '(?:~>?)'; var TILDETRIM = R++; src[TILDETRIM] = '(\\s*)' + src[LONETILDE] + '\\s+'; re[TILDETRIM] = new RegExp(src[TILDETRIM], 'g'); var tildeTrimReplace = '$1~'; var TILDE = R++; src[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$'; var TILDELOOSE = R++; src[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$'; // Caret ranges. // Meaning is "at least and backwards compatible with" var LONECARET = R++; src[LONECARET] = '(?:\\^)'; var CARETTRIM = R++; src[CARETTRIM] = '(\\s*)' + src[LONECARET] + '\\s+'; re[CARETTRIM] = new RegExp(src[CARETTRIM], 'g'); var caretTrimReplace = '$1^'; var CARET = R++; src[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$'; var CARETLOOSE = R++; src[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$'; // A simple gt/lt/eq thing, or just "" to indicate "any version" var COMPARATORLOOSE = R++; src[COMPARATORLOOSE] = '^' + src[GTLT] + '\\s*(' + LOOSEPLAIN + ')$|^$'; var COMPARATOR = R++; src[COMPARATOR] = '^' + src[GTLT] + '\\s*(' + FULLPLAIN + ')$|^$'; // An expression to strip any whitespace between the gtlt and the thing // it modifies, so that `> 1.2.3` ==> `>1.2.3` var COMPARATORTRIM = R++; src[COMPARATORTRIM] = '(\\s*)' + src[GTLT] + '\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')'; // this one has to use the /g flag re[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g'); var comparatorTrimReplace = '$1$2$3'; // Something like `1.2.3 - 1.2.4` // Note that these all use the loose form, because they'll be // checked against either the strict or loose comparator form // later. var HYPHENRANGE = R++; src[HYPHENRANGE] = '^\\s*(' + src[XRANGEPLAIN] + ')' + '\\s+-\\s+' + '(' + src[XRANGEPLAIN] + ')' + '\\s*$'; var HYPHENRANGELOOSE = R++; src[HYPHENRANGELOOSE] = '^\\s*(' + src[XRANGEPLAINLOOSE] + ')' + '\\s+-\\s+' + '(' + src[XRANGEPLAINLOOSE] + ')' + '\\s*$'; // Star ranges basically just allow anything at all. var STAR = R++; src[STAR] = '(<|>)?=?\\s*\\*'; // Compile to actual regexp objects. // All are flag-free, unless they were created above with a flag. for (var i = 0; i < R; i++) { ; if (!re[i]) re[i] = new RegExp(src[i]); } exports.parse = parse; function parse(version, loose) { var r = loose ? re[LOOSE] : re[FULL]; return (r.test(version)) ? new SemVer(version, loose) : null; } exports.valid = valid; function valid(version, loose) { var v = parse(version, loose); return v ? v.version : null; } exports.clean = clean; function clean(version, loose) { var s = parse(version.trim().replace(/^[=v]+/, ''), loose); return s ? s.version : null; } exports.SemVer = SemVer; function SemVer(version, loose) { if (version instanceof SemVer) { if (version.loose === loose) return version; else version = version.version; } else if (typeof version !== 'string') { throw new TypeError('Invalid Version: ' + version); } if (!(this instanceof SemVer)) return new SemVer(version, loose); ; this.loose = loose; var m = version.trim().match(loose ? re[LOOSE] : re[FULL]); if (!m) throw new TypeError('Invalid Version: ' + version); this.raw = version; // these are actually numbers this.major = +m[1]; this.minor = +m[2]; this.patch = +m[3]; // numberify any prerelease numeric ids if (!m[4]) this.prerelease = []; else this.prerelease = m[4].split('.').map(function(id) { return (/^[0-9]+$/.test(id)) ? +id : id; }); this.build = m[5] ? m[5].split('.') : []; this.format(); } SemVer.prototype.format = function() { this.version = this.major + '.' + this.minor + '.' + this.patch; if (this.prerelease.length) this.version += '-' + this.prerelease.join('.'); return this.version; }; SemVer.prototype.inspect = function() { return '<SemVer "' + this + '">'; }; SemVer.prototype.toString = function() { return this.version; }; SemVer.prototype.compare = function(other) { ; if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); return this.compareMain(other) || this.comparePre(other); }; SemVer.prototype.compareMain = function(other) { if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); return compareIdentifiers(this.major, other.major) || compareIdentifiers(this.minor, other.minor) || compareIdentifiers(this.patch, other.patch); }; SemVer.prototype.comparePre = function(other) { if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); // NOT having a prerelease is > having one if (this.prerelease.length && !other.prerelease.length) return -1; else if (!this.prerelease.length && other.prerelease.length) return 1; else if (!this.prerelease.length && !other.prerelease.length) return 0; var i = 0; do { var a = this.prerelease[i]; var b = other.prerelease[i]; ; if (a === undefined && b === undefined) return 0; else if (b === undefined) return 1; else if (a === undefined) return -1; else if (a === b) continue; else return compareIdentifiers(a, b); } while (++i); }; // preminor will bump the version up to the next minor release, and immediately // down to pre-release. premajor and prepatch work the same way. SemVer.prototype.inc = function(release, identifier) { switch (release) { case 'premajor': this.prerelease.length = 0; this.patch = 0; this.minor = 0; this.major++; this.inc('pre', identifier); break; case 'preminor': this.prerelease.length = 0; this.patch = 0; this.minor++; this.inc('pre', identifier); break; case 'prepatch': // If this is already a prerelease, it will bump to the next version // drop any prereleases that might already exist, since they are not // relevant at this point. this.prerelease.length = 0; this.inc('patch', identifier); this.inc('pre', identifier); break; // If the input is a non-prerelease version, this acts the same as // prepatch. case 'prerelease': if (this.prerelease.length === 0) this.inc('patch', identifier); this.inc('pre', identifier); break; case 'major': // If this is a pre-major version, bump up to the same major version. // Otherwise increment major. // 1.0.0-5 bumps to 1.0.0 // 1.1.0 bumps to 2.0.0 if (this.minor !== 0 || this.patch !== 0 || this.prerelease.length === 0) this.major++; this.minor = 0; this.patch = 0; this.prerelease = []; break; case 'minor': // If this is a pre-minor version, bump up to the same minor version. // Otherwise increment minor. // 1.2.0-5 bumps to 1.2.0 // 1.2.1 bumps to 1.3.0 if (this.patch !== 0 || this.prerelease.length === 0) this.minor++; this.patch = 0; this.prerelease = []; break; case 'patch': // If this is not a pre-release version, it will increment the patch. // If it is a pre-release it will bump up to the same patch version. // 1.2.0-5 patches to 1.2.0 // 1.2.0 patches to 1.2.1 if (this.prerelease.length === 0) this.patch++; this.prerelease = []; break; // This probably shouldn't be used publicly. // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. case 'pre': if (this.prerelease.length === 0) this.prerelease = [0]; else { var i = this.prerelease.length; while (--i >= 0) { if (typeof this.prerelease[i] === 'number') { this.prerelease[i]++; i = -2; } } if (i === -1) // didn't increment anything this.prerelease.push(0); } if (identifier) { // 1.2.0-beta.1 bumps to 1.2.0-beta.2, // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 if (this.prerelease[0] === identifier) { if (isNaN(this.prerelease[1])) this.prerelease = [identifier, 0]; } else this.prerelease = [identifier, 0]; } break; default: throw new Error('invalid increment argument: ' + release); } this.format(); return this; }; exports.inc = inc; function inc(version, release, loose, identifier) { if (typeof(loose) === 'string') { identifier = loose; loose = undefined; } try { return new SemVer(version, loose).inc(release, identifier).version; } catch (er) { return null; } } exports.diff = diff; function diff(version1, version2) { if (eq(version1, version2)) { return null; } else { var v1 = parse(version1); var v2 = parse(version2); if (v1.prerelease.length || v2.prerelease.length) { for (var key in v1) { if (key === 'major' || key === 'minor' || key === 'patch') { if (v1[key] !== v2[key]) { return 'pre'+key; } } } return 'prerelease'; } for (var key in v1) { if (key === 'major' || key === 'minor' || key === 'patch') { if (v1[key] !== v2[key]) { return key; } } } } } exports.compareIdentifiers = compareIdentifiers; var numeric = /^[0-9]+$/; function compareIdentifiers(a, b) { var anum = numeric.test(a); var bnum = numeric.test(b); if (anum && bnum) { a = +a; b = +b; } return (anum && !bnum) ? -1 : (bnum && !anum) ? 1 : a < b ? -1 : a > b ? 1 : 0; } exports.rcompareIdentifiers = rcompareIdentifiers; function rcompareIdentifiers(a, b) { return compareIdentifiers(b, a); } exports.compare = compare; function compare(a, b, loose) { return new SemVer(a, loose).compare(b); } exports.compareLoose = compareLoose; function compareLoose(a, b) { return compare(a, b, true); } exports.rcompare = rcompare; function rcompare(a, b, loose) { return compare(b, a, loose); } exports.sort = sort; function sort(list, loose) { return list.sort(function(a, b) { return exports.compare(a, b, loose); }); } exports.rsort = rsort; function rsort(list, loose) { return list.sort(function(a, b) { return exports.rcompare(a, b, loose); }); } exports.gt = gt; function gt(a, b, loose) { return compare(a, b, loose) > 0; } exports.lt = lt; function lt(a, b, loose) { return compare(a, b, loose) < 0; } exports.eq = eq; function eq(a, b, loose) { return compare(a, b, loose) === 0; } exports.neq = neq; function neq(a, b, loose) { return compare(a, b, loose) !== 0; } exports.gte = gte; function gte(a, b, loose) { return compare(a, b, loose) >= 0; } exports.lte = lte; function lte(a, b, loose) { return compare(a, b, loose) <= 0; } exports.cmp = cmp; function cmp(a, op, b, loose) { var ret; switch (op) { case '===': if (typeof a === 'object') a = a.version; if (typeof b === 'object') b = b.version; ret = a === b; break; case '!==': if (typeof a === 'object') a = a.version; if (typeof b === 'object') b = b.version; ret = a !== b; break; case '': case '=': case '==': ret = eq(a, b, loose); break; case '!=': ret = neq(a, b, loose); break; case '>': ret = gt(a, b, loose); break; case '>=': ret = gte(a, b, loose); break; case '<': ret = lt(a, b, loose); break; case '<=': ret = lte(a, b, loose); break; default: throw new TypeError('Invalid operator: ' + op); } return ret; } exports.Comparator = Comparator; function Comparator(comp, loose) { if (comp instanceof Comparator) { if (comp.loose === loose) return comp; else comp = comp.value; } if (!(this instanceof Comparator)) return new Comparator(comp, loose); ; this.loose = loose; this.parse(comp); if (this.semver === ANY) this.value = ''; else this.value = this.operator + this.semver.version; ; } var ANY = {}; Comparator.prototype.parse = function(comp) { var r = this.loose ? re[COMPARATORLOOSE] : re[COMPARATOR]; var m = comp.match(r); if (!m) throw new TypeError('Invalid comparator: ' + comp); this.operator = m[1]; if (this.operator === '=') this.operator = ''; // if it literally is just '>' or '' then allow anything. if (!m[2]) this.semver = ANY; else this.semver = new SemVer(m[2], this.loose); }; Comparator.prototype.inspect = function() { return '<SemVer Comparator "' + this + '">'; }; Comparator.prototype.toString = function() { return this.value; }; Comparator.prototype.test = function(version) { ; if (this.semver === ANY) return true; if (typeof version === 'string') version = new SemVer(version, this.loose); return cmp(version, this.operator, this.semver, this.loose); }; exports.Range = Range; function Range(range, loose) { if ((range instanceof Range) && range.loose === loose) return range; if (!(this instanceof Range)) return new Range(range, loose); this.loose = loose; // First, split based on boolean or || this.raw = range; this.set = range.split(/\s*\|\|\s*/).map(function(range) { return this.parseRange(range.trim()); }, this).filter(function(c) { // throw out any that are not relevant for whatever reason return c.length; }); if (!this.set.length) { throw new TypeError('Invalid SemVer Range: ' + range); } this.format(); } Range.prototype.inspect = function() { return '<SemVer Range "' + this.range + '">'; }; Range.prototype.format = function() { this.range = this.set.map(function(comps) { return comps.join(' ').trim(); }).join('||').trim(); return this.range; }; Range.prototype.toString = function() { return this.range; }; Range.prototype.parseRange = function(range) { var loose = this.loose; range = range.trim(); ; // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE]; range = range.replace(hr, hyphenReplace); ; // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace); ; // `~ 1.2.3` => `~1.2.3` range = range.replace(re[TILDETRIM], tildeTrimReplace); // `^ 1.2.3` => `^1.2.3` range = range.replace(re[CARETTRIM], caretTrimReplace); // normalize spaces range = range.split(/\s+/).join(' '); // At this point, the range is completely trimmed and // ready to be split into comparators. var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR]; var set = range.split(' ').map(function(comp) { return parseComparator(comp, loose); }).join(' ').split(/\s+/); if (this.loose) { // in loose mode, throw out any that are not valid comparators set = set.filter(function(comp) { return !!comp.match(compRe); }); } set = set.map(function(comp) { return new Comparator(comp, loose); }); return set; }; // Mostly just for testing and legacy API reasons exports.toComparators = toComparators; function toComparators(range, loose) { return new Range(range, loose).set.map(function(comp) { return comp.map(function(c) { return c.value; }).join(' ').trim().split(' '); }); } // comprised of xranges, tildes, stars, and gtlt's at this point. // already replaced the hyphen ranges // turn into a set of JUST comparators. function parseComparator(comp, loose) { ; comp = replaceCarets(comp, loose); ; comp = replaceTildes(comp, loose); ; comp = replaceXRanges(comp, loose); ; comp = replaceStars(comp, loose); ; return comp; } function isX(id) { return !id || id.toLowerCase() === 'x' || id === '*'; } // ~, ~> --> * (any, kinda silly) // ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 // ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 // ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 // ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 // ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 function replaceTildes(comp, loose) { return comp.trim().split(/\s+/).map(function(comp) { return replaceTilde(comp, loose); }).join(' '); } function replaceTilde(comp, loose) { var r = loose ? re[TILDELOOSE] : re[TILDE]; return comp.replace(r, function(_, M, m, p, pr) { ; var ret; if (isX(M)) ret = ''; else if (isX(m)) ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; else if (isX(p)) // ~1.2 == >=1.2.0- <1.3.0- ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; else if (pr) { ; if (pr.charAt(0) !== '-') pr = '-' + pr; ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + (+m + 1) + '.0'; } else // ~1.2.3 == >=1.2.3 <1.3.0 ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + (+m + 1) + '.0'; ; return ret; }); } // ^ --> * (any, kinda silly) // ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 // ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 // ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 // ^1.2.3 --> >=1.2.3 <2.0.0 // ^1.2.0 --> >=1.2.0 <2.0.0 function replaceCarets(comp, loose) { return comp.trim().split(/\s+/).map(function(comp) { return replaceCaret(comp, loose); }).join(' '); } function replaceCaret(comp, loose) { ; var r = loose ? re[CARETLOOSE] : re[CARET]; return comp.replace(r, function(_, M, m, p, pr) { ; var ret; if (isX(M)) ret = ''; else if (isX(m)) ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; else if (isX(p)) { if (M === '0') ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; else ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0'; } else if (pr) { ; if (pr.charAt(0) !== '-') pr = '-' + pr; if (M === '0') { if (m === '0') ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + m + '.' + (+p + 1); else ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + (+m + 1) + '.0'; } else ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + (+M + 1) + '.0.0'; } else { ; if (M === '0') { if (m === '0') ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + m + '.' + (+p + 1); else ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + (+m + 1) + '.0'; } else ret = '>=' + M + '.' + m + '.' + p + ' <' + (+M + 1) + '.0.0'; } ; return ret; }); } function replaceXRanges(comp, loose) { ; return comp.split(/\s+/).map(function(comp) { return replaceXRange(comp, loose); }).join(' '); } function replaceXRange(comp, loose) { comp = comp.trim(); var r = loose ? re[XRANGELOOSE] : re[XRANGE]; return comp.replace(r, function(ret, gtlt, M, m, p, pr) { ; var xM = isX(M); var xm = xM || isX(m); var xp = xm || isX(p); var anyX = xp; if (gtlt === '=' && anyX) gtlt = ''; if (xM) { if (gtlt === '>' || gtlt === '<') { // nothing is allowed ret = '<0.0.0'; } else { // nothing is forbidden ret = '*'; } } else if (gtlt && anyX) { // replace X with 0 if (xm) m = 0; if (xp) p = 0; if (gtlt === '>') { // >1 => >=2.0.0 // >1.2 => >=1.3.0 // >1.2.3 => >= 1.2.4 gtlt = '>='; if (xm) { M = +M + 1; m = 0; p = 0; } else if (xp) { m = +m + 1; p = 0; } } else if (gtlt === '<=') { // <=0.7.x is actually <0.8.0, since any 0.7.x should // pass. Similarly, <=7.x is actually <8.0.0, etc. gtlt = '<' if (xm) M = +M + 1 else m = +m + 1 } ret = gtlt + M + '.' + m + '.' + p; } else if (xm) { ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; } else if (xp) { ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; } ; return ret; }); } // Because * is AND-ed with everything else in the comparator, // and '' means "any version", just remove the *s entirely. function replaceStars(comp, loose) { ; // Looseness is ignored here. star is always as loose as it gets! return comp.trim().replace(re[STAR], ''); } // This function is passed to string.replace(re[HYPHENRANGE]) // M, m, patch, prerelease, build // 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 // 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do // 1.2 - 3.4 => >=1.2.0 <3.5.0 function hyphenReplace($0, from, fM, fm, fp, fpr, fb, to, tM, tm, tp, tpr, tb) { if (isX(fM)) from = ''; else if (isX(fm)) from = '>=' + fM + '.0.0'; else if (isX(fp)) from = '>=' + fM + '.' + fm + '.0'; else from = '>=' + from; if (isX(tM)) to = ''; else if (isX(tm)) to = '<' + (+tM + 1) + '.0.0'; else if (isX(tp)) to = '<' + tM + '.' + (+tm + 1) + '.0'; else if (tpr) to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr; else to = '<=' + to; return (from + ' ' + to).trim(); } // if ANY of the sets match ALL of its comparators, then pass Range.prototype.test = function(version) { if (!version) return false; if (typeof version === 'string') version = new SemVer(version, this.loose); for (var i = 0; i < this.set.length; i++) { if (testSet(this.set[i], version)) return true; } return false; }; function testSet(set, version) { for (var i = 0; i < set.length; i++) { if (!set[i].test(version)) return false; } if (version.prerelease.length) { // Find the set of versions that are allowed to have prereleases // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 // That should allow `1.2.3-pr.2` to pass. // However, `1.2.4-alpha.notready` should NOT be allowed, // even though it's within the range set by the comparators. for (var i = 0; i < set.length; i++) { ; if (set[i].semver === ANY) return true; if (set[i].semver.prerelease.length > 0) { var allowed = set[i].semver; if (allowed.major === version.major && allowed.minor === version.minor && allowed.patch === version.patch) return true; } } // Version has a -pre, but it's not one of the ones we like. return false; } return true; } exports.satisfies = satisfies; function satisfies(version, range, loose) { try { range = new Range(range, loose); } catch (er) { return false; } return range.test(version); } exports.maxSatisfying = maxSatisfying; function maxSatisfying(versions, range, loose) { return versions.filter(function(version) { return satisfies(version, range, loose); }).sort(function(a, b) { return rcompare(a, b, loose); })[0] || null; } exports.validRange = validRange; function validRange(range, loose) { try { // Return '*' instead of '' so that truthiness works. // This will throw if it's invalid anyway return new Range(range, loose).range || '*'; } catch (er) { return null; } } // Determine if version is less than all the versions possible in the range exports.ltr = ltr; function ltr(version, range, loose) { return outside(version, range, '<', loose); } // Determine if version is greater than all the versions possible in the range. exports.gtr = gtr; function gtr(version, range, loose) { return outside(version, range, '>', loose); } exports.outside = outside; function outside(version, range, hilo, loose) { version = new SemVer(version, loose); range = new Range(range, loose); var gtfn, ltefn, ltfn, comp, ecomp; switch (hilo) { case '>': gtfn = gt; ltefn = lte; ltfn = lt; comp = '>'; ecomp = '>='; break; case '<': gtfn = lt; ltefn = gte; ltfn = gt; comp = '<'; ecomp = '<='; break; default: throw new TypeError('Must provide a hilo val of "<" or ">"'); } // If it satisifes the range it is not outside if (satisfies(version, range, loose)) { return false; } // From now on, variable terms are as if we're in "gtr" mode. // but note that everything is flipped for the "ltr" function. for (var i = 0; i < range.set.length; ++i) { var comparators = range.set[i]; var high = null; var low = null; comparators.forEach(function(comparator) { high = high || comparator; low = low || comparator; if (gtfn(comparator.semver, high.semver, loose)) { high = comparator; } else if (ltfn(comparator.semver, low.semver, loose)) { low = comparator; } }); // If the edge version comparator has a operator then our version // isn't outside it if (high.operator === comp || high.operator === ecomp) { return false; } // If the lowest version comparator has an operator and our version // is less than it then it isn't higher than the range if ((!low.operator || low.operator === comp) && ltefn(version, low.semver)) { return false; } else if (low.operator === ecomp && ltfn(version, low.semver)) { return false; } } return true; } // Use the define() function if we're in AMD land if (typeof define === 'function' && define.amd) define(exports); })( typeof exports === 'object' ? exports : typeof define === 'function' && define.amd ? {} : semver = {} ); ����������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/semver.browser.js.gz����������������000644 �000766 �000024 �00000017073 12455173731 030630� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������8T=ks7+`+$%nO&W쒔-zH H$"8{Uw1IhtF̖$-4ϺV ?E~Ief^ 8M~? >"r?�/d&d }1D;OY<=Q%..s?a@/B. Oԟ ?΃,M""? `@Սf�> 4IB=!\G�1Hҹ0Y"EЏL?K7],X6hQ ΏOq|zs #m a ՛K/?'0*ߺR# ~ܣ,XUK՝harsqϗ!�-R hb12>ΒT'?~"`SxX�CT,8͉,O4c /ti0(6RG8Й'$槉s Pȩ�O>9>{O/^|>/U^"}tW7Ǎн-0#9D90~̬+& "BAaҍ4sޜ6󧢞Ƅy;)08O<0`K}V3憗ØEѫS$(Ŗhw?|�XM ALnp$L3aHaM>\|%FmivRNggǯΏ'iU=pILC?覕:` kVH$mw7vQ )a@5[zU ؎sSCfBj r$?l$m ,I)Ko ?`}z߃W˥ ^Jo?zC>& }*m`7̟y2ИD!JT*|ES" r?.P۴s^؞EIo7Y<AZ+Tm8hĀ&~IV /\KBg0 6v].'"_үy,@p N- Т.:YB;3$Y(~Z S~C֗͡h?@r89]$E)M+ϐ�0 ^;P ^G'n5` E�+ȇ h0Kzbz/\\{ۢSr r"K+�.%#/" lfчceV۩JW*j:lM56mѩ8~nYA_Ceϑ8#I@; 4/(EwPx"μvfȠ^6<rn 7͂4Ý!&ԧIUvSӿ79�8~ûvZEf5 HҞ]{%_{& ԸrÚ6i^{=b0uvfєm5>lj_f@ƼlN6t)4Ne( mj)%^2NEc2 3ss( �70SltO/W+ PTqqn U[xd&o1rt|z=ow9" 3z Uov(+Qr!ItT)]mYh߃? ${Nn؛|iƮPC^?:;(J {GEIl6:[zuIzWe%l4IJ3-KER^%ncC [`V 7'oΎ.ޔBUy}+*̎[),c8RtjG gg o}}y*̢Zzܬ>3Kx 3(F? #*_bլu|3ʝ<P {⋹ޜԤDZQW\�GahIZҚ;_*yڏXѴIa> �3ipO>̛î_80Nr۟OKU̓R5~>٧Uק ̒DZjDzBʬxDk,;.CA+*8O3JXG߬r]}|0A0,єs>^ `au?K}'q&1A Lqr+"ҰL-#*g𱵅Bx 3wkY9zҧnx0H{-}H;:C`8Ďpv; Xi,:� hPui⓰R"^C"TGcHߵ$ިA!,n Y]'Ā>wmYXl:/ޏn.^mlw+ˀ0}XI&9 Ns q,c<f,RjJ0ebO-Gk+b�ivگb2+^E}Bt@4V#4גڅh7nO:)aPӤ(_<PEW"c'Y/1�fI[K=H r)8+T]0`vO3Wo/Hxk [AijE'_;[]F-r{�7@?;lGy�I|JQS`M~xL|9ЏY(S0-~IFe% kb\C X(&mNq|yנ3F'WX p6TԳMUNԗ5Sn@e,c|d\v|#ɳnG?V0m.p@7ږ fmRRb39jo+GEې8b|O1MRO2|=]?b?`O;9xς` o|:A3vkW$/S`lE5l:[[0݀7loܤ.sf\κ>d&~QO?t@6Mn)aZpi)Iw^~Pe'4a@�Ժ<Aiۻ[53sd[p=RJܸYe@|;o;*1>D_?L|_tBcz5wQF5ME8\,a&)[InPa/e\|4>2why 0IwS,}~胗)4J W4Nr6dWEfB68cdhz6@ORJeX ^ #R]vǪc=P;Vc\eMHWW*5^Yob=:)~e`ۮ}Yf`sc m2}SRda'kV܈F;y,ȆGS;u뷿l"Wr)&*,\'+-0bo Nc_Ф nirUV~tG6C#w j{N:`~ 3 Re/ppأF@7߱[oHA@=B$"3UغU`̮;CF<Z/:m)<sϵtW>KqJ CgE JSpmN;L!l}ANJ62̍諸z2Ȉ%t/S3G1K''f8h}C閝w3;2d<ȚVv T{*{ve<݅@ S"`6 To=16Et]<=Nvk7fᛝV}D <sL9ݶbICDimqx-k{$f-f*RNӶV oI"Ay,Nr4�ϓJLVGİsЂ>z� G:WԎa5 |qcgb0gk5[EUg.kzO;՞F8Rp蜁MX62>~3 =85g,%-mj4絴rT\; IO.n%'-Uv8r`99gxsv$‹?;!E8ڙ3, U`C!M8%!i+I Tv ">~+^0; i`Y <PXJXX'MG l,>sgWY&[+޿x}.Yro;<Zq<Gvgw7ʭ ojD8^ } J];U8\yd7`V$dSoT`=4IYɜ Ú󦰃SJUB8۹*"QAJ5TL1㜆RHHVPè_1>Gb1!]NPuCIL%D;J3Hv1RK)w-3hH 7>19dh4tYľfg>(^9aB L&}G͏Ʌc("˂Ե VgnT_3{ժ ?l *IoU8 f0ZnrZU.eVgF͏- hsF>-#H;Z2\l!f2SAc\%[VAѦ c/Ka-8I(y&eXj2=sU&s^`2Qvǫ ;Da942xOCTpM BB ]H6!+h15BԦѬI Jh%6VO;Q7fra;D$He23MRvFYۢөT3 3 qSZaTH'o[ .%j2ȫo]N-_ P`n%/V?d2.\ҫ!{]~czLCM9z}ëxKcqڇog%~my=:1oÅ>&*-Xytle}bੂeg;]ʖ%u-Fd6gZيds  շ2&Ow:*4n ⠲ię`'IK*HOEϽɽ8zJ=.QPm8^G0Xg蒍UZ(0=d }VL>aV� zDP6I՟ZoMȻBt$%H|~᪜{Yi7M/]J Hס|G[EU9GRIӉJa@D0:Sû6/+D 6'>x~K;�3`F7$_R5 P6m HM'w msl|zY*a.~t1誇\fV)6"mVw ̾I . ʬ=O'Nz" d;ĔbK-s:v@Q`> ݯ?;[XUU-Ll%Ͼh/1~w"b +.Kΐ~ۤ@,(lA?:d(.WעI:bx5c}RH5axv\5ڝ+P+WM6抴¦2"W6d)rXղ״LV~`WN /CKUn:kS ;BETfɗJCY[ИMF!+Z#'H:Bח\�I?+rصmߗnFt]юۧe&UaߕR﬈ =ںwX t7dƭ*eªgV.^wgBkG\ĵv!6N CofКe%."%8NfZSԘ)(w1O kIWsm Iʭ@*5cV |ϱwwևӉUteMV#vFtWؾUc͔Q9KJ|&_!kYc[/ 8= <Q _=A" +Xg G4t,[~_+bzh܉MJ?:OY\I>IkS&D\�[6ԙoq8p!M73 %c_dsy@浟006ֻL.%v%Ϫ!Vĝ֡P_vW,pb.oۗ!q_Y}r;eLhw5" Ag2P@|4i -70 w\:χtcr\"Oz"Д2;ПG HT�aE86Y45$מMlWGBfr輤C춣Bث\oLn8:h'v>h@&?*&B~A碠OE.ZflٟhQUzJ�8s .]n!D59G$2CGaK =cz o_4I3ue[g幄l31Hd`nvk$0#Gzهr6)7>YjOxN5Gzz� C1mJg1&|ucd9YhK#"+Rve)̣fWy+EY%zidoqV ηmp*u-p~|Fi}EIL�|Exٞ_'-|/iSVz8w <Mn2PeMұuGMfB5]@(v\3ywD=8q8 UV5ut5sO2zD TQ|.SfޙLmlSg-Xa@>16O8Հ=ۘ1gFM7|'>pHrKad; e Z(<I$i2ς_ɹ@6 ) SB]y;ICEu&yhc=ʾȨ<aiJfWA>ȁ{X5#. J*%{nb|/"Hzzz/_kUk+wޠ|=4 w26&R A0G'䬐N*YHYtcLn C@؝kjr`Xlp30=e %sU牓me/ t̯)9+yZy8v؀cor+B0j[[Yq\o#RuQ kP= S{F+r L&3|SB s18Y+3=BI5:Y^Kwl W%b<>Lu t0$ktن\ 8nk:`7d.; Y0tV$2G'?3KM<=jD\1"z"O"Vcc+x@40?w�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/semver.js���������������������������000644 �000766 �000024 �00000076051 12455173731 026530� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// export the class if we are in a Node-like system. if (typeof module === 'object' && module.exports === exports) exports = module.exports = SemVer; // The debug function is excluded entirely from the minified version. /* nomin */ var debug; /* nomin */ if (typeof process === 'object' && /* nomin */ process.env && /* nomin */ process.env.NODE_DEBUG && /* nomin */ /\bsemver\b/i.test(process.env.NODE_DEBUG)) /* nomin */ debug = function() { /* nomin */ var args = Array.prototype.slice.call(arguments, 0); /* nomin */ args.unshift('SEMVER'); /* nomin */ console.log.apply(console, args); /* nomin */ }; /* nomin */ else /* nomin */ debug = function() {}; // Note: this is the semver.org version of the spec that it implements // Not necessarily the package version of this code. exports.SEMVER_SPEC_VERSION = '2.0.0'; // The actual regexps go on exports.re var re = exports.re = []; var src = exports.src = []; var R = 0; // The following Regular Expressions can be used for tokenizing, // validating, and parsing SemVer version strings. // ## Numeric Identifier // A single `0`, or a non-zero digit followed by zero or more digits. var NUMERICIDENTIFIER = R++; src[NUMERICIDENTIFIER] = '0|[1-9]\\d*'; var NUMERICIDENTIFIERLOOSE = R++; src[NUMERICIDENTIFIERLOOSE] = '[0-9]+'; // ## Non-numeric Identifier // Zero or more digits, followed by a letter or hyphen, and then zero or // more letters, digits, or hyphens. var NONNUMERICIDENTIFIER = R++; src[NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*'; // ## Main Version // Three dot-separated numeric identifiers. var MAINVERSION = R++; src[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\.' + '(' + src[NUMERICIDENTIFIER] + ')\\.' + '(' + src[NUMERICIDENTIFIER] + ')'; var MAINVERSIONLOOSE = R++; src[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + '(' + src[NUMERICIDENTIFIERLOOSE] + ')'; // ## Pre-release Version Identifier // A numeric identifier, or a non-numeric identifier. var PRERELEASEIDENTIFIER = R++; src[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] + '|' + src[NONNUMERICIDENTIFIER] + ')'; var PRERELEASEIDENTIFIERLOOSE = R++; src[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] + '|' + src[NONNUMERICIDENTIFIER] + ')'; // ## Pre-release Version // Hyphen, followed by one or more dot-separated pre-release version // identifiers. var PRERELEASE = R++; src[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] + '(?:\\.' + src[PRERELEASEIDENTIFIER] + ')*))'; var PRERELEASELOOSE = R++; src[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] + '(?:\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))'; // ## Build Metadata Identifier // Any combination of digits, letters, or hyphens. var BUILDIDENTIFIER = R++; src[BUILDIDENTIFIER] = '[0-9A-Za-z-]+'; // ## Build Metadata // Plus sign, followed by one or more period-separated build metadata // identifiers. var BUILD = R++; src[BUILD] = '(?:\\+(' + src[BUILDIDENTIFIER] + '(?:\\.' + src[BUILDIDENTIFIER] + ')*))'; // ## Full Version String // A main version, followed optionally by a pre-release version and // build metadata. // Note that the only major, minor, patch, and pre-release sections of // the version string are capturing groups. The build metadata is not a // capturing group, because it should not ever be used in version // comparison. var FULL = R++; var FULLPLAIN = 'v?' + src[MAINVERSION] + src[PRERELEASE] + '?' + src[BUILD] + '?'; src[FULL] = '^' + FULLPLAIN + '$'; // like full, but allows v1.2.3 and =1.2.3, which people do sometimes. // also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty // common in the npm registry. var LOOSEPLAIN = '[v=\\s]*' + src[MAINVERSIONLOOSE] + src[PRERELEASELOOSE] + '?' + src[BUILD] + '?'; var LOOSE = R++; src[LOOSE] = '^' + LOOSEPLAIN + '$'; var GTLT = R++; src[GTLT] = '((?:<|>)?=?)'; // Something like "2.*" or "1.2.x". // Note that "x.x" is a valid xRange identifer, meaning "any version" // Only the first item is strictly required. var XRANGEIDENTIFIERLOOSE = R++; src[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\*'; var XRANGEIDENTIFIER = R++; src[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\*'; var XRANGEPLAIN = R++; src[XRANGEPLAIN] = '[v=\\s]*(' + src[XRANGEIDENTIFIER] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + '(?:' + src[PRERELEASE] + ')?' + src[BUILD] + '?' + ')?)?'; var XRANGEPLAINLOOSE = R++; src[XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:' + src[PRERELEASELOOSE] + ')?' + src[BUILD] + '?' + ')?)?'; var XRANGE = R++; src[XRANGE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAIN] + '$'; var XRANGELOOSE = R++; src[XRANGELOOSE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAINLOOSE] + '$'; // Tilde ranges. // Meaning is "reasonably at or greater than" var LONETILDE = R++; src[LONETILDE] = '(?:~>?)'; var TILDETRIM = R++; src[TILDETRIM] = '(\\s*)' + src[LONETILDE] + '\\s+'; re[TILDETRIM] = new RegExp(src[TILDETRIM], 'g'); var tildeTrimReplace = '$1~'; var TILDE = R++; src[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$'; var TILDELOOSE = R++; src[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$'; // Caret ranges. // Meaning is "at least and backwards compatible with" var LONECARET = R++; src[LONECARET] = '(?:\\^)'; var CARETTRIM = R++; src[CARETTRIM] = '(\\s*)' + src[LONECARET] + '\\s+'; re[CARETTRIM] = new RegExp(src[CARETTRIM], 'g'); var caretTrimReplace = '$1^'; var CARET = R++; src[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$'; var CARETLOOSE = R++; src[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$'; // A simple gt/lt/eq thing, or just "" to indicate "any version" var COMPARATORLOOSE = R++; src[COMPARATORLOOSE] = '^' + src[GTLT] + '\\s*(' + LOOSEPLAIN + ')$|^$'; var COMPARATOR = R++; src[COMPARATOR] = '^' + src[GTLT] + '\\s*(' + FULLPLAIN + ')$|^$'; // An expression to strip any whitespace between the gtlt and the thing // it modifies, so that `> 1.2.3` ==> `>1.2.3` var COMPARATORTRIM = R++; src[COMPARATORTRIM] = '(\\s*)' + src[GTLT] + '\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')'; // this one has to use the /g flag re[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g'); var comparatorTrimReplace = '$1$2$3'; // Something like `1.2.3 - 1.2.4` // Note that these all use the loose form, because they'll be // checked against either the strict or loose comparator form // later. var HYPHENRANGE = R++; src[HYPHENRANGE] = '^\\s*(' + src[XRANGEPLAIN] + ')' + '\\s+-\\s+' + '(' + src[XRANGEPLAIN] + ')' + '\\s*$'; var HYPHENRANGELOOSE = R++; src[HYPHENRANGELOOSE] = '^\\s*(' + src[XRANGEPLAINLOOSE] + ')' + '\\s+-\\s+' + '(' + src[XRANGEPLAINLOOSE] + ')' + '\\s*$'; // Star ranges basically just allow anything at all. var STAR = R++; src[STAR] = '(<|>)?=?\\s*\\*'; // Compile to actual regexp objects. // All are flag-free, unless they were created above with a flag. for (var i = 0; i < R; i++) { debug(i, src[i]); if (!re[i]) re[i] = new RegExp(src[i]); } exports.parse = parse; function parse(version, loose) { var r = loose ? re[LOOSE] : re[FULL]; return (r.test(version)) ? new SemVer(version, loose) : null; } exports.valid = valid; function valid(version, loose) { var v = parse(version, loose); return v ? v.version : null; } exports.clean = clean; function clean(version, loose) { var s = parse(version.trim().replace(/^[=v]+/, ''), loose); return s ? s.version : null; } exports.SemVer = SemVer; function SemVer(version, loose) { if (version instanceof SemVer) { if (version.loose === loose) return version; else version = version.version; } else if (typeof version !== 'string') { throw new TypeError('Invalid Version: ' + version); } if (!(this instanceof SemVer)) return new SemVer(version, loose); debug('SemVer', version, loose); this.loose = loose; var m = version.trim().match(loose ? re[LOOSE] : re[FULL]); if (!m) throw new TypeError('Invalid Version: ' + version); this.raw = version; // these are actually numbers this.major = +m[1]; this.minor = +m[2]; this.patch = +m[3]; // numberify any prerelease numeric ids if (!m[4]) this.prerelease = []; else this.prerelease = m[4].split('.').map(function(id) { return (/^[0-9]+$/.test(id)) ? +id : id; }); this.build = m[5] ? m[5].split('.') : []; this.format(); } SemVer.prototype.format = function() { this.version = this.major + '.' + this.minor + '.' + this.patch; if (this.prerelease.length) this.version += '-' + this.prerelease.join('.'); return this.version; }; SemVer.prototype.inspect = function() { return '<SemVer "' + this + '">'; }; SemVer.prototype.toString = function() { return this.version; }; SemVer.prototype.compare = function(other) { debug('SemVer.compare', this.version, this.loose, other); if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); return this.compareMain(other) || this.comparePre(other); }; SemVer.prototype.compareMain = function(other) { if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); return compareIdentifiers(this.major, other.major) || compareIdentifiers(this.minor, other.minor) || compareIdentifiers(this.patch, other.patch); }; SemVer.prototype.comparePre = function(other) { if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); // NOT having a prerelease is > having one if (this.prerelease.length && !other.prerelease.length) return -1; else if (!this.prerelease.length && other.prerelease.length) return 1; else if (!this.prerelease.length && !other.prerelease.length) return 0; var i = 0; do { var a = this.prerelease[i]; var b = other.prerelease[i]; debug('prerelease compare', i, a, b); if (a === undefined && b === undefined) return 0; else if (b === undefined) return 1; else if (a === undefined) return -1; else if (a === b) continue; else return compareIdentifiers(a, b); } while (++i); }; // preminor will bump the version up to the next minor release, and immediately // down to pre-release. premajor and prepatch work the same way. SemVer.prototype.inc = function(release, identifier) { switch (release) { case 'premajor': this.prerelease.length = 0; this.patch = 0; this.minor = 0; this.major++; this.inc('pre', identifier); break; case 'preminor': this.prerelease.length = 0; this.patch = 0; this.minor++; this.inc('pre', identifier); break; case 'prepatch': // If this is already a prerelease, it will bump to the next version // drop any prereleases that might already exist, since they are not // relevant at this point. this.prerelease.length = 0; this.inc('patch', identifier); this.inc('pre', identifier); break; // If the input is a non-prerelease version, this acts the same as // prepatch. case 'prerelease': if (this.prerelease.length === 0) this.inc('patch', identifier); this.inc('pre', identifier); break; case 'major': // If this is a pre-major version, bump up to the same major version. // Otherwise increment major. // 1.0.0-5 bumps to 1.0.0 // 1.1.0 bumps to 2.0.0 if (this.minor !== 0 || this.patch !== 0 || this.prerelease.length === 0) this.major++; this.minor = 0; this.patch = 0; this.prerelease = []; break; case 'minor': // If this is a pre-minor version, bump up to the same minor version. // Otherwise increment minor. // 1.2.0-5 bumps to 1.2.0 // 1.2.1 bumps to 1.3.0 if (this.patch !== 0 || this.prerelease.length === 0) this.minor++; this.patch = 0; this.prerelease = []; break; case 'patch': // If this is not a pre-release version, it will increment the patch. // If it is a pre-release it will bump up to the same patch version. // 1.2.0-5 patches to 1.2.0 // 1.2.0 patches to 1.2.1 if (this.prerelease.length === 0) this.patch++; this.prerelease = []; break; // This probably shouldn't be used publicly. // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. case 'pre': if (this.prerelease.length === 0) this.prerelease = [0]; else { var i = this.prerelease.length; while (--i >= 0) { if (typeof this.prerelease[i] === 'number') { this.prerelease[i]++; i = -2; } } if (i === -1) // didn't increment anything this.prerelease.push(0); } if (identifier) { // 1.2.0-beta.1 bumps to 1.2.0-beta.2, // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 if (this.prerelease[0] === identifier) { if (isNaN(this.prerelease[1])) this.prerelease = [identifier, 0]; } else this.prerelease = [identifier, 0]; } break; default: throw new Error('invalid increment argument: ' + release); } this.format(); return this; }; exports.inc = inc; function inc(version, release, loose, identifier) { if (typeof(loose) === 'string') { identifier = loose; loose = undefined; } try { return new SemVer(version, loose).inc(release, identifier).version; } catch (er) { return null; } } exports.diff = diff; function diff(version1, version2) { if (eq(version1, version2)) { return null; } else { var v1 = parse(version1); var v2 = parse(version2); if (v1.prerelease.length || v2.prerelease.length) { for (var key in v1) { if (key === 'major' || key === 'minor' || key === 'patch') { if (v1[key] !== v2[key]) { return 'pre'+key; } } } return 'prerelease'; } for (var key in v1) { if (key === 'major' || key === 'minor' || key === 'patch') { if (v1[key] !== v2[key]) { return key; } } } } } exports.compareIdentifiers = compareIdentifiers; var numeric = /^[0-9]+$/; function compareIdentifiers(a, b) { var anum = numeric.test(a); var bnum = numeric.test(b); if (anum && bnum) { a = +a; b = +b; } return (anum && !bnum) ? -1 : (bnum && !anum) ? 1 : a < b ? -1 : a > b ? 1 : 0; } exports.rcompareIdentifiers = rcompareIdentifiers; function rcompareIdentifiers(a, b) { return compareIdentifiers(b, a); } exports.compare = compare; function compare(a, b, loose) { return new SemVer(a, loose).compare(b); } exports.compareLoose = compareLoose; function compareLoose(a, b) { return compare(a, b, true); } exports.rcompare = rcompare; function rcompare(a, b, loose) { return compare(b, a, loose); } exports.sort = sort; function sort(list, loose) { return list.sort(function(a, b) { return exports.compare(a, b, loose); }); } exports.rsort = rsort; function rsort(list, loose) { return list.sort(function(a, b) { return exports.rcompare(a, b, loose); }); } exports.gt = gt; function gt(a, b, loose) { return compare(a, b, loose) > 0; } exports.lt = lt; function lt(a, b, loose) { return compare(a, b, loose) < 0; } exports.eq = eq; function eq(a, b, loose) { return compare(a, b, loose) === 0; } exports.neq = neq; function neq(a, b, loose) { return compare(a, b, loose) !== 0; } exports.gte = gte; function gte(a, b, loose) { return compare(a, b, loose) >= 0; } exports.lte = lte; function lte(a, b, loose) { return compare(a, b, loose) <= 0; } exports.cmp = cmp; function cmp(a, op, b, loose) { var ret; switch (op) { case '===': if (typeof a === 'object') a = a.version; if (typeof b === 'object') b = b.version; ret = a === b; break; case '!==': if (typeof a === 'object') a = a.version; if (typeof b === 'object') b = b.version; ret = a !== b; break; case '': case '=': case '==': ret = eq(a, b, loose); break; case '!=': ret = neq(a, b, loose); break; case '>': ret = gt(a, b, loose); break; case '>=': ret = gte(a, b, loose); break; case '<': ret = lt(a, b, loose); break; case '<=': ret = lte(a, b, loose); break; default: throw new TypeError('Invalid operator: ' + op); } return ret; } exports.Comparator = Comparator; function Comparator(comp, loose) { if (comp instanceof Comparator) { if (comp.loose === loose) return comp; else comp = comp.value; } if (!(this instanceof Comparator)) return new Comparator(comp, loose); debug('comparator', comp, loose); this.loose = loose; this.parse(comp); if (this.semver === ANY) this.value = ''; else this.value = this.operator + this.semver.version; debug('comp', this); } var ANY = {}; Comparator.prototype.parse = function(comp) { var r = this.loose ? re[COMPARATORLOOSE] : re[COMPARATOR]; var m = comp.match(r); if (!m) throw new TypeError('Invalid comparator: ' + comp); this.operator = m[1]; if (this.operator === '=') this.operator = ''; // if it literally is just '>' or '' then allow anything. if (!m[2]) this.semver = ANY; else this.semver = new SemVer(m[2], this.loose); }; Comparator.prototype.inspect = function() { return '<SemVer Comparator "' + this + '">'; }; Comparator.prototype.toString = function() { return this.value; }; Comparator.prototype.test = function(version) { debug('Comparator.test', version, this.loose); if (this.semver === ANY) return true; if (typeof version === 'string') version = new SemVer(version, this.loose); return cmp(version, this.operator, this.semver, this.loose); }; exports.Range = Range; function Range(range, loose) { if ((range instanceof Range) && range.loose === loose) return range; if (!(this instanceof Range)) return new Range(range, loose); this.loose = loose; // First, split based on boolean or || this.raw = range; this.set = range.split(/\s*\|\|\s*/).map(function(range) { return this.parseRange(range.trim()); }, this).filter(function(c) { // throw out any that are not relevant for whatever reason return c.length; }); if (!this.set.length) { throw new TypeError('Invalid SemVer Range: ' + range); } this.format(); } Range.prototype.inspect = function() { return '<SemVer Range "' + this.range + '">'; }; Range.prototype.format = function() { this.range = this.set.map(function(comps) { return comps.join(' ').trim(); }).join('||').trim(); return this.range; }; Range.prototype.toString = function() { return this.range; }; Range.prototype.parseRange = function(range) { var loose = this.loose; range = range.trim(); debug('range', range, loose); // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE]; range = range.replace(hr, hyphenReplace); debug('hyphen replace', range); // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace); debug('comparator trim', range, re[COMPARATORTRIM]); // `~ 1.2.3` => `~1.2.3` range = range.replace(re[TILDETRIM], tildeTrimReplace); // `^ 1.2.3` => `^1.2.3` range = range.replace(re[CARETTRIM], caretTrimReplace); // normalize spaces range = range.split(/\s+/).join(' '); // At this point, the range is completely trimmed and // ready to be split into comparators. var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR]; var set = range.split(' ').map(function(comp) { return parseComparator(comp, loose); }).join(' ').split(/\s+/); if (this.loose) { // in loose mode, throw out any that are not valid comparators set = set.filter(function(comp) { return !!comp.match(compRe); }); } set = set.map(function(comp) { return new Comparator(comp, loose); }); return set; }; // Mostly just for testing and legacy API reasons exports.toComparators = toComparators; function toComparators(range, loose) { return new Range(range, loose).set.map(function(comp) { return comp.map(function(c) { return c.value; }).join(' ').trim().split(' '); }); } // comprised of xranges, tildes, stars, and gtlt's at this point. // already replaced the hyphen ranges // turn into a set of JUST comparators. function parseComparator(comp, loose) { debug('comp', comp); comp = replaceCarets(comp, loose); debug('caret', comp); comp = replaceTildes(comp, loose); debug('tildes', comp); comp = replaceXRanges(comp, loose); debug('xrange', comp); comp = replaceStars(comp, loose); debug('stars', comp); return comp; } function isX(id) { return !id || id.toLowerCase() === 'x' || id === '*'; } // ~, ~> --> * (any, kinda silly) // ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 // ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 // ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 // ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 // ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 function replaceTildes(comp, loose) { return comp.trim().split(/\s+/).map(function(comp) { return replaceTilde(comp, loose); }).join(' '); } function replaceTilde(comp, loose) { var r = loose ? re[TILDELOOSE] : re[TILDE]; return comp.replace(r, function(_, M, m, p, pr) { debug('tilde', comp, _, M, m, p, pr); var ret; if (isX(M)) ret = ''; else if (isX(m)) ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; else if (isX(p)) // ~1.2 == >=1.2.0- <1.3.0- ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; else if (pr) { debug('replaceTilde pr', pr); if (pr.charAt(0) !== '-') pr = '-' + pr; ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + (+m + 1) + '.0'; } else // ~1.2.3 == >=1.2.3 <1.3.0 ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + (+m + 1) + '.0'; debug('tilde return', ret); return ret; }); } // ^ --> * (any, kinda silly) // ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 // ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 // ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 // ^1.2.3 --> >=1.2.3 <2.0.0 // ^1.2.0 --> >=1.2.0 <2.0.0 function replaceCarets(comp, loose) { return comp.trim().split(/\s+/).map(function(comp) { return replaceCaret(comp, loose); }).join(' '); } function replaceCaret(comp, loose) { debug('caret', comp, loose); var r = loose ? re[CARETLOOSE] : re[CARET]; return comp.replace(r, function(_, M, m, p, pr) { debug('caret', comp, _, M, m, p, pr); var ret; if (isX(M)) ret = ''; else if (isX(m)) ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; else if (isX(p)) { if (M === '0') ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; else ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0'; } else if (pr) { debug('replaceCaret pr', pr); if (pr.charAt(0) !== '-') pr = '-' + pr; if (M === '0') { if (m === '0') ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + m + '.' + (+p + 1); else ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + (+m + 1) + '.0'; } else ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + (+M + 1) + '.0.0'; } else { debug('no pr'); if (M === '0') { if (m === '0') ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + m + '.' + (+p + 1); else ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + (+m + 1) + '.0'; } else ret = '>=' + M + '.' + m + '.' + p + ' <' + (+M + 1) + '.0.0'; } debug('caret return', ret); return ret; }); } function replaceXRanges(comp, loose) { debug('replaceXRanges', comp, loose); return comp.split(/\s+/).map(function(comp) { return replaceXRange(comp, loose); }).join(' '); } function replaceXRange(comp, loose) { comp = comp.trim(); var r = loose ? re[XRANGELOOSE] : re[XRANGE]; return comp.replace(r, function(ret, gtlt, M, m, p, pr) { debug('xRange', comp, ret, gtlt, M, m, p, pr); var xM = isX(M); var xm = xM || isX(m); var xp = xm || isX(p); var anyX = xp; if (gtlt === '=' && anyX) gtlt = ''; if (xM) { if (gtlt === '>' || gtlt === '<') { // nothing is allowed ret = '<0.0.0'; } else { // nothing is forbidden ret = '*'; } } else if (gtlt && anyX) { // replace X with 0 if (xm) m = 0; if (xp) p = 0; if (gtlt === '>') { // >1 => >=2.0.0 // >1.2 => >=1.3.0 // >1.2.3 => >= 1.2.4 gtlt = '>='; if (xm) { M = +M + 1; m = 0; p = 0; } else if (xp) { m = +m + 1; p = 0; } } else if (gtlt === '<=') { // <=0.7.x is actually <0.8.0, since any 0.7.x should // pass. Similarly, <=7.x is actually <8.0.0, etc. gtlt = '<' if (xm) M = +M + 1 else m = +m + 1 } ret = gtlt + M + '.' + m + '.' + p; } else if (xm) { ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; } else if (xp) { ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; } debug('xRange return', ret); return ret; }); } // Because * is AND-ed with everything else in the comparator, // and '' means "any version", just remove the *s entirely. function replaceStars(comp, loose) { debug('replaceStars', comp, loose); // Looseness is ignored here. star is always as loose as it gets! return comp.trim().replace(re[STAR], ''); } // This function is passed to string.replace(re[HYPHENRANGE]) // M, m, patch, prerelease, build // 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 // 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do // 1.2 - 3.4 => >=1.2.0 <3.5.0 function hyphenReplace($0, from, fM, fm, fp, fpr, fb, to, tM, tm, tp, tpr, tb) { if (isX(fM)) from = ''; else if (isX(fm)) from = '>=' + fM + '.0.0'; else if (isX(fp)) from = '>=' + fM + '.' + fm + '.0'; else from = '>=' + from; if (isX(tM)) to = ''; else if (isX(tm)) to = '<' + (+tM + 1) + '.0.0'; else if (isX(tp)) to = '<' + tM + '.' + (+tm + 1) + '.0'; else if (tpr) to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr; else to = '<=' + to; return (from + ' ' + to).trim(); } // if ANY of the sets match ALL of its comparators, then pass Range.prototype.test = function(version) { if (!version) return false; if (typeof version === 'string') version = new SemVer(version, this.loose); for (var i = 0; i < this.set.length; i++) { if (testSet(this.set[i], version)) return true; } return false; }; function testSet(set, version) { for (var i = 0; i < set.length; i++) { if (!set[i].test(version)) return false; } if (version.prerelease.length) { // Find the set of versions that are allowed to have prereleases // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 // That should allow `1.2.3-pr.2` to pass. // However, `1.2.4-alpha.notready` should NOT be allowed, // even though it's within the range set by the comparators. for (var i = 0; i < set.length; i++) { debug(set[i].semver); if (set[i].semver === ANY) return true; if (set[i].semver.prerelease.length > 0) { var allowed = set[i].semver; if (allowed.major === version.major && allowed.minor === version.minor && allowed.patch === version.patch) return true; } } // Version has a -pre, but it's not one of the ones we like. return false; } return true; } exports.satisfies = satisfies; function satisfies(version, range, loose) { try { range = new Range(range, loose); } catch (er) { return false; } return range.test(version); } exports.maxSatisfying = maxSatisfying; function maxSatisfying(versions, range, loose) { return versions.filter(function(version) { return satisfies(version, range, loose); }).sort(function(a, b) { return rcompare(a, b, loose); })[0] || null; } exports.validRange = validRange; function validRange(range, loose) { try { // Return '*' instead of '' so that truthiness works. // This will throw if it's invalid anyway return new Range(range, loose).range || '*'; } catch (er) { return null; } } // Determine if version is less than all the versions possible in the range exports.ltr = ltr; function ltr(version, range, loose) { return outside(version, range, '<', loose); } // Determine if version is greater than all the versions possible in the range. exports.gtr = gtr; function gtr(version, range, loose) { return outside(version, range, '>', loose); } exports.outside = outside; function outside(version, range, hilo, loose) { version = new SemVer(version, loose); range = new Range(range, loose); var gtfn, ltefn, ltfn, comp, ecomp; switch (hilo) { case '>': gtfn = gt; ltefn = lte; ltfn = lt; comp = '>'; ecomp = '>='; break; case '<': gtfn = lt; ltefn = gte; ltfn = gt; comp = '<'; ecomp = '<='; break; default: throw new TypeError('Must provide a hilo val of "<" or ">"'); } // If it satisifes the range it is not outside if (satisfies(version, range, loose)) { return false; } // From now on, variable terms are as if we're in "gtr" mode. // but note that everything is flipped for the "ltr" function. for (var i = 0; i < range.set.length; ++i) { var comparators = range.set[i]; var high = null; var low = null; comparators.forEach(function(comparator) { high = high || comparator; low = low || comparator; if (gtfn(comparator.semver, high.semver, loose)) { high = comparator; } else if (ltfn(comparator.semver, low.semver, loose)) { low = comparator; } }); // If the edge version comparator has a operator then our version // isn't outside it if (high.operator === comp || high.operator === ecomp) { return false; } // If the lowest version comparator has an operator and our version // is less than it then it isn't higher than the range if ((!low.operator || low.operator === comp) && ltefn(version, low.semver)) { return false; } else if (low.operator === ecomp && ltfn(version, low.semver)) { return false; } } return true; } // Use the define() function if we're in AMD land if (typeof define === 'function' && define.amd) define(exports); ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/semver.min.js�����������������������000644 �000766 �000024 �00000027750 12455173731 027314� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������(function(e){if(typeof module==="object"&&module.exports===e)e=module.exports=H;e.SEMVER_SPEC_VERSION="2.0.0";var r=e.re=[];var t=e.src=[];var n=0;var i=n++;t[i]="0|[1-9]\\d*";var s=n++;t[s]="[0-9]+";var a=n++;t[a]="\\d*[a-zA-Z-][a-zA-Z0-9-]*";var o=n++;t[o]="("+t[i]+")\\."+"("+t[i]+")\\."+"("+t[i]+")";var f=n++;t[f]="("+t[s]+")\\."+"("+t[s]+")\\."+"("+t[s]+")";var u=n++;t[u]="(?:"+t[i]+"|"+t[a]+")";var l=n++;t[l]="(?:"+t[s]+"|"+t[a]+")";var p=n++;t[p]="(?:-("+t[u]+"(?:\\."+t[u]+")*))";var c=n++;t[c]="(?:-?("+t[l]+"(?:\\."+t[l]+")*))";var h=n++;t[h]="[0-9A-Za-z-]+";var v=n++;t[v]="(?:\\+("+t[h]+"(?:\\."+t[h]+")*))";var m=n++;var g="v?"+t[o]+t[p]+"?"+t[v]+"?";t[m]="^"+g+"$";var w="[v=\\s]*"+t[f]+t[c]+"?"+t[v]+"?";var d=n++;t[d]="^"+w+"$";var y=n++;t[y]="((?:<|>)?=?)";var j=n++;t[j]=t[s]+"|x|X|\\*";var b=n++;t[b]=t[i]+"|x|X|\\*";var $=n++;t[$]="[v=\\s]*("+t[b]+")"+"(?:\\.("+t[b]+")"+"(?:\\.("+t[b]+")"+"(?:"+t[p]+")?"+t[v]+"?"+")?)?";var k=n++;t[k]="[v=\\s]*("+t[j]+")"+"(?:\\.("+t[j]+")"+"(?:\\.("+t[j]+")"+"(?:"+t[c]+")?"+t[v]+"?"+")?)?";var E=n++;t[E]="^"+t[y]+"\\s*"+t[$]+"$";var x=n++;t[x]="^"+t[y]+"\\s*"+t[k]+"$";var R=n++;t[R]="(?:~>?)";var S=n++;t[S]="(\\s*)"+t[R]+"\\s+";r[S]=new RegExp(t[S],"g");var V="$1~";var I=n++;t[I]="^"+t[R]+t[$]+"$";var T=n++;t[T]="^"+t[R]+t[k]+"$";var A=n++;t[A]="(?:\\^)";var C=n++;t[C]="(\\s*)"+t[A]+"\\s+";r[C]=new RegExp(t[C],"g");var M="$1^";var z=n++;t[z]="^"+t[A]+t[$]+"$";var N=n++;t[N]="^"+t[A]+t[k]+"$";var P=n++;t[P]="^"+t[y]+"\\s*("+w+")$|^$";var Z=n++;t[Z]="^"+t[y]+"\\s*("+g+")$|^$";var q=n++;t[q]="(\\s*)"+t[y]+"\\s*("+w+"|"+t[$]+")";r[q]=new RegExp(t[q],"g");var L="$1$2$3";var X=n++;t[X]="^\\s*("+t[$]+")"+"\\s+-\\s+"+"("+t[$]+")"+"\\s*$";var _=n++;t[_]="^\\s*("+t[k]+")"+"\\s+-\\s+"+"("+t[k]+")"+"\\s*$";var O=n++;t[O]="(<|>)?=?\\s*\\*";for(var B=0;B<n;B++){if(!r[B])r[B]=new RegExp(t[B])}e.parse=D;function D(e,t){var n=t?r[d]:r[m];return n.test(e)?new H(e,t):null}e.valid=F;function F(e,r){var t=D(e,r);return t?t.version:null}e.clean=G;function G(e,r){var t=D(e.trim().replace(/^[=v]+/,""),r);return t?t.version:null}e.SemVer=H;function H(e,t){if(e instanceof H){if(e.loose===t)return e;else e=e.version}else if(typeof e!=="string"){throw new TypeError("Invalid Version: "+e)}if(!(this instanceof H))return new H(e,t);this.loose=t;var n=e.trim().match(t?r[d]:r[m]);if(!n)throw new TypeError("Invalid Version: "+e);this.raw=e;this.major=+n[1];this.minor=+n[2];this.patch=+n[3];if(!n[4])this.prerelease=[];else this.prerelease=n[4].split(".").map(function(e){return/^[0-9]+$/.test(e)?+e:e});this.build=n[5]?n[5].split("."):[];this.format()}H.prototype.format=function(){this.version=this.major+"."+this.minor+"."+this.patch;if(this.prerelease.length)this.version+="-"+this.prerelease.join(".");return this.version};H.prototype.inspect=function(){return'<SemVer "'+this+'">'};H.prototype.toString=function(){return this.version};H.prototype.compare=function(e){if(!(e instanceof H))e=new H(e,this.loose);return this.compareMain(e)||this.comparePre(e)};H.prototype.compareMain=function(e){if(!(e instanceof H))e=new H(e,this.loose);return U(this.major,e.major)||U(this.minor,e.minor)||U(this.patch,e.patch)};H.prototype.comparePre=function(e){if(!(e instanceof H))e=new H(e,this.loose);if(this.prerelease.length&&!e.prerelease.length)return-1;else if(!this.prerelease.length&&e.prerelease.length)return 1;else if(!this.prerelease.length&&!e.prerelease.length)return 0;var r=0;do{var t=this.prerelease[r];var n=e.prerelease[r];if(t===undefined&&n===undefined)return 0;else if(n===undefined)return 1;else if(t===undefined)return-1;else if(t===n)continue;else return U(t,n)}while(++r)};H.prototype.inc=function(e,r){switch(e){case"premajor":this.prerelease.length=0;this.patch=0;this.minor=0;this.major++;this.inc("pre",r);break;case"preminor":this.prerelease.length=0;this.patch=0;this.minor++;this.inc("pre",r);break;case"prepatch":this.prerelease.length=0;this.inc("patch",r);this.inc("pre",r);break;case"prerelease":if(this.prerelease.length===0)this.inc("patch",r);this.inc("pre",r);break;case"major":if(this.minor!==0||this.patch!==0||this.prerelease.length===0)this.major++;this.minor=0;this.patch=0;this.prerelease=[];break;case"minor":if(this.patch!==0||this.prerelease.length===0)this.minor++;this.patch=0;this.prerelease=[];break;case"patch":if(this.prerelease.length===0)this.patch++;this.prerelease=[];break;case"pre":if(this.prerelease.length===0)this.prerelease=[0];else{var t=this.prerelease.length;while(--t>=0){if(typeof this.prerelease[t]==="number"){this.prerelease[t]++;t=-2}}if(t===-1)this.prerelease.push(0)}if(r){if(this.prerelease[0]===r){if(isNaN(this.prerelease[1]))this.prerelease=[r,0]}else this.prerelease=[r,0]}break;default:throw new Error("invalid increment argument: "+e)}this.format();return this};e.inc=J;function J(e,r,t,n){if(typeof t==="string"){n=t;t=undefined}try{return new H(e,t).inc(r,n).version}catch(i){return null}}e.diff=K;function K(e,r){if(ar(e,r)){return null}else{var t=D(e);var n=D(r);if(t.prerelease.length||n.prerelease.length){for(var i in t){if(i==="major"||i==="minor"||i==="patch"){if(t[i]!==n[i]){return"pre"+i}}}return"prerelease"}for(var i in t){if(i==="major"||i==="minor"||i==="patch"){if(t[i]!==n[i]){return i}}}}}e.compareIdentifiers=U;var Q=/^[0-9]+$/;function U(e,r){var t=Q.test(e);var n=Q.test(r);if(t&&n){e=+e;r=+r}return t&&!n?-1:n&&!t?1:e<r?-1:e>r?1:0}e.rcompareIdentifiers=W;function W(e,r){return U(r,e)}e.compare=Y;function Y(e,r,t){return new H(e,t).compare(r)}e.compareLoose=er;function er(e,r){return Y(e,r,true)}e.rcompare=rr;function rr(e,r,t){return Y(r,e,t)}e.sort=tr;function tr(r,t){return r.sort(function(r,n){return e.compare(r,n,t)})}e.rsort=nr;function nr(r,t){return r.sort(function(r,n){return e.rcompare(r,n,t)})}e.gt=ir;function ir(e,r,t){return Y(e,r,t)>0}e.lt=sr;function sr(e,r,t){return Y(e,r,t)<0}e.eq=ar;function ar(e,r,t){return Y(e,r,t)===0}e.neq=or;function or(e,r,t){return Y(e,r,t)!==0}e.gte=fr;function fr(e,r,t){return Y(e,r,t)>=0}e.lte=ur;function ur(e,r,t){return Y(e,r,t)<=0}e.cmp=lr;function lr(e,r,t,n){var i;switch(r){case"===":if(typeof e==="object")e=e.version;if(typeof t==="object")t=t.version;i=e===t;break;case"!==":if(typeof e==="object")e=e.version;if(typeof t==="object")t=t.version;i=e!==t;break;case"":case"=":case"==":i=ar(e,t,n);break;case"!=":i=or(e,t,n);break;case">":i=ir(e,t,n);break;case">=":i=fr(e,t,n);break;case"<":i=sr(e,t,n);break;case"<=":i=ur(e,t,n);break;default:throw new TypeError("Invalid operator: "+r)}return i}e.Comparator=pr;function pr(e,r){if(e instanceof pr){if(e.loose===r)return e;else e=e.value}if(!(this instanceof pr))return new pr(e,r);this.loose=r;this.parse(e);if(this.semver===cr)this.value="";else this.value=this.operator+this.semver.version}var cr={};pr.prototype.parse=function(e){var t=this.loose?r[P]:r[Z];var n=e.match(t);if(!n)throw new TypeError("Invalid comparator: "+e);this.operator=n[1];if(this.operator==="=")this.operator="";if(!n[2])this.semver=cr;else this.semver=new H(n[2],this.loose)};pr.prototype.inspect=function(){return'<SemVer Comparator "'+this+'">'};pr.prototype.toString=function(){return this.value};pr.prototype.test=function(e){if(this.semver===cr)return true;if(typeof e==="string")e=new H(e,this.loose);return lr(e,this.operator,this.semver,this.loose)};e.Range=hr;function hr(e,r){if(e instanceof hr&&e.loose===r)return e;if(!(this instanceof hr))return new hr(e,r);this.loose=r;this.raw=e;this.set=e.split(/\s*\|\|\s*/).map(function(e){return this.parseRange(e.trim())},this).filter(function(e){return e.length});if(!this.set.length){throw new TypeError("Invalid SemVer Range: "+e)}this.format()}hr.prototype.inspect=function(){return'<SemVer Range "'+this.range+'">'};hr.prototype.format=function(){this.range=this.set.map(function(e){return e.join(" ").trim()}).join("||").trim();return this.range};hr.prototype.toString=function(){return this.range};hr.prototype.parseRange=function(e){var t=this.loose;e=e.trim();var n=t?r[_]:r[X];e=e.replace(n,Er);e=e.replace(r[q],L);e=e.replace(r[S],V);e=e.replace(r[C],M);e=e.split(/\s+/).join(" ");var i=t?r[P]:r[Z];var s=e.split(" ").map(function(e){return mr(e,t)}).join(" ").split(/\s+/);if(this.loose){s=s.filter(function(e){return!!e.match(i)})}s=s.map(function(e){return new pr(e,t)});return s};e.toComparators=vr;function vr(e,r){return new hr(e,r).set.map(function(e){return e.map(function(e){return e.value}).join(" ").trim().split(" ")})}function mr(e,r){e=yr(e,r);e=wr(e,r);e=br(e,r);e=kr(e,r);return e}function gr(e){return!e||e.toLowerCase()==="x"||e==="*"}function wr(e,r){return e.trim().split(/\s+/).map(function(e){return dr(e,r)}).join(" ")}function dr(e,t){var n=t?r[T]:r[I];return e.replace(n,function(e,r,t,n,i){var s;if(gr(r))s="";else if(gr(t))s=">="+r+".0.0 <"+(+r+1)+".0.0";else if(gr(n))s=">="+r+"."+t+".0 <"+r+"."+(+t+1)+".0";else if(i){if(i.charAt(0)!=="-")i="-"+i;s=">="+r+"."+t+"."+n+i+" <"+r+"."+(+t+1)+".0"}else s=">="+r+"."+t+"."+n+" <"+r+"."+(+t+1)+".0";return s})}function yr(e,r){return e.trim().split(/\s+/).map(function(e){return jr(e,r)}).join(" ")}function jr(e,t){var n=t?r[N]:r[z];return e.replace(n,function(e,r,t,n,i){var s;if(gr(r))s="";else if(gr(t))s=">="+r+".0.0 <"+(+r+1)+".0.0";else if(gr(n)){if(r==="0")s=">="+r+"."+t+".0 <"+r+"."+(+t+1)+".0";else s=">="+r+"."+t+".0 <"+(+r+1)+".0.0"}else if(i){if(i.charAt(0)!=="-")i="-"+i;if(r==="0"){if(t==="0")s=">="+r+"."+t+"."+n+i+" <"+r+"."+t+"."+(+n+1);else s=">="+r+"."+t+"."+n+i+" <"+r+"."+(+t+1)+".0"}else s=">="+r+"."+t+"."+n+i+" <"+(+r+1)+".0.0"}else{if(r==="0"){if(t==="0")s=">="+r+"."+t+"."+n+" <"+r+"."+t+"."+(+n+1);else s=">="+r+"."+t+"."+n+" <"+r+"."+(+t+1)+".0"}else s=">="+r+"."+t+"."+n+" <"+(+r+1)+".0.0"}return s})}function br(e,r){return e.split(/\s+/).map(function(e){return $r(e,r)}).join(" ")}function $r(e,t){e=e.trim();var n=t?r[x]:r[E];return e.replace(n,function(e,r,t,n,i,s){var a=gr(t);var o=a||gr(n);var f=o||gr(i);var u=f;if(r==="="&&u)r="";if(a){if(r===">"||r==="<"){e="<0.0.0"}else{e="*"}}else if(r&&u){if(o)n=0;if(f)i=0;if(r===">"){r=">=";if(o){t=+t+1;n=0;i=0}else if(f){n=+n+1;i=0}}else if(r==="<="){r="<";if(o)t=+t+1;else n=+n+1}e=r+t+"."+n+"."+i}else if(o){e=">="+t+".0.0 <"+(+t+1)+".0.0"}else if(f){e=">="+t+"."+n+".0 <"+t+"."+(+n+1)+".0"}return e})}function kr(e,t){return e.trim().replace(r[O],"")}function Er(e,r,t,n,i,s,a,o,f,u,l,p,c){if(gr(t))r="";else if(gr(n))r=">="+t+".0.0";else if(gr(i))r=">="+t+"."+n+".0";else r=">="+r;if(gr(f))o="";else if(gr(u))o="<"+(+f+1)+".0.0";else if(gr(l))o="<"+f+"."+(+u+1)+".0";else if(p)o="<="+f+"."+u+"."+l+"-"+p;else o="<="+o;return(r+" "+o).trim()}hr.prototype.test=function(e){if(!e)return false;if(typeof e==="string")e=new H(e,this.loose);for(var r=0;r<this.set.length;r++){if(xr(this.set[r],e))return true}return false};function xr(e,r){for(var t=0;t<e.length;t++){if(!e[t].test(r))return false}if(r.prerelease.length){for(var t=0;t<e.length;t++){if(e[t].semver===cr)return true;if(e[t].semver.prerelease.length>0){var n=e[t].semver;if(n.major===r.major&&n.minor===r.minor&&n.patch===r.patch)return true}}return false}return true}e.satisfies=Rr;function Rr(e,r,t){try{r=new hr(r,t)}catch(n){return false}return r.test(e)}e.maxSatisfying=Sr;function Sr(e,r,t){return e.filter(function(e){return Rr(e,r,t)}).sort(function(e,r){return rr(e,r,t)})[0]||null}e.validRange=Vr;function Vr(e,r){try{return new hr(e,r).range||"*"}catch(t){return null}}e.ltr=Ir;function Ir(e,r,t){return Ar(e,r,"<",t)}e.gtr=Tr;function Tr(e,r,t){return Ar(e,r,">",t)}e.outside=Ar;function Ar(e,r,t,n){e=new H(e,n);r=new hr(r,n);var i,s,a,o,f;switch(t){case">":i=ir;s=ur;a=sr;o=">";f=">=";break;case"<":i=sr;s=fr;a=ir;o="<";f="<=";break;default:throw new TypeError('Must provide a hilo val of "<" or ">"')}if(Rr(e,r,n)){return false}for(var u=0;u<r.set.length;++u){var l=r.set[u];var p=null;var c=null;l.forEach(function(e){p=p||e;c=c||e;if(i(e.semver,p.semver,n)){p=e}else if(a(e.semver,c.semver,n)){c=e}});if(p.operator===o||p.operator===f){return false}if((!c.operator||c.operator===o)&&s(e,c.semver)){return false}else if(c.operator===f&&a(e,c.semver)){return false}}return true}if(typeof define==="function"&&define.amd)define(e)})(typeof exports==="object"?exports:typeof define==="function"&&define.amd?{}:semver={});������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/semver.min.js.gz��������������������000644 �000766 �000024 �00000006754 12455173731 027734� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������8Tks6""' 8۸ͫvꦑ -CmT@ҏXo]`Amgn<cwX,ٚ${J<6ypƘ]Ib5joHbX6gW.998r` <{ˬ^gמhƻXb!_&#̬rb ,9jޣyzuu>u�1ш6ؖ 'DzOOap;ոqvpSMwה`X\ Hn)@W#MBPb%1#h_"xzHY\T±Z>�9l9Po@kvzt:Wnrwp₀=dCV/nj}<=93\Wiy{\H)q& )az&U*Spt s`oܮw[wUܑ viF#EbMJoGvBȶuaY;)JDPr4.�>� !`OgrKBBW7|B~VoZo m =Z%^~&O +^jۨ׊_ }_ϷP?(Ӕ;Ҋ3$DjU ;T>.=<D ^b/G[bbl㿲01g/]}6^x;i $T "l݄ CJB4ص짂O,M'&k.b�蓀{!@M?op.o[>lgm˲7>.i+@q^nE1MD˃78D;EވAB>要6�B{PqB5,:,\E -#7K&1E¡pI}0e$v4j9M,+N0VF?m"XKVyƋOZVB-S�/*iYz=K�CgHH`Qhe�KeU@}1/msheR<t 5ŅX@%kʫp Tp7'}RMcN9$*Vu@GheG*˥9^pem~A�=i-_\֋fm4[&&WuѣO=(WȌ8OHYif34 -ar@R)/$ ?L)if8®S<xs*wX _oC0{fp�Ҵ:ܻrs6x6,GYKH=D mI~4i2*U3G DR1빖^HɈ#˔D#M<Ɣ9\c~6R RM ;1a:?¢c*,٩J]ՓP*TzD-[ ^S{㬶6QKʌl4HE%EUOUD0lZ& O\@]0U 痢"NSiKfP%~JI�\}YCU|OB_Ue #")߿$!eXs뛊vkG5~_.@n9-2ǰBw,ˊIٿͬ|xT fKƊ]7]ɒ iLҾ."QGt8CxH;}>8=HԈG!J0.`HkbApQċT2b2a`Q'c@#9�ND9`5^H@R c( z:j :3u<Ȁo)*Y~VrD,5ӵK|<EҥKPoܩmܽJ 'x(WoU[}E^L -ˁKQ.+iZ4vE"5-HQ=O/pBG*kxA[!oB 1&;V/8tG.̲~Zc@QA]֠h]ШtC1zO z5LrcM-*o> ,<ZsАu&°ͩݵ[ET%:mOШyY\ǭz]RmlU2bQ yg3cClzn Y9gkh\g!^_ٺ[?RjgRM;!:T]e*5(i]͜q6(J4$<5Xc+ݷkX6Y)ijJ0I"¾Ɣ//>n%sߊ_WF< Ы�AZprEPMY>Ft\C5lYkZ,f6.W&EkT³W0h(7pv?OW.a+\.haiiBl*nʚk<WsQ} #0 eFٴ*+cTgJTT{^c`9-j\KЖ ba &^嫛fw2^amu,ۗҳPԢ#k8r.W,4zW\b=Æ תg SwFʪ>.YgՈ~H(oo m QݎU\{LF0}-2DCH}Aìf3u iX> , /rH1#?L)Dg-hQ] t0+NtxDق)"A$$Bʠ>-FI6-*R AVfZ}E1n_ekGi;mE{"PN*V*iA (Nm;N4Ѵ>bJVIW`$sBzDqނ y׺öL= 1\&\A }KFbyLYQR,|3_g$|%%MۇM5% Sc}nOby S-9O8:zmZl*s1;2ݣ'_>0vq3U{Vǒ ~"ped^nؚǂ(Fri~R'�'$W+2APV%H;4(VUSOU>!B&ٞgtV'":w| 5_i*\{T G"/!rCn2>y ܴkP5f~5% @]|GnJ`]®K]09?Jcx`3lw&u^B?@ 3+`&&`Tsdav/�5!I n6cT54m6 8PdmS6T'jm;}ni0Ol?0m>/����������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/semver/bin/semver��������������������������000755 �000766 �000024 �00000007774 12455173731 026676� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node // Standalone semver comparison program. // Exits successfully and prints matching version(s) if // any supplied version is valid and passes all tests. var argv = process.argv.slice(2) , versions = [] , range = [] , gt = [] , lt = [] , eq = [] , inc = null , version = require("../package.json").version , loose = false , identifier = undefined , semver = require("../semver") , reverse = false main() function main () { if (!argv.length) return help() while (argv.length) { var a = argv.shift() var i = a.indexOf('=') if (i !== -1) { a = a.slice(0, i) argv.unshift(a.slice(i + 1)) } switch (a) { case "-rv": case "-rev": case "--rev": case "--reverse": reverse = true break case "-l": case "--loose": loose = true break case "-v": case "--version": versions.push(argv.shift()) break case "-i": case "--inc": case "--increment": switch (argv[0]) { case "major": case "minor": case "patch": case "prerelease": case "premajor": case "preminor": case "prepatch": inc = argv.shift() break default: inc = "patch" break } break case "--preid": identifier = argv.shift() break case "-r": case "--range": range.push(argv.shift()) break case "-h": case "--help": case "-?": return help() default: versions.push(a) break } } versions = versions.filter(function (v) { return semver.valid(v, loose) }) if (!versions.length) return fail() if (inc && (versions.length !== 1 || range.length)) return failInc() for (var i = 0, l = range.length; i < l ; i ++) { versions = versions.filter(function (v) { return semver.satisfies(v, range[i], loose) }) if (!versions.length) return fail() } return success(versions) } function failInc () { console.error("--inc can only be used on a single version with no range") fail() } function fail () { process.exit(1) } function success () { var compare = reverse ? "rcompare" : "compare" versions.sort(function (a, b) { return semver[compare](a, b, loose) }).map(function (v) { return semver.clean(v, loose) }).map(function (v) { return inc ? semver.inc(v, inc, loose, identifier) : v }).forEach(function (v,i,_) { console.log(v) }) } function help () { console.log(["SemVer " + version ,"" ,"A JavaScript implementation of the http://semver.org/ specification" ,"Copyright Isaac Z. Schlueter" ,"" ,"Usage: semver [options] <version> [<version> [...]]" ,"Prints valid versions sorted by SemVer precedence" ,"" ,"Options:" ,"-r --range <range>" ," Print versions that match the specified range." ,"" ,"-i --increment [<level>]" ," Increment a version by the specified level. Level can" ," be one of: major, minor, patch, premajor, preminor," ," prepatch, or prerelease. Default level is 'patch'." ," Only one version may be specified." ,"" ,"--preid <identifier>" ," Identifier to be used to prefix premajor, preminor," ," prepatch or prerelease version increments." ,"" ,"-l --loose" ," Interpret versions and ranges loosely" ,"" ,"Program exits successfully if any valid version satisfies" ,"all supplied ranges, and prints all satisfying versions." ,"" ,"If no satisfying versions are found, then exits failure." ,"" ,"Versions are printed in ascending order, so supplying" ,"multiple versions to the utility will just sort them." ].join("\n")) } ����iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/AUTHORS�����������������������������000644 �000766 �000024 �00000000353 12455173731 025710� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Authors sorted by whether or not they're me. Isaac Z. Schlueter <i@izs.me> (http://blog.izs.me) Wayne Larsen <wayne@larsen.st> (http://github.com/wvl) ritch <skawful@gmail.com> Marcel Laverdet Yosef Dinerstein <yosefd@microsoft.com> �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/bin.js������������������������������000755 �000766 �000024 �00000001350 12455173731 025747� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var rimraf = require('./') var help = false var dashdash = false var args = process.argv.slice(2).filter(function(arg) { if (dashdash) return !!arg else if (arg === '--') dashdash = true else if (arg.match(/^(-+|\/)(h(elp)?|\?)$/)) help = true else return !!arg }); if (help || args.length === 0) { // If they didn't ask for help, then this is not a "success" var log = help ? console.log : console.error log('Usage: rimraf <path>') log('') log(' Deletes all files and folders at "path" recursively.') log('') log('Options:') log('') log(' -h, --help Display this usage info') process.exit(help ? 0 : 1) } else { args.forEach(function(arg) { rimraf.sync(arg) }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/LICENSE�����������������������������000644 �000766 �000024 �00000002104 12455173731 025641� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/package.json������������������������000644 �000766 �000024 �00000003216 12455173731 027127� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "rimraf", "version": "2.2.8", "main": "rimraf.js", "description": "A deep deletion module for node (like `rm -rf`)", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": { "type": "MIT", "url": "https://github.com/isaacs/rimraf/raw/master/LICENSE" }, "repository": { "type": "git", "url": "git://github.com/isaacs/rimraf.git" }, "scripts": { "test": "cd test && bash run.sh" }, "bin": { "rimraf": "./bin.js" }, "contributors": [ { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, { "name": "Wayne Larsen", "email": "wayne@larsen.st", "url": "http://github.com/wvl" }, { "name": "ritch", "email": "skawful@gmail.com" }, { "name": "Marcel Laverdet" }, { "name": "Yosef Dinerstein", "email": "yosefd@microsoft.com" } ], "bugs": { "url": "https://github.com/isaacs/rimraf/issues" }, "homepage": "https://github.com/isaacs/rimraf", "_id": "rimraf@2.2.8", "_shasum": "e439be2aaee327321952730f99a8929e4fc50582", "_from": "rimraf@latest", "_npmVersion": "1.4.10", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "e439be2aaee327321952730f99a8929e4fc50582", "tarball": "http://registry.npmjs.org/rimraf/-/rimraf-2.2.8.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/rimraf/-/rimraf-2.2.8.tgz", "readme": "ERROR: No README data found!" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/README.md���������������������������000644 �000766 �000024 �00000001510 12455173731 026113� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������`rm -rf` for node. Install with `npm install rimraf`, or just drop rimraf.js somewhere. ## API `rimraf(f, callback)` The callback will be called with an error if there is one. Certain errors are handled for you: * Windows: `EBUSY` and `ENOTEMPTY` - rimraf will back off a maximum of `opts.maxBusyTries` times before giving up. * `ENOENT` - If the file doesn't exist, rimraf will return successfully, since your desired outcome is already the case. ## rimraf.sync It can remove stuff synchronously, too. But that's not so good. Use the async API. It's better. ## CLI If installed with `npm install rimraf -g` it can be used as a global command `rimraf <path>` which is useful for cross platform support. ## mkdirp If you need to create a directory recursively, check out [mkdirp](https://github.com/substack/node-mkdirp). ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/rimraf/rimraf.js���������������������������000644 �000766 �000024 �00000013172 12455173731 026461� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = rimraf rimraf.sync = rimrafSync var assert = require("assert") var path = require("path") var fs = require("fs") // for EMFILE handling var timeout = 0 exports.EMFILE_MAX = 1000 exports.BUSYTRIES_MAX = 3 var isWindows = (process.platform === "win32") function defaults (options) { var methods = [ 'unlink', 'chmod', 'stat', 'rmdir', 'readdir' ] methods.forEach(function(m) { options[m] = options[m] || fs[m] m = m + 'Sync' options[m] = options[m] || fs[m] }) } function rimraf (p, options, cb) { if (typeof options === 'function') { cb = options options = {} } assert(p) assert(options) assert(typeof cb === 'function') defaults(options) if (!cb) throw new Error("No callback passed to rimraf()") var busyTries = 0 rimraf_(p, options, function CB (er) { if (er) { if (isWindows && (er.code === "EBUSY" || er.code === "ENOTEMPTY") && busyTries < exports.BUSYTRIES_MAX) { busyTries ++ var time = busyTries * 100 // try again, with the same exact callback as this one. return setTimeout(function () { rimraf_(p, options, CB) }, time) } // this one won't happen if graceful-fs is used. if (er.code === "EMFILE" && timeout < exports.EMFILE_MAX) { return setTimeout(function () { rimraf_(p, options, CB) }, timeout ++) } // already gone if (er.code === "ENOENT") er = null } timeout = 0 cb(er) }) } // Two possible strategies. // 1. Assume it's a file. unlink it, then do the dir stuff on EPERM or EISDIR // 2. Assume it's a directory. readdir, then do the file stuff on ENOTDIR // // Both result in an extra syscall when you guess wrong. However, there // are likely far more normal files in the world than directories. This // is based on the assumption that a the average number of files per // directory is >= 1. // // If anyone ever complains about this, then I guess the strategy could // be made configurable somehow. But until then, YAGNI. function rimraf_ (p, options, cb) { assert(p) assert(options) assert(typeof cb === 'function') options.unlink(p, function (er) { if (er) { if (er.code === "ENOENT") return cb(null) if (er.code === "EPERM") return (isWindows) ? fixWinEPERM(p, options, er, cb) : rmdir(p, options, er, cb) if (er.code === "EISDIR") return rmdir(p, options, er, cb) } return cb(er) }) } function fixWinEPERM (p, options, er, cb) { assert(p) assert(options) assert(typeof cb === 'function') if (er) assert(er instanceof Error) options.chmod(p, 666, function (er2) { if (er2) cb(er2.code === "ENOENT" ? null : er) else options.stat(p, function(er3, stats) { if (er3) cb(er3.code === "ENOENT" ? null : er) else if (stats.isDirectory()) rmdir(p, options, er, cb) else options.unlink(p, cb) }) }) } function fixWinEPERMSync (p, options, er) { assert(p) assert(options) if (er) assert(er instanceof Error) try { options.chmodSync(p, 666) } catch (er2) { if (er2.code === "ENOENT") return else throw er } try { var stats = options.statSync(p) } catch (er3) { if (er3.code === "ENOENT") return else throw er } if (stats.isDirectory()) rmdirSync(p, options, er) else options.unlinkSync(p) } function rmdir (p, options, originalEr, cb) { assert(p) assert(options) if (originalEr) assert(originalEr instanceof Error) assert(typeof cb === 'function') // try to rmdir first, and only readdir on ENOTEMPTY or EEXIST (SunOS) // if we guessed wrong, and it's not a directory, then // raise the original error. options.rmdir(p, function (er) { if (er && (er.code === "ENOTEMPTY" || er.code === "EEXIST" || er.code === "EPERM")) rmkids(p, options, cb) else if (er && er.code === "ENOTDIR") cb(originalEr) else cb(er) }) } function rmkids(p, options, cb) { assert(p) assert(options) assert(typeof cb === 'function') options.readdir(p, function (er, files) { if (er) return cb(er) var n = files.length if (n === 0) return options.rmdir(p, cb) var errState files.forEach(function (f) { rimraf(path.join(p, f), options, function (er) { if (errState) return if (er) return cb(errState = er) if (--n === 0) options.rmdir(p, cb) }) }) }) } // this looks simpler, and is strictly *faster*, but will // tie up the JavaScript thread and fail on excessively // deep directory trees. function rimrafSync (p, options) { options = options || {} defaults(options) assert(p) assert(options) try { options.unlinkSync(p) } catch (er) { if (er.code === "ENOENT") return if (er.code === "EPERM") return isWindows ? fixWinEPERMSync(p, options, er) : rmdirSync(p, options, er) if (er.code !== "EISDIR") throw er rmdirSync(p, options, er) } } function rmdirSync (p, options, originalEr) { assert(p) assert(options) if (originalEr) assert(originalEr instanceof Error) try { options.rmdirSync(p) } catch (er) { if (er.code === "ENOENT") return if (er.code === "ENOTDIR") throw originalEr if (er.code === "ENOTEMPTY" || er.code === "EEXIST" || er.code === "EPERM") rmkidsSync(p, options) } } function rmkidsSync (p, options) { assert(p) assert(options) options.readdirSync(p).forEach(function (f) { rimrafSync(path.join(p, f), options) }) options.rmdirSync(p, options) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/.npmignore���������������������������000644 �000766 �000024 �00000000020 12455173731 026513� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/node_modules/* ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/equation.gif�������������������������000644 �000766 �000024 �00000002271 12455173731 027042� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������GIF89a�1��������  $ )()),)1011419899<9A@AADAJHJJLJRPRRURZYZZ]Zbabbebjijjmjsqssus{y{{}{,�����1��pH,Ȥrl:ШtJZجxˀ(xL.gan <51zD/w< 5[:,R6C3�o�&H>1 �- %> 8I7v24J š!�Hڄ<l ֋ 7H:D<*E DL/"5=Vp zczC!A%PbB ~trLBT`6EO|Q#(E t'�F8QȂB\,p TQ.0cC+M7Ʋ!("yX"f0%ֺ֪Wj䁀4L @�<FXCB|PmJX#4LTQDB.-Ƿv DMč#/.8!+0سk7�L:ჴx&x0a8s:. o͗+6�" M9G_P7;@G`mD7S;s\(y &&G(Cn2͘n?h�G0 Co #DI$-yc4 ),@-a:5fE �E*ݛ="e C ;$ ̱| qg{`٨A & AH jj >`_maAB) gP髆)krQC�&L`0�08@ٴ^+3 ly <`+qLB>h�ɭCKE8 fKľ|@AW,E {Dw D D&,(у\2 �t<́w5l1 b4,B'L7PGD/I2Iphl,$q&,50� >M-(*Doǽ7*\y%cPdYկY^3ngD�;���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/example/�����������������������������000755 �000766 �000024 �00000000000 12456115120 026144� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/index.js�����������������������������000644 �000766 �000024 �00000000050 12455173731 026164� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib/retry');����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/lib/���������������������������������000755 �000766 �000024 �00000000000 12456115120 025257� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/License������������������������������000644 �000766 �000024 �00000002163 12455173731 026033� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2011: Tim Koschützki (tim@debuggable.com) Felix Geisendörfer (felix@debuggable.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/Makefile�����������������������������000644 �000766 �000024 �00000000077 12455173731 026170� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������SHELL := /bin/bash test: @node test/runner.js .PHONY: test �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/package.json�������������������������000644 �000766 �000024 �00000002265 12455173731 027017� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Tim Koschützki", "email": "tim@debuggable.com", "url": "http://debuggable.com/" }, "name": "retry", "description": "Abstraction for exponential and custom retry strategies for failed operations.", "version": "0.6.1", "homepage": "https://github.com/tim-kos/node-retry", "repository": { "type": "git", "url": "git://github.com/tim-kos/node-retry.git" }, "directories": { "lib": "./lib" }, "main": "index", "engines": { "node": "*" }, "dependencies": {}, "devDependencies": { "fake": "0.2.0", "far": "0.0.1" }, "bugs": { "url": "https://github.com/tim-kos/node-retry/issues" }, "_id": "retry@0.6.1", "_shasum": "fdc90eed943fde11b893554b8cc63d0e899ba918", "_from": "retry@>=0.6.1 <0.7.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "tim-kos", "email": "tim@debuggable.com" }, "maintainers": [ { "name": "tim-kos", "email": "tim@debuggable.com" } ], "dist": { "shasum": "fdc90eed943fde11b893554b8cc63d0e899ba918", "tarball": "http://registry.npmjs.org/retry/-/retry-0.6.1.tgz" }, "_resolved": "https://registry.npmjs.org/retry/-/retry-0.6.1.tgz" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/Readme.md����������������������������000644 �000766 �000024 �00000012735 12455173731 026253� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# retry Abstraction for exponential and custom retry strategies for failed operations. ## Installation npm install retry ## Current Status This module has been tested and is ready to be used. ## Tutorial The example below will retry a potentially failing `dns.resolve` operation `10` times using an exponential backoff strategy. With the default settings, this means the last attempt is made after `17 minutes and 3 seconds`. ``` javascript var dns = require('dns'); var retry = require('retry'); function faultTolerantResolve(address, cb) { var operation = retry.operation(); operation.attempt(function(currentAttempt) { dns.resolve(address, function(err, addresses) { if (operation.retry(err)) { return; } cb(err ? operation.mainError() : null, addresses); }); }); } faultTolerantResolve('nodejs.org', function(err, addresses) { console.log(err, addresses); }); ``` Of course you can also configure the factors that go into the exponential backoff. See the API documentation below for all available settings. currentAttempt is an int representing the number of attempts so far. ``` javascript var operation = retry.operation({ retries: 5, factor: 3, minTimeout: 1 * 1000, maxTimeout: 60 * 1000, randomize: true, }); ``` ## API ### retry.operation([options]) Creates a new `RetryOperation` object. See the `retry.timeouts()` function below for available `options`. ### retry.timeouts([options]) Returns an array of timeouts. All time `options` and return values are in milliseconds. If `options` is an array, a copy of that array is returned. `options` is a JS object that can contain any of the following keys: * `retries`: The maximum amount of times to retry the operation. Default is `10`. * `factor`: The exponential factor to use. Default is `2`. * `minTimeout`: The number of milliseconds before starting the first retry. Default is `1000`. * `maxTimeout`: The maximum number of milliseconds between two retries. Default is `Infinity`. * `randomize`: Randomizes the timeouts by multiplying with a factor between `1` to `2`. Default is `false`. The formula used to calculate the individual timeouts is: ``` var Math.min(random * minTimeout * Math.pow(factor, attempt), maxTimeout); ``` Have a look at [this article][article] for a better explanation of approach. If you want to tune your `factor` / `times` settings to attempt the last retry after a certain amount of time, you can use wolfram alpha. For example in order to tune for `10` attempts in `5 minutes`, you can use this equation: ![screenshot](https://github.com/tim-kos/node-retry/raw/master/equation.gif) Explaining the various values from left to right: * `k = 0 ... 9`: The `retries` value (10) * `1000`: The `minTimeout` value in ms (1000) * `x^k`: No need to change this, `x` will be your resulting factor * `5 * 60 * 1000`: The desired total amount of time for retrying in ms (5 minutes) To make this a little easier for you, use wolfram alpha to do the calculations: [http://www.wolframalpha.com/input/?i=Sum%5B1000*x^k%2C+{k%2C+0%2C+9}%5D+%3D+5+*+60+*+1000]() [article]: http://dthain.blogspot.com/2009/02/exponential-backoff-in-distributed.html ### new RetryOperation(timeouts) Creates a new `RetryOperation` where `timeouts` is an array where each value is a timeout given in milliseconds. #### retryOperation.errors() Returns an array of all errors that have been passed to `retryOperation.retry()` so far. #### retryOperation.mainError() A reference to the error object that occured most frequently. Errors are compared using the `error.message` property. If multiple error messages occured the same amount of time, the last error object with that message is returned. If no errors occured so far, the value is `null`. #### retryOperation.attempt(fn, timeoutOps) Defines the function `fn` that is to be retried and executes it for the first time right away. The `fn` function can receive an optional `currentAttempt` callback that represents the number of attempts to execute `fn` so far. Optionally defines `timeoutOps` which is an object having a property `timeout` in miliseconds and a property `cb` callback function. Whenever your retry operation takes longer than `timeout` to execute, the timeout callback function `cb` is called. #### retryOperation.try(fn) This is an alias for `retryOperation.attempt(fn)`. This is deprecated. #### retryOperation.start(fn) This is an alias for `retryOperation.attempt(fn)`. This is deprecated. #### retryOperation.retry(error) Returns `false` when no `error` value is given, or the maximum amount of retries has been reached. Otherwise it returns `true`, and retries the operation after the timeout for the current attempt number. #### retryOperation.attempts() Returns an int representing the number of attempts it took to call `fn` before it was successful. ## License retry is licensed under the MIT license. #Changelog 0.6.0 Introduced optional timeOps parameter for the attempt() function which is an object having a property timeout in miliseconds and a property cb callback function. Whenever your retry operation takes longer than timeout to execute, the timeout callback function cb is called. 0.5.0 Some minor refactorings. 0.4.0 Changed retryOperation.try() to retryOperation.attempt(). Deprecated the aliases start() and try() for it. 0.3.0 Added retryOperation.start() which is an alias for retryOperation.try(). 0.2.0 Added attempts() function and parameter to retryOperation.try() representing the number of attempts it took to call fn(). �����������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/lib/retry.js�������������������������000644 �000766 �000024 �00000002116 12455173731 026775� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var RetryOperation = require('./retry_operation'); exports.operation = function(options) { var timeouts = exports.timeouts(options); return new RetryOperation(timeouts); }; exports.timeouts = function(options) { if (options instanceof Array) { return [].concat(options); } var opts = { retries: 10, factor: 2, minTimeout: 1 * 1000, maxTimeout: Infinity, randomize: false }; for (var key in options) { opts[key] = options[key]; } if (opts.minTimeout > opts.maxTimeout) { throw new Error('minTimeout is greater than maxTimeout'); } var timeouts = []; for (var i = 0; i < opts.retries; i++) { timeouts.push(this._createTimeout(i, opts)); } // sort the array numerically ascending timeouts.sort(function(a,b) { return a - b; }); return timeouts; }; exports._createTimeout = function(attempt, opts) { var random = (opts.randomize) ? (Math.random() + 1) : 1; var timeout = Math.round(random * opts.minTimeout * Math.pow(opts.factor, attempt)); timeout = Math.min(timeout, opts.maxTimeout); return timeout; };��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/lib/retry_operation.js���������������000644 �000766 �000024 �00000004333 12455173731 031060� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������function RetryOperation(timeouts) { this._timeouts = timeouts; this._fn = null; this._errors = []; this._attempts = 1; this._operationTimeout = null; this._operationTimeoutCb = null; this._timeout = null; } module.exports = RetryOperation; RetryOperation.prototype.retry = function(err) { if (this._timeout) { clearTimeout(this._timeout); } if (!err) { return false; } this._errors.push(err); var timeout = this._timeouts.shift(); if (timeout === undefined) { return false; } this._attempts++; var self = this; setTimeout(function() { self._fn(self._attempts); if (self._operationTimeoutCb) { self._timeout = setTimeout(function() { self._operationTimeoutCb(self._attempts); }, self._operationTimeout); } }, timeout); return true; }; RetryOperation.prototype.attempt = function(fn, timeoutOps) { this._fn = fn; if (timeoutOps) { if (timeoutOps.timeout) { this._operationTimeout = timeoutOps.timeout; } if (timeoutOps.cb) { this._operationTimeoutCb = timeoutOps.cb; } } this._fn(this._attempts); var self = this; if (this._operationTimeoutCb) { this._timeout = setTimeout(function() { self._operationTimeoutCb(); }, self._operationTimeout); } }; RetryOperation.prototype.try = function(fn) { console.log('Using RetryOperation.try() is deprecated'); this.attempt(fn); }; RetryOperation.prototype.start = function(fn) { console.log('Using RetryOperation.start() is deprecated'); this.attempt(fn); }; RetryOperation.prototype.start = RetryOperation.prototype.try; RetryOperation.prototype.errors = function() { return this._errors; }; RetryOperation.prototype.attempts = function() { return this._attempts; }; RetryOperation.prototype.mainError = function() { if (this._errors.length === 0) { return null; } var counts = {}; var mainError = null; var mainErrorCount = 0; for (var i = 0; i < this._errors.length; i++) { var error = this._errors[i]; var message = error.message; var count = (counts[message] || 0) + 1; counts[message] = count; if (count >= mainErrorCount) { mainError = error; mainErrorCount = count; } } return mainError; };�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/retry/example/dns.js�����������������������000644 �000766 �000024 �00000001255 12455173731 027304� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var dns = require('dns'); var retry = require('../lib/retry'); function faultTolerantResolve(address, cb) { var opts = { times: 2, factor: 2, minTimeout: 1 * 1000, maxTimeout: 2 * 1000, randomize: true }; var operation = retry.operation(opts); operation.attempt(function(currentAttempt) { dns.resolve(address, function(err, addresses) { if (operation.retry(err)) { return; } cb(operation.mainError(), operation.errors(), addresses); }); }); } faultTolerantResolve('nodejs.org', function(err, errors, addresses) { console.warn('err:'); console.log(err); console.warn('addresses:'); console.log(addresses); });���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/.eslintrc��������������������������000644 �000766 �000024 �00000001303 12455173731 026670� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "env": { "node": true }, "rules": { // Disallow semi-colons, unless needed to disambiguate statement "semi": [2, "never"], // Require strings to use single quotes "quotes": [2, "single"], // Require curly braces for all control statements "curly": 2, // Disallow using variables and functions before they've been defined "no-use-before-define": 2, // Allow any case for variable naming "camelcase": 0, // Disallow unused variables, except as function arguments "no-unused-vars": [2, {"args":"none"}], // Allow leading underscores for method names // REASON: we use underscores to denote private methods "no-underscore-dangle": 0 } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/.npmignore�������������������������000644 �000766 �000024 �00000000023 12455173731 027041� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������tests node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/.travis.yml������������������������000644 �000766 �000024 �00000001025 12455173731 027156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10" before_install: npm install -g npm@~1.4.6 after_script: ./node_modules/.bin/istanbul cover ./node_modules/tape/bin/tape tests/test-*.js --report lcovonly && cat ./coverage/lcov.info | ./node_modules/coveralls/bin/coveralls.js --verbose webhooks: urls: https://webhooks.gitter.im/e/237280ed4796c19cc626 on_success: change # options: [always|never|change] default: always on_failure: always # options: [always|never|change] default: always on_start: false # default: false �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/CHANGELOG.md�����������������������000644 �000766 �000024 �00000111572 12455173731 026667� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������## Change Log ### v2.51.0 (2014/12/10) - [#1310](https://github.com/request/request/pull/1310) Revert changes introduced in https://github.com/request/request/pull/1282 (@simov) ### v2.50.0 (2014/12/09) - [#1308](https://github.com/request/request/pull/1308) Add browser test to keep track of browserify compability. (@eiriksm) - [#1299](https://github.com/request/request/pull/1299) Add optional support for jsonReviver (@poislagarde) - [#1277](https://github.com/request/request/pull/1277) Add Coveralls configuration (@simov) - [#1307](https://github.com/request/request/pull/1307) Upgrade form-data, add back browserify compability. Fixes #455. (@eiriksm) - [#1305](https://github.com/request/request/pull/1305) Fix typo in README.md (@LewisJEllis) - [#1288](https://github.com/request/request/pull/1288) Update README.md to explain custom file use case (@cliffcrosland) ### v2.49.0 (2014/11/28) - [#1295](https://github.com/request/request/pull/1295) fix(proxy): no-proxy false positive (@oliamb) - [#1292](https://github.com/request/request/pull/1292) Upgrade `caseless` to 0.8.1 (@mmalecki) - [#1276](https://github.com/request/request/pull/1276) Set transfer encoding for multipart/related to chunked by default (@simov) - [#1275](https://github.com/request/request/pull/1275) Fix multipart content-type headers detection (@simov) - [#1269](https://github.com/request/request/pull/1269) adds streams example for review (@tbuchok) - [#1238](https://github.com/request/request/pull/1238) Add examples README.md (@simov) ### v2.48.0 (2014/11/12) - [#1263](https://github.com/request/request/pull/1263) Fixed a syntax error / typo in README.md (@xna2) - [#1253](https://github.com/request/request/pull/1253) Add multipart chunked flag (@simov, @nylen) - [#1251](https://github.com/request/request/pull/1251) Clarify that defaults() does not modify global defaults (@nylen) - [#1250](https://github.com/request/request/pull/1250) Improve documentation for pool and maxSockets options (@nylen) - [#1237](https://github.com/request/request/pull/1237) Documenting error handling when using streams (@vmattos) - [#1244](https://github.com/request/request/pull/1244) Finalize changelog command (@nylen) - [#1241](https://github.com/request/request/pull/1241) Fix typo (@alexanderGugel) - [#1223](https://github.com/request/request/pull/1223) Show latest version number instead of "upcoming" in changelog (@nylen) - [#1236](https://github.com/request/request/pull/1236) Document how to use custom CA in README (#1229) (@hypesystem) - [#1228](https://github.com/request/request/pull/1228) Support for oauth with RSA-SHA1 signing (@nylen) - [#1216](https://github.com/request/request/pull/1216) Made json and multipart options coexist (@nylen, @simov) - [#1225](https://github.com/request/request/pull/1225) Allow header white/exclusive lists in any case. (@RReverser) ### v2.47.0 (2014/10/26) - [#1222](https://github.com/request/request/pull/1222) Move from mikeal/request to request/request (@nylen) - [#1220](https://github.com/request/request/pull/1220) update qs dependency to 2.3.1 (@FredKSchott) - [#1212](https://github.com/request/request/pull/1212) Improve tests/test-timeout.js (@nylen) - [#1219](https://github.com/request/request/pull/1219) remove old globalAgent workaround for node 0.4 (@request) - [#1214](https://github.com/request/request/pull/1214) Remove cruft left over from optional dependencies (@nylen) - [#1215](https://github.com/request/request/pull/1215) Add proxyHeaderExclusiveList option for proxy-only headers. (@RReverser) - [#1211](https://github.com/request/request/pull/1211) Allow 'Host' header instead of 'host' and remember case across redirects (@nylen) - [#1208](https://github.com/request/request/pull/1208) Improve release script (@nylen) - [#1213](https://github.com/request/request/pull/1213) Support for custom cookie store (@nylen, @mitsuru) - [#1197](https://github.com/request/request/pull/1197) Clean up some code around setting the agent (@FredKSchott) - [#1209](https://github.com/request/request/pull/1209) Improve multipart form append test (@simov) - [#1207](https://github.com/request/request/pull/1207) Update changelog (@nylen) - [#1185](https://github.com/request/request/pull/1185) Stream multipart/related bodies (@simov) ### v2.46.0 (2014/10/23) - [#1198](https://github.com/request/request/pull/1198) doc for TLS/SSL protocol options (@shawnzhu) - [#1200](https://github.com/request/request/pull/1200) Add a Gitter chat badge to README.md (@gitter-badger) - [#1196](https://github.com/request/request/pull/1196) Upgrade taper test reporter to v0.3.0 (@nylen) - [#1199](https://github.com/request/request/pull/1199) Fix lint error: undeclared var i (@nylen) - [#1191](https://github.com/request/request/pull/1191) Move self.proxy decision logic out of init and into a helper (@FredKSchott) - [#1190](https://github.com/request/request/pull/1190) Move _buildRequest() logic back into init (@FredKSchott) - [#1186](https://github.com/request/request/pull/1186) Support Smarter Unix URL Scheme (@FredKSchott) - [#1178](https://github.com/request/request/pull/1178) update form documentation for new usage (@FredKSchott) - [#1180](https://github.com/request/request/pull/1180) Enable no-mixed-requires linting rule (@nylen) - [#1184](https://github.com/request/request/pull/1184) Don't forward authorization header across redirects to different hosts (@nylen) - [#1183](https://github.com/request/request/pull/1183) Correct README about pre and postamble CRLF using multipart and not mult... (@netpoetica) - [#1179](https://github.com/request/request/pull/1179) Lint tests directory (@nylen) - [#1169](https://github.com/request/request/pull/1169) add metadata for form-data file field (@dotcypress) - [#1173](https://github.com/request/request/pull/1173) remove optional dependencies (@seanstrom) - [#1165](https://github.com/request/request/pull/1165) Cleanup event listeners and remove function creation from init (@FredKSchott) - [#1174](https://github.com/request/request/pull/1174) update the request.cookie docs to have a valid cookie example (@seanstrom) - [#1168](https://github.com/request/request/pull/1168) create a detach helper and use detach helper in replace of nextTick (@seanstrom) - [#1171](https://github.com/request/request/pull/1171) in post can send form data and use callback (@MiroRadenovic) - [#1159](https://github.com/request/request/pull/1159) accept charset for x-www-form-urlencoded content-type (@seanstrom) - [#1157](https://github.com/request/request/pull/1157) Update README.md: body with json=true (@Rob--W) - [#1164](https://github.com/request/request/pull/1164) Disable tests/test-timeout.js on Travis (@nylen) - [#1153](https://github.com/request/request/pull/1153) Document how to run a single test (@nylen) - [#1144](https://github.com/request/request/pull/1144) adds documentation for the "response" event within the streaming section (@tbuchok) - [#1162](https://github.com/request/request/pull/1162) Update eslintrc file to no longer allow past errors (@FredKSchott) - [#1155](https://github.com/request/request/pull/1155) Support/use self everywhere (@seanstrom) - [#1161](https://github.com/request/request/pull/1161) fix no-use-before-define lint warnings (@emkay) - [#1156](https://github.com/request/request/pull/1156) adding curly brackets to get rid of lint errors (@emkay) - [#1151](https://github.com/request/request/pull/1151) Fix localAddress test on OS X (@nylen) - [#1145](https://github.com/request/request/pull/1145) documentation: fix outdated reference to setCookieSync old name in README (@FredKSchott) - [#1131](https://github.com/request/request/pull/1131) Update pool documentation (@FredKSchott) - [#1143](https://github.com/request/request/pull/1143) Rewrite all tests to use tape (@nylen) - [#1137](https://github.com/request/request/pull/1137) Add ability to specifiy querystring lib in options. (@jgrund) - [#1138](https://github.com/request/request/pull/1138) allow hostname and port in place of host on uri (@cappslock) - [#1134](https://github.com/request/request/pull/1134) Fix multiple redirects and `self.followRedirect` (@blakeembrey) - [#1130](https://github.com/request/request/pull/1130) documentation fix: add note about npm test for contributing (@FredKSchott) - [#1120](https://github.com/request/request/pull/1120) Support/refactor request setup tunnel (@seanstrom) - [#1129](https://github.com/request/request/pull/1129) linting fix: convert double quote strings to use single quotes (@FredKSchott) - [#1124](https://github.com/request/request/pull/1124) linting fix: remove unneccesary semi-colons (@FredKSchott) ### v2.45.0 (2014/10/06) - [#1128](https://github.com/request/request/pull/1128) Add test for setCookie regression (@nylen) - [#1127](https://github.com/request/request/pull/1127) added tests around using objects as values in a query string (@bcoe) - [#1103](https://github.com/request/request/pull/1103) Support/refactor request constructor (@nylen, @seanstrom) - [#1119](https://github.com/request/request/pull/1119) add basic linting to request library (@FredKSchott) - [#1121](https://github.com/request/request/pull/1121) Revert "Explicitly use sync versions of cookie functions" (@nylen) - [#1118](https://github.com/request/request/pull/1118) linting fix: Restructure bad empty if statement (@FredKSchott) - [#1117](https://github.com/request/request/pull/1117) Fix a bad check for valid URIs (@FredKSchott) - [#1113](https://github.com/request/request/pull/1113) linting fix: space out operators (@FredKSchott) - [#1116](https://github.com/request/request/pull/1116) Fix typo in `noProxyHost` definition (@FredKSchott) - [#1114](https://github.com/request/request/pull/1114) linting fix: Added a `new` operator that was missing when creating and throwing a new error (@FredKSchott) - [#1096](https://github.com/request/request/pull/1096) No_proxy support (@samcday) - [#1107](https://github.com/request/request/pull/1107) linting-fix: remove unused variables (@FredKSchott) - [#1112](https://github.com/request/request/pull/1112) linting fix: Make return values consistent and more straitforward (@FredKSchott) - [#1111](https://github.com/request/request/pull/1111) linting fix: authPieces was getting redeclared (@FredKSchott) - [#1105](https://github.com/request/request/pull/1105) Use strict mode in request (@FredKSchott) - [#1110](https://github.com/request/request/pull/1110) linting fix: replace lazy '==' with more strict '===' (@FredKSchott) - [#1109](https://github.com/request/request/pull/1109) linting fix: remove function call from if-else conditional statement (@FredKSchott) - [#1102](https://github.com/request/request/pull/1102) Fix to allow setting a `requester` on recursive calls to `request.defaults` (@tikotzky) - [#1095](https://github.com/request/request/pull/1095) Tweaking engines in package.json (@pdehaan) - [#1082](https://github.com/request/request/pull/1082) Forward the socket event from the httpModule request (@seanstrom) - [#972](https://github.com/request/request/pull/972) Clarify gzip handling in the README (@kevinoid) - [#1089](https://github.com/request/request/pull/1089) Mention that encoding defaults to utf8, not Buffer (@stuartpb) - [#1088](https://github.com/request/request/pull/1088) Fix cookie example in README.md and make it more clear (@pipi32167) - [#1027](https://github.com/request/request/pull/1027) Add support for multipart form data in request options. (@crocket) - [#1076](https://github.com/request/request/pull/1076) use Request.abort() to abort the request when the request has timed-out (@seanstrom) - [#1068](https://github.com/request/request/pull/1068) add optional postamble required by .NET multipart requests (@netpoetica) ### v2.43.0 (2014/09/18) - [#1057](https://github.com/request/request/pull/1057) Defaults should not overwrite defined options (@davidwood) - [#1046](https://github.com/request/request/pull/1046) Propagate datastream errors, useful in case gzip fails. (@ZJONSSON, @Janpot) - [#1063](https://github.com/request/request/pull/1063) copy the input headers object #1060 (@finnp) - [#1031](https://github.com/request/request/pull/1031) Explicitly use sync versions of cookie functions (@ZJONSSON) - [#1056](https://github.com/request/request/pull/1056) Fix redirects when passing url.parse(x) as URL to convenience method (@nylen) ### v2.42.0 (2014/09/04) - [#1053](https://github.com/request/request/pull/1053) Fix #1051 Parse auth properly when using non-tunneling proxy (@isaacs) ### v2.41.0 (2014/09/04) - [#1050](https://github.com/request/request/pull/1050) Pass whitelisted headers to tunneling proxy. Organize all tunneling logic. (@isaacs, @Feldhacker) - [#1035](https://github.com/request/request/pull/1035) souped up nodei.co badge (@rvagg) - [#1048](https://github.com/request/request/pull/1048) Aws is now possible over a proxy (@steven-aerts) - [#1039](https://github.com/request/request/pull/1039) extract out helper functions to a helper file (@seanstrom) - [#1021](https://github.com/request/request/pull/1021) Support/refactor indexjs (@seanstrom) - [#1033](https://github.com/request/request/pull/1033) Improve and document debug options (@nylen) - [#1034](https://github.com/request/request/pull/1034) Fix readme headings (@nylen) - [#1030](https://github.com/request/request/pull/1030) Allow recursive request.defaults (@tikotzky) - [#1029](https://github.com/request/request/pull/1029) Fix a couple of typos (@nylen) - [#675](https://github.com/request/request/pull/675) Checking for SSL fault on connection before reading SSL properties (@VRMink) - [#989](https://github.com/request/request/pull/989) Added allowRedirect function. Should return true if redirect is allowed or false otherwise (@doronin) - [#1025](https://github.com/request/request/pull/1025) [fixes #1023] Set self._ended to true once response has ended (@mridgway) - [#1020](https://github.com/request/request/pull/1020) Add back removed debug metadata (@FredKSchott) - [#1008](https://github.com/request/request/pull/1008) Moving to module instead of cutomer buffer concatenation. (@mikeal) - [#770](https://github.com/request/request/pull/770) Added dependency badge for README file; (@timgluz) - [#1016](https://github.com/request/request/pull/1016) toJSON no longer results in an infinite loop, returns simple objects (@FredKSchott) - [#1018](https://github.com/request/request/pull/1018) Remove pre-0.4.4 HTTPS fix (@mmalecki) - [#1006](https://github.com/request/request/pull/1006) Migrate to caseless, fixes #1001 (@mikeal) - [#995](https://github.com/request/request/pull/995) Fix parsing array of objects (@sjonnet19) - [#999](https://github.com/request/request/pull/999) Fix fallback for browserify for optional modules. (@eiriksm) - [#996](https://github.com/request/request/pull/996) Wrong oauth signature when multiple same param keys exist [updated] (@bengl) ### v2.40.0 (2014/08/06) - [#992](https://github.com/request/request/pull/992) Fix security vulnerability. Update qs (@poeticninja) - [#988](https://github.com/request/request/pull/988) “--” -> “—” (@upisfree) - [#987](https://github.com/request/request/pull/987) Show optional modules as being loaded by the module that reqeusted them (@iarna) ### v2.39.0 (2014/07/24) - [#976](https://github.com/request/request/pull/976) Update README.md (@fosco-maestro) ### v2.38.0 (2014/07/22) - [#952](https://github.com/request/request/pull/952) Adding support to client certificate with proxy use case (@ofirshaked) - [#884](https://github.com/request/request/pull/884) Documented tough-cookie installation. (@wbyoung) - [#935](https://github.com/request/request/pull/935) Correct repository url (@fritx) - [#963](https://github.com/request/request/pull/963) Update changelog (@nylen) - [#960](https://github.com/request/request/pull/960) Support gzip with encoding on node pre-v0.9.4 (@kevinoid) - [#953](https://github.com/request/request/pull/953) Add async Content-Length computation when using form-data (@LoicMahieu) - [#844](https://github.com/request/request/pull/844) Add support for HTTP[S]_PROXY environment variables. Fixes #595. (@jvmccarthy) - [#946](https://github.com/request/request/pull/946) defaults: merge headers (@aj0strow) ### v2.37.0 (2014/07/07) - [#957](https://github.com/request/request/pull/957) Silence EventEmitter memory leak warning #311 (@watson) - [#955](https://github.com/request/request/pull/955) check for content-length header before setting it in nextTick (@camilleanne) - [#951](https://github.com/request/request/pull/951) Add support for gzip content decoding (@kevinoid) - [#949](https://github.com/request/request/pull/949) Manually enter querystring in form option (@charlespwd) - [#944](https://github.com/request/request/pull/944) Make request work with browserify (@eiriksm) - [#943](https://github.com/request/request/pull/943) New mime module (@eiriksm) - [#927](https://github.com/request/request/pull/927) Bump version of hawk dep. (@samccone) - [#907](https://github.com/request/request/pull/907) append secureOptions to poolKey (@medovob) ### v2.35.0 (2014/05/17) - [#901](https://github.com/request/request/pull/901) Fixes #555 (@pigulla) - [#897](https://github.com/request/request/pull/897) merge with default options (@vohof) - [#891](https://github.com/request/request/pull/891) fixes 857 - options object is mutated by calling request (@lalitkapoor) - [#869](https://github.com/request/request/pull/869) Pipefilter test (@tgohn) - [#866](https://github.com/request/request/pull/866) Fix typo (@dandv) - [#861](https://github.com/request/request/pull/861) Add support for RFC 6750 Bearer Tokens (@phedny) - [#809](https://github.com/request/request/pull/809) upgrade tunnel-proxy to 0.4.0 (@ksato9700) - [#850](https://github.com/request/request/pull/850) Fix word consistency in readme (@0xNobody) - [#810](https://github.com/request/request/pull/810) add some exposition to mpu example in README.md (@mikermcneil) - [#840](https://github.com/request/request/pull/840) improve error reporting for invalid protocols (@FND) - [#821](https://github.com/request/request/pull/821) added secureOptions back (@nw) - [#815](https://github.com/request/request/pull/815) Create changelog based on pull requests (@lalitkapoor) ### v2.34.0 (2014/02/18) - [#516](https://github.com/request/request/pull/516) UNIX Socket URL Support (@lyuzashi) - [#801](https://github.com/request/request/pull/801) 794 ignore cookie parsing and domain errors (@lalitkapoor) - [#802](https://github.com/request/request/pull/802) Added the Apache license to the package.json. (@keskival) - [#793](https://github.com/request/request/pull/793) Adds content-length calculation when submitting forms using form-data li... (@Juul) - [#785](https://github.com/request/request/pull/785) Provide ability to override content-type when `json` option used (@vvo) - [#781](https://github.com/request/request/pull/781) simpler isReadStream function (@joaojeronimo) ### v2.32.0 (2014/01/16) - [#767](https://github.com/request/request/pull/767) Use tough-cookie CookieJar sync API (@stash) - [#764](https://github.com/request/request/pull/764) Case-insensitive authentication scheme (@bobyrizov) - [#763](https://github.com/request/request/pull/763) Upgrade tough-cookie to 0.10.0 (@stash) - [#744](https://github.com/request/request/pull/744) Use Cookie.parse (@lalitkapoor) - [#757](https://github.com/request/request/pull/757) require aws-sign2 (@mafintosh) ### v2.31.0 (2014/01/08) - [#645](https://github.com/request/request/pull/645) update twitter api url to v1.1 (@mick) - [#746](https://github.com/request/request/pull/746) README: Markdown code highlight (@weakish) - [#745](https://github.com/request/request/pull/745) updating setCookie example to make it clear that the callback is required (@emkay) - [#742](https://github.com/request/request/pull/742) Add note about JSON output body type (@iansltx) - [#741](https://github.com/request/request/pull/741) README example is using old cookie jar api (@emkay) - [#736](https://github.com/request/request/pull/736) Fix callback arguments documentation (@mmalecki) ### v2.30.0 (2013/12/13) - [#732](https://github.com/request/request/pull/732) JSHINT: Creating global 'for' variable. Should be 'for (var ...'. (@Fritz-Lium) - [#730](https://github.com/request/request/pull/730) better HTTP DIGEST support (@dai-shi) - [#728](https://github.com/request/request/pull/728) Fix TypeError when calling request.cookie (@scarletmeow) ### v2.29.0 (2013/12/06) - [#727](https://github.com/request/request/pull/727) fix requester bug (@jchris) ### v2.28.0 (2013/12/04) - [#724](https://github.com/request/request/pull/724) README.md: add custom HTTP Headers example. (@tcort) - [#719](https://github.com/request/request/pull/719) Made a comment gender neutral. (@oztu) - [#715](https://github.com/request/request/pull/715) Request.multipart no longer crashes when header 'Content-type' present (@pastaclub) - [#710](https://github.com/request/request/pull/710) Fixing listing in callback part of docs. (@lukasz-zak) - [#696](https://github.com/request/request/pull/696) Edited README.md for formatting and clarity of phrasing (@Zearin) - [#694](https://github.com/request/request/pull/694) Typo in README (@VRMink) - [#690](https://github.com/request/request/pull/690) Handle blank password in basic auth. (@diversario) - [#682](https://github.com/request/request/pull/682) Optional dependencies (@Turbo87) - [#683](https://github.com/request/request/pull/683) Travis CI support (@Turbo87) - [#674](https://github.com/request/request/pull/674) change cookie module,to tough-cookie.please check it . (@sxyizhiren) - [#666](https://github.com/request/request/pull/666) make `ciphers` and `secureProtocol` to work in https request (@richarddong) - [#656](https://github.com/request/request/pull/656) Test case for #304. (@diversario) - [#662](https://github.com/request/request/pull/662) option.tunnel to explicitly disable tunneling (@seanmonstar) - [#659](https://github.com/request/request/pull/659) fix failure when running with NODE_DEBUG=request, and a test for that (@jrgm) - [#630](https://github.com/request/request/pull/630) Send random cnonce for HTTP Digest requests (@wprl) ### v2.27.0 (2013/08/15) - [#619](https://github.com/request/request/pull/619) decouple things a bit (@joaojeronimo) ### v2.26.0 (2013/08/07) - [#613](https://github.com/request/request/pull/613) Fixes #583, moved initialization of self.uri.pathname (@lexander) - [#605](https://github.com/request/request/pull/605) Only include ":" + pass in Basic Auth if it's defined (fixes #602) (@bendrucker) ### v2.24.0 (2013/07/23) - [#596](https://github.com/request/request/pull/596) Global agent is being used when pool is specified (@Cauldrath) - [#594](https://github.com/request/request/pull/594) Emit complete event when there is no callback (@RomainLK) - [#601](https://github.com/request/request/pull/601) Fixed a small typo (@michalstanko) ### v2.23.0 (2013/07/23) - [#589](https://github.com/request/request/pull/589) Prevent setting headers after they are sent (@geek) - [#587](https://github.com/request/request/pull/587) Global cookie jar disabled by default (@threepointone) ### v2.22.0 (2013/07/05) - [#544](https://github.com/request/request/pull/544) Update http-signature version. (@davidlehn) - [#581](https://github.com/request/request/pull/581) Fix spelling of "ignoring." (@bigeasy) - [#568](https://github.com/request/request/pull/568) use agentOptions to create agent when specified in request (@SamPlacette) - [#564](https://github.com/request/request/pull/564) Fix redirections (@criloz) - [#541](https://github.com/request/request/pull/541) The exported request function doesn't have an auth method (@tschaub) - [#542](https://github.com/request/request/pull/542) Expose Request class (@regality) ### v2.21.0 (2013/04/30) - [#536](https://github.com/request/request/pull/536) Allow explicitly empty user field for basic authentication. (@mikeando) - [#532](https://github.com/request/request/pull/532) fix typo (@fredericosilva) - [#497](https://github.com/request/request/pull/497) Added redirect event (@Cauldrath) - [#503](https://github.com/request/request/pull/503) Fix basic auth for passwords that contain colons (@tonistiigi) - [#521](https://github.com/request/request/pull/521) Improving test-localAddress.js (@noway421) - [#529](https://github.com/request/request/pull/529) dependencies versions bump (@jodaka) ### v2.17.0 (2013/04/22) - [#523](https://github.com/request/request/pull/523) Updating dependencies (@noway421) - [#520](https://github.com/request/request/pull/520) Fixing test-tunnel.js (@noway421) - [#519](https://github.com/request/request/pull/519) Update internal path state on post-creation QS changes (@jblebrun) - [#510](https://github.com/request/request/pull/510) Add HTTP Signature support. (@davidlehn) - [#502](https://github.com/request/request/pull/502) Fix POST (and probably other) requests that are retried after 401 Unauthorized (@nylen) - [#508](https://github.com/request/request/pull/508) Honor the .strictSSL option when using proxies (tunnel-agent) (@jhs) - [#512](https://github.com/request/request/pull/512) Make password optional to support the format: http://username@hostname/ (@pajato1) - [#513](https://github.com/request/request/pull/513) add 'localAddress' support (@yyfrankyy) - [#498](https://github.com/request/request/pull/498) Moving response emit above setHeaders on destination streams (@kenperkins) - [#490](https://github.com/request/request/pull/490) Empty response body (3-rd argument) must be passed to callback as an empty string (@Olegas) - [#479](https://github.com/request/request/pull/479) Changing so if Accept header is explicitly set, sending json does not ov... (@RoryH) - [#475](https://github.com/request/request/pull/475) Use `unescape` from `querystring` (@shimaore) - [#473](https://github.com/request/request/pull/473) V0.10 compat (@isaacs) - [#471](https://github.com/request/request/pull/471) Using querystring library from visionmedia (@kbackowski) - [#461](https://github.com/request/request/pull/461) Strip the UTF8 BOM from a UTF encoded response (@kppullin) - [#460](https://github.com/request/request/pull/460) hawk 0.10.0 (@hueniverse) - [#462](https://github.com/request/request/pull/462) if query params are empty, then request path shouldn't end with a '?' (merges cleanly now) (@jaipandya) - [#456](https://github.com/request/request/pull/456) hawk 0.9.0 (@hueniverse) - [#429](https://github.com/request/request/pull/429) Copy options before adding callback. (@nrn) - [#454](https://github.com/request/request/pull/454) Destroy the response if present when destroying the request (clean merge) (@mafintosh) - [#310](https://github.com/request/request/pull/310) Twitter Oauth Stuff Out of Date; Now Updated (@joemccann) - [#413](https://github.com/request/request/pull/413) rename googledoodle.png to .jpg (@nfriedly) - [#448](https://github.com/request/request/pull/448) Convenience method for PATCH (@mloar) - [#444](https://github.com/request/request/pull/444) protect against double callbacks on error path (@spollack) - [#433](https://github.com/request/request/pull/433) Added support for HTTPS cert & key (@mmalecki) - [#430](https://github.com/request/request/pull/430) Respect specified {Host,host} headers, not just {host} (@andrewschaaf) - [#415](https://github.com/request/request/pull/415) Fixed a typo. (@jerem) - [#338](https://github.com/request/request/pull/338) Add more auth options, including digest support (@nylen) - [#403](https://github.com/request/request/pull/403) Optimize environment lookup to happen once only (@mmalecki) - [#398](https://github.com/request/request/pull/398) Add more reporting to tests (@mmalecki) - [#388](https://github.com/request/request/pull/388) Ensure "safe" toJSON doesn't break EventEmitters (@othiym23) - [#381](https://github.com/request/request/pull/381) Resolving "Invalid signature. Expected signature base string: " (@landeiro) - [#380](https://github.com/request/request/pull/380) Fixes missing host header on retried request when using forever agent (@mac-) - [#376](https://github.com/request/request/pull/376) Headers lost on redirect (@kapetan) - [#375](https://github.com/request/request/pull/375) Fix for missing oauth_timestamp parameter (@jplock) - [#374](https://github.com/request/request/pull/374) Correct Host header for proxy tunnel CONNECT (@youurayy) - [#370](https://github.com/request/request/pull/370) Twitter reverse auth uses x_auth_mode not x_auth_type (@drudge) - [#369](https://github.com/request/request/pull/369) Don't remove x_auth_mode for Twitter reverse auth (@drudge) - [#344](https://github.com/request/request/pull/344) Make AWS auth signing find headers correctly (@nlf) - [#363](https://github.com/request/request/pull/363) rfc3986 on base_uri, now passes tests (@jeffmarshall) - [#362](https://github.com/request/request/pull/362) Running `rfc3986` on `base_uri` in `oauth.hmacsign` instead of just `encodeURIComponent` (@jeffmarshall) - [#361](https://github.com/request/request/pull/361) Don't create a Content-Length header if we already have it set (@danjenkins) - [#360](https://github.com/request/request/pull/360) Delete self._form along with everything else on redirect (@jgautier) - [#355](https://github.com/request/request/pull/355) stop sending erroneous headers on redirected requests (@azylman) - [#332](https://github.com/request/request/pull/332) Fix #296 - Only set Content-Type if body exists (@Marsup) - [#343](https://github.com/request/request/pull/343) Allow AWS to work in more situations, added a note in the README on its usage (@nlf) - [#320](https://github.com/request/request/pull/320) request.defaults() doesn't need to wrap jar() (@StuartHarris) - [#322](https://github.com/request/request/pull/322) Fix + test for piped into request bumped into redirect. #321 (@alexindigo) - [#326](https://github.com/request/request/pull/326) Do not try to remove listener from an undefined connection (@strk) - [#318](https://github.com/request/request/pull/318) Pass servername to tunneling secure socket creation (@isaacs) - [#317](https://github.com/request/request/pull/317) Workaround for #313 (@isaacs) - [#293](https://github.com/request/request/pull/293) Allow parser errors to bubble up to request (@mscdex) - [#290](https://github.com/request/request/pull/290) A test for #289 (@isaacs) - [#280](https://github.com/request/request/pull/280) Like in node.js print options if NODE_DEBUG contains the word request (@Filirom1) - [#207](https://github.com/request/request/pull/207) Fix #206 Change HTTP/HTTPS agent when redirecting between protocols (@isaacs) - [#214](https://github.com/request/request/pull/214) documenting additional behavior of json option (@jphaas) - [#272](https://github.com/request/request/pull/272) Boundary begins with CRLF? (@elspoono) - [#284](https://github.com/request/request/pull/284) Remove stray `console.log()` call in multipart generator. (@bcherry) - [#241](https://github.com/request/request/pull/241) Composability updates suggested by issue #239 (@polotek) - [#282](https://github.com/request/request/pull/282) OAuth Authorization header contains non-"oauth_" parameters (@jplock) - [#279](https://github.com/request/request/pull/279) fix tests with boundary by injecting boundry from header (@benatkin) - [#273](https://github.com/request/request/pull/273) Pipe back pressure issue (@mafintosh) - [#268](https://github.com/request/request/pull/268) I'm not OCD seriously (@TehShrike) - [#263](https://github.com/request/request/pull/263) Bug in OAuth key generation for sha1 (@nanodocumet) - [#265](https://github.com/request/request/pull/265) uncaughtException when redirected to invalid URI (@naholyr) - [#262](https://github.com/request/request/pull/262) JSON test should check for equality (@timshadel) - [#261](https://github.com/request/request/pull/261) Setting 'pool' to 'false' does NOT disable Agent pooling (@timshadel) - [#249](https://github.com/request/request/pull/249) Fix for the fix of your (closed) issue #89 where self.headers[content-length] is set to 0 for all methods (@sethbridges) - [#255](https://github.com/request/request/pull/255) multipart allow body === '' ( the empty string ) (@Filirom1) - [#260](https://github.com/request/request/pull/260) fixed just another leak of 'i' (@sreuter) - [#246](https://github.com/request/request/pull/246) Fixing the set-cookie header (@jeromegn) - [#243](https://github.com/request/request/pull/243) Dynamic boundary (@zephrax) - [#240](https://github.com/request/request/pull/240) don't error when null is passed for options (@polotek) - [#211](https://github.com/request/request/pull/211) Replace all occurrences of special chars in RFC3986 (@chriso) - [#224](https://github.com/request/request/pull/224) Multipart content-type change (@janjongboom) - [#217](https://github.com/request/request/pull/217) need to use Authorization (titlecase) header with Tumblr OAuth (@visnup) - [#203](https://github.com/request/request/pull/203) Fix cookie and redirect bugs and add auth support for HTTPS tunnel (@milewise) - [#199](https://github.com/request/request/pull/199) Tunnel (@isaacs) - [#198](https://github.com/request/request/pull/198) Bugfix on forever usage of util.inherits (@isaacs) - [#197](https://github.com/request/request/pull/197) Make ForeverAgent work with HTTPS (@isaacs) - [#193](https://github.com/request/request/pull/193) Fixes GH-119 (@goatslacker) - [#188](https://github.com/request/request/pull/188) Add abort support to the returned request (@itay) - [#176](https://github.com/request/request/pull/176) Querystring option (@csainty) - [#182](https://github.com/request/request/pull/182) Fix request.defaults to support (uri, options, callback) api (@twilson63) - [#180](https://github.com/request/request/pull/180) Modified the post, put, head and del shortcuts to support uri optional param (@twilson63) - [#179](https://github.com/request/request/pull/179) fix to add opts in .pipe(stream, opts) (@substack) - [#177](https://github.com/request/request/pull/177) Issue #173 Support uri as first and optional config as second argument (@twilson63) - [#170](https://github.com/request/request/pull/170) can't create a cookie in a wrapped request (defaults) (@fabianonunes) - [#168](https://github.com/request/request/pull/168) Picking off an EasyFix by adding some missing mimetypes. (@serby) - [#161](https://github.com/request/request/pull/161) Fix cookie jar/headers.cookie collision (#125) (@papandreou) - [#162](https://github.com/request/request/pull/162) Fix issue #159 (@dpetukhov) - [#90](https://github.com/request/request/pull/90) add option followAllRedirects to follow post/put redirects (@jroes) - [#148](https://github.com/request/request/pull/148) Retry Agent (@thejh) - [#146](https://github.com/request/request/pull/146) Multipart should respect content-type if previously set (@apeace) - [#144](https://github.com/request/request/pull/144) added "form" option to readme (@petejkim) - [#133](https://github.com/request/request/pull/133) Fixed cookies parsing (@afanasy) - [#135](https://github.com/request/request/pull/135) host vs hostname (@iangreenleaf) - [#132](https://github.com/request/request/pull/132) return the body as a Buffer when encoding is set to null (@jahewson) - [#112](https://github.com/request/request/pull/112) Support using a custom http-like module (@jhs) - [#104](https://github.com/request/request/pull/104) Cookie handling contains bugs (@janjongboom) - [#121](https://github.com/request/request/pull/121) Another patch for cookie handling regression (@jhurliman) - [#117](https://github.com/request/request/pull/117) Remove the global `i` (@3rd-Eden) - [#110](https://github.com/request/request/pull/110) Update to Iris Couch URL (@jhs) - [#86](https://github.com/request/request/pull/86) Can't post binary to multipart requests (@developmentseed) - [#105](https://github.com/request/request/pull/105) added test for proxy option. (@dominictarr) - [#102](https://github.com/request/request/pull/102) Implemented cookies - closes issue 82: https://github.com/mikeal/request/issues/82 (@alessioalex) - [#97](https://github.com/request/request/pull/97) Typo in previous pull causes TypeError in non-0.5.11 versions (@isaacs) - [#96](https://github.com/request/request/pull/96) Authless parsed url host support (@isaacs) - [#81](https://github.com/request/request/pull/81) Enhance redirect handling (@danmactough) - [#78](https://github.com/request/request/pull/78) Don't try to do strictSSL for non-ssl connections (@isaacs) - [#76](https://github.com/request/request/pull/76) Bug when a request fails and a timeout is set (@Marsup) - [#70](https://github.com/request/request/pull/70) add test script to package.json (@isaacs) - [#73](https://github.com/request/request/pull/73) Fix #71 Respect the strictSSL flag (@isaacs) - [#69](https://github.com/request/request/pull/69) Flatten chunked requests properly (@isaacs) - [#67](https://github.com/request/request/pull/67) fixed global variable leaks (@aheckmann) - [#66](https://github.com/request/request/pull/66) Do not overwrite established content-type headers for read stream deliver (@voodootikigod) - [#53](https://github.com/request/request/pull/53) Parse json: Issue #51 (@benatkin) - [#45](https://github.com/request/request/pull/45) Added timeout option (@mbrevoort) - [#35](https://github.com/request/request/pull/35) The "end" event isn't emitted for some responses (@voxpelli) - [#31](https://github.com/request/request/pull/31) Error on piping a request to a destination (@tobowers)��������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/CONTRIBUTING.md��������������������000644 �000766 �000024 �00000003435 12455173731 027305� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# This is an OPEN Open Source Project ----------------------------------------- ## What? Individuals making significant and valuable contributions are given commit-access to the project to contribute as they see fit. This project is more like an open wiki than a standard guarded open source project. ## Rules There are a few basic ground-rules for contributors: 1. **No `--force` pushes** or modifying the Git history in any way. 1. **Non-master branches** ought to be used for ongoing work. 1. **External API changes and significant modifications** ought to be subject to an **internal pull-request** to solicit feedback from other contributors. 1. Internal pull-requests to solicit feedback are *encouraged* for any other non-trivial contribution but left to the discretion of the contributor. 1. For significant changes wait a full 24 hours before merging so that active contributors who are distributed throughout the world have a chance to weigh in. 1. Contributors should attempt to adhere to the prevailing code-style. 1. Run `npm test` locally before submitting your PR, to catch any easy to miss style & testing issues. To diagnose test failures, there are two ways to run a single test file: - `node_modules/.bin/taper tests/test-file.js` - run using the default [`taper`](https://github.com/nylen/taper) test reporter. - `node tests/test-file.js` - view the raw [tap](https://testanything.org/) output. ## Releases Declaring formal releases remains the prerogative of the project maintainer. ## Changes to this arrangement This is an experiment and feedback is welcome! This document may also be subject to pull-requests or changes by contributors where you believe you have something valuable to add or change. ----------------------------------------- �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/disabled.appveyor.yml��������������000644 �000766 �000024 �00000001654 12455173731 031213� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# http://www.appveyor.com/docs/appveyor-yml # Fix line endings in Windows. (runs before repo cloning) init: - git config --global core.autocrlf input # Test against these versions of Node.js. environment: matrix: - nodejs_version: "0.10" - nodejs_version: "0.8" - nodejs_version: "0.11" # Allow failing jobs for bleeding-edge Node.js versions. matrix: allow_failures: - nodejs_version: "0.11" # Install scripts. (runs after repo cloning) install: # Get the latest stable version of Node 0.STABLE.latest - ps: Update-NodeJsInstallation (Get-NodeJsLatestBuild $env:nodejs_version) # Typical npm stuff. - npm install # Post-install test scripts. test_script: # Output useful info for debugging. - ps: "npm test # PowerShell" # Pass comment to PS for easier debugging - cmd: npm test # Don't actually build. build: off # Set build version format here instead of in the admin panel. version: "{build}" ������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/examples/��������������������������000755 �000766 �000024 �00000000000 12456115120 026652� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/index.js���������������������������000755 �000766 �000024 �00000012172 12455173731 026522� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2010-2012 Mikeal Rogers // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. 'use strict' var extend = require('util')._extend , cookies = require('./lib/cookies') , helpers = require('./lib/helpers') var isFunction = helpers.isFunction , constructObject = helpers.constructObject , filterForCallback = helpers.filterForCallback , constructOptionsFrom = helpers.constructOptionsFrom , paramsHaveRequestBody = helpers.paramsHaveRequestBody // organize params for patch, post, put, head, del function initParams(uri, options, callback) { callback = filterForCallback([options, callback]) options = constructOptionsFrom(uri, options) return constructObject() .extend({callback: callback}) .extend({options: options}) .extend({uri: options.uri}) .done() } function request (uri, options, callback) { if (typeof uri === 'undefined') { throw new Error('undefined is not a valid uri or options object.') } var params = initParams(uri, options, callback) options = params.options options.callback = params.callback options.uri = params.uri return new request.Request(options) } function requester(params) { if(typeof params.options._requester === 'function') { return params.options._requester } return request } request.get = function (uri, options, callback) { var params = initParams(uri, options, callback) params.options.method = 'GET' return requester(params)(params.uri || null, params.options, params.callback) } request.head = function (uri, options, callback) { var params = initParams(uri, options, callback) params.options.method = 'HEAD' if (paramsHaveRequestBody(params)) { throw new Error('HTTP HEAD requests MUST NOT include a request body.') } return requester(params)(params.uri || null, params.options, params.callback) } request.post = function (uri, options, callback) { var params = initParams(uri, options, callback) params.options.method = 'POST' return requester(params)(params.uri || null, params.options, params.callback) } request.put = function (uri, options, callback) { var params = initParams(uri, options, callback) params.options.method = 'PUT' return requester(params)(params.uri || null, params.options, params.callback) } request.patch = function (uri, options, callback) { var params = initParams(uri, options, callback) params.options.method = 'PATCH' return requester(params)(params.uri || null, params.options, params.callback) } request.del = function (uri, options, callback) { var params = initParams(uri, options, callback) params.options.method = 'DELETE' return requester(params)(params.uri || null, params.options, params.callback) } request.jar = function (store) { return cookies.jar(store) } request.cookie = function (str) { return cookies.parse(str) } request.defaults = function (options, requester) { var self = this var wrap = function (method) { var headerlessOptions = function (options) { options = extend({}, options) delete options.headers return options } var getHeaders = function (params, options) { return constructObject() .extend(options.headers) .extend(params.options.headers) .done() } return function (uri, opts, callback) { var params = initParams(uri, opts, callback) params.options = extend(headerlessOptions(options), params.options) if (options.headers) { params.options.headers = getHeaders(params, options) } if (isFunction(requester)) { if (method === self) { method = requester } else { params.options._requester = requester } } return method(params.options, params.callback) } } var defaults = wrap(self) defaults.get = wrap(self.get) defaults.patch = wrap(self.patch) defaults.post = wrap(self.post) defaults.put = wrap(self.put) defaults.head = wrap(self.head) defaults.del = wrap(self.del) defaults.cookie = wrap(self.cookie) defaults.jar = self.jar defaults.defaults = self.defaults return defaults } request.forever = function (agentOptions, optionsArg) { var options = constructObject() if (optionsArg) { options.extend(optionsArg) } if (agentOptions) { options.agentOptions = agentOptions } options.extend({forever: true}) return request.defaults(options.done()) } // Exports module.exports = request request.Request = require('./request') request.debug = process.env.NODE_DEBUG && /\brequest\b/.test(process.env.NODE_DEBUG) request.initParams = initParams ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/lib/�������������������������������000755 �000766 �000024 �00000000000 12456115120 025602� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/LICENSE����������������������������000644 �000766 �000024 �00000021664 12455173731 026065� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS����������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/����������������������000755 �000766 �000024 �00000000000 12456115120 027511� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/package.json�����������������������000755 �000766 �000024 �00000004753 12455173731 027351� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "request", "description": "Simplified HTTP request client.", "tags": [ "http", "simple", "util", "utility" ], "version": "2.51.0", "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com" }, "repository": { "type": "git", "url": "https://github.com/request/request.git" }, "bugs": { "url": "http://github.com/request/request/issues" }, "license": "Apache-2.0", "engines": { "node": ">=0.8.0" }, "main": "index.js", "dependencies": { "bl": "~0.9.0", "caseless": "~0.8.0", "forever-agent": "~0.5.0", "form-data": "~0.2.0", "json-stringify-safe": "~5.0.0", "mime-types": "~1.0.1", "node-uuid": "~1.4.0", "qs": "~2.3.1", "tunnel-agent": "~0.4.0", "tough-cookie": ">=0.12.0", "http-signature": "~0.10.0", "oauth-sign": "~0.5.0", "hawk": "1.1.1", "aws-sign2": "~0.5.0", "stringstream": "~0.0.4", "combined-stream": "~0.0.5" }, "scripts": { "test": "npm run lint && node node_modules/.bin/taper tests/test-*.js && npm run test-browser", "test-browser": "browserify tests/browser/test.js -o tests/browser/test-browser.js && karma start tests/browser/karma.conf.js", "lint": "node node_modules/.bin/eslint lib/ *.js tests/ && echo Lint passed." }, "devDependencies": { "browserify": "~5.9.1", "coveralls": "~2.11.2", "eslint": "0.5.1", "function-bind": "~1.0.0", "istanbul": "~0.3.2", "karma": "~0.12.21", "karma-cli": "0.0.4", "karma-phantomjs-launcher": "~0.1.4", "karma-tap": "~1.0.1", "rimraf": "~2.2.8", "tape": "~3.0.0", "taper": "~0.3.0" }, "gitHead": "1c8aca6a9205df58660c676005fb8ec4603d5265", "homepage": "https://github.com/request/request", "_id": "request@2.51.0", "_shasum": "35d00bbecc012e55f907b1bd9e0dbd577bfef26e", "_from": "request@>=2.51.0 <2.52.0", "_npmVersion": "1.4.14", "_npmUser": { "name": "nylen", "email": "jnylen@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, { "name": "nylen", "email": "jnylen@gmail.com" }, { "name": "fredkschott", "email": "fkschott@gmail.com" } ], "dist": { "shasum": "35d00bbecc012e55f907b1bd9e0dbd577bfef26e", "tarball": "http://registry.npmjs.org/request/-/request-2.51.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/request/-/request-2.51.0.tgz", "readme": "ERROR: No README data found!" } ���������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/README.md��������������������������000644 �000766 �000024 �00000075721 12455173731 026342� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Request — Simplified HTTP client [![NPM](https://nodei.co/npm/request.png?downloads=true&downloadRank=true&stars=true)](https://nodei.co/npm/request/) [![Gitter](https://badges.gitter.im/Join Chat.svg)](https://gitter.im/request/request?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) ## Super simple to use Request is designed to be the simplest way possible to make http calls. It supports HTTPS and follows redirects by default. ```javascript var request = require('request'); request('http://www.google.com', function (error, response, body) { if (!error && response.statusCode == 200) { console.log(body) // Print the google web page. } }) ``` ## Streaming You can stream any response to a file stream. ```javascript request('http://google.com/doodle.png').pipe(fs.createWriteStream('doodle.png')) ``` You can also stream a file to a PUT or POST request. This method will also check the file extension against a mapping of file extensions to content-types (in this case `application/json`) and use the proper `content-type` in the PUT request (if the headers don’t already provide one). ```javascript fs.createReadStream('file.json').pipe(request.put('http://mysite.com/obj.json')) ``` Request can also `pipe` to itself. When doing so, `content-type` and `content-length` are preserved in the PUT headers. ```javascript request.get('http://google.com/img.png').pipe(request.put('http://mysite.com/img.png')) ``` Request emits a "response" event when a response is received. The `response` argument will be an instance of [http.IncomingMessage](http://nodejs.org/api/http.html#http_http_incomingmessage). ```javascript request .get('http://google.com/img.png') .on('response', function(response) { console.log(response.statusCode) // 200 console.log(response.headers['content-type']) // 'image/png' }) .pipe(request.put('http://mysite.com/img.png')) ``` To easily handle errors when streaming requests, listen to the `error` event before piping: ```javascript request .get('http://mysite.com/doodle.png') .on('error', function(err) { console.log(err) }) .pipe(fs.createWriteStream('doodle.png')) ``` Now let’s get fancy. ```javascript http.createServer(function (req, resp) { if (req.url === '/doodle.png') { if (req.method === 'PUT') { req.pipe(request.put('http://mysite.com/doodle.png')) } else if (req.method === 'GET' || req.method === 'HEAD') { request.get('http://mysite.com/doodle.png').pipe(resp) } } }) ``` You can also `pipe()` from `http.ServerRequest` instances, as well as to `http.ServerResponse` instances. The HTTP method, headers, and entity-body data will be sent. Which means that, if you don't really care about security, you can do: ```javascript http.createServer(function (req, resp) { if (req.url === '/doodle.png') { var x = request('http://mysite.com/doodle.png') req.pipe(x) x.pipe(resp) } }) ``` And since `pipe()` returns the destination stream in ≥ Node 0.5.x you can do one line proxying. :) ```javascript req.pipe(request('http://mysite.com/doodle.png')).pipe(resp) ``` Also, none of this new functionality conflicts with requests previous features, it just expands them. ```javascript var r = request.defaults({'proxy':'http://localproxy.com'}) http.createServer(function (req, resp) { if (req.url === '/doodle.png') { r.get('http://google.com/doodle.png').pipe(resp) } }) ``` You can still use intermediate proxies, the requests will still follow HTTP forwards, etc. ## Proxies If you specify a `proxy` option, then the request (and any subsequent redirects) will be sent via a connection to the proxy server. If your endpoint is an `https` url, and you are using a proxy, then request will send a `CONNECT` request to the proxy server *first*, and then use the supplied connection to connect to the endpoint. That is, first it will make a request like: ``` HTTP/1.1 CONNECT endpoint-server.com:80 Host: proxy-server.com User-Agent: whatever user agent you specify ``` and then the proxy server make a TCP connection to `endpoint-server` on port `80`, and return a response that looks like: ``` HTTP/1.1 200 OK ``` At this point, the connection is left open, and the client is communicating directly with the `endpoint-server.com` machine. See [the wikipedia page on HTTP Tunneling](http://en.wikipedia.org/wiki/HTTP_tunnel) for more information. By default, when proxying `http` traffic, request will simply make a standard proxied `http` request. This is done by making the `url` section of the initial line of the request a fully qualified url to the endpoint. For example, it will make a single request that looks like: ``` HTTP/1.1 GET http://endpoint-server.com/some-url Host: proxy-server.com Other-Headers: all go here request body or whatever ``` Because a pure "http over http" tunnel offers no additional security or other features, it is generally simpler to go with a straightforward HTTP proxy in this case. However, if you would like to force a tunneling proxy, you may set the `tunnel` option to `true`. If you are using a tunneling proxy, you may set the `proxyHeaderWhiteList` to share certain headers with the proxy. You can also set the `proxyHeaderExclusiveList` to share certain headers only with the proxy and not with destination host. By default, this set is: ``` accept accept-charset accept-encoding accept-language accept-ranges cache-control content-encoding content-language content-length content-location content-md5 content-range content-type connection date expect max-forwards pragma proxy-authorization referer te transfer-encoding user-agent via ``` Note that, when using a tunneling proxy, the `proxy-authorization` header and any headers from custom `proxyHeaderExclusiveList` are *never* sent to the endpoint server, but only to the proxy server. ### Controlling proxy behaviour using environment variables The following environment variables are respected by `request`: * `HTTP_PROXY` / `http_proxy` * `HTTPS_PROXY` / `https_proxy` * `NO_PROXY` / `no_proxy` When `HTTP_PROXY` / `http_proxy` are set, they will be used to proxy non-SSL requests that do not have an explicit `proxy` configuration option present. Similarly, `HTTPS_PROXY` / `https_proxy` will be respected for SSL requests that do not have an explicit `proxy` configuration option. It is valid to define a proxy in one of the environment variables, but then override it for a specific request, using the `proxy` configuration option. Furthermore, the `proxy` configuration option can be explicitly set to false / null to opt out of proxying altogether for that request. `request` is also aware of the `NO_PROXY`/`no_proxy` environment variables. These variables provide a granular way to opt out of proxying, on a per-host basis. It should contain a comma separated list of hosts to opt out of proxying. It is also possible to opt of proxying when a particular destination port is used. Finally, the variable may be set to `*` to opt out of the implicit proxy configuration of the other environment variables. Here's some examples of valid `no_proxy` values: * `google.com` - don't proxy HTTP/HTTPS requests to Google. * `google.com:443` - don't proxy HTTPS requests to Google, but *do* proxy HTTP requests to Google. * `google.com:443, yahoo.com:80` - don't proxy HTTPS requests to Google, and don't proxy HTTP requests to Yahoo! * `*` - ignore `https_proxy`/`http_proxy` environment variables altogether. ## UNIX Socket `request` supports making requests to [UNIX Domain Sockets](http://en.wikipedia.org/wiki/Unix_domain_socket). To make one, use the following URL scheme: ```javascript /* Pattern */ 'http://unix:SOCKET:PATH' /* Example */ request.get('http://unix:/absolute/path/to/unix.socket:/request/path') ``` Note: The `SOCKET` path is assumed to be absolute to the root of the host file system. ## Forms `request` supports `application/x-www-form-urlencoded` and `multipart/form-data` form uploads. For `multipart/related` refer to the `multipart` API. #### application/x-www-form-urlencoded (URL-Encoded Forms) URL-encoded forms are simple. ```javascript request.post('http://service.com/upload', {form:{key:'value'}}) // or request.post('http://service.com/upload').form({key:'value'}) // or request.post({url:'http://service.com/upload', form: {key:'value'}}, function(err,httpResponse,body){ /* ... */ }) ``` #### multipart/form-data (Multipart Form Uploads) For `multipart/form-data` we use the [form-data](https://github.com/felixge/node-form-data) library by [@felixge](https://github.com/felixge). For the most cases, you can pass your upload form data via the `formData` option. ```javascript var formData = { // Pass a simple key-value pair my_field: 'my_value', // Pass data via Buffers my_buffer: new Buffer([1, 2, 3]), // Pass data via Streams my_file: fs.createReadStream(__dirname + '/unicycle.jpg'), // Pass multiple values /w an Array attachments: [ fs.createReadStream(__dirname + '/attachment1.jpg'), fs.createReadStream(__dirname + '/attachment2.jpg') ], // Pass optional meta-data with an 'options' object with style: {value: DATA, options: OPTIONS} // Use case: for some types of streams, you'll need to provide "file"-related information manually. // See the `form-data` README for more information about options: https://github.com/felixge/node-form-data custom_file: { value: fs.createReadStream('/dev/urandom'), options: { filename: 'topsecret.jpg', contentType: 'image/jpg' } } }; request.post({url:'http://service.com/upload', formData: formData}, function optionalCallback(err, httpResponse, body) { if (err) { return console.error('upload failed:', err); } console.log('Upload successful! Server responded with:', body); }); ``` For advanced cases, you can access the form-data object itself via `r.form()`. This can be modified until the request is fired on the next cycle of the event-loop. (Note that this calling `form()` will clear the currently set form data for that request.) ```javascript // NOTE: Advanced use-case, for normal use see 'formData' usage above var r = request.post('http://service.com/upload', function optionalCallback(err, httpResponse, body) { // ... var form = r.form(); form.append('my_field', 'my_value'); form.append('my_buffer', new Buffer([1, 2, 3])); form.append('custom_file', fs.createReadStream(__dirname + '/unicycle.jpg'), {filename: 'unicycle.jpg'}); ``` See the [form-data README](https://github.com/felixge/node-form-data) for more information & examples. #### multipart/related Some variations in different HTTP implementations require a newline/CRLF before, after, or both before and after the boundary of a `multipart/related` request (using the multipart option). This has been observed in the .NET WebAPI version 4.0. You can turn on a boundary preambleCRLF or postamble by passing them as `true` to your request options. ```javascript request({ method: 'PUT', preambleCRLF: true, postambleCRLF: true, uri: 'http://service.com/upload', multipart: [ { 'content-type': 'application/json' body: JSON.stringify({foo: 'bar', _attachments: {'message.txt': {follows: true, length: 18, 'content_type': 'text/plain' }}}) }, { body: 'I am an attachment' }, { body: fs.createReadStream('image.png') } ], // alternatively pass an object containing additional options multipart: { chunked: false, data: [ { 'content-type': 'application/json', body: JSON.stringify({foo: 'bar', _attachments: {'message.txt': {follows: true, length: 18, 'content_type': 'text/plain' }}}) }, { body: 'I am an attachment' } ] } }, function (error, response, body) { if (error) { return console.error('upload failed:', error); } console.log('Upload successful! Server responded with:', body); }) ``` ## HTTP Authentication ```javascript request.get('http://some.server.com/').auth('username', 'password', false); // or request.get('http://some.server.com/', { 'auth': { 'user': 'username', 'pass': 'password', 'sendImmediately': false } }); // or request.get('http://some.server.com/').auth(null, null, true, 'bearerToken'); // or request.get('http://some.server.com/', { 'auth': { 'bearer': 'bearerToken' } }); ``` If passed as an option, `auth` should be a hash containing values `user` || `username`, `pass` || `password`, and `sendImmediately` (optional). The method form takes parameters `auth(username, password, sendImmediately)`. `sendImmediately` defaults to `true`, which causes a basic authentication header to be sent. If `sendImmediately` is `false`, then `request` will retry with a proper authentication header after receiving a `401` response from the server (which must contain a `WWW-Authenticate` header indicating the required authentication method). Note that you can also use for basic authentication a trick using the URL itself, as specified in [RFC 1738](http://www.ietf.org/rfc/rfc1738.txt). Simply pass the `user:password` before the host with an `@` sign. ```javascript var username = 'username', password = 'password', url = 'http://' + username + ':' + password + '@some.server.com'; request({url: url}, function (error, response, body) { // Do more stuff with 'body' here }); ``` Digest authentication is supported, but it only works with `sendImmediately` set to `false`; otherwise `request` will send basic authentication on the initial request, which will probably cause the request to fail. Bearer authentication is supported, and is activated when the `bearer` value is available. The value may be either a `String` or a `Function` returning a `String`. Using a function to supply the bearer token is particularly useful if used in conjuction with `defaults` to allow a single function to supply the last known token at the time or sending a request or to compute one on the fly. ## OAuth Signing [OAuth version 1.0](https://tools.ietf.org/html/rfc5849) is supported. The default signing algorithm is [HMAC-SHA1](https://tools.ietf.org/html/rfc5849#section-3.4.2): ```javascript // Twitter OAuth var qs = require('querystring') , oauth = { callback: 'http://mysite.com/callback/' , consumer_key: CONSUMER_KEY , consumer_secret: CONSUMER_SECRET } , url = 'https://api.twitter.com/oauth/request_token' ; request.post({url:url, oauth:oauth}, function (e, r, body) { // Ideally, you would take the body in the response // and construct a URL that a user clicks on (like a sign in button). // The verifier is only available in the response after a user has // verified with twitter that they are authorizing your app. var access_token = qs.parse(body) , oauth = { consumer_key: CONSUMER_KEY , consumer_secret: CONSUMER_SECRET , token: access_token.oauth_token , verifier: access_token.oauth_verifier } , url = 'https://api.twitter.com/oauth/access_token' ; request.post({url:url, oauth:oauth}, function (e, r, body) { var perm_token = qs.parse(body) , oauth = { consumer_key: CONSUMER_KEY , consumer_secret: CONSUMER_SECRET , token: perm_token.oauth_token , token_secret: perm_token.oauth_token_secret } , url = 'https://api.twitter.com/1.1/users/show.json?' , params = { screen_name: perm_token.screen_name , user_id: perm_token.user_id } ; url += qs.stringify(params) request.get({url:url, oauth:oauth, json:true}, function (e, r, user) { console.log(user) }) }) }) ``` For [RSA-SHA1 signing](https://tools.ietf.org/html/rfc5849#section-3.4.3), make the following changes to the OAuth options object: * Pass `signature_method : 'RSA-SHA1'` * Instead of `consumer_secret`, specify a `private_key` string in [PEM format](http://how2ssl.com/articles/working_with_pem_files/) ## Custom HTTP Headers HTTP Headers, such as `User-Agent`, can be set in the `options` object. In the example below, we call the github API to find out the number of stars and forks for the request repository. This requires a custom `User-Agent` header as well as https. ```javascript var request = require('request'); var options = { url: 'https://api.github.com/repos/request/request', headers: { 'User-Agent': 'request' } }; function callback(error, response, body) { if (!error && response.statusCode == 200) { var info = JSON.parse(body); console.log(info.stargazers_count + " Stars"); console.log(info.forks_count + " Forks"); } } request(options, callback); ``` ## TLS/SSL Protocol TLS/SSL Protocol options, such as `cert`, `key` and `passphrase`, can be set in the `agentOptions` property of the `options` object. In the example below, we call an API requires client side SSL certificate (in PEM format) with passphrase protected private key (in PEM format) and disable the SSLv3 protocol: ```javascript var fs = require('fs') , path = require('path') , certFile = path.resolve(__dirname, 'ssl/client.crt') , keyFile = path.resolve(__dirname, 'ssl/client.key') , request = require('request'); var options = { url: 'https://api.some-server.com/', agentOptions: { cert: fs.readFileSync(certFile), key: fs.readFileSync(keyFile), // Or use `pfx` property replacing `cert` and `key` when using private key, certificate and CA certs in PFX or PKCS12 format: // pfx: fs.readFileSync(pfxFilePath), passphrase: 'password', securityOptions: 'SSL_OP_NO_SSLv3' } }; request.get(options); ``` It is able to force using SSLv3 only by specifying `secureProtocol`: ```javascript request.get({ url: 'https://api.some-server.com/', agentOptions: { secureProtocol: 'SSLv3_method' } }); ``` It is possible to accept other certificates than those signed by generally allowed Certificate Authorities (CAs). This can be useful, for example, when using self-signed certificates. To allow a different certificate, you can specify the signing CA by adding the contents of the CA's certificate file to the `agentOptions`: ```javascript request.get({ url: 'https://api.some-server.com/', agentOptions: { ca: fs.readFileSync('ca.cert.pem') } }); ``` ## request(options, callback) The first argument can be either a `url` or an `options` object. The only required option is `uri`; all others are optional. * `uri` || `url` - fully qualified uri or a parsed url object from `url.parse()` * `qs` - object containing querystring values to be appended to the `uri` * `useQuerystring` - If true, use `querystring` to stringify and parse querystrings, otherwise use `qs` (default: `false`). Set this option to `true` if you need arrays to be serialized as `foo=bar&foo=baz` instead of the default `foo[0]=bar&foo[1]=baz`. * `method` - http method (default: `"GET"`) * `headers` - http headers (default: `{}`) * `body` - entity body for PATCH, POST and PUT requests. Must be a `Buffer` or `String`, unless `json` is `true`. If `json` is `true`, then `body` must be a JSON-serializable object. * `form` - when passed an object or a querystring, this sets `body` to a querystring representation of value, and adds `Content-type: application/x-www-form-urlencoded` header. When passed no options, a `FormData` instance is returned (and is piped to request). See "Forms" section above. * `formData` - Data to pass for a `multipart/form-data` request. See [Forms](#forms) section above. * `multipart` - array of objects which contain their own headers and `body` attributes. Sends a `multipart/related` request. See [Forms](#forms) section above. * Alternatively you can pass in an object `{chunked: false, data: []}` where `chunked` is used to specify whether the request is sent in [chunked transfer encoding](https://en.wikipedia.org/wiki/Chunked_transfer_encoding) (the default is `chunked: true`). In non-chunked requests, data items with body streams are not allowed. * `auth` - A hash containing values `user` || `username`, `pass` || `password`, and `sendImmediately` (optional). See documentation above. * `json` - sets `body` but to JSON representation of value and adds `Content-type: application/json` header. Additionally, parses the response body as JSON. * `jsonReviver` - a [reviver function](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse) that will be passed to `JSON.parse()` when parsing a JSON response body. * `preambleCRLF` - append a newline/CRLF before the boundary of your `multipart/form-data` request. * `postambleCRLF` - append a newline/CRLF at the end of the boundary of your `multipart/form-data` request. * `followRedirect` - follow HTTP 3xx responses as redirects (default: `true`). This property can also be implemented as function which gets `response` object as a single argument and should return `true` if redirects should continue or `false` otherwise. * `followAllRedirects` - follow non-GET HTTP 3xx responses as redirects (default: `false`) * `maxRedirects` - the maximum number of redirects to follow (default: `10`) * `encoding` - Encoding to be used on `setEncoding` of response data. If `null`, the `body` is returned as a `Buffer`. Anything else **(including the default value of `undefined`)** will be passed as the [encoding](http://nodejs.org/api/buffer.html#buffer_buffer) parameter to `toString()` (meaning this is effectively `utf8` by default). * `pool` - An object describing which agents to use for the request. If this option is omitted the request will use the global agent (as long as [your options allow for it](request.js#L747)). Otherwise, request will search the pool for your custom agent. If no custom agent is found, a new agent will be created and added to the pool. * A `maxSockets` property can also be provided on the `pool` object to set the max number of sockets for all agents created (ex: `pool: {maxSockets: Infinity}`). * Note that if you are sending multiple requests in a loop and creating multiple new `pool` objects, `maxSockets` will not work as intended. To work around this, either use [`request.defaults`](#requestdefaultsoptions) with your pool options or create the pool object with the `maxSockets` property outside of the loop. * `timeout` - Integer containing the number of milliseconds to wait for a request to respond before aborting the request * `proxy` - An HTTP proxy to be used. Supports proxy Auth with Basic Auth, identical to support for the `url` parameter (by embedding the auth info in the `uri`) * `oauth` - Options for OAuth HMAC-SHA1 signing. See documentation above. * `hawk` - Options for [Hawk signing](https://github.com/hueniverse/hawk). The `credentials` key must contain the necessary signing info, [see hawk docs for details](https://github.com/hueniverse/hawk#usage-example). * `strictSSL` - If `true`, requires SSL certificates be valid. **Note:** to use your own certificate authority, you need to specify an agent that was created with that CA as an option. * `agentOptions` - Object containing user agent options. See documentation above. **Note:** [see tls API doc for TLS/SSL options](http://nodejs.org/api/tls.html#tls_tls_connect_options_callback). * `jar` - If `true` and `tough-cookie` is installed, remember cookies for future use (or define your custom cookie jar; see examples section) * `aws` - `object` containing AWS signing information. Should have the properties `key`, `secret`. Also requires the property `bucket`, unless you’re specifying your `bucket` as part of the path, or the request doesn’t use a bucket (i.e. GET Services) * `httpSignature` - Options for the [HTTP Signature Scheme](https://github.com/joyent/node-http-signature/blob/master/http_signing.md) using [Joyent's library](https://github.com/joyent/node-http-signature). The `keyId` and `key` properties must be specified. See the docs for other options. * `localAddress` - Local interface to bind for network connections. * `gzip` - If `true`, add an `Accept-Encoding` header to request compressed content encodings from the server (if not already present) and decode supported content encodings in the response. **Note:** Automatic decoding of the response content is performed on the body data returned through `request` (both through the `request` stream and passed to the callback function) but is not performed on the `response` stream (available from the `response` event) which is the unmodified `http.IncomingMessage` object which may contain compressed data. See example below. * `tunnel` - If `true`, then *always* use a tunneling proxy. If `false` (default), then tunneling will only be used if the destination is `https`, or if a previous request in the redirect chain used a tunneling proxy. * `proxyHeaderWhiteList` - A whitelist of headers to send to a tunneling proxy. * `proxyHeaderExclusiveList` - A whitelist of headers to send exclusively to a tunneling proxy and not to destination. The callback argument gets 3 arguments: 1. An `error` when applicable (usually from [`http.ClientRequest`](http://nodejs.org/api/http.html#http_class_http_clientrequest) object) 2. An [`http.IncomingMessage`](http://nodejs.org/api/http.html#http_http_incomingmessage) object 3. The third is the `response` body (`String` or `Buffer`, or JSON object if the `json` option is supplied) ## Convenience methods There are also shorthand methods for different HTTP METHODs and some other conveniences. ### request.defaults(options) This method **returns a wrapper** around the normal request API that defaults to whatever options you pass to it. **Note:** `request.defaults()` **does not** modify the global request API; instead, it **returns a wrapper** that has your default settings applied to it. **Note:** You can call `.defaults()` on the wrapper that is returned from `request.defaults` to add/override defaults that were previously defaulted. For example: ```javascript //requests using baseRequest() will set the 'x-token' header var baseRequest = request.defaults({ headers: {x-token: 'my-token'} }) //requests using specialRequest() will include the 'x-token' header set in //baseRequest and will also include the 'special' header var specialRequest = baseRequest.defaults({ headers: {special: 'special value'} }) ``` ### request.put Same as `request()`, but defaults to `method: "PUT"`. ```javascript request.put(url) ``` ### request.patch Same as `request()`, but defaults to `method: "PATCH"`. ```javascript request.patch(url) ``` ### request.post Same as `request()`, but defaults to `method: "POST"`. ```javascript request.post(url) ``` ### request.head Same as request() but defaults to `method: "HEAD"`. ```javascript request.head(url) ``` ### request.del Same as `request()`, but defaults to `method: "DELETE"`. ```javascript request.del(url) ``` ### request.get Same as `request()` (for uniformity). ```javascript request.get(url) ``` ### request.cookie Function that creates a new cookie. ```javascript request.cookie('key1=value1') ``` ### request.jar() Function that creates a new cookie jar. ```javascript request.jar() ``` ## Examples: ```javascript var request = require('request') , rand = Math.floor(Math.random()*100000000).toString() ; request( { method: 'PUT' , uri: 'http://mikeal.iriscouch.com/testjs/' + rand , multipart: [ { 'content-type': 'application/json' , body: JSON.stringify({foo: 'bar', _attachments: {'message.txt': {follows: true, length: 18, 'content_type': 'text/plain' }}}) } , { body: 'I am an attachment' } ] } , function (error, response, body) { if(response.statusCode == 201){ console.log('document saved as: http://mikeal.iriscouch.com/testjs/'+ rand) } else { console.log('error: '+ response.statusCode) console.log(body) } } ) ``` For backwards-compatibility, response compression is not supported by default. To accept gzip-compressed responses, set the `gzip` option to `true`. Note that the body data passed through `request` is automatically decompressed while the response object is unmodified and will contain compressed data if the server sent a compressed response. ```javascript var request = require('request') request( { method: 'GET' , uri: 'http://www.google.com' , gzip: true } , function (error, response, body) { // body is the decompressed response body console.log('server encoded the data as: ' + (response.headers['content-encoding'] || 'identity')) console.log('the decoded data is: ' + body) } ).on('data', function(data) { // decompressed data as it is received console.log('decoded chunk: ' + data) }) .on('response', function(response) { // unmodified http.IncomingMessage object response.on('data', function(data) { // compressed data as it is received console.log('received ' + data.length + ' bytes of compressed data') }) }) ``` Cookies are disabled by default (else, they would be used in subsequent requests). To enable cookies, set `jar` to `true` (either in `defaults` or `options`) and install `tough-cookie`. ```javascript var request = request.defaults({jar: true}) request('http://www.google.com', function () { request('http://images.google.com') }) ``` To use a custom cookie jar (instead of `request`’s global cookie jar), set `jar` to an instance of `request.jar()` (either in `defaults` or `options`) ```javascript var j = request.jar() var request = request.defaults({jar:j}) request('http://www.google.com', function () { request('http://images.google.com') }) ``` OR ```javascript var j = request.jar(); var cookie = request.cookie('key1=value1'); var url = 'http://www.google.com'; j.setCookie(cookie, url); request({url: url, jar: j}, function () { request('http://images.google.com') }) ``` To use a custom cookie store (such as a [`FileCookieStore`](https://github.com/mitsuru/tough-cookie-filestore) which supports saving to and restoring from JSON files), pass it as a parameter to `request.jar()`: ```javascript var FileCookieStore = require('tough-cookie-filestore'); // NOTE - currently the 'cookies.json' file must already exist! var j = request.jar(new FileCookieStore('cookies.json')); request = request.defaults({ jar : j }) request('http://www.google.com', function() { request('http://images.google.com') }) ``` The cookie store must be a [`tough-cookie`](https://github.com/goinstant/tough-cookie) store and it must support synchronous operations; see the [`CookieStore` API docs](https://github.com/goinstant/tough-cookie/#cookiestore-api) for details. To inspect your cookie jar after a request: ```javascript var j = request.jar() request({url: 'http://www.google.com', jar: j}, function () { var cookie_string = j.getCookieString(uri); // "key1=value1; key2=value2; ..." var cookies = j.getCookies(uri); // [{key: 'key1', value: 'value1', domain: "www.google.com", ...}, ...] }) ``` ## Debugging There are at least three ways to debug the operation of `request`: 1. Launch the node process like `NODE_DEBUG=request node script.js` (`lib,request,otherlib` works too). 2. Set `require('request').debug = true` at any time (this does the same thing as #1). 3. Use the [request-debug module](https://github.com/nylen/request-debug) to view request and response headers and bodies. �����������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/release.sh�������������������������000755 �000766 �000024 �00000002356 12455173731 027034� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh if [ -z "`which github-changes`" ]; then # specify version because github-changes "is under heavy development. Things # may break between releases" until 0.1.0 echo "First, do: [sudo] npm install -g github-changes@0.0.14" exit 1 fi if [ -d .git/refs/remotes/upstream ]; then remote=upstream else remote=origin fi # Increment v2.x.y -> v2.x+1.0 npm version minor || exit 1 # Generate changelog from pull requests github-changes -o request -r request \ --auth --verbose \ --file CHANGELOG.md \ --only-pulls --use-commit-body \ --date-format '(YYYY/MM/DD)' \ || exit 1 # Since the tag for the new version hasn't been pushed yet, any changes in it # will be marked as "upcoming" version="$(grep '"version"' package.json | cut -d'"' -f4)" sed -i -e "s/^### upcoming/### v$version/" CHANGELOG.md # This may fail if no changelog updates # TODO: would this ever actually happen? handle it better? git add CHANGELOG.md; git commit -m 'Update changelog' # Publish the new version to npm npm publish || exit 1 # Increment v2.x.0 -> v2.x.1 # For rationale, see: # https://github.com/request/oauth-sign/issues/10#issuecomment-58917018 npm version patch || exit 1 # Push back to the main repo git push $remote master --tags || exit 1 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/request.js�������������������������000644 �000766 �000024 �00000142336 12455173731 027106� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var http = require('http') , https = require('https') , url = require('url') , util = require('util') , stream = require('stream') , qs = require('qs') , querystring = require('querystring') , zlib = require('zlib') , helpers = require('./lib/helpers') , bl = require('bl') , oauth = require('oauth-sign') , hawk = require('hawk') , aws = require('aws-sign2') , httpSignature = require('http-signature') , uuid = require('node-uuid') , mime = require('mime-types') , tunnel = require('tunnel-agent') , stringstream = require('stringstream') , caseless = require('caseless') , ForeverAgent = require('forever-agent') , FormData = require('form-data') , cookies = require('./lib/cookies') , copy = require('./lib/copy') , debug = require('./lib/debug') , net = require('net') , CombinedStream = require('combined-stream') var safeStringify = helpers.safeStringify , md5 = helpers.md5 , isReadStream = helpers.isReadStream , toBase64 = helpers.toBase64 , defer = helpers.defer , globalCookieJar = cookies.jar() var globalPool = {} , isUrl = /^https?:/ var defaultProxyHeaderWhiteList = [ 'accept', 'accept-charset', 'accept-encoding', 'accept-language', 'accept-ranges', 'cache-control', 'content-encoding', 'content-language', 'content-length', 'content-location', 'content-md5', 'content-range', 'content-type', 'connection', 'date', 'expect', 'max-forwards', 'pragma', 'referer', 'te', 'transfer-encoding', 'user-agent', 'via' ] var defaultProxyHeaderExclusiveList = [ 'proxy-authorization' ] function filterForNonReserved(reserved, options) { // Filter out properties that are not reserved. // Reserved values are passed in at call site. var object = {} for (var i in options) { var notReserved = (reserved.indexOf(i) === -1) if (notReserved) { object[i] = options[i] } } return object } function filterOutReservedFunctions(reserved, options) { // Filter out properties that are functions and are reserved. // Reserved values are passed in at call site. var object = {} for (var i in options) { var isReserved = !(reserved.indexOf(i) === -1) var isFunction = (typeof options[i] === 'function') if (!(isReserved && isFunction)) { object[i] = options[i] } } return object } function constructProxyHost(uriObject) { var port = uriObject.portA , protocol = uriObject.protocol , proxyHost = uriObject.hostname + ':' if (port) { proxyHost += port } else if (protocol === 'https:') { proxyHost += '443' } else { proxyHost += '80' } return proxyHost } function constructProxyHeaderWhiteList(headers, proxyHeaderWhiteList) { var whiteList = proxyHeaderWhiteList .reduce(function (set, header) { set[header.toLowerCase()] = true return set }, {}) return Object.keys(headers) .filter(function (header) { return whiteList[header.toLowerCase()] }) .reduce(function (set, header) { set[header] = headers[header] return set }, {}) } function construcTunnelOptions(request) { var proxy = request.proxy var tunnelOptions = { proxy: { host: proxy.hostname, port: +proxy.port, proxyAuth: proxy.auth, headers: request.proxyHeaders }, rejectUnauthorized: request.rejectUnauthorized, headers: request.headers, ca: request.ca, cert: request.cert, key: request.key } return tunnelOptions } function constructTunnelFnName(uri, proxy) { var uriProtocol = (uri.protocol === 'https:' ? 'https' : 'http') var proxyProtocol = (proxy.protocol === 'https:' ? 'Https' : 'Http') return [uriProtocol, proxyProtocol].join('Over') } function getTunnelFn(request) { var uri = request.uri var proxy = request.proxy var tunnelFnName = constructTunnelFnName(uri, proxy) return tunnel[tunnelFnName] } // Decide the proper request proxy to use based on the request URI object and the // environmental variables (NO_PROXY, HTTP_PROXY, etc.) function getProxyFromURI(uri) { // respect NO_PROXY environment variables (see: http://lynx.isc.org/current/breakout/lynx_help/keystrokes/environments.html) var noProxy = process.env.NO_PROXY || process.env.no_proxy || null // easy case first - if NO_PROXY is '*' if (noProxy === '*') { return null } // otherwise, parse the noProxy value to see if it applies to the URL if (noProxy !== null) { var noProxyItem, hostname, port, noProxyItemParts, noProxyHost, noProxyPort, noProxyList // canonicalize the hostname, so that 'oogle.com' won't match 'google.com' hostname = uri.hostname.replace(/^\.*/, '.').toLowerCase() noProxyList = noProxy.split(',') for (var i = 0, len = noProxyList.length; i < len; i++) { noProxyItem = noProxyList[i].trim().toLowerCase() // no_proxy can be granular at the port level, which complicates things a bit. if (noProxyItem.indexOf(':') > -1) { noProxyItemParts = noProxyItem.split(':', 2) noProxyHost = noProxyItemParts[0].replace(/^\.*/, '.') noProxyPort = noProxyItemParts[1] port = uri.port || (uri.protocol === 'https:' ? '443' : '80') // we've found a match - ports are same and host ends with no_proxy entry. if (port === noProxyPort && hostname.indexOf(noProxyHost) === hostname.length - noProxyHost.length) { return null } } else { noProxyItem = noProxyItem.replace(/^\.*/, '.') var isMatchedAt = hostname.indexOf(noProxyItem) if (isMatchedAt > -1 && isMatchedAt === hostname.length - noProxyItem.length) { return null } } } } // check for HTTP(S)_PROXY environment variables if (uri.protocol === 'http:') { return process.env.HTTP_PROXY || process.env.http_proxy || null } else if (uri.protocol === 'https:') { return process.env.HTTPS_PROXY || process.env.https_proxy || process.env.HTTP_PROXY || process.env.http_proxy || null } // return null if all else fails (What uri protocol are you using then?) return null } // Function for properly handling a connection error function connectionErrorHandler(error) { var socket = this if (socket.res) { if (socket.res.request) { socket.res.request.emit('error', error) } else { socket.res.emit('error', error) } } else { socket._httpMessage.emit('error', error) } } // Return a simpler request object to allow serialization function requestToJSON() { var self = this return { uri: self.uri, method: self.method, headers: self.headers } } // Return a simpler response object to allow serialization function responseToJSON() { var self = this return { statusCode: self.statusCode, body: self.body, headers: self.headers, request: requestToJSON.call(self.request) } } function Request (options) { // if tunnel property of options was not given default to false // if given the method property in options, set property explicitMethod to true // extend the Request instance with any non-reserved properties // remove any reserved functions from the options object // set Request instance to be readable and writable // call init var self = this stream.Stream.call(self) var reserved = Object.keys(Request.prototype) var nonReserved = filterForNonReserved(reserved, options) stream.Stream.call(self) util._extend(self, nonReserved) options = filterOutReservedFunctions(reserved, options) self.readable = true self.writable = true if (typeof options.tunnel === 'undefined') { options.tunnel = false } if (options.method) { self.explicitMethod = true } self.canTunnel = options.tunnel !== false && tunnel self.init(options) } util.inherits(Request, stream.Stream) Request.prototype.setupTunnel = function () { // Set up the tunneling agent if necessary // Only send the proxy whitelisted header names. // Turn on tunneling for the rest of request. var self = this if (typeof self.proxy === 'string') { self.proxy = url.parse(self.proxy) } if (!self.proxy) { return false } if (!self.tunnel && self.uri.protocol !== 'https:') { return false } // Always include `defaultProxyHeaderExclusiveList` if (!self.proxyHeaderExclusiveList) { self.proxyHeaderExclusiveList = [] } var proxyHeaderExclusiveList = self.proxyHeaderExclusiveList.concat(defaultProxyHeaderExclusiveList) // Treat `proxyHeaderExclusiveList` as part of `proxyHeaderWhiteList` if (!self.proxyHeaderWhiteList) { self.proxyHeaderWhiteList = defaultProxyHeaderWhiteList } var proxyHeaderWhiteList = self.proxyHeaderWhiteList.concat(proxyHeaderExclusiveList) var proxyHost = constructProxyHost(self.uri) self.proxyHeaders = constructProxyHeaderWhiteList(self.headers, proxyHeaderWhiteList) self.proxyHeaders.host = proxyHost proxyHeaderExclusiveList.forEach(self.removeHeader, self) var tunnelFn = getTunnelFn(self) var tunnelOptions = construcTunnelOptions(self) self.agent = tunnelFn(tunnelOptions) self.tunnel = true return true } Request.prototype.init = function (options) { // init() contains all the code to setup the request object. // the actual outgoing request is not started until start() is called // this function is called from both the constructor and on redirect. var self = this if (!options) { options = {} } self.headers = self.headers ? copy(self.headers) : {} caseless.httpify(self, self.headers) if (!self.method) { self.method = options.method || 'GET' } self.localAddress = options.localAddress if (!self.qsLib) { self.qsLib = (options.useQuerystring ? querystring : qs) } debug(options) if (!self.pool && self.pool !== false) { self.pool = globalPool } self.dests = self.dests || [] self.__isRequestRequest = true // Protect against double callback if (!self._callback && self.callback) { self._callback = self.callback self.callback = function () { if (self._callbackCalled) { return // Print a warning maybe? } self._callbackCalled = true self._callback.apply(self, arguments) } self.on('error', self.callback.bind()) self.on('complete', self.callback.bind(self, null)) } // People use this property instead all the time, so support it if (!self.uri && self.url) { self.uri = self.url delete self.url } // A URI is needed by this point, throw if we haven't been able to get one if (!self.uri) { return self.emit('error', new Error('options.uri is a required argument')) } // If a string URI/URL was given, parse it into a URL object if(typeof self.uri === 'string') { self.uri = url.parse(self.uri) } // DEPRECATED: Warning for users of the old Unix Sockets URL Scheme if (self.uri.protocol === 'unix:') { return self.emit('error', new Error('`unix://` URL scheme is no longer supported. Please use the format `http://unix:SOCKET:PATH`')) } // Support Unix Sockets if(self.uri.host === 'unix') { // Get the socket & request paths from the URL var unixParts = self.uri.path.split(':') , host = unixParts[0] , path = unixParts[1] // Apply unix properties to request self.socketPath = host self.uri.pathname = path self.uri.path = path self.uri.host = host self.uri.hostname = host self.uri.isUnix = true } if (self.strictSSL === false) { self.rejectUnauthorized = false } if(!self.hasOwnProperty('proxy')) { self.proxy = getProxyFromURI(self.uri) } // Pass in `tunnel:true` to *always* tunnel through proxies self.tunnel = !!options.tunnel if (self.proxy) { self.setupTunnel() } if (!self.uri.pathname) {self.uri.pathname = '/'} if (!(self.uri.host || (self.uri.hostname && self.uri.port)) && !self.uri.isUnix) { // Invalid URI: it may generate lot of bad errors, like 'TypeError: Cannot call method `indexOf` of undefined' in CookieJar // Detect and reject it as soon as possible var faultyUri = url.format(self.uri) var message = 'Invalid URI "' + faultyUri + '"' if (Object.keys(options).length === 0) { // No option ? This can be the sign of a redirect // As this is a case where the user cannot do anything (they didn't call request directly with this URL) // they should be warned that it can be caused by a redirection (can save some hair) message += '. This can be caused by a crappy redirection.' } // This error was fatal return self.emit('error', new Error(message)) } self._redirectsFollowed = self._redirectsFollowed || 0 self.maxRedirects = (self.maxRedirects !== undefined) ? self.maxRedirects : 10 self.allowRedirect = (typeof self.followRedirect === 'function') ? self.followRedirect : function(response) { return true } self.followRedirects = (self.followRedirect !== undefined) ? !!self.followRedirect : true self.followAllRedirects = (self.followAllRedirects !== undefined) ? self.followAllRedirects : false if (self.followRedirects || self.followAllRedirects) { self.redirects = self.redirects || [] } self.setHost = false if (!self.hasHeader('host')) { var hostHeaderName = self.originalHostHeaderName || 'host' self.setHeader(hostHeaderName, self.uri.hostname) if (self.uri.port) { if ( !(self.uri.port === 80 && self.uri.protocol === 'http:') && !(self.uri.port === 443 && self.uri.protocol === 'https:') ) { self.setHeader(hostHeaderName, self.getHeader('host') + (':' + self.uri.port) ) } } self.setHost = true } self.jar(self._jar || options.jar) if (!self.uri.port) { if (self.uri.protocol === 'http:') {self.uri.port = 80} else if (self.uri.protocol === 'https:') {self.uri.port = 443} } if (self.proxy && !self.tunnel) { self.port = self.proxy.port self.host = self.proxy.hostname } else { self.port = self.uri.port self.host = self.uri.hostname } if (options.form) { self.form(options.form) } if (options.formData) { var formData = options.formData var requestForm = self.form() var appendFormValue = function (key, value) { if (value.hasOwnProperty('value') && value.hasOwnProperty('options')) { requestForm.append(key, value.value, value.options) } else { requestForm.append(key, value) } } for (var formKey in formData) { if (formData.hasOwnProperty(formKey)) { var formValue = formData[formKey] if (formValue instanceof Array) { for (var j = 0; j < formValue.length; j++) { appendFormValue(formKey, formValue[j]) } } else { appendFormValue(formKey, formValue) } } } } if (options.qs) { self.qs(options.qs) } if (self.uri.path) { self.path = self.uri.path } else { self.path = self.uri.pathname + (self.uri.search || '') } if (self.path.length === 0) { self.path = '/' } // Auth must happen last in case signing is dependent on other headers if (options.oauth) { self.oauth(options.oauth) } if (options.aws) { self.aws(options.aws) } if (options.hawk) { self.hawk(options.hawk) } if (options.httpSignature) { self.httpSignature(options.httpSignature) } if (options.auth) { if (Object.prototype.hasOwnProperty.call(options.auth, 'username')) { options.auth.user = options.auth.username } if (Object.prototype.hasOwnProperty.call(options.auth, 'password')) { options.auth.pass = options.auth.password } self.auth( options.auth.user, options.auth.pass, options.auth.sendImmediately, options.auth.bearer ) } if (self.gzip && !self.hasHeader('accept-encoding')) { self.setHeader('accept-encoding', 'gzip') } if (self.uri.auth && !self.hasHeader('authorization')) { var uriAuthPieces = self.uri.auth.split(':').map(function(item){ return querystring.unescape(item) }) self.auth(uriAuthPieces[0], uriAuthPieces.slice(1).join(':'), true) } if (!self.tunnel && self.proxy && self.proxy.auth && !self.hasHeader('proxy-authorization')) { var proxyAuthPieces = self.proxy.auth.split(':').map(function(item){ return querystring.unescape(item) }) var authHeader = 'Basic ' + toBase64(proxyAuthPieces.join(':')) self.setHeader('proxy-authorization', authHeader) } if (self.proxy && !self.tunnel) { self.path = (self.uri.protocol + '//' + self.uri.host + self.path) } if (options.json) { self.json(options.json) } if (options.multipart) { self.boundary = uuid() self.multipart(options.multipart) } if (self.body) { var length = 0 if (!Buffer.isBuffer(self.body)) { if (Array.isArray(self.body)) { for (var i = 0; i < self.body.length; i++) { length += self.body[i].length } } else { self.body = new Buffer(self.body) length = self.body.length } } else { length = self.body.length } if (length) { if (!self.hasHeader('content-length')) { self.setHeader('content-length', length) } } else { throw new Error('Argument error, options.body.') } } var protocol = self.proxy && !self.tunnel ? self.proxy.protocol : self.uri.protocol , defaultModules = {'http:':http, 'https:':https} , httpModules = self.httpModules || {} self.httpModule = httpModules[protocol] || defaultModules[protocol] if (!self.httpModule) { return self.emit('error', new Error('Invalid protocol: ' + protocol)) } if (options.ca) { self.ca = options.ca } if (!self.agent) { if (options.agentOptions) { self.agentOptions = options.agentOptions } if (options.agentClass) { self.agentClass = options.agentClass } else if (options.forever) { self.agentClass = protocol === 'http:' ? ForeverAgent : ForeverAgent.SSL } else { self.agentClass = self.httpModule.Agent } } if (self.pool === false) { self.agent = false } else { self.agent = self.agent || self.getNewAgent() } self.on('pipe', function (src) { if (self.ntick && self._started) { throw new Error('You cannot pipe to this stream after the outbound request has started.') } self.src = src if (isReadStream(src)) { if (!self.hasHeader('content-type')) { self.setHeader('content-type', mime.lookup(src.path)) } } else { if (src.headers) { for (var i in src.headers) { if (!self.hasHeader(i)) { self.setHeader(i, src.headers[i]) } } } if (self._json && !self.hasHeader('content-type')) { self.setHeader('content-type', 'application/json') } if (src.method && !self.explicitMethod) { self.method = src.method } } // self.on('pipe', function () { // console.error('You have already piped to this stream. Pipeing twice is likely to break the request.') // }) }) defer(function () { if (self._aborted) { return } var end = function () { if (self._form) { self._form.pipe(self) } if (self._multipart) { self._multipart.pipe(self) } if (self.body) { if (Array.isArray(self.body)) { self.body.forEach(function (part) { self.write(part) }) } else { self.write(self.body) } self.end() } else if (self.requestBodyStream) { console.warn('options.requestBodyStream is deprecated, please pass the request object to stream.pipe.') self.requestBodyStream.pipe(self) } else if (!self.src) { if (self.method !== 'GET' && typeof self.method !== 'undefined') { self.setHeader('content-length', 0) } self.end() } } if (self._form && !self.hasHeader('content-length')) { // Before ending the request, we had to compute the length of the whole form, asyncly self.setHeader(self._form.getHeaders()) self._form.getLength(function (err, length) { if (!err) { self.setHeader('content-length', length) } end() }) } else { end() } self.ntick = true }) } // Must call this when following a redirect from https to http or vice versa // Attempts to keep everything as identical as possible, but update the // httpModule, Tunneling agent, and/or Forever Agent in use. Request.prototype._updateProtocol = function () { var self = this var protocol = self.uri.protocol if (protocol === 'https:' || self.tunnel) { // previously was doing http, now doing https // if it's https, then we might need to tunnel now. if (self.proxy) { if (self.setupTunnel()) { return } } self.httpModule = https switch (self.agentClass) { case ForeverAgent: self.agentClass = ForeverAgent.SSL break case http.Agent: self.agentClass = https.Agent break default: // nothing we can do. Just hope for the best. return } // if there's an agent, we need to get a new one. if (self.agent) { self.agent = self.getNewAgent() } } else { // previously was doing https, now doing http self.httpModule = http switch (self.agentClass) { case ForeverAgent.SSL: self.agentClass = ForeverAgent break case https.Agent: self.agentClass = http.Agent break default: // nothing we can do. just hope for the best return } // if there's an agent, then get a new one. if (self.agent) { self.agent = null self.agent = self.getNewAgent() } } } Request.prototype.getNewAgent = function () { var self = this var Agent = self.agentClass var options = {} if (self.agentOptions) { for (var i in self.agentOptions) { options[i] = self.agentOptions[i] } } if (self.ca) { options.ca = self.ca } if (self.ciphers) { options.ciphers = self.ciphers } if (self.secureProtocol) { options.secureProtocol = self.secureProtocol } if (self.secureOptions) { options.secureOptions = self.secureOptions } if (typeof self.rejectUnauthorized !== 'undefined') { options.rejectUnauthorized = self.rejectUnauthorized } if (self.cert && self.key) { options.key = self.key options.cert = self.cert } var poolKey = '' // different types of agents are in different pools if (Agent !== self.httpModule.Agent) { poolKey += Agent.name } // ca option is only relevant if proxy or destination are https var proxy = self.proxy if (typeof proxy === 'string') { proxy = url.parse(proxy) } var isHttps = (proxy && proxy.protocol === 'https:') || this.uri.protocol === 'https:' if (isHttps) { if (options.ca) { if (poolKey) { poolKey += ':' } poolKey += options.ca } if (typeof options.rejectUnauthorized !== 'undefined') { if (poolKey) { poolKey += ':' } poolKey += options.rejectUnauthorized } if (options.cert) { poolKey += options.cert.toString('ascii') + options.key.toString('ascii') } if (options.ciphers) { if (poolKey) { poolKey += ':' } poolKey += options.ciphers } if (options.secureProtocol) { if (poolKey) { poolKey += ':' } poolKey += options.secureProtocol } if (options.secureOptions) { if (poolKey) { poolKey += ':' } poolKey += options.secureOptions } } if (self.pool === globalPool && !poolKey && Object.keys(options).length === 0 && self.httpModule.globalAgent) { // not doing anything special. Use the globalAgent return self.httpModule.globalAgent } // we're using a stored agent. Make sure it's protocol-specific poolKey = self.uri.protocol + poolKey // generate a new agent for this setting if none yet exists if (!self.pool[poolKey]) { self.pool[poolKey] = new Agent(options) // properly set maxSockets on new agents if (self.pool.maxSockets) { self.pool[poolKey].maxSockets = self.pool.maxSockets } } return self.pool[poolKey] } Request.prototype.start = function () { // start() is called once we are ready to send the outgoing HTTP request. // this is usually called on the first write(), end() or on nextTick() var self = this if (self._aborted) { return } self._started = true self.method = self.method || 'GET' self.href = self.uri.href if (self.src && self.src.stat && self.src.stat.size && !self.hasHeader('content-length')) { self.setHeader('content-length', self.src.stat.size) } if (self._aws) { self.aws(self._aws, true) } // We have a method named auth, which is completely different from the http.request // auth option. If we don't remove it, we're gonna have a bad time. var reqOptions = copy(self) delete reqOptions.auth debug('make request', self.uri.href) self.req = self.httpModule.request(reqOptions) if (self.timeout && !self.timeoutTimer) { self.timeoutTimer = setTimeout(function () { self.abort() var e = new Error('ETIMEDOUT') e.code = 'ETIMEDOUT' self.emit('error', e) }, self.timeout) // Set additional timeout on socket - in case if remote // server freeze after sending headers if (self.req.setTimeout) { // only works on node 0.6+ self.req.setTimeout(self.timeout, function () { if (self.req) { self.req.abort() var e = new Error('ESOCKETTIMEDOUT') e.code = 'ESOCKETTIMEDOUT' self.emit('error', e) } }) } } self.req.on('response', self.onRequestResponse.bind(self)) self.req.on('error', self.onRequestError.bind(self)) self.req.on('drain', function() { self.emit('drain') }) self.req.on('socket', function(socket) { self.emit('socket', socket) }) self.on('end', function() { if ( self.req.connection ) { self.req.connection.removeListener('error', connectionErrorHandler) } }) self.emit('request', self.req) } Request.prototype.onRequestError = function (error) { var self = this if (self._aborted) { return } if (self.req && self.req._reusedSocket && error.code === 'ECONNRESET' && self.agent.addRequestNoreuse) { self.agent = { addRequest: self.agent.addRequestNoreuse.bind(self.agent) } self.start() self.req.end() return } if (self.timeout && self.timeoutTimer) { clearTimeout(self.timeoutTimer) self.timeoutTimer = null } self.emit('error', error) } Request.prototype.onRequestResponse = function (response) { var self = this debug('onRequestResponse', self.uri.href, response.statusCode, response.headers) response.on('end', function() { debug('response end', self.uri.href, response.statusCode, response.headers) }) // The check on response.connection is a workaround for browserify. if (response.connection && response.connection.listeners('error').indexOf(connectionErrorHandler) === -1) { response.connection.setMaxListeners(0) response.connection.once('error', connectionErrorHandler) } if (self._aborted) { debug('aborted', self.uri.href) response.resume() return } if (self._paused) { response.pause() } else if (response.resume) { // response.resume should be defined, but check anyway before calling. Workaround for browserify. response.resume() } self.response = response response.request = self response.toJSON = responseToJSON // XXX This is different on 0.10, because SSL is strict by default if (self.httpModule === https && self.strictSSL && (!response.hasOwnProperty('client') || !response.client.authorized)) { debug('strict ssl error', self.uri.href) var sslErr = response.hasOwnProperty('client') ? response.client.authorizationError : self.uri.href + ' does not support SSL' self.emit('error', new Error('SSL Error: ' + sslErr)) return } // Save the original host before any redirect (if it changes, we need to // remove any authorization headers). Also remember the case of the header // name because lots of broken servers expect Host instead of host and we // want the caller to be able to specify this. self.originalHost = self.getHeader('host') if (!self.originalHostHeaderName) { self.originalHostHeaderName = self.hasHeader('host') } if (self.setHost) { self.removeHeader('host') } if (self.timeout && self.timeoutTimer) { clearTimeout(self.timeoutTimer) self.timeoutTimer = null } var targetCookieJar = (self._jar && self._jar.setCookie) ? self._jar : globalCookieJar var addCookie = function (cookie) { //set the cookie if it's domain in the href's domain. try { targetCookieJar.setCookie(cookie, self.uri.href, {ignoreError: true}) } catch (e) { self.emit('error', e) } } response.caseless = caseless(response.headers) if (response.caseless.has('set-cookie') && (!self._disableCookies)) { var headerName = response.caseless.has('set-cookie') if (Array.isArray(response.headers[headerName])) { response.headers[headerName].forEach(addCookie) } else { addCookie(response.headers[headerName]) } } var redirectTo = null if (response.statusCode >= 300 && response.statusCode < 400 && response.caseless.has('location')) { var location = response.caseless.get('location') debug('redirect', location) if (self.followAllRedirects) { redirectTo = location } else if (self.followRedirects) { switch (self.method) { case 'PATCH': case 'PUT': case 'POST': case 'DELETE': // Do not follow redirects break default: redirectTo = location break } } } else if (response.statusCode === 401 && self._hasAuth && !self._sentAuth) { var authHeader = response.caseless.get('www-authenticate') var authVerb = authHeader && authHeader.split(' ')[0].toLowerCase() debug('reauth', authVerb) switch (authVerb) { case 'basic': self.auth(self._user, self._pass, true) redirectTo = self.uri break case 'bearer': self.auth(null, null, true, self._bearer) redirectTo = self.uri break case 'digest': // TODO: More complete implementation of RFC 2617. // - check challenge.algorithm // - support algorithm="MD5-sess" // - handle challenge.domain // - support qop="auth-int" only // - handle Authentication-Info (not necessarily?) // - check challenge.stale (not necessarily?) // - increase nc (not necessarily?) // For reference: // http://tools.ietf.org/html/rfc2617#section-3 // https://github.com/bagder/curl/blob/master/lib/http_digest.c var challenge = {} var re = /([a-z0-9_-]+)=(?:"([^"]+)"|([a-z0-9_-]+))/gi for (;;) { var match = re.exec(authHeader) if (!match) { break } challenge[match[1]] = match[2] || match[3] } var ha1 = md5(self._user + ':' + challenge.realm + ':' + self._pass) var ha2 = md5(self.method + ':' + self.uri.path) var qop = /(^|,)\s*auth\s*($|,)/.test(challenge.qop) && 'auth' var nc = qop && '00000001' var cnonce = qop && uuid().replace(/-/g, '') var digestResponse = qop ? md5(ha1 + ':' + challenge.nonce + ':' + nc + ':' + cnonce + ':' + qop + ':' + ha2) : md5(ha1 + ':' + challenge.nonce + ':' + ha2) var authValues = { username: self._user, realm: challenge.realm, nonce: challenge.nonce, uri: self.uri.path, qop: qop, response: digestResponse, nc: nc, cnonce: cnonce, algorithm: challenge.algorithm, opaque: challenge.opaque } authHeader = [] for (var k in authValues) { if (authValues[k]) { if (k === 'qop' || k === 'nc' || k === 'algorithm') { authHeader.push(k + '=' + authValues[k]) } else { authHeader.push(k + '="' + authValues[k] + '"') } } } authHeader = 'Digest ' + authHeader.join(', ') self.setHeader('authorization', authHeader) self._sentAuth = true redirectTo = self.uri break } } if (redirectTo && self.allowRedirect.call(self, response)) { debug('redirect to', redirectTo) // ignore any potential response body. it cannot possibly be useful // to us at this point. if (self._paused) { response.resume() } if (self._redirectsFollowed >= self.maxRedirects) { self.emit('error', new Error('Exceeded maxRedirects. Probably stuck in a redirect loop ' + self.uri.href)) return } self._redirectsFollowed += 1 if (!isUrl.test(redirectTo)) { redirectTo = url.resolve(self.uri.href, redirectTo) } var uriPrev = self.uri self.uri = url.parse(redirectTo) // handle the case where we change protocol from https to http or vice versa if (self.uri.protocol !== uriPrev.protocol) { self._updateProtocol() } self.redirects.push( { statusCode : response.statusCode , redirectUri: redirectTo } ) if (self.followAllRedirects && response.statusCode !== 401 && response.statusCode !== 307) { self.method = 'GET' } // self.method = 'GET' // Force all redirects to use GET || commented out fixes #215 delete self.src delete self.req delete self.agent delete self._started if (response.statusCode !== 401 && response.statusCode !== 307) { // Remove parameters from the previous response, unless this is the second request // for a server that requires digest authentication. delete self.body delete self._form if (self.headers) { self.removeHeader('host') self.removeHeader('content-type') self.removeHeader('content-length') if (self.uri.hostname !== self.originalHost.split(':')[0]) { // Remove authorization if changing hostnames (but not if just // changing ports or protocols). This matches the behavior of curl: // https://github.com/bagder/curl/blob/6beb0eee/lib/http.c#L710 self.removeHeader('authorization') } } } self.emit('redirect') self.init() return // Ignore the rest of the response } else { self._redirectsFollowed = self._redirectsFollowed || 0 // Be a good stream and emit end when the response is finished. // Hack to emit end on close because of a core bug that never fires end response.on('close', function () { if (!self._ended) { self.response.emit('end') } }) response.on('end', function () { self._ended = true }) var dataStream if (self.gzip) { var contentEncoding = response.headers['content-encoding'] || 'identity' contentEncoding = contentEncoding.trim().toLowerCase() if (contentEncoding === 'gzip') { dataStream = zlib.createGunzip() response.pipe(dataStream) } else { // Since previous versions didn't check for Content-Encoding header, // ignore any invalid values to preserve backwards-compatibility if (contentEncoding !== 'identity') { debug('ignoring unrecognized Content-Encoding ' + contentEncoding) } dataStream = response } } else { dataStream = response } if (self.encoding) { if (self.dests.length !== 0) { console.error('Ignoring encoding parameter as this stream is being piped to another stream which makes the encoding option invalid.') } else if (dataStream.setEncoding) { dataStream.setEncoding(self.encoding) } else { // Should only occur on node pre-v0.9.4 (joyent/node@9b5abe5) with // zlib streams. // If/When support for 0.9.4 is dropped, this should be unnecessary. dataStream = dataStream.pipe(stringstream(self.encoding)) } } self.emit('response', response) self.dests.forEach(function (dest) { self.pipeDest(dest) }) dataStream.on('data', function (chunk) { self._destdata = true self.emit('data', chunk) }) dataStream.on('end', function (chunk) { self.emit('end', chunk) }) dataStream.on('error', function (error) { self.emit('error', error) }) dataStream.on('close', function () {self.emit('close')}) if (self.callback) { var buffer = bl() , strings = [] self.on('data', function (chunk) { if (Buffer.isBuffer(chunk)) { buffer.append(chunk) } else { strings.push(chunk) } }) self.on('end', function () { debug('end event', self.uri.href) if (self._aborted) { debug('aborted', self.uri.href) return } if (buffer.length) { debug('has body', self.uri.href, buffer.length) if (self.encoding === null) { // response.body = buffer // can't move to this until https://github.com/rvagg/bl/issues/13 response.body = buffer.slice() } else { response.body = buffer.toString(self.encoding) } } else if (strings.length) { // The UTF8 BOM [0xEF,0xBB,0xBF] is converted to [0xFE,0xFF] in the JS UTC16/UCS2 representation. // Strip this value out when the encoding is set to 'utf8', as upstream consumers won't expect it and it breaks JSON.parse(). if (self.encoding === 'utf8' && strings[0].length > 0 && strings[0][0] === '\uFEFF') { strings[0] = strings[0].substring(1) } response.body = strings.join('') } if (self._json) { try { response.body = JSON.parse(response.body, self._jsonReviver) } catch (e) {} } debug('emitting complete', self.uri.href) if(typeof response.body === 'undefined' && !self._json) { response.body = '' } self.emit('complete', response, response.body) }) } //if no callback else{ self.on('end', function () { if (self._aborted) { debug('aborted', self.uri.href) return } self.emit('complete', response) }) } } debug('finish init function', self.uri.href) } Request.prototype.abort = function () { var self = this self._aborted = true if (self.req) { self.req.abort() } else if (self.response) { self.response.abort() } self.emit('abort') } Request.prototype.pipeDest = function (dest) { var self = this var response = self.response // Called after the response is received if (dest.headers && !dest.headersSent) { if (response.caseless.has('content-type')) { var ctname = response.caseless.has('content-type') if (dest.setHeader) { dest.setHeader(ctname, response.headers[ctname]) } else { dest.headers[ctname] = response.headers[ctname] } } if (response.caseless.has('content-length')) { var clname = response.caseless.has('content-length') if (dest.setHeader) { dest.setHeader(clname, response.headers[clname]) } else { dest.headers[clname] = response.headers[clname] } } } if (dest.setHeader && !dest.headersSent) { for (var i in response.headers) { // If the response content is being decoded, the Content-Encoding header // of the response doesn't represent the piped content, so don't pass it. if (!self.gzip || i !== 'content-encoding') { dest.setHeader(i, response.headers[i]) } } dest.statusCode = response.statusCode } if (self.pipefilter) { self.pipefilter(response, dest) } } Request.prototype.qs = function (q, clobber) { var self = this var base if (!clobber && self.uri.query) { base = self.qsLib.parse(self.uri.query) } else { base = {} } for (var i in q) { base[i] = q[i] } if (self.qsLib.stringify(base) === ''){ return self } self.uri = url.parse(self.uri.href.split('?')[0] + '?' + self.qsLib.stringify(base)) self.url = self.uri self.path = self.uri.path return self } Request.prototype.form = function (form) { var self = this if (form) { self.setHeader('content-type', 'application/x-www-form-urlencoded') self.body = (typeof form === 'string') ? form.toString('utf8') : self.qsLib.stringify(form).toString('utf8') return self } // create form-data object self._form = new FormData() return self._form } Request.prototype.multipart = function (multipart) { var self = this var chunked = (multipart instanceof Array) || (multipart.chunked === undefined) || multipart.chunked multipart = multipart.data || multipart var items = chunked ? new CombinedStream() : [] function add (part) { return chunked ? items.append(part) : items.push(new Buffer(part)) } if (chunked) { self.setHeader('transfer-encoding', 'chunked') } var headerName = self.hasHeader('content-type') if (!headerName || self.headers[headerName].indexOf('multipart') === -1) { self.setHeader('content-type', 'multipart/related; boundary=' + self.boundary) } else { self.setHeader(headerName, self.headers[headerName].split(';')[0] + '; boundary=' + self.boundary) } if (!multipart.forEach) { throw new Error('Argument error, options.multipart.') } if (self.preambleCRLF) { add('\r\n') } multipart.forEach(function (part) { var body = part.body if(typeof body === 'undefined') { throw new Error('Body attribute missing in multipart.') } var preamble = '--' + self.boundary + '\r\n' Object.keys(part).forEach(function (key) { if (key === 'body') { return } preamble += key + ': ' + part[key] + '\r\n' }) preamble += '\r\n' add(preamble) add(body) add('\r\n') }) add('--' + self.boundary + '--') if (self.postambleCRLF) { add('\r\n') } self[chunked ? '_multipart' : 'body'] = items return self } Request.prototype.json = function (val) { var self = this if (!self.hasHeader('accept')) { self.setHeader('accept', 'application/json') } self._json = true if (typeof val === 'boolean') { if (self.body !== undefined && self.getHeader('content-type') !== 'application/x-www-form-urlencoded') { self.body = safeStringify(self.body) if (!self.hasHeader('content-type')) { self.setHeader('content-type', 'application/json') } } } else { self.body = safeStringify(val) if (!self.hasHeader('content-type')) { self.setHeader('content-type', 'application/json') } } if (typeof self.jsonReviver === 'function') { self._jsonReviver = self.jsonReviver } return self } Request.prototype.getHeader = function (name, headers) { var self = this var result, re, match if (!headers) { headers = self.headers } Object.keys(headers).forEach(function (key) { if (key.length !== name.length) { return } re = new RegExp(name, 'i') match = key.match(re) if (match) { result = headers[key] } }) return result } var getHeader = Request.prototype.getHeader Request.prototype.auth = function (user, pass, sendImmediately, bearer) { var self = this if (bearer !== undefined) { self._bearer = bearer self._hasAuth = true if (sendImmediately || typeof sendImmediately === 'undefined') { if (typeof bearer === 'function') { bearer = bearer() } self.setHeader('authorization', 'Bearer ' + bearer) self._sentAuth = true } return self } if (typeof user !== 'string' || (pass !== undefined && typeof pass !== 'string')) { throw new Error('auth() received invalid user or password') } self._user = user self._pass = pass self._hasAuth = true var header = typeof pass !== 'undefined' ? user + ':' + pass : user if (sendImmediately || typeof sendImmediately === 'undefined') { self.setHeader('authorization', 'Basic ' + toBase64(header)) self._sentAuth = true } return self } Request.prototype.aws = function (opts, now) { var self = this if (!now) { self._aws = opts return self } var date = new Date() self.setHeader('date', date.toUTCString()) var auth = { key: opts.key , secret: opts.secret , verb: self.method.toUpperCase() , date: date , contentType: self.getHeader('content-type') || '' , md5: self.getHeader('content-md5') || '' , amazonHeaders: aws.canonicalizeHeaders(self.headers) } var path = self.uri.path if (opts.bucket && path) { auth.resource = '/' + opts.bucket + path } else if (opts.bucket && !path) { auth.resource = '/' + opts.bucket } else if (!opts.bucket && path) { auth.resource = path } else if (!opts.bucket && !path) { auth.resource = '/' } auth.resource = aws.canonicalizeResource(auth.resource) self.setHeader('authorization', aws.authorization(auth)) return self } Request.prototype.httpSignature = function (opts) { var self = this httpSignature.signRequest({ getHeader: function(header) { return getHeader(header, self.headers) }, setHeader: function(header, value) { self.setHeader(header, value) }, method: self.method, path: self.path }, opts) debug('httpSignature authorization', self.getHeader('authorization')) return self } Request.prototype.hawk = function (opts) { var self = this self.setHeader('Authorization', hawk.client.header(self.uri, self.method, opts).field) } Request.prototype.oauth = function (_oauth) { var self = this var form, query if (self.hasHeader('content-type') && self.getHeader('content-type').slice(0, 'application/x-www-form-urlencoded'.length) === 'application/x-www-form-urlencoded' ) { form = self.body } if (self.uri.query) { query = self.uri.query } var oa = {} for (var i in _oauth) { oa['oauth_' + i] = _oauth[i] } if ('oauth_realm' in oa) { delete oa.oauth_realm } if (!oa.oauth_version) { oa.oauth_version = '1.0' } if (!oa.oauth_timestamp) { oa.oauth_timestamp = Math.floor( Date.now() / 1000 ).toString() } if (!oa.oauth_nonce) { oa.oauth_nonce = uuid().replace(/-/g, '') } if (!oa.oauth_signature_method) { oa.oauth_signature_method = 'HMAC-SHA1' } var consumer_secret_or_private_key = oa.oauth_consumer_secret || oa.oauth_private_key delete oa.oauth_consumer_secret delete oa.oauth_private_key var token_secret = oa.oauth_token_secret delete oa.oauth_token_secret var baseurl = self.uri.protocol + '//' + self.uri.host + self.uri.pathname var params = self.qsLib.parse([].concat(query, form, self.qsLib.stringify(oa)).join('&')) var signature = oauth.sign( oa.oauth_signature_method, self.method, baseurl, params, consumer_secret_or_private_key, token_secret) var realm = _oauth.realm ? 'realm="' + _oauth.realm + '",' : '' var authHeader = 'OAuth ' + realm + Object.keys(oa).sort().map(function (i) {return i + '="' + oauth.rfc3986(oa[i]) + '"'}).join(',') authHeader += ',oauth_signature="' + oauth.rfc3986(signature) + '"' self.setHeader('Authorization', authHeader) return self } Request.prototype.jar = function (jar) { var self = this var cookies if (self._redirectsFollowed === 0) { self.originalCookieHeader = self.getHeader('cookie') } if (!jar) { // disable cookies cookies = false self._disableCookies = true } else { var targetCookieJar = (jar && jar.getCookieString) ? jar : globalCookieJar var urihref = self.uri.href //fetch cookie in the Specified host if (targetCookieJar) { cookies = targetCookieJar.getCookieString(urihref) } } //if need cookie and cookie is not empty if (cookies && cookies.length) { if (self.originalCookieHeader) { // Don't overwrite existing Cookie header self.setHeader('cookie', self.originalCookieHeader + '; ' + cookies) } else { self.setHeader('cookie', cookies) } } self._jar = jar return self } // Stream API Request.prototype.pipe = function (dest, opts) { var self = this if (self.response) { if (self._destdata) { throw new Error('You cannot pipe after data has been emitted from the response.') } else if (self._ended) { throw new Error('You cannot pipe after the response has been ended.') } else { stream.Stream.prototype.pipe.call(self, dest, opts) self.pipeDest(dest) return dest } } else { self.dests.push(dest) stream.Stream.prototype.pipe.call(self, dest, opts) return dest } } Request.prototype.write = function () { var self = this if (!self._started) { self.start() } return self.req.write.apply(self.req, arguments) } Request.prototype.end = function (chunk) { var self = this if (chunk) { self.write(chunk) } if (!self._started) { self.start() } self.req.end() } Request.prototype.pause = function () { var self = this if (!self.response) { self._paused = true } else { self.response.pause.apply(self.response, arguments) } } Request.prototype.resume = function () { var self = this if (!self.response) { self._paused = false } else { self.response.resume.apply(self.response, arguments) } } Request.prototype.destroy = function () { var self = this if (!self._ended) { self.end() } else if (self.response) { self.response.destroy() } } Request.defaultProxyHeaderWhiteList = defaultProxyHeaderWhiteList.slice() Request.defaultProxyHeaderExclusiveList = defaultProxyHeaderExclusiveList.slice() // Exports Request.prototype.toJSON = requestToJSON module.exports = Request ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/aws-sign2/������������000755 �000766 �000024 �00000000000 12456115120 031323� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/�������������������000755 �000766 �000024 �00000000000 12456115120 030106� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/caseless/�������������000755 �000766 �000024 �00000000000 12456115120 031313� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/combined-stream/������000755 �000766 �000024 �00000000000 12456115120 032562� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/��������000755 �000766 �000024 �00000000000 12456115120 032255� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/form-data/������������000755 �000766 �000024 �00000000000 12456115120 031363� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/�����������������000755 �000766 �000024 �00000000000 12456115120 030443� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/http-signature/�������000755 �000766 �000024 �00000000000 12456115120 032467� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/��000755 �000766 �000024 �00000000000 12456115120 033412� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/�����������000755 �000766 �000024 �00000000000 12456115120 031602� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/������������000755 �000766 �000024 �00000000000 12456115120 031402� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/�����������000755 �000766 �000024 �00000000000 12456115120 031567� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/�������������������000755 �000766 �000024 �00000000000 12456115120 030134� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/stringstream/���������000755 �000766 �000024 �00000000000 12456115120 032233� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/���������000755 �000766 �000024 �00000000000 12456115120 032106� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/���������000755 �000766 �000024 �00000000000 12456115120 032112� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/.jshintrc000644 �000766 �000024 �00000000066 12455173731 033754� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "node": true, "asi": true, "laxcomma": true } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/index.js�000644 �000766 �000024 �00000014735 12455173731 033604� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var net = require('net') , tls = require('tls') , http = require('http') , https = require('https') , events = require('events') , assert = require('assert') , util = require('util') ; exports.httpOverHttp = httpOverHttp exports.httpsOverHttp = httpsOverHttp exports.httpOverHttps = httpOverHttps exports.httpsOverHttps = httpsOverHttps function httpOverHttp(options) { var agent = new TunnelingAgent(options) agent.request = http.request return agent } function httpsOverHttp(options) { var agent = new TunnelingAgent(options) agent.request = http.request agent.createSocket = createSecureSocket return agent } function httpOverHttps(options) { var agent = new TunnelingAgent(options) agent.request = https.request return agent } function httpsOverHttps(options) { var agent = new TunnelingAgent(options) agent.request = https.request agent.createSocket = createSecureSocket return agent } function TunnelingAgent(options) { var self = this self.options = options || {} self.proxyOptions = self.options.proxy || {} self.maxSockets = self.options.maxSockets || http.Agent.defaultMaxSockets self.requests = [] self.sockets = [] self.on('free', function onFree(socket, host, port) { for (var i = 0, len = self.requests.length; i < len; ++i) { var pending = self.requests[i] if (pending.host === host && pending.port === port) { // Detect the request to connect same origin server, // reuse the connection. self.requests.splice(i, 1) pending.request.onSocket(socket) return } } socket.destroy() self.removeSocket(socket) }) } util.inherits(TunnelingAgent, events.EventEmitter) TunnelingAgent.prototype.addRequest = function addRequest(req, options) { var self = this // Legacy API: addRequest(req, host, port, path) if (typeof options === 'string') { options = { host: options, port: arguments[2], path: arguments[3] }; } if (self.sockets.length >= this.maxSockets) { // We are over limit so we'll add it to the queue. self.requests.push({host: host, port: port, request: req}) return } // If we are under maxSockets create a new one. self.createSocket({host: options.host, port: options.port, request: req}, function(socket) { socket.on('free', onFree) socket.on('close', onCloseOrRemove) socket.on('agentRemove', onCloseOrRemove) req.onSocket(socket) function onFree() { self.emit('free', socket, options.host, options.port) } function onCloseOrRemove(err) { self.removeSocket() socket.removeListener('free', onFree) socket.removeListener('close', onCloseOrRemove) socket.removeListener('agentRemove', onCloseOrRemove) } }) } TunnelingAgent.prototype.createSocket = function createSocket(options, cb) { var self = this var placeholder = {} self.sockets.push(placeholder) var connectOptions = mergeOptions({}, self.proxyOptions, { method: 'CONNECT' , path: options.host + ':' + options.port , agent: false } ) if (connectOptions.proxyAuth) { connectOptions.headers = connectOptions.headers || {} connectOptions.headers['Proxy-Authorization'] = 'Basic ' + new Buffer(connectOptions.proxyAuth).toString('base64') } debug('making CONNECT request') var connectReq = self.request(connectOptions) connectReq.useChunkedEncodingByDefault = false // for v0.6 connectReq.once('response', onResponse) // for v0.6 connectReq.once('upgrade', onUpgrade) // for v0.6 connectReq.once('connect', onConnect) // for v0.7 or later connectReq.once('error', onError) connectReq.end() function onResponse(res) { // Very hacky. This is necessary to avoid http-parser leaks. res.upgrade = true } function onUpgrade(res, socket, head) { // Hacky. process.nextTick(function() { onConnect(res, socket, head) }) } function onConnect(res, socket, head) { connectReq.removeAllListeners() socket.removeAllListeners() if (res.statusCode === 200) { assert.equal(head.length, 0) debug('tunneling connection has established') self.sockets[self.sockets.indexOf(placeholder)] = socket cb(socket) } else { debug('tunneling socket could not be established, statusCode=%d', res.statusCode) var error = new Error('tunneling socket could not be established, ' + 'statusCode=' + res.statusCode) error.code = 'ECONNRESET' options.request.emit('error', error) self.removeSocket(placeholder) } } function onError(cause) { connectReq.removeAllListeners() debug('tunneling socket could not be established, cause=%s\n', cause.message, cause.stack) var error = new Error('tunneling socket could not be established, ' + 'cause=' + cause.message) error.code = 'ECONNRESET' options.request.emit('error', error) self.removeSocket(placeholder) } } TunnelingAgent.prototype.removeSocket = function removeSocket(socket) { var pos = this.sockets.indexOf(socket) if (pos === -1) return this.sockets.splice(pos, 1) var pending = this.requests.shift() if (pending) { // If we have pending requests and a socket gets closed a new one // needs to be created to take over in the pool for the one that closed. this.createSocket(pending, function(socket) { pending.request.onSocket(socket) }) } } function createSecureSocket(options, cb) { var self = this TunnelingAgent.prototype.createSocket.call(self, options, function(socket) { // 0 is dummy port for v0.6 var secureSocket = tls.connect(0, mergeOptions({}, self.options, { servername: options.host , socket: socket } )) cb(secureSocket) }) } function mergeOptions(target) { for (var i = 1, len = arguments.length; i < len; ++i) { var overrides = arguments[i] if (typeof overrides === 'object') { var keys = Object.keys(overrides) for (var j = 0, keyLen = keys.length; j < keyLen; ++j) { var k = keys[j] if (overrides[k] !== undefined) { target[k] = overrides[k] } } } } return target } var debug if (process.env.NODE_DEBUG && /\btunnel\b/.test(process.env.NODE_DEBUG)) { debug = function() { var args = Array.prototype.slice.call(arguments) if (typeof args[0] === 'string') { args[0] = 'TUNNEL: ' + args[0] } else { args.unshift('TUNNEL:') } console.error.apply(console, args) } } else { debug = function() {} } exports.debug = debug // for test �����������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/LICENSE��000644 �000766 �000024 �00000021664 12455173731 033143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS����������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/package.json��������������������000644 �000766 �000024 �00000002410 12455173731 034331� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "tunnel-agent", "description": "HTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module.", "version": "0.4.0", "repository": { "url": "https://github.com/mikeal/tunnel-agent" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "bugs": { "url": "https://github.com/mikeal/tunnel-agent/issues" }, "homepage": "https://github.com/mikeal/tunnel-agent", "_id": "tunnel-agent@0.4.0", "dist": { "shasum": "b1184e312ffbcf70b3b4c78e8c219de7ebb1c550", "tarball": "http://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.4.0.tgz" }, "_from": "tunnel-agent@>=0.4.0 <0.5.0", "_npmVersion": "1.3.21", "_npmUser": { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" } ], "directories": {}, "_shasum": "b1184e312ffbcf70b3b4c78e8c219de7ebb1c550", "_resolved": "https://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.4.0.tgz", "readme": "ERROR: No README data found!", "scripts": {} } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tunnel-agent/README.md000644 �000766 �000024 �00000000161 12455173731 033402� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������tunnel-agent ============ HTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/.jshintrc000644 �000766 �000024 �00000003111 12455173731 033742� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "passfail" : false, "maxerr" : 100, "browser" : false, "node" : true, "rhino" : false, "couch" : false, "wsh" : false, "jquery" : false, "prototypejs" : false, "mootools" : false, "dojo" : false, "debug" : false, "devel" : false, "esnext" : true, "strict" : true, "globalstrict" : true, "asi" : false, "laxbreak" : false, "bitwise" : true, "boss" : false, "curly" : true, "eqeqeq" : false, "eqnull" : true, "evil" : false, "expr" : false, "forin" : false, "immed" : true, "lastsemic" : true, "latedef" : false, "loopfunc" : false, "noarg" : true, "regexp" : false, "regexdash" : false, "scripturl" : false, "shadow" : false, "supernew" : false, "undef" : true, "unused" : true, "newcap" : true, "noempty" : true, "nonew" : true, "nomen" : false, "onevar" : false, "onecase" : true, "plusplus" : false, "proto" : false, "sub" : true, "trailing" : true, "white" : false, "predef": [ "describe", "it", "before", "beforeEach", "after", "afterEach", "expect", "setTimeout", "clearTimeout" ], "maxlen": 0 } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/.npmignore����������������������000644 �000766 �000024 �00000000050 12455173731 034034� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules/ .*.sw[nmop] npm-debug.log ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/.travis.yml���������������������000644 �000766 �000024 �00000000157 12455173731 034156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.10" - "0.11" matrix: fast_finish: true allow_failures: - node_js: 0.11 �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/generate-pubsuffix.js�����������000644 �000766 �000024 �00000021116 12455173731 036204� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* * Copyright GoInstant, Inc. and other contributors. All rights reserved. * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ 'use strict'; var fs = require('fs'); var assert = require('assert'); var punycode = require('punycode'); fs.readFile('./public-suffix.txt', 'utf8', function(err,string) { if (err) { throw err; } var lines = string.split("\n"); process.nextTick(function() { processList(lines); }); }); var index = {}; var COMMENT = new RegExp('//.+'); function processList(lines) { while (lines.length) { var line = lines.shift(); line = line.replace(COMMENT,'').trim(); if (!line) { continue; } addToIndex(index,line); } pubSufTest(); var w = fs.createWriteStream('./lib/pubsuffix.js',{ flags: 'w', encoding: 'utf8', mode: parseInt('644',8) }); w.on('end', process.exit); w.write("/****************************************************\n"); w.write(" * AUTOMATICALLY GENERATED by generate-pubsuffix.js *\n"); w.write(" * DO NOT EDIT! *\n"); w.write(" ****************************************************/\n\n"); w.write("module.exports.getPublicSuffix = "); w.write(getPublicSuffix.toString()); w.write(";\n\n"); w.write("// The following generated structure is used under the MPL version 1.1\n"); w.write("// See public-suffix.txt for more information\n\n"); w.write("var index = module.exports.index = Object.freeze(\n"); w.write(JSON.stringify(index)); w.write(");\n\n"); w.write("// END of automatically generated file\n"); w.end(); } function addToIndex(index,line) { var prefix = ''; if (line.replace(/^(!|\*\.)/)) { prefix = RegExp.$1; line = line.slice(prefix.length); } line = prefix + punycode.toASCII(line); if (line.substr(0,1) == '!') { index[line.substr(1)] = false; } else { index[line] = true; } } // include the licence in the function since it gets written to pubsuffix.js function getPublicSuffix(domain) { /* * Copyright GoInstant, Inc. and other contributors. All rights reserved. * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ if (!domain) { return null; } if (domain.match(/^\./)) { return null; } domain = domain.toLowerCase(); var parts = domain.split('.').reverse(); var suffix = ''; var suffixLen = 0; for (var i=0; i<parts.length; i++) { var part = parts[i]; var starstr = '*'+suffix; var partstr = part+suffix; if (index[starstr]) { // star rule matches suffixLen = i+1; if (index[partstr] === false) { // exception rule matches (NB: false, not undefined) suffixLen--; } } else if (index[partstr]) { // exact match, not exception suffixLen = i+1; } suffix = '.'+part+suffix; } if (index['*'+suffix]) { // *.domain exists (e.g. *.kyoto.jp for domain='kyoto.jp'); return null; } if (suffixLen && parts.length > suffixLen) { return parts.slice(0,suffixLen+1).reverse().join('.'); } return null; } function checkPublicSuffix(give,get) { var got = getPublicSuffix(give); assert.equal(got, get, give+' should be '+(get==null?'NULL':get)+' but got '+got); } // pubSufTest() was converted to JavaScript from http://publicsuffix.org/list/test.txt function pubSufTest() { // For this function-scope and this function-scope ONLY: // Any copyright is dedicated to the Public Domain. // http://creativecommons.org/publicdomain/zero/1.0/ // NULL input. checkPublicSuffix(null, null); // Mixed case. checkPublicSuffix('COM', null); checkPublicSuffix('example.COM', 'example.com'); checkPublicSuffix('WwW.example.COM', 'example.com'); // Leading dot. checkPublicSuffix('.com', null); checkPublicSuffix('.example', null); checkPublicSuffix('.example.com', null); checkPublicSuffix('.example.example', null); // Unlisted TLD. checkPublicSuffix('example', null); checkPublicSuffix('example.example', null); checkPublicSuffix('b.example.example', null); checkPublicSuffix('a.b.example.example', null); // Listed, but non-Internet, TLD. checkPublicSuffix('local', null); checkPublicSuffix('example.local', null); checkPublicSuffix('b.example.local', null); checkPublicSuffix('a.b.example.local', null); // TLD with only 1 rule. checkPublicSuffix('biz', null); checkPublicSuffix('domain.biz', 'domain.biz'); checkPublicSuffix('b.domain.biz', 'domain.biz'); checkPublicSuffix('a.b.domain.biz', 'domain.biz'); // TLD with some 2-level rules. checkPublicSuffix('com', null); checkPublicSuffix('example.com', 'example.com'); checkPublicSuffix('b.example.com', 'example.com'); checkPublicSuffix('a.b.example.com', 'example.com'); checkPublicSuffix('uk.com', null); checkPublicSuffix('example.uk.com', 'example.uk.com'); checkPublicSuffix('b.example.uk.com', 'example.uk.com'); checkPublicSuffix('a.b.example.uk.com', 'example.uk.com'); checkPublicSuffix('test.ac', 'test.ac'); // TLD with only 1 (wildcard) rule. checkPublicSuffix('cy', null); checkPublicSuffix('c.cy', null); checkPublicSuffix('b.c.cy', 'b.c.cy'); checkPublicSuffix('a.b.c.cy', 'b.c.cy'); // More complex TLD. checkPublicSuffix('jp', null); checkPublicSuffix('test.jp', 'test.jp'); checkPublicSuffix('www.test.jp', 'test.jp'); checkPublicSuffix('ac.jp', null); checkPublicSuffix('test.ac.jp', 'test.ac.jp'); checkPublicSuffix('www.test.ac.jp', 'test.ac.jp'); checkPublicSuffix('kyoto.jp', null); checkPublicSuffix('c.kyoto.jp', null); checkPublicSuffix('b.c.kyoto.jp', 'b.c.kyoto.jp'); checkPublicSuffix('a.b.c.kyoto.jp', 'b.c.kyoto.jp'); checkPublicSuffix('pref.kyoto.jp', 'pref.kyoto.jp'); // Exception rule. checkPublicSuffix('www.pref.kyoto.jp', 'pref.kyoto.jp'); // Exception rule. checkPublicSuffix('city.kyoto.jp', 'city.kyoto.jp'); // Exception rule. checkPublicSuffix('www.city.kyoto.jp', 'city.kyoto.jp'); // Exception rule. // TLD with a wildcard rule and exceptions. checkPublicSuffix('om', null); checkPublicSuffix('test.om', null); checkPublicSuffix('b.test.om', 'b.test.om'); checkPublicSuffix('a.b.test.om', 'b.test.om'); checkPublicSuffix('songfest.om', 'songfest.om'); checkPublicSuffix('www.songfest.om', 'songfest.om'); // US K12. checkPublicSuffix('us', null); checkPublicSuffix('test.us', 'test.us'); checkPublicSuffix('www.test.us', 'test.us'); checkPublicSuffix('ak.us', null); checkPublicSuffix('test.ak.us', 'test.ak.us'); checkPublicSuffix('www.test.ak.us', 'test.ak.us'); checkPublicSuffix('k12.ak.us', null); checkPublicSuffix('test.k12.ak.us', 'test.k12.ak.us'); checkPublicSuffix('www.test.k12.ak.us', 'test.k12.ak.us'); } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/lib/�����000755 �000766 �000024 �00000000000 12456115120 032654� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/LICENSE��000644 �000766 �000024 �00000007213 12455173731 033131� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright GoInstant, Inc. and other contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. The following exceptions apply: === `pubSufTest()` of generate-pubsuffix.js is in the public domain. // Any copyright is dedicated to the Public Domain. // http://creativecommons.org/publicdomain/zero/1.0/ === `public-suffix.txt` was obtained from <https://mxr.mozilla.org/mozilla-central/source/netwerk/dns/effective_tld_names.dat?raw=1> via <http://publicsuffix.org>. That file contains the usual Mozilla triple-license, for which this project uses it under the terms of the MPL 1.1: // ***** BEGIN LICENSE BLOCK ***** // Version: MPL 1.1/GPL 2.0/LGPL 2.1 // // The contents of this file are subject to the Mozilla Public License Version // 1.1 (the "License"); you may not use this file except in compliance with // the License. You may obtain a copy of the License at // http://www.mozilla.org/MPL/ // // Software distributed under the License is distributed on an "AS IS" basis, // WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License // for the specific language governing rights and limitations under the // License. // // The Original Code is the Public Suffix List. // // The Initial Developer of the Original Code is // Jo Hermans <jo.hermans@gmail.com>. // Portions created by the Initial Developer are Copyright (C) 2007 // the Initial Developer. All Rights Reserved. // // Contributor(s): // Ruben Arakelyan <ruben@rubenarakelyan.com> // Gervase Markham <gerv@gerv.net> // Pamela Greene <pamg.bugs@gmail.com> // David Triendl <david@triendl.name> // Jothan Frakes <jothan@gmail.com> // The kind representatives of many TLD registries // // Alternatively, the contents of this file may be used under the terms of // either the GNU General Public License Version 2 or later (the "GPL"), or // the GNU Lesser General Public License Version 2.1 or later (the "LGPL"), // in which case the provisions of the GPL or the LGPL are applicable instead // of those above. If you wish to allow use of your version of this file only // under the terms of either the GPL or the LGPL, and not to allow others to // use your version of this file under the terms of the MPL, indicate your // decision by deleting the provisions above and replace them with the notice // and other provisions required by the GPL or the LGPL. If you do not delete // the provisions above, a recipient may use your version of this file under // the terms of any one of the MPL, the GPL or the LGPL. // // ***** END LICENSE BLOCK ***** �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/node_modules/�������������������000755 �000766 �000024 �00000000000 12456115120 034504� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/package.json��������������������000644 �000766 �000024 �00000053324 12455173731 034337� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "GoInstant Inc., a salesforce.com company" }, "license": "MIT", "name": "tough-cookie", "description": "RFC6265 Cookies and Cookie Jar for node.js", "keywords": [ "HTTP", "cookie", "cookies", "set-cookie", "cookiejar", "jar", "RFC6265", "RFC2965" ], "version": "0.12.1", "homepage": "https://github.com/goinstant/tough-cookie", "repository": { "type": "git", "url": "git://github.com/goinstant/tough-cookie.git" }, "bugs": { "url": "https://github.com/goinstant/tough-cookie/issues" }, "main": "./lib/cookie", "scripts": { "test": "vows test.js" }, "engines": { "node": ">=0.4.12" }, "dependencies": { "punycode": ">=0.2.0" }, "devDependencies": { "vows": "0.7.0", "async": ">=0.1.12" }, "readme": "[RFC6265](http://tools.ietf.org/html/rfc6265) Cookies and CookieJar for Node.js\n\n![Tough Cookie](http://www.goinstant.com.s3.amazonaws.com/tough-cookie.jpg)\n\n[![Build Status](https://travis-ci.org/goinstant/node-cookie.png?branch=master)](https://travis-ci.org/goinstant/node-cookie)\n\n[![NPM Stats](https://nodei.co/npm/tough-cookie.png?downloads=true&stars=true)](https://npmjs.org/package/tough-cookie)\n![NPM Downloads](https://nodei.co/npm-dl/tough-cookie.png?months=9)\n\n# Synopsis\n\n``` javascript\nvar tough = require('tough-cookie'); // note: not 'cookie', 'cookies' or 'node-cookie'\nvar Cookie = tough.Cookie;\nvar cookie = Cookie.parse(header);\ncookie.value = 'somethingdifferent';\nheader = cookie.toString();\n\nvar cookiejar = new tough.CookieJar();\ncookiejar.setCookie(cookie, 'http://currentdomain.example.com/path', cb);\n// ...\ncookiejar.getCookies('http://example.com/otherpath',function(err,cookies) {\n res.headers['cookie'] = cookies.join('; ');\n});\n```\n\n# Installation\n\nIt's _so_ easy!\n\n`npm install tough-cookie`\n\nRequires `punycode`, which should get installed automatically for you. Note that node.js v0.6.2+ bundles punycode by default.\n\nWhy the name? NPM modules `cookie`, `cookies` and `cookiejar` were already taken.\n\n# API\n\ntough\n=====\n\nFunctions on the module you get from `require('tough-cookie')`. All can be used as pure functions and don't need to be \"bound\".\n\nparseDate(string[,strict])\n-----------------\n\nParse a cookie date string into a `Date`. Parses according to RFC6265 Section 5.1.1, not `Date.parse()`. If strict is set to true then leading/trailing non-seperator characters around the time part will cause the parsing to fail (e.g. \"Thu, 01 Jan 1970 00:00:010 GMT\" has an extra trailing zero but Chrome, an assumedly RFC-compliant browser, treats this as valid).\n\nformatDate(date)\n----------------\n\nFormat a Date into a RFC1123 string (the RFC6265-recommended format).\n\ncanonicalDomain(str)\n--------------------\n\nTransforms a domain-name into a canonical domain-name. The canonical domain-name is a trimmed, lowercased, stripped-of-leading-dot and optionally punycode-encoded domain-name (Section 5.1.2 of RFC6265). For the most part, this function is idempotent (can be run again on its output without ill effects).\n\ndomainMatch(str,domStr[,canonicalize=true])\n-------------------------------------------\n\nAnswers \"does this real domain match the domain in a cookie?\". The `str` is the \"current\" domain-name and the `domStr` is the \"cookie\" domain-name. Matches according to RFC6265 Section 5.1.3, but it helps to think of it as a \"suffix match\".\n\nThe `canonicalize` parameter will run the other two paramters through `canonicalDomain` or not.\n\ndefaultPath(path)\n-----------------\n\nGiven a current request/response path, gives the Path apropriate for storing in a cookie. This is basically the \"directory\" of a \"file\" in the path, but is specified by Section 5.1.4 of the RFC.\n\nThe `path` parameter MUST be _only_ the pathname part of a URI (i.e. excludes the hostname, query, fragment, etc.). This is the `.pathname` property of node's `uri.parse()` output.\n\npathMatch(reqPath,cookiePath)\n-----------------------------\n\nAnswers \"does the request-path path-match a given cookie-path?\" as per RFC6265 Section 5.1.4. Returns a boolean.\n\nThis is essentially a prefix-match where `cookiePath` is a prefix of `reqPath`.\n\nparse(header[,strict=false])\n----------------------------\n\nalias for `Cookie.parse(header[,strict])`\n\nfromJSON(string)\n----------------\n\nalias for `Cookie.fromJSON(string)`\n\ngetPublicSuffix(hostname)\n-------------------------\n\nReturns the public suffix of this hostname. The public suffix is the shortest domain-name upon which a cookie can be set. Returns `null` if the hostname cannot have cookies set for it.\n\nFor example: `www.example.com` and `www.subdomain.example.com` both have public suffix `example.com`.\n\nFor further information, see http://publicsuffix.org/. This module derives its list from that site.\n\ncookieCompare(a,b)\n------------------\n\nFor use with `.sort()`, sorts a list of cookies into the recommended order given in the RFC (Section 5.4 step 2). Longest `.path`s go first, then sorted oldest to youngest.\n\n``` javascript\nvar cookies = [ /* unsorted array of Cookie objects */ ];\ncookies = cookies.sort(cookieCompare);\n```\n\npermuteDomain(domain)\n---------------------\n\nGenerates a list of all possible domains that `domainMatch()` the parameter. May be handy for implementing cookie stores.\n\n\npermutePath(path)\n-----------------\n\nGenerates a list of all possible paths that `pathMatch()` the parameter. May be handy for implementing cookie stores.\n\nCookie\n======\n\nCookie.parse(header[,strict=false])\n-----------------------------------\n\nParses a single Cookie or Set-Cookie HTTP header into a `Cookie` object. Returns `undefined` if the string can't be parsed. If in strict mode, returns `undefined` if the cookie doesn't follow the guidelines in section 4 of RFC6265. Generally speaking, strict mode can be used to validate your own generated Set-Cookie headers, but acting as a client you want to be lenient and leave strict mode off.\n\nHere's how to process the Set-Cookie header(s) on a node HTTP/HTTPS response:\n\n``` javascript\nif (res.headers['set-cookie'] instanceof Array)\n cookies = res.headers['set-cookie'].map(function (c) { return (Cookie.parse(c)); });\nelse\n cookies = [Cookie.parse(res.headers['set-cookie'])];\n```\n\nCookie.fromJSON(string)\n-----------------------\n\nConvert a JSON string to a `Cookie` object. Does a `JSON.parse()` and converts the `.created`, `.lastAccessed` and `.expires` properties into `Date` objects.\n\nProperties\n==========\n\n * _key_ - string - the name or key of the cookie (default \"\")\n * _value_ - string - the value of the cookie (default \"\")\n * _expires_ - `Date` - if set, the `Expires=` attribute of the cookie (defaults to the string `\"Infinity\"`). See `setExpires()`\n * _maxAge_ - seconds - if set, the `Max-Age=` attribute _in seconds_ of the cookie. May also be set to strings `\"Infinity\"` and `\"-Infinity\"` for non-expiry and immediate-expiry, respectively. See `setMaxAge()`\n * _domain_ - string - the `Domain=` attribute of the cookie\n * _path_ - string - the `Path=` of the cookie\n * _secure_ - boolean - the `Secure` cookie flag\n * _httpOnly_ - boolean - the `HttpOnly` cookie flag\n * _extensions_ - `Array` - any unrecognized cookie attributes as strings (even if equal-signs inside)\n\nAfter a cookie has been passed through `CookieJar.setCookie()` it will have the following additional attributes:\n\n * _hostOnly_ - boolean - is this a host-only cookie (i.e. no Domain field was set, but was instead implied)\n * _pathIsDefault_ - boolean - if true, there was no Path field on the cookie and `defaultPath()` was used to derive one.\n * _created_ - `Date` - when this cookie was added to the jar\n * _lastAccessed_ - `Date` - last time the cookie got accessed. Will affect cookie cleaning once implemented. Using `cookiejar.getCookies(...)` will update this attribute.\n\nConstruction([{options}])\n------------\n\nReceives an options object that can contain any Cookie properties, uses the default for unspecified properties.\n\n.toString()\n-----------\n\nencode to a Set-Cookie header value. The Expires cookie field is set using `formatDate()`, but is omitted entirely if `.expires` is `Infinity`.\n\n.cookieString()\n---------------\n\nencode to a Cookie header value (i.e. the `.key` and `.value` properties joined with '=').\n\n.setExpires(String)\n-------------------\n\nsets the expiry based on a date-string passed through `parseDate()`. If parseDate returns `null` (i.e. can't parse this date string), `.expires` is set to `\"Infinity\"` (a string) is set.\n\n.setMaxAge(number)\n-------------------\n\nsets the maxAge in seconds. Coerces `-Infinity` to `\"-Infinity\"` and `Infinity` to `\"Infinity\"` so it JSON serializes correctly.\n\n.expiryTime([now=Date.now()])\n-----------------------------\n\n.expiryDate([now=Date.now()])\n-----------------------------\n\nexpiryTime() Computes the absolute unix-epoch milliseconds that this cookie expires. expiryDate() works similarly, except it returns a `Date` object. Note that in both cases the `now` parameter should be milliseconds.\n\nMax-Age takes precedence over Expires (as per the RFC). The `.created` attribute -- or, by default, the `now` paramter -- is used to offset the `.maxAge` attribute.\n\nIf Expires (`.expires`) is set, that's returned.\n\nOtherwise, `expiryTime()` returns `Infinity` and `expiryDate()` returns a `Date` object for \"Tue, 19 Jan 2038 03:14:07 GMT\" (latest date that can be expressed by a 32-bit `time_t`; the common limit for most user-agents).\n\n.TTL([now=Date.now()])\n---------\n\ncompute the TTL relative to `now` (milliseconds). The same precedence rules as for `expiryTime`/`expiryDate` apply.\n\nThe \"number\" `Infinity` is returned for cookies without an explicit expiry and `0` is returned if the cookie is expired. Otherwise a time-to-live in milliseconds is returned.\n\n.canonicalizedDoman()\n---------------------\n\n.cdomain()\n----------\n\nreturn the canonicalized `.domain` field. This is lower-cased and punycode (RFC3490) encoded if the domain has any non-ASCII characters.\n\n.validate()\n-----------\n\nStatus: *IN PROGRESS*. Works for a few things, but is by no means comprehensive.\n\nvalidates cookie attributes for semantic correctness. Useful for \"lint\" checking any Set-Cookie headers you generate. For now, it returns a boolean, but eventually could return a reason string -- you can future-proof with this construct:\n\n``` javascript\nif (cookie.validate() === true) {\n // it's tasty\n} else {\n // yuck!\n}\n```\n\nCookieJar\n=========\n\nConstruction([store = new MemoryCookieStore()][, rejectPublicSuffixes])\n------------\n\nSimply use `new CookieJar()`. If you'd like to use a custom store, pass that to the constructor otherwise a `MemoryCookieStore` will be created and used.\n\n\nAttributes\n----------\n\n * _rejectPublicSuffixes_ - boolean - reject cookies with domains like \"com\" and \"co.uk\" (default: `true`)\n\nSince eventually this module would like to support database/remote/etc. CookieJars, continuation passing style is used for CookieJar methods.\n\n.setCookie(cookieOrString, currentUrl, [{options},] cb(err,cookie))\n-------------------------------------------------------------------\n\nAttempt to set the cookie in the cookie jar. If the operation fails, an error will be given to the callback `cb`, otherwise the cookie is passed through. The cookie will have updated `.created`, `.lastAccessed` and `.hostOnly` properties.\n\nThe `options` object can be omitted and can have the following properties:\n\n * _http_ - boolean - default `true` - indicates if this is an HTTP or non-HTTP API. Affects HttpOnly cookies.\n * _secure_ - boolean - autodetect from url - indicates if this is a \"Secure\" API. If the currentUrl starts with `https:` or `wss:` then this is defaulted to `true`, otherwise `false`.\n * _now_ - Date - default `new Date()` - what to use for the creation/access time of cookies\n * _strict_ - boolean - default `false` - perform extra checks\n * _ignoreError_ - boolean - default `false` - silently ignore things like parse errors and invalid domains. CookieStore errors aren't ignored by this option.\n\nAs per the RFC, the `.hostOnly` property is set if there was no \"Domain=\" parameter in the cookie string (or `.domain` was null on the Cookie object). The `.domain` property is set to the fully-qualified hostname of `currentUrl` in this case. Matching this cookie requires an exact hostname match (not a `domainMatch` as per usual).\n\n.setCookieSync(cookieOrString, currentUrl, [{options}])\n-------------------------------------------------------\n\nSynchronous version of `setCookie`; only works with synchronous stores (e.g. the default `MemoryCookieStore`).\n\n.storeCookie(cookie, [{options},] cb(err,cookie))\n-------------------------------------------------\n\n__REMOVED__ removed in lieu of the CookieStore API below\n\n.getCookies(currentUrl, [{options},] cb(err,cookies))\n-----------------------------------------------------\n\nRetrieve the list of cookies that can be sent in a Cookie header for the current url.\n\nIf an error is encountered, that's passed as `err` to the callback, otherwise an `Array` of `Cookie` objects is passed. The array is sorted with `cookieCompare()` unless the `{sort:false}` option is given.\n\nThe `options` object can be omitted and can have the following properties:\n\n * _http_ - boolean - default `true` - indicates if this is an HTTP or non-HTTP API. Affects HttpOnly cookies.\n * _secure_ - boolean - autodetect from url - indicates if this is a \"Secure\" API. If the currentUrl starts with `https:` or `wss:` then this is defaulted to `true`, otherwise `false`.\n * _now_ - Date - default `new Date()` - what to use for the creation/access time of cookies\n * _expire_ - boolean - default `true` - perform expiry-time checking of cookies and asynchronously remove expired cookies from the store. Using `false` will return expired cookies and **not** remove them from the store (which is useful for replaying Set-Cookie headers, potentially).\n * _allPaths_ - boolean - default `false` - if `true`, do not scope cookies by path. The default uses RFC-compliant path scoping. **Note**: may not be supported by the CookieStore `fetchCookies` function (the default MemoryCookieStore supports it).\n\nThe `.lastAccessed` property of the returned cookies will have been updated.\n\n.getCookiesSync(currentUrl, [{options}])\n----------------------------------------\n\nSynchronous version of `getCookies`; only works with synchronous stores (e.g. the default `MemoryCookieStore`).\n\n.getCookieString(...)\n---------------------\n\nAccepts the same options as `.getCookies()` but passes a string suitable for a Cookie header rather than an array to the callback. Simply maps the `Cookie` array via `.cookieString()`.\n\n.getCookieStringSync(...)\n-------------------------\n\nSynchronous version of `getCookieString`; only works with synchronous stores (e.g. the default `MemoryCookieStore`).\n\n.getSetCookieStrings(...)\n-------------------------\n\nReturns an array of strings suitable for **Set-Cookie** headers. Accepts the same options as `.getCookies()`. Simply maps the cookie array via `.toString()`.\n\n.getSetCookieStringsSync(...)\n-----------------------------\n\nSynchronous version of `getSetCookieStrings`; only works with synchronous stores (e.g. the default `MemoryCookieStore`).\n\nStore\n=====\n\nBase class for CookieJar stores.\n\n# CookieStore API\n\nThe storage model for each `CookieJar` instance can be replaced with a custom implementation. The default is `MemoryCookieStore` which can be found in the `lib/memstore.js` file. The API uses continuation-passing-style to allow for asynchronous stores.\n\nStores should inherit from the base `Store` class, which is available as `require('tough-cookie').Store`. Stores are asynchronous by default, but if `store.synchronous` is set, then the `*Sync` methods on the CookieJar can be used.\n\nAll `domain` parameters will have been normalized before calling.\n\nThe Cookie store must have all of the following methods.\n\nstore.findCookie(domain, path, key, cb(err,cookie))\n---------------------------------------------------\n\nRetrieve a cookie with the given domain, path and key (a.k.a. name). The RFC maintains that exactly one of these cookies should exist in a store. If the store is using versioning, this means that the latest/newest such cookie should be returned.\n\nCallback takes an error and the resulting `Cookie` object. If no cookie is found then `null` MUST be passed instead (i.e. not an error).\n\nstore.findCookies(domain, path, cb(err,cookies))\n------------------------------------------------\n\nLocates cookies matching the given domain and path. This is most often called in the context of `cookiejar.getCookies()` above.\n\nIf no cookies are found, the callback MUST be passed an empty array.\n\nThe resulting list will be checked for applicability to the current request according to the RFC (domain-match, path-match, http-only-flag, secure-flag, expiry, etc.), so it's OK to use an optimistic search algorithm when implementing this method. However, the search algorithm used SHOULD try to find cookies that `domainMatch()` the domain and `pathMatch()` the path in order to limit the amount of checking that needs to be done.\n\nAs of version 0.9.12, the `allPaths` option to `cookiejar.getCookies()` above will cause the path here to be `null`. If the path is `null`, path-matching MUST NOT be performed (i.e. domain-matching only).\n\nstore.putCookie(cookie, cb(err))\n--------------------------------\n\nAdds a new cookie to the store. The implementation SHOULD replace any existing cookie with the same `.domain`, `.path`, and `.key` properties -- depending on the nature of the implementation, it's possible that between the call to `fetchCookie` and `putCookie` that a duplicate `putCookie` can occur.\n\nThe `cookie` object MUST NOT be modified; the caller will have already updated the `.creation` and `.lastAccessed` properties.\n\nPass an error if the cookie cannot be stored.\n\nstore.updateCookie(oldCookie, newCookie, cb(err))\n-------------------------------------------------\n\nUpdate an existing cookie. The implementation MUST update the `.value` for a cookie with the same `domain`, `.path` and `.key`. The implementation SHOULD check that the old value in the store is equivalent to `oldCookie` - how the conflict is resolved is up to the store.\n\nThe `.lastAccessed` property will always be different between the two objects and `.created` will always be the same. Stores MAY ignore or defer the `.lastAccessed` change at the cost of affecting how cookies are sorted (or selected for deletion).\n\nStores may wish to optimize changing the `.value` of the cookie in the store versus storing a new cookie. If the implementation doesn't define this method a stub that calls `putCookie(newCookie,cb)` will be added to the store object.\n\nThe `newCookie` and `oldCookie` objects MUST NOT be modified.\n\nPass an error if the newCookie cannot be stored.\n\nstore.removeCookie(domain, path, key, cb(err))\n----------------------------------------------\n\nRemove a cookie from the store (see notes on `findCookie` about the uniqueness constraint).\n\nThe implementation MUST NOT pass an error if the cookie doesn't exist; only pass an error due to the failure to remove an existing cookie.\n\nstore.removeCookies(domain, path, cb(err))\n------------------------------------------\n\nRemoves matching cookies from the store. The `path` paramter is optional, and if missing means all paths in a domain should be removed.\n\nPass an error ONLY if removing any existing cookies failed.\n\n# TODO\n\n * _full_ RFC5890/RFC5891 canonicalization for domains in `cdomain()`\n * the optional `punycode` requirement implements RFC3492, but RFC6265 requires RFC5891\n * better tests for `validate()`?\n\n# Copyright and License\n\n(tl;dr: MIT with some MPL/1.1)\n\nCopyright 2012- GoInstant, Inc. and other contributors. All rights reserved.\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to\ndeal in the Software without restriction, including without limitation the\nrights to use, copy, modify, merge, publish, distribute, sublicense, and/or\nsell copies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\nFROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\nIN THE SOFTWARE.\n\nPortions may be licensed under different licenses (in particular public-suffix.txt is MPL/1.1); please read the LICENSE file for full details.\n", "readmeFilename": "README.md", "_id": "tough-cookie@0.12.1", "dist": { "shasum": "8220c7e21abd5b13d96804254bd5a81ebf2c7d62", "tarball": "http://registry.npmjs.org/tough-cookie/-/tough-cookie-0.12.1.tgz" }, "_from": "tough-cookie@>=0.12.0", "_npmVersion": "1.3.11", "_npmUser": { "name": "goinstant", "email": "support@goinstant.com" }, "maintainers": [ { "name": "jstash", "email": "jeremy@goinstant.com" }, { "name": "goinstant", "email": "services@goinstant.com" } ], "directories": {}, "_shasum": "8220c7e21abd5b13d96804254bd5a81ebf2c7d62", "_resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-0.12.1.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/public-suffix.txt���������������000644 �000766 �000024 �00000212375 12455173731 035375� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// ***** BEGIN LICENSE BLOCK ***** // Version: MPL 1.1/GPL 2.0/LGPL 2.1 // // The contents of this file are subject to the Mozilla Public License Version // 1.1 (the "License"); you may not use this file except in compliance with // the License. You may obtain a copy of the License at // http://www.mozilla.org/MPL/ // // Software distributed under the License is distributed on an "AS IS" basis, // WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License // for the specific language governing rights and limitations under the // License. // // The Original Code is the Public Suffix List. // // The Initial Developer of the Original Code is // Jo Hermans <jo.hermans@gmail.com>. // Portions created by the Initial Developer are Copyright (C) 2007 // the Initial Developer. All Rights Reserved. // // Contributor(s): // Ruben Arakelyan <ruben@rubenarakelyan.com> // Gervase Markham <gerv@gerv.net> // Pamela Greene <pamg.bugs@gmail.com> // David Triendl <david@triendl.name> // Jothan Frakes <jothan@gmail.com> // The kind representatives of many TLD registries // // Alternatively, the contents of this file may be used under the terms of // either the GNU General Public License Version 2 or later (the "GPL"), or // the GNU Lesser General Public License Version 2.1 or later (the "LGPL"), // in which case the provisions of the GPL or the LGPL are applicable instead // of those above. If you wish to allow use of your version of this file only // under the terms of either the GPL or the LGPL, and not to allow others to // use your version of this file under the terms of the MPL, indicate your // decision by deleting the provisions above and replace them with the notice // and other provisions required by the GPL or the LGPL. If you do not delete // the provisions above, a recipient may use your version of this file under // the terms of any one of the MPL, the GPL or the LGPL. // // ***** END LICENSE BLOCK ***** // ===BEGIN ICANN DOMAINS=== // ac : http://en.wikipedia.org/wiki/.ac ac com.ac edu.ac gov.ac net.ac mil.ac org.ac // ad : http://en.wikipedia.org/wiki/.ad ad nom.ad // ae : http://en.wikipedia.org/wiki/.ae // see also: "Domain Name Eligibility Policy" at http://www.aeda.ae/eng/aepolicy.php ae co.ae net.ae org.ae sch.ae ac.ae gov.ae mil.ae // aero : see http://www.information.aero/index.php?id=66 aero accident-investigation.aero accident-prevention.aero aerobatic.aero aeroclub.aero aerodrome.aero agents.aero aircraft.aero airline.aero airport.aero air-surveillance.aero airtraffic.aero air-traffic-control.aero ambulance.aero amusement.aero association.aero author.aero ballooning.aero broker.aero caa.aero cargo.aero catering.aero certification.aero championship.aero charter.aero civilaviation.aero club.aero conference.aero consultant.aero consulting.aero control.aero council.aero crew.aero design.aero dgca.aero educator.aero emergency.aero engine.aero engineer.aero entertainment.aero equipment.aero exchange.aero express.aero federation.aero flight.aero freight.aero fuel.aero gliding.aero government.aero groundhandling.aero group.aero hanggliding.aero homebuilt.aero insurance.aero journal.aero journalist.aero leasing.aero logistics.aero magazine.aero maintenance.aero marketplace.aero media.aero microlight.aero modelling.aero navigation.aero parachuting.aero paragliding.aero passenger-association.aero pilot.aero press.aero production.aero recreation.aero repbody.aero res.aero research.aero rotorcraft.aero safety.aero scientist.aero services.aero show.aero skydiving.aero software.aero student.aero taxi.aero trader.aero trading.aero trainer.aero union.aero workinggroup.aero works.aero // af : http://www.nic.af/help.jsp af gov.af com.af org.af net.af edu.af // ag : http://www.nic.ag/prices.htm ag com.ag org.ag net.ag co.ag nom.ag // ai : http://nic.com.ai/ ai off.ai com.ai net.ai org.ai // al : http://www.ert.gov.al/ert_alb/faq_det.html?Id=31 al com.al edu.al gov.al mil.al net.al org.al // am : http://en.wikipedia.org/wiki/.am am // an : http://www.una.an/an_domreg/default.asp an com.an net.an org.an edu.an // ao : http://en.wikipedia.org/wiki/.ao // http://www.dns.ao/REGISTR.DOC ao ed.ao gv.ao og.ao co.ao pb.ao it.ao // aq : http://en.wikipedia.org/wiki/.aq aq // ar : http://en.wikipedia.org/wiki/.ar *.ar !congresodelalengua3.ar !educ.ar !gobiernoelectronico.ar !mecon.ar !nacion.ar !nic.ar !promocion.ar !retina.ar !uba.ar // arpa : http://en.wikipedia.org/wiki/.arpa // Confirmed by registry <iana-questions@icann.org> 2008-06-18 e164.arpa in-addr.arpa ip6.arpa iris.arpa uri.arpa urn.arpa // as : http://en.wikipedia.org/wiki/.as as gov.as // asia : http://en.wikipedia.org/wiki/.asia asia // at : http://en.wikipedia.org/wiki/.at // Confirmed by registry <it@nic.at> 2008-06-17 at ac.at co.at gv.at or.at // au : http://en.wikipedia.org/wiki/.au // http://www.auda.org.au/ // 2LDs com.au net.au org.au edu.au gov.au csiro.au asn.au id.au // Historic 2LDs (closed to new registration, but sites still exist) info.au conf.au oz.au // CGDNs - http://www.cgdn.org.au/ act.au nsw.au nt.au qld.au sa.au tas.au vic.au wa.au // 3LDs act.edu.au nsw.edu.au nt.edu.au qld.edu.au sa.edu.au tas.edu.au vic.edu.au wa.edu.au act.gov.au // Removed at request of Shae.Donelan@services.nsw.gov.au, 2010-03-04 // nsw.gov.au nt.gov.au qld.gov.au sa.gov.au tas.gov.au vic.gov.au wa.gov.au // aw : http://en.wikipedia.org/wiki/.aw aw com.aw // ax : http://en.wikipedia.org/wiki/.ax ax // az : http://en.wikipedia.org/wiki/.az az com.az net.az int.az gov.az org.az edu.az info.az pp.az mil.az name.az pro.az biz.az // ba : http://en.wikipedia.org/wiki/.ba ba org.ba net.ba edu.ba gov.ba mil.ba unsa.ba unbi.ba co.ba com.ba rs.ba // bb : http://en.wikipedia.org/wiki/.bb bb biz.bb com.bb edu.bb gov.bb info.bb net.bb org.bb store.bb // bd : http://en.wikipedia.org/wiki/.bd *.bd // be : http://en.wikipedia.org/wiki/.be // Confirmed by registry <tech@dns.be> 2008-06-08 be ac.be // bf : http://en.wikipedia.org/wiki/.bf bf gov.bf // bg : http://en.wikipedia.org/wiki/.bg // https://www.register.bg/user/static/rules/en/index.html bg a.bg b.bg c.bg d.bg e.bg f.bg g.bg h.bg i.bg j.bg k.bg l.bg m.bg n.bg o.bg p.bg q.bg r.bg s.bg t.bg u.bg v.bg w.bg x.bg y.bg z.bg 0.bg 1.bg 2.bg 3.bg 4.bg 5.bg 6.bg 7.bg 8.bg 9.bg // bh : http://en.wikipedia.org/wiki/.bh bh com.bh edu.bh net.bh org.bh gov.bh // bi : http://en.wikipedia.org/wiki/.bi // http://whois.nic.bi/ bi co.bi com.bi edu.bi or.bi org.bi // biz : http://en.wikipedia.org/wiki/.biz biz // bj : http://en.wikipedia.org/wiki/.bj bj asso.bj barreau.bj gouv.bj // bm : http://www.bermudanic.bm/dnr-text.txt bm com.bm edu.bm gov.bm net.bm org.bm // bn : http://en.wikipedia.org/wiki/.bn *.bn // bo : http://www.nic.bo/ bo com.bo edu.bo gov.bo gob.bo int.bo org.bo net.bo mil.bo tv.bo // br : http://registro.br/dominio/dpn.html // Updated by registry <fneves@registro.br> 2011-03-01 br adm.br adv.br agr.br am.br arq.br art.br ato.br b.br bio.br blog.br bmd.br can.br cim.br cng.br cnt.br com.br coop.br ecn.br edu.br emp.br eng.br esp.br etc.br eti.br far.br flog.br fm.br fnd.br fot.br fst.br g12.br ggf.br gov.br imb.br ind.br inf.br jor.br jus.br lel.br mat.br med.br mil.br mus.br net.br nom.br not.br ntr.br odo.br org.br ppg.br pro.br psc.br psi.br qsl.br radio.br rec.br slg.br srv.br taxi.br teo.br tmp.br trd.br tur.br tv.br vet.br vlog.br wiki.br zlg.br // bs : http://www.nic.bs/rules.html bs com.bs net.bs org.bs edu.bs gov.bs // bt : http://en.wikipedia.org/wiki/.bt bt com.bt edu.bt gov.bt net.bt org.bt // bv : No registrations at this time. // Submitted by registry <jarle@uninett.no> 2006-06-16 // bw : http://en.wikipedia.org/wiki/.bw // http://www.gobin.info/domainname/bw.doc // list of other 2nd level tlds ? bw co.bw org.bw // by : http://en.wikipedia.org/wiki/.by // http://tld.by/rules_2006_en.html // list of other 2nd level tlds ? by gov.by mil.by // Official information does not indicate that com.by is a reserved // second-level domain, but it's being used as one (see www.google.com.by and // www.yahoo.com.by, for example), so we list it here for safety's sake. com.by // http://hoster.by/ of.by // bz : http://en.wikipedia.org/wiki/.bz // http://www.belizenic.bz/ bz com.bz net.bz org.bz edu.bz gov.bz // ca : http://en.wikipedia.org/wiki/.ca ca // ca geographical names ab.ca bc.ca mb.ca nb.ca nf.ca nl.ca ns.ca nt.ca nu.ca on.ca pe.ca qc.ca sk.ca yk.ca // gc.ca: http://en.wikipedia.org/wiki/.gc.ca // see also: http://registry.gc.ca/en/SubdomainFAQ gc.ca // cat : http://en.wikipedia.org/wiki/.cat cat // cc : http://en.wikipedia.org/wiki/.cc cc // cd : http://en.wikipedia.org/wiki/.cd // see also: https://www.nic.cd/domain/insertDomain_2.jsp?act=1 cd gov.cd // cf : http://en.wikipedia.org/wiki/.cf cf // cg : http://en.wikipedia.org/wiki/.cg cg // ch : http://en.wikipedia.org/wiki/.ch ch // ci : http://en.wikipedia.org/wiki/.ci // http://www.nic.ci/index.php?page=charte ci org.ci or.ci com.ci co.ci edu.ci ed.ci ac.ci net.ci go.ci asso.ci aéroport.ci int.ci presse.ci md.ci gouv.ci // ck : http://en.wikipedia.org/wiki/.ck *.ck !www.ck // cl : http://en.wikipedia.org/wiki/.cl cl gov.cl gob.cl co.cl mil.cl // cm : http://en.wikipedia.org/wiki/.cm cm gov.cm // cn : http://en.wikipedia.org/wiki/.cn // Submitted by registry <tanyaling@cnnic.cn> 2008-06-11 cn ac.cn com.cn edu.cn gov.cn net.cn org.cn mil.cn 公司.cn 网络.cn 網絡.cn // cn geographic names ah.cn bj.cn cq.cn fj.cn gd.cn gs.cn gz.cn gx.cn ha.cn hb.cn he.cn hi.cn hl.cn hn.cn jl.cn js.cn jx.cn ln.cn nm.cn nx.cn qh.cn sc.cn sd.cn sh.cn sn.cn sx.cn tj.cn xj.cn xz.cn yn.cn zj.cn hk.cn mo.cn tw.cn // co : http://en.wikipedia.org/wiki/.co // Submitted by registry <tecnico@uniandes.edu.co> 2008-06-11 co arts.co com.co edu.co firm.co gov.co info.co int.co mil.co net.co nom.co org.co rec.co web.co // com : http://en.wikipedia.org/wiki/.com com // coop : http://en.wikipedia.org/wiki/.coop coop // cr : http://www.nic.cr/niccr_publico/showRegistroDominiosScreen.do cr ac.cr co.cr ed.cr fi.cr go.cr or.cr sa.cr // cu : http://en.wikipedia.org/wiki/.cu cu com.cu edu.cu org.cu net.cu gov.cu inf.cu // cv : http://en.wikipedia.org/wiki/.cv cv // cx : http://en.wikipedia.org/wiki/.cx // list of other 2nd level tlds ? cx gov.cx // cy : http://en.wikipedia.org/wiki/.cy *.cy // cz : http://en.wikipedia.org/wiki/.cz cz // de : http://en.wikipedia.org/wiki/.de // Confirmed by registry <ops@denic.de> (with technical // reservations) 2008-07-01 de // dj : http://en.wikipedia.org/wiki/.dj dj // dk : http://en.wikipedia.org/wiki/.dk // Confirmed by registry <robert@dk-hostmaster.dk> 2008-06-17 dk // dm : http://en.wikipedia.org/wiki/.dm dm com.dm net.dm org.dm edu.dm gov.dm // do : http://en.wikipedia.org/wiki/.do do art.do com.do edu.do gob.do gov.do mil.do net.do org.do sld.do web.do // dz : http://en.wikipedia.org/wiki/.dz dz com.dz org.dz net.dz gov.dz edu.dz asso.dz pol.dz art.dz // ec : http://www.nic.ec/reg/paso1.asp // Submitted by registry <vabboud@nic.ec> 2008-07-04 ec com.ec info.ec net.ec fin.ec k12.ec med.ec pro.ec org.ec edu.ec gov.ec gob.ec mil.ec // edu : http://en.wikipedia.org/wiki/.edu edu // ee : http://www.eenet.ee/EENet/dom_reeglid.html#lisa_B ee edu.ee gov.ee riik.ee lib.ee med.ee com.ee pri.ee aip.ee org.ee fie.ee // eg : http://en.wikipedia.org/wiki/.eg eg com.eg edu.eg eun.eg gov.eg mil.eg name.eg net.eg org.eg sci.eg // er : http://en.wikipedia.org/wiki/.er *.er // es : https://www.nic.es/site_ingles/ingles/dominios/index.html es com.es nom.es org.es gob.es edu.es // et : http://en.wikipedia.org/wiki/.et *.et // eu : http://en.wikipedia.org/wiki/.eu eu // fi : http://en.wikipedia.org/wiki/.fi fi // aland.fi : http://en.wikipedia.org/wiki/.ax // This domain is being phased out in favor of .ax. As there are still many // domains under aland.fi, we still keep it on the list until aland.fi is // completely removed. // TODO: Check for updates (expected to be phased out around Q1/2009) aland.fi // fj : http://en.wikipedia.org/wiki/.fj *.fj // fk : http://en.wikipedia.org/wiki/.fk *.fk // fm : http://en.wikipedia.org/wiki/.fm fm // fo : http://en.wikipedia.org/wiki/.fo fo // fr : http://www.afnic.fr/ // domaines descriptifs : http://www.afnic.fr/obtenir/chartes/nommage-fr/annexe-descriptifs fr com.fr asso.fr nom.fr prd.fr presse.fr tm.fr // domaines sectoriels : http://www.afnic.fr/obtenir/chartes/nommage-fr/annexe-sectoriels aeroport.fr assedic.fr avocat.fr avoues.fr cci.fr chambagri.fr chirurgiens-dentistes.fr experts-comptables.fr geometre-expert.fr gouv.fr greta.fr huissier-justice.fr medecin.fr notaires.fr pharmacien.fr port.fr veterinaire.fr // ga : http://en.wikipedia.org/wiki/.ga ga // gb : This registry is effectively dormant // Submitted by registry <Damien.Shaw@ja.net> 2008-06-12 // gd : http://en.wikipedia.org/wiki/.gd gd // ge : http://www.nic.net.ge/policy_en.pdf ge com.ge edu.ge gov.ge org.ge mil.ge net.ge pvt.ge // gf : http://en.wikipedia.org/wiki/.gf gf // gg : http://www.channelisles.net/applic/avextn.shtml gg co.gg org.gg net.gg sch.gg gov.gg // gh : http://en.wikipedia.org/wiki/.gh // see also: http://www.nic.gh/reg_now.php // Although domains directly at second level are not possible at the moment, // they have been possible for some time and may come back. gh com.gh edu.gh gov.gh org.gh mil.gh // gi : http://www.nic.gi/rules.html gi com.gi ltd.gi gov.gi mod.gi edu.gi org.gi // gl : http://en.wikipedia.org/wiki/.gl // http://nic.gl gl // gm : http://www.nic.gm/htmlpages%5Cgm-policy.htm gm // gn : http://psg.com/dns/gn/gn.txt // Submitted by registry <randy@psg.com> 2008-06-17 ac.gn com.gn edu.gn gov.gn org.gn net.gn // gov : http://en.wikipedia.org/wiki/.gov gov // gp : http://www.nic.gp/index.php?lang=en gp com.gp net.gp mobi.gp edu.gp org.gp asso.gp // gq : http://en.wikipedia.org/wiki/.gq gq // gr : https://grweb.ics.forth.gr/english/1617-B-2005.html // Submitted by registry <segred@ics.forth.gr> 2008-06-09 gr com.gr edu.gr net.gr org.gr gov.gr // gs : http://en.wikipedia.org/wiki/.gs gs // gt : http://www.gt/politicas.html *.gt !www.gt // gu : http://gadao.gov.gu/registration.txt *.gu // gw : http://en.wikipedia.org/wiki/.gw gw // gy : http://en.wikipedia.org/wiki/.gy // http://registry.gy/ gy co.gy com.gy net.gy // hk : https://www.hkdnr.hk // Submitted by registry <hk.tech@hkirc.hk> 2008-06-11 hk com.hk edu.hk gov.hk idv.hk net.hk org.hk 公司.hk 教育.hk 敎育.hk 政府.hk 個人.hk 个人.hk 箇人.hk 網络.hk 网络.hk 组織.hk 網絡.hk 网絡.hk 组织.hk 組織.hk 組织.hk // hm : http://en.wikipedia.org/wiki/.hm hm // hn : http://www.nic.hn/politicas/ps02,,05.html hn com.hn edu.hn org.hn net.hn mil.hn gob.hn // hr : http://www.dns.hr/documents/pdf/HRTLD-regulations.pdf hr iz.hr from.hr name.hr com.hr // ht : http://www.nic.ht/info/charte.cfm ht com.ht shop.ht firm.ht info.ht adult.ht net.ht pro.ht org.ht med.ht art.ht coop.ht pol.ht asso.ht edu.ht rel.ht gouv.ht perso.ht // hu : http://www.domain.hu/domain/English/sld.html // Confirmed by registry <pasztor@iszt.hu> 2008-06-12 hu co.hu info.hu org.hu priv.hu sport.hu tm.hu 2000.hu agrar.hu bolt.hu casino.hu city.hu erotica.hu erotika.hu film.hu forum.hu games.hu hotel.hu ingatlan.hu jogasz.hu konyvelo.hu lakas.hu media.hu news.hu reklam.hu sex.hu shop.hu suli.hu szex.hu tozsde.hu utazas.hu video.hu // id : http://en.wikipedia.org/wiki/.id // see also: https://register.pandi.or.id/ id ac.id co.id go.id mil.id net.id or.id sch.id web.id // ie : http://en.wikipedia.org/wiki/.ie ie gov.ie // il : http://en.wikipedia.org/wiki/.il *.il // im : https://www.nic.im/pdfs/imfaqs.pdf im co.im ltd.co.im plc.co.im net.im gov.im org.im nic.im ac.im // in : http://en.wikipedia.org/wiki/.in // see also: http://www.inregistry.in/policies/ // Please note, that nic.in is not an offical eTLD, but used by most // government institutions. in co.in firm.in net.in org.in gen.in ind.in nic.in ac.in edu.in res.in gov.in mil.in // info : http://en.wikipedia.org/wiki/.info info // int : http://en.wikipedia.org/wiki/.int // Confirmed by registry <iana-questions@icann.org> 2008-06-18 int eu.int // io : http://www.nic.io/rules.html // list of other 2nd level tlds ? io com.io // iq : http://www.cmc.iq/english/iq/iqregister1.htm iq gov.iq edu.iq mil.iq com.iq org.iq net.iq // ir : http://www.nic.ir/Terms_and_Conditions_ir,_Appendix_1_Domain_Rules // Also see http://www.nic.ir/Internationalized_Domain_Names // Two <iran>.ir entries added at request of <tech-team@nic.ir>, 2010-04-16 ir ac.ir co.ir gov.ir id.ir net.ir org.ir sch.ir // xn--mgba3a4f16a.ir (<iran>.ir, Persian YEH) ایران.ir // xn--mgba3a4fra.ir (<iran>.ir, Arabic YEH) ايران.ir // is : http://www.isnic.is/domain/rules.php // Confirmed by registry <marius@isgate.is> 2008-12-06 is net.is com.is edu.is gov.is org.is int.is // it : http://en.wikipedia.org/wiki/.it it gov.it edu.it // list of reserved geo-names : // http://www.nic.it/documenti/regolamenti-e-linee-guida/regolamento-assegnazione-versione-6.0.pdf // (There is also a list of reserved geo-names corresponding to Italian // municipalities : http://www.nic.it/documenti/appendice-c.pdf , but it is // not included here.) agrigento.it ag.it alessandria.it al.it ancona.it an.it aosta.it aoste.it ao.it arezzo.it ar.it ascoli-piceno.it ascolipiceno.it ap.it asti.it at.it avellino.it av.it bari.it ba.it andria-barletta-trani.it andriabarlettatrani.it trani-barletta-andria.it tranibarlettaandria.it barletta-trani-andria.it barlettatraniandria.it andria-trani-barletta.it andriatranibarletta.it trani-andria-barletta.it traniandriabarletta.it bt.it belluno.it bl.it benevento.it bn.it bergamo.it bg.it biella.it bi.it bologna.it bo.it bolzano.it bozen.it balsan.it alto-adige.it altoadige.it suedtirol.it bz.it brescia.it bs.it brindisi.it br.it cagliari.it ca.it caltanissetta.it cl.it campobasso.it cb.it carboniaiglesias.it carbonia-iglesias.it iglesias-carbonia.it iglesiascarbonia.it ci.it caserta.it ce.it catania.it ct.it catanzaro.it cz.it chieti.it ch.it como.it co.it cosenza.it cs.it cremona.it cr.it crotone.it kr.it cuneo.it cn.it dell-ogliastra.it dellogliastra.it ogliastra.it og.it enna.it en.it ferrara.it fe.it fermo.it fm.it firenze.it florence.it fi.it foggia.it fg.it forli-cesena.it forlicesena.it cesena-forli.it cesenaforli.it fc.it frosinone.it fr.it genova.it genoa.it ge.it gorizia.it go.it grosseto.it gr.it imperia.it im.it isernia.it is.it laquila.it aquila.it aq.it la-spezia.it laspezia.it sp.it latina.it lt.it lecce.it le.it lecco.it lc.it livorno.it li.it lodi.it lo.it lucca.it lu.it macerata.it mc.it mantova.it mn.it massa-carrara.it massacarrara.it carrara-massa.it carraramassa.it ms.it matera.it mt.it medio-campidano.it mediocampidano.it campidano-medio.it campidanomedio.it vs.it messina.it me.it milano.it milan.it mi.it modena.it mo.it monza.it monza-brianza.it monzabrianza.it monzaebrianza.it monzaedellabrianza.it monza-e-della-brianza.it mb.it napoli.it naples.it na.it novara.it no.it nuoro.it nu.it oristano.it or.it padova.it padua.it pd.it palermo.it pa.it parma.it pr.it pavia.it pv.it perugia.it pg.it pescara.it pe.it pesaro-urbino.it pesarourbino.it urbino-pesaro.it urbinopesaro.it pu.it piacenza.it pc.it pisa.it pi.it pistoia.it pt.it pordenone.it pn.it potenza.it pz.it prato.it po.it ragusa.it rg.it ravenna.it ra.it reggio-calabria.it reggiocalabria.it rc.it reggio-emilia.it reggioemilia.it re.it rieti.it ri.it rimini.it rn.it roma.it rome.it rm.it rovigo.it ro.it salerno.it sa.it sassari.it ss.it savona.it sv.it siena.it si.it siracusa.it sr.it sondrio.it so.it taranto.it ta.it tempio-olbia.it tempioolbia.it olbia-tempio.it olbiatempio.it ot.it teramo.it te.it terni.it tr.it torino.it turin.it to.it trapani.it tp.it trento.it trentino.it tn.it treviso.it tv.it trieste.it ts.it udine.it ud.it varese.it va.it venezia.it venice.it ve.it verbania.it vb.it vercelli.it vc.it verona.it vr.it vibo-valentia.it vibovalentia.it vv.it vicenza.it vi.it viterbo.it vt.it // je : http://www.channelisles.net/applic/avextn.shtml je co.je org.je net.je sch.je gov.je // jm : http://www.com.jm/register.html *.jm // jo : http://www.dns.jo/Registration_policy.aspx jo com.jo org.jo net.jo edu.jo sch.jo gov.jo mil.jo name.jo // jobs : http://en.wikipedia.org/wiki/.jobs jobs // jp : http://en.wikipedia.org/wiki/.jp // http://jprs.co.jp/en/jpdomain.html // Submitted by registry <yone@jprs.co.jp> 2008-06-11 // Updated by registry <yone@jprs.co.jp> 2008-12-04 jp // jp organizational type names ac.jp ad.jp co.jp ed.jp go.jp gr.jp lg.jp ne.jp or.jp // jp geographic type names // http://jprs.jp/doc/rule/saisoku-1.html *.aichi.jp *.akita.jp *.aomori.jp *.chiba.jp *.ehime.jp *.fukui.jp *.fukuoka.jp *.fukushima.jp *.gifu.jp *.gunma.jp *.hiroshima.jp *.hokkaido.jp *.hyogo.jp *.ibaraki.jp *.ishikawa.jp *.iwate.jp *.kagawa.jp *.kagoshima.jp *.kanagawa.jp *.kawasaki.jp *.kitakyushu.jp *.kobe.jp *.kochi.jp *.kumamoto.jp *.kyoto.jp *.mie.jp *.miyagi.jp *.miyazaki.jp *.nagano.jp *.nagasaki.jp *.nagoya.jp *.nara.jp *.niigata.jp *.oita.jp *.okayama.jp *.okinawa.jp *.osaka.jp *.saga.jp *.saitama.jp *.sapporo.jp *.sendai.jp *.shiga.jp *.shimane.jp *.shizuoka.jp *.tochigi.jp *.tokushima.jp *.tokyo.jp *.tottori.jp *.toyama.jp *.wakayama.jp *.yamagata.jp *.yamaguchi.jp *.yamanashi.jp *.yokohama.jp !metro.tokyo.jp !pref.aichi.jp !pref.akita.jp !pref.aomori.jp !pref.chiba.jp !pref.ehime.jp !pref.fukui.jp !pref.fukuoka.jp !pref.fukushima.jp !pref.gifu.jp !pref.gunma.jp !pref.hiroshima.jp !pref.hokkaido.jp !pref.hyogo.jp !pref.ibaraki.jp !pref.ishikawa.jp !pref.iwate.jp !pref.kagawa.jp !pref.kagoshima.jp !pref.kanagawa.jp !pref.kochi.jp !pref.kumamoto.jp !pref.kyoto.jp !pref.mie.jp !pref.miyagi.jp !pref.miyazaki.jp !pref.nagano.jp !pref.nagasaki.jp !pref.nara.jp !pref.niigata.jp !pref.oita.jp !pref.okayama.jp !pref.okinawa.jp !pref.osaka.jp !pref.saga.jp !pref.saitama.jp !pref.shiga.jp !pref.shimane.jp !pref.shizuoka.jp !pref.tochigi.jp !pref.tokushima.jp !pref.tottori.jp !pref.toyama.jp !pref.wakayama.jp !pref.yamagata.jp !pref.yamaguchi.jp !pref.yamanashi.jp !city.chiba.jp !city.fukuoka.jp !city.hiroshima.jp !city.kawasaki.jp !city.kitakyushu.jp !city.kobe.jp !city.kyoto.jp !city.nagoya.jp !city.niigata.jp !city.okayama.jp !city.osaka.jp !city.saitama.jp !city.sapporo.jp !city.sendai.jp !city.shizuoka.jp !city.yokohama.jp // ke : http://www.kenic.or.ke/index.php?option=com_content&task=view&id=117&Itemid=145 *.ke // kg : http://www.domain.kg/dmn_n.html kg org.kg net.kg com.kg edu.kg gov.kg mil.kg // kh : http://www.mptc.gov.kh/dns_registration.htm *.kh // ki : http://www.ki/dns/index.html ki edu.ki biz.ki net.ki org.ki gov.ki info.ki com.ki // km : http://en.wikipedia.org/wiki/.km // http://www.domaine.km/documents/charte.doc km org.km nom.km gov.km prd.km tm.km edu.km mil.km ass.km com.km // These are only mentioned as proposed suggestions at domaine.km, but // http://en.wikipedia.org/wiki/.km says they're available for registration: coop.km asso.km presse.km medecin.km notaires.km pharmaciens.km veterinaire.km gouv.km // kn : http://en.wikipedia.org/wiki/.kn // http://www.dot.kn/domainRules.html kn net.kn org.kn edu.kn gov.kn // kp : http://www.kcce.kp/en_index.php com.kp edu.kp gov.kp org.kp rep.kp tra.kp // kr : http://en.wikipedia.org/wiki/.kr // see also: http://domain.nida.or.kr/eng/registration.jsp kr ac.kr co.kr es.kr go.kr hs.kr kg.kr mil.kr ms.kr ne.kr or.kr pe.kr re.kr sc.kr // kr geographical names busan.kr chungbuk.kr chungnam.kr daegu.kr daejeon.kr gangwon.kr gwangju.kr gyeongbuk.kr gyeonggi.kr gyeongnam.kr incheon.kr jeju.kr jeonbuk.kr jeonnam.kr seoul.kr ulsan.kr // kw : http://en.wikipedia.org/wiki/.kw *.kw // ky : http://www.icta.ky/da_ky_reg_dom.php // Confirmed by registry <kysupport@perimeterusa.com> 2008-06-17 ky edu.ky gov.ky com.ky org.ky net.ky // kz : http://en.wikipedia.org/wiki/.kz // see also: http://www.nic.kz/rules/index.jsp kz org.kz edu.kz net.kz gov.kz mil.kz com.kz // la : http://en.wikipedia.org/wiki/.la // Submitted by registry <gavin.brown@nic.la> 2008-06-10 la int.la net.la info.la edu.la gov.la per.la com.la org.la // lb : http://en.wikipedia.org/wiki/.lb // Submitted by registry <randy@psg.com> 2008-06-17 com.lb edu.lb gov.lb net.lb org.lb // lc : http://en.wikipedia.org/wiki/.lc // see also: http://www.nic.lc/rules.htm lc com.lc net.lc co.lc org.lc edu.lc gov.lc // li : http://en.wikipedia.org/wiki/.li li // lk : http://www.nic.lk/seclevpr.html lk gov.lk sch.lk net.lk int.lk com.lk org.lk edu.lk ngo.lk soc.lk web.lk ltd.lk assn.lk grp.lk hotel.lk // lr : http://psg.com/dns/lr/lr.txt // Submitted by registry <randy@psg.com> 2008-06-17 com.lr edu.lr gov.lr org.lr net.lr // ls : http://en.wikipedia.org/wiki/.ls ls co.ls org.ls // lt : http://en.wikipedia.org/wiki/.lt lt // gov.lt : http://www.gov.lt/index_en.php gov.lt // lu : http://www.dns.lu/en/ lu // lv : http://www.nic.lv/DNS/En/generic.php lv com.lv edu.lv gov.lv org.lv mil.lv id.lv net.lv asn.lv conf.lv // ly : http://www.nic.ly/regulations.php ly com.ly net.ly gov.ly plc.ly edu.ly sch.ly med.ly org.ly id.ly // ma : http://en.wikipedia.org/wiki/.ma // http://www.anrt.ma/fr/admin/download/upload/file_fr782.pdf ma co.ma net.ma gov.ma org.ma ac.ma press.ma // mc : http://www.nic.mc/ mc tm.mc asso.mc // md : http://en.wikipedia.org/wiki/.md md // me : http://en.wikipedia.org/wiki/.me me co.me net.me org.me edu.me ac.me gov.me its.me priv.me // mg : http://www.nic.mg/tarif.htm mg org.mg nom.mg gov.mg prd.mg tm.mg edu.mg mil.mg com.mg // mh : http://en.wikipedia.org/wiki/.mh mh // mil : http://en.wikipedia.org/wiki/.mil mil // mk : http://en.wikipedia.org/wiki/.mk // see also: http://dns.marnet.net.mk/postapka.php mk com.mk org.mk net.mk edu.mk gov.mk inf.mk name.mk // ml : http://www.gobin.info/domainname/ml-template.doc // see also: http://en.wikipedia.org/wiki/.ml ml com.ml edu.ml gouv.ml gov.ml net.ml org.ml presse.ml // mm : http://en.wikipedia.org/wiki/.mm *.mm // mn : http://en.wikipedia.org/wiki/.mn mn gov.mn edu.mn org.mn // mo : http://www.monic.net.mo/ mo com.mo net.mo org.mo edu.mo gov.mo // mobi : http://en.wikipedia.org/wiki/.mobi mobi // mp : http://www.dot.mp/ // Confirmed by registry <dcamacho@saipan.com> 2008-06-17 mp // mq : http://en.wikipedia.org/wiki/.mq mq // mr : http://en.wikipedia.org/wiki/.mr mr gov.mr // ms : http://en.wikipedia.org/wiki/.ms ms // mt : https://www.nic.org.mt/dotmt/ *.mt // mu : http://en.wikipedia.org/wiki/.mu mu com.mu net.mu org.mu gov.mu ac.mu co.mu or.mu // museum : http://about.museum/naming/ // http://index.museum/ museum academy.museum agriculture.museum air.museum airguard.museum alabama.museum alaska.museum amber.museum ambulance.museum american.museum americana.museum americanantiques.museum americanart.museum amsterdam.museum and.museum annefrank.museum anthro.museum anthropology.museum antiques.museum aquarium.museum arboretum.museum archaeological.museum archaeology.museum architecture.museum art.museum artanddesign.museum artcenter.museum artdeco.museum arteducation.museum artgallery.museum arts.museum artsandcrafts.museum asmatart.museum assassination.museum assisi.museum association.museum astronomy.museum atlanta.museum austin.museum australia.museum automotive.museum aviation.museum axis.museum badajoz.museum baghdad.museum bahn.museum bale.museum baltimore.museum barcelona.museum baseball.museum basel.museum baths.museum bauern.museum beauxarts.museum beeldengeluid.museum bellevue.museum bergbau.museum berkeley.museum berlin.museum bern.museum bible.museum bilbao.museum bill.museum birdart.museum birthplace.museum bonn.museum boston.museum botanical.museum botanicalgarden.museum botanicgarden.museum botany.museum brandywinevalley.museum brasil.museum bristol.museum british.museum britishcolumbia.museum broadcast.museum brunel.museum brussel.museum brussels.museum bruxelles.museum building.museum burghof.museum bus.museum bushey.museum cadaques.museum california.museum cambridge.museum can.museum canada.museum capebreton.museum carrier.museum cartoonart.museum casadelamoneda.museum castle.museum castres.museum celtic.museum center.museum chattanooga.museum cheltenham.museum chesapeakebay.museum chicago.museum children.museum childrens.museum childrensgarden.museum chiropractic.museum chocolate.museum christiansburg.museum cincinnati.museum cinema.museum circus.museum civilisation.museum civilization.museum civilwar.museum clinton.museum clock.museum coal.museum coastaldefence.museum cody.museum coldwar.museum collection.museum colonialwilliamsburg.museum coloradoplateau.museum columbia.museum columbus.museum communication.museum communications.museum community.museum computer.museum computerhistory.museum comunicações.museum contemporary.museum contemporaryart.museum convent.museum copenhagen.museum corporation.museum correios-e-telecomunicações.museum corvette.museum costume.museum countryestate.museum county.museum crafts.museum cranbrook.museum creation.museum cultural.museum culturalcenter.museum culture.museum cyber.museum cymru.museum dali.museum dallas.museum database.museum ddr.museum decorativearts.museum delaware.museum delmenhorst.museum denmark.museum depot.museum design.museum detroit.museum dinosaur.museum discovery.museum dolls.museum donostia.museum durham.museum eastafrica.museum eastcoast.museum education.museum educational.museum egyptian.museum eisenbahn.museum elburg.museum elvendrell.museum embroidery.museum encyclopedic.museum england.museum entomology.museum environment.museum environmentalconservation.museum epilepsy.museum essex.museum estate.museum ethnology.museum exeter.museum exhibition.museum family.museum farm.museum farmequipment.museum farmers.museum farmstead.museum field.museum figueres.museum filatelia.museum film.museum fineart.museum finearts.museum finland.museum flanders.museum florida.museum force.museum fortmissoula.museum fortworth.museum foundation.museum francaise.museum frankfurt.museum franziskaner.museum freemasonry.museum freiburg.museum fribourg.museum frog.museum fundacio.museum furniture.museum gallery.museum garden.museum gateway.museum geelvinck.museum gemological.museum geology.museum georgia.museum giessen.museum glas.museum glass.museum gorge.museum grandrapids.museum graz.museum guernsey.museum halloffame.museum hamburg.museum handson.museum harvestcelebration.museum hawaii.museum health.museum heimatunduhren.museum hellas.museum helsinki.museum hembygdsforbund.museum heritage.museum histoire.museum historical.museum historicalsociety.museum historichouses.museum historisch.museum historisches.museum history.museum historyofscience.museum horology.museum house.museum humanities.museum illustration.museum imageandsound.museum indian.museum indiana.museum indianapolis.museum indianmarket.museum intelligence.museum interactive.museum iraq.museum iron.museum isleofman.museum jamison.museum jefferson.museum jerusalem.museum jewelry.museum jewish.museum jewishart.museum jfk.museum journalism.museum judaica.museum judygarland.museum juedisches.museum juif.museum karate.museum karikatur.museum kids.museum koebenhavn.museum koeln.museum kunst.museum kunstsammlung.museum kunstunddesign.museum labor.museum labour.museum lajolla.museum lancashire.museum landes.museum lans.museum läns.museum larsson.museum lewismiller.museum lincoln.museum linz.museum living.museum livinghistory.museum localhistory.museum london.museum losangeles.museum louvre.museum loyalist.museum lucerne.museum luxembourg.museum luzern.museum mad.museum madrid.museum mallorca.museum manchester.museum mansion.museum mansions.museum manx.museum marburg.museum maritime.museum maritimo.museum maryland.museum marylhurst.museum media.museum medical.museum medizinhistorisches.museum meeres.museum memorial.museum mesaverde.museum michigan.museum midatlantic.museum military.museum mill.museum miners.museum mining.museum minnesota.museum missile.museum missoula.museum modern.museum moma.museum money.museum monmouth.museum monticello.museum montreal.museum moscow.museum motorcycle.museum muenchen.museum muenster.museum mulhouse.museum muncie.museum museet.museum museumcenter.museum museumvereniging.museum music.museum national.museum nationalfirearms.museum nationalheritage.museum nativeamerican.museum naturalhistory.museum naturalhistorymuseum.museum naturalsciences.museum nature.museum naturhistorisches.museum natuurwetenschappen.museum naumburg.museum naval.museum nebraska.museum neues.museum newhampshire.museum newjersey.museum newmexico.museum newport.museum newspaper.museum newyork.museum niepce.museum norfolk.museum north.museum nrw.museum nuernberg.museum nuremberg.museum nyc.museum nyny.museum oceanographic.museum oceanographique.museum omaha.museum online.museum ontario.museum openair.museum oregon.museum oregontrail.museum otago.museum oxford.museum pacific.museum paderborn.museum palace.museum paleo.museum palmsprings.museum panama.museum paris.museum pasadena.museum pharmacy.museum philadelphia.museum philadelphiaarea.museum philately.museum phoenix.museum photography.museum pilots.museum pittsburgh.museum planetarium.museum plantation.museum plants.museum plaza.museum portal.museum portland.museum portlligat.museum posts-and-telecommunications.museum preservation.museum presidio.museum press.museum project.museum public.museum pubol.museum quebec.museum railroad.museum railway.museum research.museum resistance.museum riodejaneiro.museum rochester.museum rockart.museum roma.museum russia.museum saintlouis.museum salem.museum salvadordali.museum salzburg.museum sandiego.museum sanfrancisco.museum santabarbara.museum santacruz.museum santafe.museum saskatchewan.museum satx.museum savannahga.museum schlesisches.museum schoenbrunn.museum schokoladen.museum school.museum schweiz.museum science.museum scienceandhistory.museum scienceandindustry.museum sciencecenter.museum sciencecenters.museum science-fiction.museum sciencehistory.museum sciences.museum sciencesnaturelles.museum scotland.museum seaport.museum settlement.museum settlers.museum shell.museum sherbrooke.museum sibenik.museum silk.museum ski.museum skole.museum society.museum sologne.museum soundandvision.museum southcarolina.museum southwest.museum space.museum spy.museum square.museum stadt.museum stalbans.museum starnberg.museum state.museum stateofdelaware.museum station.museum steam.museum steiermark.museum stjohn.museum stockholm.museum stpetersburg.museum stuttgart.museum suisse.museum surgeonshall.museum surrey.museum svizzera.museum sweden.museum sydney.museum tank.museum tcm.museum technology.museum telekommunikation.museum television.museum texas.museum textile.museum theater.museum time.museum timekeeping.museum topology.museum torino.museum touch.museum town.museum transport.museum tree.museum trolley.museum trust.museum trustee.museum uhren.museum ulm.museum undersea.museum university.museum usa.museum usantiques.museum usarts.museum uscountryestate.museum usculture.museum usdecorativearts.museum usgarden.museum ushistory.museum ushuaia.museum uslivinghistory.museum utah.museum uvic.museum valley.museum vantaa.museum versailles.museum viking.museum village.museum virginia.museum virtual.museum virtuel.museum vlaanderen.museum volkenkunde.museum wales.museum wallonie.museum war.museum washingtondc.museum watchandclock.museum watch-and-clock.museum western.museum westfalen.museum whaling.museum wildlife.museum williamsburg.museum windmill.museum workshop.museum york.museum yorkshire.museum yosemite.museum youth.museum zoological.museum zoology.museum ירושלים.museum иком.museum // mv : http://en.wikipedia.org/wiki/.mv // "mv" included because, contra Wikipedia, google.mv exists. mv aero.mv biz.mv com.mv coop.mv edu.mv gov.mv info.mv int.mv mil.mv museum.mv name.mv net.mv org.mv pro.mv // mw : http://www.registrar.mw/ mw ac.mw biz.mw co.mw com.mw coop.mw edu.mw gov.mw int.mw museum.mw net.mw org.mw // mx : http://www.nic.mx/ // Submitted by registry <farias@nic.mx> 2008-06-19 mx com.mx org.mx gob.mx edu.mx net.mx // my : http://www.mynic.net.my/ my com.my net.my org.my gov.my edu.my mil.my name.my // mz : http://www.gobin.info/domainname/mz-template.doc *.mz // na : http://www.na-nic.com.na/ // http://www.info.na/domain/ na info.na pro.na name.na school.na or.na dr.na us.na mx.na ca.na in.na cc.na tv.na ws.na mobi.na co.na com.na org.na // name : has 2nd-level tlds, but there's no list of them name // nc : http://www.cctld.nc/ nc asso.nc // ne : http://en.wikipedia.org/wiki/.ne ne // net : http://en.wikipedia.org/wiki/.net net // nf : http://en.wikipedia.org/wiki/.nf nf com.nf net.nf per.nf rec.nf web.nf arts.nf firm.nf info.nf other.nf store.nf // ng : http://psg.com/dns/ng/ // Submitted by registry <randy@psg.com> 2008-06-17 ac.ng com.ng edu.ng gov.ng net.ng org.ng // ni : http://www.nic.ni/dominios.htm *.ni // nl : http://www.domain-registry.nl/ace.php/c,728,122,,,,Home.html // Confirmed by registry <Antoin.Verschuren@sidn.nl> (with technical // reservations) 2008-06-08 nl // BV.nl will be a registry for dutch BV's (besloten vennootschap) bv.nl // no : http://www.norid.no/regelverk/index.en.html // The Norwegian registry has declined to notify us of updates. The web pages // referenced below are the official source of the data. There is also an // announce mailing list: // https://postlister.uninett.no/sympa/info/norid-diskusjon no // Norid generic domains : http://www.norid.no/regelverk/vedlegg-c.en.html fhs.no vgs.no fylkesbibl.no folkebibl.no museum.no idrett.no priv.no // Non-Norid generic domains : http://www.norid.no/regelverk/vedlegg-d.en.html mil.no stat.no dep.no kommune.no herad.no // no geographical names : http://www.norid.no/regelverk/vedlegg-b.en.html // counties aa.no ah.no bu.no fm.no hl.no hm.no jan-mayen.no mr.no nl.no nt.no of.no ol.no oslo.no rl.no sf.no st.no svalbard.no tm.no tr.no va.no vf.no // primary and lower secondary schools per county gs.aa.no gs.ah.no gs.bu.no gs.fm.no gs.hl.no gs.hm.no gs.jan-mayen.no gs.mr.no gs.nl.no gs.nt.no gs.of.no gs.ol.no gs.oslo.no gs.rl.no gs.sf.no gs.st.no gs.svalbard.no gs.tm.no gs.tr.no gs.va.no gs.vf.no // cities akrehamn.no åkrehamn.no algard.no ålgård.no arna.no brumunddal.no bryne.no bronnoysund.no brønnøysund.no drobak.no drøbak.no egersund.no fetsund.no floro.no florø.no fredrikstad.no hokksund.no honefoss.no hønefoss.no jessheim.no jorpeland.no jørpeland.no kirkenes.no kopervik.no krokstadelva.no langevag.no langevåg.no leirvik.no mjondalen.no mjøndalen.no mo-i-rana.no mosjoen.no mosjøen.no nesoddtangen.no orkanger.no osoyro.no osøyro.no raholt.no råholt.no sandnessjoen.no sandnessjøen.no skedsmokorset.no slattum.no spjelkavik.no stathelle.no stavern.no stjordalshalsen.no stjørdalshalsen.no tananger.no tranby.no vossevangen.no // communities afjord.no åfjord.no agdenes.no al.no ål.no alesund.no ålesund.no alstahaug.no alta.no áltá.no alaheadju.no álaheadju.no alvdal.no amli.no åmli.no amot.no åmot.no andebu.no andoy.no andøy.no andasuolo.no ardal.no årdal.no aremark.no arendal.no ås.no aseral.no åseral.no asker.no askim.no askvoll.no askoy.no askøy.no asnes.no åsnes.no audnedaln.no aukra.no aure.no aurland.no aurskog-holand.no aurskog-høland.no austevoll.no austrheim.no averoy.no averøy.no balestrand.no ballangen.no balat.no bálát.no balsfjord.no bahccavuotna.no báhccavuotna.no bamble.no bardu.no beardu.no beiarn.no bajddar.no bájddar.no baidar.no báidár.no berg.no bergen.no berlevag.no berlevåg.no bearalvahki.no bearalváhki.no bindal.no birkenes.no bjarkoy.no bjarkøy.no bjerkreim.no bjugn.no bodo.no bodø.no badaddja.no bådåddjå.no budejju.no bokn.no bremanger.no bronnoy.no brønnøy.no bygland.no bykle.no barum.no bærum.no bo.telemark.no bø.telemark.no bo.nordland.no bø.nordland.no bievat.no bievát.no bomlo.no bømlo.no batsfjord.no båtsfjord.no bahcavuotna.no báhcavuotna.no dovre.no drammen.no drangedal.no dyroy.no dyrøy.no donna.no dønna.no eid.no eidfjord.no eidsberg.no eidskog.no eidsvoll.no eigersund.no elverum.no enebakk.no engerdal.no etne.no etnedal.no evenes.no evenassi.no evenášši.no evje-og-hornnes.no farsund.no fauske.no fuossko.no fuoisku.no fedje.no fet.no finnoy.no finnøy.no fitjar.no fjaler.no fjell.no flakstad.no flatanger.no flekkefjord.no flesberg.no flora.no fla.no flå.no folldal.no forsand.no fosnes.no frei.no frogn.no froland.no frosta.no frana.no fræna.no froya.no frøya.no fusa.no fyresdal.no forde.no førde.no gamvik.no gangaviika.no gáŋgaviika.no gaular.no gausdal.no gildeskal.no gildeskål.no giske.no gjemnes.no gjerdrum.no gjerstad.no gjesdal.no gjovik.no gjøvik.no gloppen.no gol.no gran.no grane.no granvin.no gratangen.no grimstad.no grong.no kraanghke.no kråanghke.no grue.no gulen.no hadsel.no halden.no halsa.no hamar.no hamaroy.no habmer.no hábmer.no hapmir.no hápmir.no hammerfest.no hammarfeasta.no hámmárfeasta.no haram.no hareid.no harstad.no hasvik.no aknoluokta.no ákŋoluokta.no hattfjelldal.no aarborte.no haugesund.no hemne.no hemnes.no hemsedal.no heroy.more-og-romsdal.no herøy.møre-og-romsdal.no heroy.nordland.no herøy.nordland.no hitra.no hjartdal.no hjelmeland.no hobol.no hobøl.no hof.no hol.no hole.no holmestrand.no holtalen.no holtålen.no hornindal.no horten.no hurdal.no hurum.no hvaler.no hyllestad.no hagebostad.no hægebostad.no hoyanger.no høyanger.no hoylandet.no høylandet.no ha.no hå.no ibestad.no inderoy.no inderøy.no iveland.no jevnaker.no jondal.no jolster.no jølster.no karasjok.no karasjohka.no kárášjohka.no karlsoy.no galsa.no gálsá.no karmoy.no karmøy.no kautokeino.no guovdageaidnu.no klepp.no klabu.no klæbu.no kongsberg.no kongsvinger.no kragero.no kragerø.no kristiansand.no kristiansund.no krodsherad.no krødsherad.no kvalsund.no rahkkeravju.no ráhkkerávju.no kvam.no kvinesdal.no kvinnherad.no kviteseid.no kvitsoy.no kvitsøy.no kvafjord.no kvæfjord.no giehtavuoatna.no kvanangen.no kvænangen.no navuotna.no návuotna.no kafjord.no kåfjord.no gaivuotna.no gáivuotna.no larvik.no lavangen.no lavagis.no loabat.no loabát.no lebesby.no davvesiida.no leikanger.no leirfjord.no leka.no leksvik.no lenvik.no leangaviika.no leaŋgaviika.no lesja.no levanger.no lier.no lierne.no lillehammer.no lillesand.no lindesnes.no lindas.no lindås.no lom.no loppa.no lahppi.no láhppi.no lund.no lunner.no luroy.no lurøy.no luster.no lyngdal.no lyngen.no ivgu.no lardal.no lerdal.no lærdal.no lodingen.no lødingen.no lorenskog.no lørenskog.no loten.no løten.no malvik.no masoy.no måsøy.no muosat.no muosát.no mandal.no marker.no marnardal.no masfjorden.no meland.no meldal.no melhus.no meloy.no meløy.no meraker.no meråker.no moareke.no moåreke.no midsund.no midtre-gauldal.no modalen.no modum.no molde.no moskenes.no moss.no mosvik.no malselv.no målselv.no malatvuopmi.no málatvuopmi.no namdalseid.no aejrie.no namsos.no namsskogan.no naamesjevuemie.no nååmesjevuemie.no laakesvuemie.no nannestad.no narvik.no narviika.no naustdal.no nedre-eiker.no nes.akershus.no nes.buskerud.no nesna.no nesodden.no nesseby.no unjarga.no unjárga.no nesset.no nissedal.no nittedal.no nord-aurdal.no nord-fron.no nord-odal.no norddal.no nordkapp.no davvenjarga.no davvenjárga.no nordre-land.no nordreisa.no raisa.no ráisa.no nore-og-uvdal.no notodden.no naroy.no nærøy.no notteroy.no nøtterøy.no odda.no oksnes.no øksnes.no oppdal.no oppegard.no oppegård.no orkdal.no orland.no ørland.no orskog.no ørskog.no orsta.no ørsta.no os.hedmark.no os.hordaland.no osen.no osteroy.no osterøy.no ostre-toten.no østre-toten.no overhalla.no ovre-eiker.no øvre-eiker.no oyer.no øyer.no oygarden.no øygarden.no oystre-slidre.no øystre-slidre.no porsanger.no porsangu.no porsáŋgu.no porsgrunn.no radoy.no radøy.no rakkestad.no rana.no ruovat.no randaberg.no rauma.no rendalen.no rennebu.no rennesoy.no rennesøy.no rindal.no ringebu.no ringerike.no ringsaker.no rissa.no risor.no risør.no roan.no rollag.no rygge.no ralingen.no rælingen.no rodoy.no rødøy.no romskog.no rømskog.no roros.no røros.no rost.no røst.no royken.no røyken.no royrvik.no røyrvik.no rade.no råde.no salangen.no siellak.no saltdal.no salat.no sálát.no sálat.no samnanger.no sande.more-og-romsdal.no sande.møre-og-romsdal.no sande.vestfold.no sandefjord.no sandnes.no sandoy.no sandøy.no sarpsborg.no sauda.no sauherad.no sel.no selbu.no selje.no seljord.no sigdal.no siljan.no sirdal.no skaun.no skedsmo.no ski.no skien.no skiptvet.no skjervoy.no skjervøy.no skierva.no skiervá.no skjak.no skjåk.no skodje.no skanland.no skånland.no skanit.no skánit.no smola.no smøla.no snillfjord.no snasa.no snåsa.no snoasa.no snaase.no snåase.no sogndal.no sokndal.no sola.no solund.no songdalen.no sortland.no spydeberg.no stange.no stavanger.no steigen.no steinkjer.no stjordal.no stjørdal.no stokke.no stor-elvdal.no stord.no stordal.no storfjord.no omasvuotna.no strand.no stranda.no stryn.no sula.no suldal.no sund.no sunndal.no surnadal.no sveio.no svelvik.no sykkylven.no sogne.no søgne.no somna.no sømna.no sondre-land.no søndre-land.no sor-aurdal.no sør-aurdal.no sor-fron.no sør-fron.no sor-odal.no sør-odal.no sor-varanger.no sør-varanger.no matta-varjjat.no mátta-várjjat.no sorfold.no sørfold.no sorreisa.no sørreisa.no sorum.no sørum.no tana.no deatnu.no time.no tingvoll.no tinn.no tjeldsund.no dielddanuorri.no tjome.no tjøme.no tokke.no tolga.no torsken.no tranoy.no tranøy.no tromso.no tromsø.no tromsa.no romsa.no trondheim.no troandin.no trysil.no trana.no træna.no trogstad.no trøgstad.no tvedestrand.no tydal.no tynset.no tysfjord.no divtasvuodna.no divttasvuotna.no tysnes.no tysvar.no tysvær.no tonsberg.no tønsberg.no ullensaker.no ullensvang.no ulvik.no utsira.no vadso.no vadsø.no cahcesuolo.no čáhcesuolo.no vaksdal.no valle.no vang.no vanylven.no vardo.no vardø.no varggat.no várggát.no vefsn.no vaapste.no vega.no vegarshei.no vegårshei.no vennesla.no verdal.no verran.no vestby.no vestnes.no vestre-slidre.no vestre-toten.no vestvagoy.no vestvågøy.no vevelstad.no vik.no vikna.no vindafjord.no volda.no voss.no varoy.no værøy.no vagan.no vågan.no voagat.no vagsoy.no vågsøy.no vaga.no vågå.no valer.ostfold.no våler.østfold.no valer.hedmark.no våler.hedmark.no // np : http://www.mos.com.np/register.html *.np // nr : http://cenpac.net.nr/dns/index.html // Confirmed by registry <technician@cenpac.net.nr> 2008-06-17 nr biz.nr info.nr gov.nr edu.nr org.nr net.nr com.nr // nu : http://en.wikipedia.org/wiki/.nu nu // nz : http://en.wikipedia.org/wiki/.nz *.nz // om : http://en.wikipedia.org/wiki/.om *.om !mediaphone.om !nawrastelecom.om !nawras.om !omanmobile.om !omanpost.om !omantel.om !rakpetroleum.om !siemens.om !songfest.om !statecouncil.om // org : http://en.wikipedia.org/wiki/.org org // pa : http://www.nic.pa/ // Some additional second level "domains" resolve directly as hostnames, such as // pannet.pa, so we add a rule for "pa". pa ac.pa gob.pa com.pa org.pa sld.pa edu.pa net.pa ing.pa abo.pa med.pa nom.pa // pe : https://www.nic.pe/InformeFinalComision.pdf pe edu.pe gob.pe nom.pe mil.pe org.pe com.pe net.pe // pf : http://www.gobin.info/domainname/formulaire-pf.pdf pf com.pf org.pf edu.pf // pg : http://en.wikipedia.org/wiki/.pg *.pg // ph : http://www.domains.ph/FAQ2.asp // Submitted by registry <jed@email.com.ph> 2008-06-13 ph com.ph net.ph org.ph gov.ph edu.ph ngo.ph mil.ph i.ph // pk : http://pk5.pknic.net.pk/pk5/msgNamepk.PK pk com.pk net.pk edu.pk org.pk fam.pk biz.pk web.pk gov.pk gob.pk gok.pk gon.pk gop.pk gos.pk info.pk // pl : http://www.dns.pl/english/ pl // NASK functional domains (nask.pl / dns.pl) : http://www.dns.pl/english/dns-funk.html aid.pl agro.pl atm.pl auto.pl biz.pl com.pl edu.pl gmina.pl gsm.pl info.pl mail.pl miasta.pl media.pl mil.pl net.pl nieruchomosci.pl nom.pl org.pl pc.pl powiat.pl priv.pl realestate.pl rel.pl sex.pl shop.pl sklep.pl sos.pl szkola.pl targi.pl tm.pl tourism.pl travel.pl turystyka.pl // ICM functional domains (icm.edu.pl) 6bone.pl art.pl mbone.pl // Government domains (administred by ippt.gov.pl) gov.pl uw.gov.pl um.gov.pl ug.gov.pl upow.gov.pl starostwo.gov.pl so.gov.pl sr.gov.pl po.gov.pl pa.gov.pl // other functional domains ngo.pl irc.pl usenet.pl // NASK geographical domains : http://www.dns.pl/english/dns-regiony.html augustow.pl babia-gora.pl bedzin.pl beskidy.pl bialowieza.pl bialystok.pl bielawa.pl bieszczady.pl boleslawiec.pl bydgoszcz.pl bytom.pl cieszyn.pl czeladz.pl czest.pl dlugoleka.pl elblag.pl elk.pl glogow.pl gniezno.pl gorlice.pl grajewo.pl ilawa.pl jaworzno.pl jelenia-gora.pl jgora.pl kalisz.pl kazimierz-dolny.pl karpacz.pl kartuzy.pl kaszuby.pl katowice.pl kepno.pl ketrzyn.pl klodzko.pl kobierzyce.pl kolobrzeg.pl konin.pl konskowola.pl kutno.pl lapy.pl lebork.pl legnica.pl lezajsk.pl limanowa.pl lomza.pl lowicz.pl lubin.pl lukow.pl malbork.pl malopolska.pl mazowsze.pl mazury.pl mielec.pl mielno.pl mragowo.pl naklo.pl nowaruda.pl nysa.pl olawa.pl olecko.pl olkusz.pl olsztyn.pl opoczno.pl opole.pl ostroda.pl ostroleka.pl ostrowiec.pl ostrowwlkp.pl pila.pl pisz.pl podhale.pl podlasie.pl polkowice.pl pomorze.pl pomorskie.pl prochowice.pl pruszkow.pl przeworsk.pl pulawy.pl radom.pl rawa-maz.pl rybnik.pl rzeszow.pl sanok.pl sejny.pl siedlce.pl slask.pl slupsk.pl sosnowiec.pl stalowa-wola.pl skoczow.pl starachowice.pl stargard.pl suwalki.pl swidnica.pl swiebodzin.pl swinoujscie.pl szczecin.pl szczytno.pl tarnobrzeg.pl tgory.pl turek.pl tychy.pl ustka.pl walbrzych.pl warmia.pl warszawa.pl waw.pl wegrow.pl wielun.pl wlocl.pl wloclawek.pl wodzislaw.pl wolomin.pl wroclaw.pl zachpomor.pl zagan.pl zarow.pl zgora.pl zgorzelec.pl // TASK geographical domains (www.task.gda.pl/uslugi/dns) gda.pl gdansk.pl gdynia.pl med.pl sopot.pl // other geographical domains gliwice.pl krakow.pl poznan.pl wroc.pl zakopane.pl // pm : http://www.afnic.fr/medias/documents/AFNIC-naming-policy2012.pdf pm // pn : http://www.government.pn/PnRegistry/policies.htm pn gov.pn co.pn org.pn edu.pn net.pn // pr : http://www.nic.pr/index.asp?f=1 pr com.pr net.pr org.pr gov.pr edu.pr isla.pr pro.pr biz.pr info.pr name.pr // these aren't mentioned on nic.pr, but on http://en.wikipedia.org/wiki/.pr est.pr prof.pr ac.pr // pro : http://www.nic.pro/support_faq.htm pro aca.pro bar.pro cpa.pro jur.pro law.pro med.pro eng.pro // ps : http://en.wikipedia.org/wiki/.ps // http://www.nic.ps/registration/policy.html#reg ps edu.ps gov.ps sec.ps plo.ps com.ps org.ps net.ps // pt : http://online.dns.pt/dns/start_dns pt net.pt gov.pt org.pt edu.pt int.pt publ.pt com.pt nome.pt // pw : http://en.wikipedia.org/wiki/.pw pw co.pw ne.pw or.pw ed.pw go.pw belau.pw // py : http://www.nic.py/faq_a.html#faq_b *.py // qa : http://domains.qa/en/ qa com.qa edu.qa gov.qa mil.qa name.qa net.qa org.qa sch.qa // re : http://www.afnic.re/obtenir/chartes/nommage-re/annexe-descriptifs re com.re asso.re nom.re // ro : http://www.rotld.ro/ ro com.ro org.ro tm.ro nt.ro nom.ro info.ro rec.ro arts.ro firm.ro store.ro www.ro // rs : http://en.wikipedia.org/wiki/.rs rs co.rs org.rs edu.rs ac.rs gov.rs in.rs // ru : http://www.cctld.ru/ru/docs/aktiv_8.php // Industry domains ru ac.ru com.ru edu.ru int.ru net.ru org.ru pp.ru // Geographical domains adygeya.ru altai.ru amur.ru arkhangelsk.ru astrakhan.ru bashkiria.ru belgorod.ru bir.ru bryansk.ru buryatia.ru cbg.ru chel.ru chelyabinsk.ru chita.ru chukotka.ru chuvashia.ru dagestan.ru dudinka.ru e-burg.ru grozny.ru irkutsk.ru ivanovo.ru izhevsk.ru jar.ru joshkar-ola.ru kalmykia.ru kaluga.ru kamchatka.ru karelia.ru kazan.ru kchr.ru kemerovo.ru khabarovsk.ru khakassia.ru khv.ru kirov.ru koenig.ru komi.ru kostroma.ru krasnoyarsk.ru kuban.ru kurgan.ru kursk.ru lipetsk.ru magadan.ru mari.ru mari-el.ru marine.ru mordovia.ru mosreg.ru msk.ru murmansk.ru nalchik.ru nnov.ru nov.ru novosibirsk.ru nsk.ru omsk.ru orenburg.ru oryol.ru palana.ru penza.ru perm.ru pskov.ru ptz.ru rnd.ru ryazan.ru sakhalin.ru samara.ru saratov.ru simbirsk.ru smolensk.ru spb.ru stavropol.ru stv.ru surgut.ru tambov.ru tatarstan.ru tom.ru tomsk.ru tsaritsyn.ru tsk.ru tula.ru tuva.ru tver.ru tyumen.ru udm.ru udmurtia.ru ulan-ude.ru vladikavkaz.ru vladimir.ru vladivostok.ru volgograd.ru vologda.ru voronezh.ru vrn.ru vyatka.ru yakutia.ru yamal.ru yaroslavl.ru yekaterinburg.ru yuzhno-sakhalinsk.ru // More geographical domains amursk.ru baikal.ru cmw.ru fareast.ru jamal.ru kms.ru k-uralsk.ru kustanai.ru kuzbass.ru magnitka.ru mytis.ru nakhodka.ru nkz.ru norilsk.ru oskol.ru pyatigorsk.ru rubtsovsk.ru snz.ru syzran.ru vdonsk.ru zgrad.ru // State domains gov.ru mil.ru // Technical domains test.ru // rw : http://www.nic.rw/cgi-bin/policy.pl rw gov.rw net.rw edu.rw ac.rw com.rw co.rw int.rw mil.rw gouv.rw // sa : http://www.nic.net.sa/ sa com.sa net.sa org.sa gov.sa med.sa pub.sa edu.sa sch.sa // sb : http://www.sbnic.net.sb/ // Submitted by registry <lee.humphries@telekom.com.sb> 2008-06-08 sb com.sb edu.sb gov.sb net.sb org.sb // sc : http://www.nic.sc/ sc com.sc gov.sc net.sc org.sc edu.sc // sd : http://www.isoc.sd/sudanic.isoc.sd/billing_pricing.htm // Submitted by registry <admin@isoc.sd> 2008-06-17 sd com.sd net.sd org.sd edu.sd med.sd gov.sd info.sd // se : http://en.wikipedia.org/wiki/.se // Submitted by registry <Patrik.Wallstrom@iis.se> 2008-06-24 se a.se ac.se b.se bd.se brand.se c.se d.se e.se f.se fh.se fhsk.se fhv.se g.se h.se i.se k.se komforb.se kommunalforbund.se komvux.se l.se lanbib.se m.se n.se naturbruksgymn.se o.se org.se p.se parti.se pp.se press.se r.se s.se sshn.se t.se tm.se u.se w.se x.se y.se z.se // sg : http://www.nic.net.sg/sub_policies_agreement/2ld.html sg com.sg net.sg org.sg gov.sg edu.sg per.sg // sh : http://www.nic.sh/rules.html // list of 2nd level domains ? sh // si : http://en.wikipedia.org/wiki/.si si // sj : No registrations at this time. // Submitted by registry <jarle@uninett.no> 2008-06-16 // sk : http://en.wikipedia.org/wiki/.sk // list of 2nd level domains ? sk // sl : http://www.nic.sl // Submitted by registry <adam@neoip.com> 2008-06-12 sl com.sl net.sl edu.sl gov.sl org.sl // sm : http://en.wikipedia.org/wiki/.sm sm // sn : http://en.wikipedia.org/wiki/.sn sn art.sn com.sn edu.sn gouv.sn org.sn perso.sn univ.sn // so : http://www.soregistry.com/ so com.so net.so org.so // sr : http://en.wikipedia.org/wiki/.sr sr // st : http://www.nic.st/html/policyrules/ st co.st com.st consulado.st edu.st embaixada.st gov.st mil.st net.st org.st principe.st saotome.st store.st // su : http://en.wikipedia.org/wiki/.su su // sv : http://www.svnet.org.sv/svpolicy.html *.sv // sy : http://en.wikipedia.org/wiki/.sy // see also: http://www.gobin.info/domainname/sy.doc sy edu.sy gov.sy net.sy mil.sy com.sy org.sy // sz : http://en.wikipedia.org/wiki/.sz // http://www.sispa.org.sz/ sz co.sz ac.sz org.sz // tc : http://en.wikipedia.org/wiki/.tc tc // td : http://en.wikipedia.org/wiki/.td td // tel: http://en.wikipedia.org/wiki/.tel // http://www.telnic.org/ tel // tf : http://en.wikipedia.org/wiki/.tf tf // tg : http://en.wikipedia.org/wiki/.tg // http://www.nic.tg/nictg/index.php implies no reserved 2nd-level domains, // although this contradicts wikipedia. tg // th : http://en.wikipedia.org/wiki/.th // Submitted by registry <krit@thains.co.th> 2008-06-17 th ac.th co.th go.th in.th mi.th net.th or.th // tj : http://www.nic.tj/policy.htm tj ac.tj biz.tj co.tj com.tj edu.tj go.tj gov.tj int.tj mil.tj name.tj net.tj nic.tj org.tj test.tj web.tj // tk : http://en.wikipedia.org/wiki/.tk tk // tl : http://en.wikipedia.org/wiki/.tl tl gov.tl // tm : http://www.nic.tm/rules.html // list of 2nd level tlds ? tm // tn : http://en.wikipedia.org/wiki/.tn // http://whois.ati.tn/ tn com.tn ens.tn fin.tn gov.tn ind.tn intl.tn nat.tn net.tn org.tn info.tn perso.tn tourism.tn edunet.tn rnrt.tn rns.tn rnu.tn mincom.tn agrinet.tn defense.tn turen.tn // to : http://en.wikipedia.org/wiki/.to // Submitted by registry <egullich@colo.to> 2008-06-17 to com.to gov.to net.to org.to edu.to mil.to // tr : http://en.wikipedia.org/wiki/.tr *.tr !nic.tr // Used by government in the TRNC // http://en.wikipedia.org/wiki/.nc.tr gov.nc.tr // travel : http://en.wikipedia.org/wiki/.travel travel // tt : http://www.nic.tt/ tt co.tt com.tt org.tt net.tt biz.tt info.tt pro.tt int.tt coop.tt jobs.tt mobi.tt travel.tt museum.tt aero.tt name.tt gov.tt edu.tt // tv : http://en.wikipedia.org/wiki/.tv // Not listing any 2LDs as reserved since none seem to exist in practice, // Wikipedia notwithstanding. tv // tw : http://en.wikipedia.org/wiki/.tw tw edu.tw gov.tw mil.tw com.tw net.tw org.tw idv.tw game.tw ebiz.tw club.tw 網路.tw 組織.tw 商業.tw // tz : http://en.wikipedia.org/wiki/.tz // Submitted by registry <randy@psg.com> 2008-06-17 // Updated from http://www.tznic.or.tz/index.php/domains.html 2010-10-25 ac.tz co.tz go.tz mil.tz ne.tz or.tz sc.tz // ua : http://www.nic.net.ua/ ua com.ua edu.ua gov.ua in.ua net.ua org.ua // ua geo-names cherkassy.ua chernigov.ua chernovtsy.ua ck.ua cn.ua crimea.ua cv.ua dn.ua dnepropetrovsk.ua donetsk.ua dp.ua if.ua ivano-frankivsk.ua kh.ua kharkov.ua kherson.ua khmelnitskiy.ua kiev.ua kirovograd.ua km.ua kr.ua ks.ua kv.ua lg.ua lugansk.ua lutsk.ua lviv.ua mk.ua nikolaev.ua od.ua odessa.ua pl.ua poltava.ua rovno.ua rv.ua sebastopol.ua sumy.ua te.ua ternopil.ua uzhgorod.ua vinnica.ua vn.ua zaporizhzhe.ua zp.ua zhitomir.ua zt.ua // Private registries in .ua co.ua pp.ua // ug : http://www.registry.co.ug/ ug co.ug ac.ug sc.ug go.ug ne.ug or.ug // uk : http://en.wikipedia.org/wiki/.uk *.uk *.sch.uk !bl.uk !british-library.uk !icnet.uk !jet.uk !mod.uk !nel.uk !nhs.uk !nic.uk !nls.uk !national-library-scotland.uk !parliament.uk !police.uk // us : http://en.wikipedia.org/wiki/.us us dni.us fed.us isa.us kids.us nsn.us // us geographic names ak.us al.us ar.us as.us az.us ca.us co.us ct.us dc.us de.us fl.us ga.us gu.us hi.us ia.us id.us il.us in.us ks.us ky.us la.us ma.us md.us me.us mi.us mn.us mo.us ms.us mt.us nc.us nd.us ne.us nh.us nj.us nm.us nv.us ny.us oh.us ok.us or.us pa.us pr.us ri.us sc.us sd.us tn.us tx.us ut.us vi.us vt.us va.us wa.us wi.us wv.us wy.us // The registrar notes several more specific domains available in each state, // such as state.*.us, dst.*.us, etc., but resolution of these is somewhat // haphazard; in some states these domains resolve as addresses, while in others // only subdomains are available, or even nothing at all. We include the // most common ones where it's clear that different sites are different // entities. k12.ak.us k12.al.us k12.ar.us k12.as.us k12.az.us k12.ca.us k12.co.us k12.ct.us k12.dc.us k12.de.us k12.fl.us k12.ga.us k12.gu.us // k12.hi.us Hawaii has a state-wide DOE login: bug 614565 k12.ia.us k12.id.us k12.il.us k12.in.us k12.ks.us k12.ky.us k12.la.us k12.ma.us k12.md.us k12.me.us k12.mi.us k12.mn.us k12.mo.us k12.ms.us k12.mt.us k12.nc.us k12.nd.us k12.ne.us k12.nh.us k12.nj.us k12.nm.us k12.nv.us k12.ny.us k12.oh.us k12.ok.us k12.or.us k12.pa.us k12.pr.us k12.ri.us k12.sc.us k12.sd.us k12.tn.us k12.tx.us k12.ut.us k12.vi.us k12.vt.us k12.va.us k12.wa.us k12.wi.us k12.wv.us k12.wy.us cc.ak.us cc.al.us cc.ar.us cc.as.us cc.az.us cc.ca.us cc.co.us cc.ct.us cc.dc.us cc.de.us cc.fl.us cc.ga.us cc.gu.us cc.hi.us cc.ia.us cc.id.us cc.il.us cc.in.us cc.ks.us cc.ky.us cc.la.us cc.ma.us cc.md.us cc.me.us cc.mi.us cc.mn.us cc.mo.us cc.ms.us cc.mt.us cc.nc.us cc.nd.us cc.ne.us cc.nh.us cc.nj.us cc.nm.us cc.nv.us cc.ny.us cc.oh.us cc.ok.us cc.or.us cc.pa.us cc.pr.us cc.ri.us cc.sc.us cc.sd.us cc.tn.us cc.tx.us cc.ut.us cc.vi.us cc.vt.us cc.va.us cc.wa.us cc.wi.us cc.wv.us cc.wy.us lib.ak.us lib.al.us lib.ar.us lib.as.us lib.az.us lib.ca.us lib.co.us lib.ct.us lib.dc.us lib.de.us lib.fl.us lib.ga.us lib.gu.us lib.hi.us lib.ia.us lib.id.us lib.il.us lib.in.us lib.ks.us lib.ky.us lib.la.us lib.ma.us lib.md.us lib.me.us lib.mi.us lib.mn.us lib.mo.us lib.ms.us lib.mt.us lib.nc.us lib.nd.us lib.ne.us lib.nh.us lib.nj.us lib.nm.us lib.nv.us lib.ny.us lib.oh.us lib.ok.us lib.or.us lib.pa.us lib.pr.us lib.ri.us lib.sc.us lib.sd.us lib.tn.us lib.tx.us lib.ut.us lib.vi.us lib.vt.us lib.va.us lib.wa.us lib.wi.us lib.wv.us lib.wy.us // k12.ma.us contains school districts in Massachusetts. The 4LDs are // managed indepedently except for private (PVT), charter (CHTR) and // parochial (PAROCH) schools. Those are delegated dorectly to the // 5LD operators. <k12-ma-hostmaster _ at _ rsuc.gweep.net> pvt.k12.ma.us chtr.k12.ma.us paroch.k12.ma.us // uy : http://www.antel.com.uy/ *.uy // uz : http://www.reg.uz/registerr.html // are there other 2nd level tlds ? uz com.uz co.uz // va : http://en.wikipedia.org/wiki/.va va // vc : http://en.wikipedia.org/wiki/.vc // Submitted by registry <kshah@ca.afilias.info> 2008-06-13 vc com.vc net.vc org.vc gov.vc mil.vc edu.vc // ve : http://registro.nic.ve/nicve/registro/index.html *.ve // vg : http://en.wikipedia.org/wiki/.vg vg // vi : http://www.nic.vi/newdomainform.htm // http://www.nic.vi/Domain_Rules/body_domain_rules.html indicates some other // TLDs are "reserved", such as edu.vi and gov.vi, but doesn't actually say they // are available for registration (which they do not seem to be). vi co.vi com.vi k12.vi net.vi org.vi // vn : https://www.dot.vn/vnnic/vnnic/domainregistration.jsp vn com.vn net.vn org.vn edu.vn gov.vn int.vn ac.vn biz.vn info.vn name.vn pro.vn health.vn // vu : http://en.wikipedia.org/wiki/.vu // list of 2nd level tlds ? vu // wf : http://www.afnic.fr/medias/documents/AFNIC-naming-policy2012.pdf wf // ws : http://en.wikipedia.org/wiki/.ws // http://samoanic.ws/index.dhtml ws com.ws net.ws org.ws gov.ws edu.ws // yt : http://www.afnic.fr/medias/documents/AFNIC-naming-policy2012.pdf yt // IDN ccTLDs // Please sort by ISO 3166 ccTLD, then punicode string // when submitting patches and follow this format: // <Punicode> ("<english word>" <language>) : <ISO 3166 ccTLD> // [optional sponsoring org] // <URL> // xn--mgbaam7a8h ("Emerat" Arabic) : AE //http://nic.ae/english/arabicdomain/rules.jsp امارات // xn--54b7fta0cc ("Bangla" Bangla) : BD বাংলা // xn--fiqs8s ("China" Chinese-Han-Simplified <.Zhonggou>) : CN // CNNIC // http://cnnic.cn/html/Dir/2005/10/11/3218.htm 中国 // xn--fiqz9s ("China" Chinese-Han-Traditional <.Zhonggou>) : CN // CNNIC // http://cnnic.cn/html/Dir/2005/10/11/3218.htm 中國 // xn--lgbbat1ad8j ("Algeria / Al Jazair" Arabic) : DZ الجزائر // xn--wgbh1c ("Egypt" Arabic .masr) : EG // http://www.dotmasr.eg/ مصر // xn--node ("ge" Georgian (Mkhedruli)) : GE გე // xn--j6w193g ("Hong Kong" Chinese-Han) : HK // https://www2.hkirc.hk/register/rules.jsp 香港 // xn--h2brj9c ("Bharat" Devanagari) : IN // India भारत // xn--mgbbh1a71e ("Bharat" Arabic) : IN // India بھارت // xn--fpcrj9c3d ("Bharat" Telugu) : IN // India భారత్ // xn--gecrj9c ("Bharat" Gujarati) : IN // India ભારત // xn--s9brj9c ("Bharat" Gurmukhi) : IN // India ਭਾਰਤ // xn--45brj9c ("Bharat" Bengali) : IN // India ভারত // xn--xkc2dl3a5ee0h ("India" Tamil) : IN // India இந்தியா // xn--mgba3a4f16a ("Iran" Persian) : IR ایران // xn--mgba3a4fra ("Iran" Arabic) : IR ايران //xn--mgbayh7gpa ("al-Ordon" Arabic) JO //National Information Technology Center (NITC) //Royal Scientific Society, Al-Jubeiha الاردن // xn--3e0b707e ("Republic of Korea" Hangul) : KR 한국 // xn--fzc2c9e2c ("Lanka" Sinhalese-Sinhala) : LK // http://nic.lk ලංකා // xn--xkc2al3hye2a ("Ilangai" Tamil) : LK // http://nic.lk இலங்கை // xn--mgbc0a9azcg ("Morocco / al-Maghrib" Arabic) : MA المغرب // xn--mgb9awbf ("Oman" Arabic) : OM عمان // xn--ygbi2ammx ("Falasteen" Arabic) : PS // The Palestinian National Internet Naming Authority (PNINA) // http://www.pnina.ps فلسطين // xn--90a3ac ("srb" Cyrillic) : RS срб // xn--p1ai ("rf" Russian-Cyrillic) : RU // http://www.cctld.ru/en/docs/rulesrf.php рф // xn--wgbl6a ("Qatar" Arabic) : QA // http://www.ict.gov.qa/ قطر // xn--mgberp4a5d4ar ("AlSaudiah" Arabic) : SA // http://www.nic.net.sa/ السعودية // xn--mgberp4a5d4a87g ("AlSaudiah" Arabic) variant : SA السعودیة // xn--mgbqly7c0a67fbc ("AlSaudiah" Arabic) variant : SA السعودیۃ // xn--mgbqly7cvafr ("AlSaudiah" Arabic) variant : SA السعوديه // xn--ogbpf8fl ("Syria" Arabic) : SY سورية // xn--mgbtf8fl ("Syria" Arabic) variant : SY سوريا // xn--yfro4i67o Singapore ("Singapore" Chinese-Han) : SG 新加坡 // xn--clchc0ea0b2g2a9gcd ("Singapore" Tamil) : SG சிங்கப்பூர் // xn--o3cw4h ("Thai" Thai) : TH // http://www.thnic.co.th ไทย // xn--pgbs0dh ("Tunis") : TN // http://nic.tn تونس // xn--kpry57d ("Taiwan" Chinese-Han-Traditional) : TW // http://www.twnic.net/english/dn/dn_07a.htm 台灣 // xn--kprw13d ("Taiwan" Chinese-Han-Simplified) : TW // http://www.twnic.net/english/dn/dn_07a.htm 台湾 // xn--nnx388a ("Taiwan") variant : TW 臺灣 // xn--j1amh ("ukr" Cyrillic) : UA укр // xn--mgb2ddes ("AlYemen" Arabic) : YE اليمن // xxx : http://icmregistry.com xxx // ye : http://www.y.net.ye/services/domain_name.htm *.ye // za : http://www.zadna.org.za/slds.html *.za // zm : http://en.wikipedia.org/wiki/.zm *.zm // zw : http://en.wikipedia.org/wiki/.zw *.zw // ===END ICANN DOMAINS=== // ===BEGIN PRIVATE DOMAINS=== // info.at : http://www.info.at/ biz.at info.at // priv.at : http://www.nic.priv.at/ // Submitted by registry <lendl@nic.at> 2008-06-09 priv.at // co.ca : http://registry.co.ca co.ca // CentralNic : http://www.centralnic.com/names/domains // Confirmed by registry <gavin.brown@centralnic.com> 2008-06-09 ar.com br.com cn.com de.com eu.com gb.com gr.com hu.com jpn.com kr.com no.com qc.com ru.com sa.com se.com uk.com us.com uy.com za.com gb.net jp.net se.net uk.net ae.org us.org com.de // Opera Software, A.S.A. // Requested by Yngve Pettersen <yngve@opera.com> 2009-11-26 operaunite.com // Google, Inc. // Requested by Eduardo Vela <evn@google.com> 2010-09-06 appspot.com // iki.fi : Submitted by Hannu Aronsson <haa@iki.fi> 2009-11-05 iki.fi // c.la : http://www.c.la/ c.la // ZaNiC : http://www.za.net/ // Confirmed by registry <hostmaster@nic.za.net> 2009-10-03 za.net za.org // CoDNS B.V. // Added 2010-05-23. co.nl co.no // Mainseek Sp. z o.o. : http://www.co.pl/ co.pl // DynDNS.com : http://www.dyndns.com/services/dns/dyndns/ dyndns-at-home.com dyndns-at-work.com dyndns-blog.com dyndns-free.com dyndns-home.com dyndns-ip.com dyndns-mail.com dyndns-office.com dyndns-pics.com dyndns-remote.com dyndns-server.com dyndns-web.com dyndns-wiki.com dyndns-work.com dyndns.biz dyndns.info dyndns.org dyndns.tv at-band-camp.net ath.cx barrel-of-knowledge.info barrell-of-knowledge.info better-than.tv blogdns.com blogdns.net blogdns.org blogsite.org boldlygoingnowhere.org broke-it.net buyshouses.net cechire.com dnsalias.com dnsalias.net dnsalias.org dnsdojo.com dnsdojo.net dnsdojo.org does-it.net doesntexist.com doesntexist.org dontexist.com dontexist.net dontexist.org doomdns.com doomdns.org dvrdns.org dyn-o-saur.com dynalias.com dynalias.net dynalias.org dynathome.net dyndns.ws endofinternet.net endofinternet.org endoftheinternet.org est-a-la-maison.com est-a-la-masion.com est-le-patron.com est-mon-blogueur.com for-better.biz for-more.biz for-our.info for-some.biz for-the.biz forgot.her.name forgot.his.name from-ak.com from-al.com from-ar.com from-az.net from-ca.com from-co.net from-ct.com from-dc.com from-de.com from-fl.com from-ga.com from-hi.com from-ia.com from-id.com from-il.com from-in.com from-ks.com from-ky.com from-la.net from-ma.com from-md.com from-me.org from-mi.com from-mn.com from-mo.com from-ms.com from-mt.com from-nc.com from-nd.com from-ne.com from-nh.com from-nj.com from-nm.com from-nv.com from-ny.net from-oh.com from-ok.com from-or.com from-pa.com from-pr.com from-ri.com from-sc.com from-sd.com from-tn.com from-tx.com from-ut.com from-va.com from-vt.com from-wa.com from-wi.com from-wv.com from-wy.com ftpaccess.cc fuettertdasnetz.de game-host.org game-server.cc getmyip.com gets-it.net go.dyndns.org gotdns.com gotdns.org groks-the.info groks-this.info ham-radio-op.net here-for-more.info hobby-site.com hobby-site.org home.dyndns.org homedns.org homeftp.net homeftp.org homeip.net homelinux.com homelinux.net homelinux.org homeunix.com homeunix.net homeunix.org iamallama.com in-the-band.net is-a-anarchist.com is-a-blogger.com is-a-bookkeeper.com is-a-bruinsfan.org is-a-bulls-fan.com is-a-candidate.org is-a-caterer.com is-a-celticsfan.org is-a-chef.com is-a-chef.net is-a-chef.org is-a-conservative.com is-a-cpa.com is-a-cubicle-slave.com is-a-democrat.com is-a-designer.com is-a-doctor.com is-a-financialadvisor.com is-a-geek.com is-a-geek.net is-a-geek.org is-a-green.com is-a-guru.com is-a-hard-worker.com is-a-hunter.com is-a-knight.org is-a-landscaper.com is-a-lawyer.com is-a-liberal.com is-a-libertarian.com is-a-linux-user.org is-a-llama.com is-a-musician.com is-a-nascarfan.com is-a-nurse.com is-a-painter.com is-a-patsfan.org is-a-personaltrainer.com is-a-photographer.com is-a-player.com is-a-republican.com is-a-rockstar.com is-a-socialist.com is-a-soxfan.org is-a-student.com is-a-teacher.com is-a-techie.com is-a-therapist.com is-an-accountant.com is-an-actor.com is-an-actress.com is-an-anarchist.com is-an-artist.com is-an-engineer.com is-an-entertainer.com is-by.us is-certified.com is-found.org is-gone.com is-into-anime.com is-into-cars.com is-into-cartoons.com is-into-games.com is-leet.com is-lost.org is-not-certified.com is-saved.org is-slick.com is-uberleet.com is-very-bad.org is-very-evil.org is-very-good.org is-very-nice.org is-very-sweet.org is-with-theband.com isa-geek.com isa-geek.net isa-geek.org isa-hockeynut.com issmarterthanyou.com isteingeek.de istmein.de kicks-ass.net kicks-ass.org knowsitall.info land-4-sale.us lebtimnetz.de leitungsen.de likes-pie.com likescandy.com merseine.nu mine.nu misconfused.org mypets.ws myphotos.cc neat-url.com office-on-the.net on-the-web.tv podzone.net podzone.org readmyblog.org saves-the-whales.com scrapper-site.net scrapping.cc selfip.biz selfip.com selfip.info selfip.net selfip.org sells-for-less.com sells-for-u.com sells-it.net sellsyourhome.org servebbs.com servebbs.net servebbs.org serveftp.net serveftp.org servegame.org shacknet.nu simple-url.com space-to-rent.com stuff-4-sale.org stuff-4-sale.us teaches-yoga.com thruhere.net traeumtgerade.de webhop.biz webhop.info webhop.net webhop.org worse-than.tv writesthisblog.com // ===END PRIVATE DOMAINS=== �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/README.md000644 �000766 �000024 �00000047367 12455173731 033421� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������[RFC6265](http://tools.ietf.org/html/rfc6265) Cookies and CookieJar for Node.js ![Tough Cookie](http://www.goinstant.com.s3.amazonaws.com/tough-cookie.jpg) [![Build Status](https://travis-ci.org/goinstant/node-cookie.png?branch=master)](https://travis-ci.org/goinstant/node-cookie) [![NPM Stats](https://nodei.co/npm/tough-cookie.png?downloads=true&stars=true)](https://npmjs.org/package/tough-cookie) ![NPM Downloads](https://nodei.co/npm-dl/tough-cookie.png?months=9) # Synopsis ``` javascript var tough = require('tough-cookie'); // note: not 'cookie', 'cookies' or 'node-cookie' var Cookie = tough.Cookie; var cookie = Cookie.parse(header); cookie.value = 'somethingdifferent'; header = cookie.toString(); var cookiejar = new tough.CookieJar(); cookiejar.setCookie(cookie, 'http://currentdomain.example.com/path', cb); // ... cookiejar.getCookies('http://example.com/otherpath',function(err,cookies) { res.headers['cookie'] = cookies.join('; '); }); ``` # Installation It's _so_ easy! `npm install tough-cookie` Requires `punycode`, which should get installed automatically for you. Note that node.js v0.6.2+ bundles punycode by default. Why the name? NPM modules `cookie`, `cookies` and `cookiejar` were already taken. # API tough ===== Functions on the module you get from `require('tough-cookie')`. All can be used as pure functions and don't need to be "bound". parseDate(string[,strict]) ----------------- Parse a cookie date string into a `Date`. Parses according to RFC6265 Section 5.1.1, not `Date.parse()`. If strict is set to true then leading/trailing non-seperator characters around the time part will cause the parsing to fail (e.g. "Thu, 01 Jan 1970 00:00:010 GMT" has an extra trailing zero but Chrome, an assumedly RFC-compliant browser, treats this as valid). formatDate(date) ---------------- Format a Date into a RFC1123 string (the RFC6265-recommended format). canonicalDomain(str) -------------------- Transforms a domain-name into a canonical domain-name. The canonical domain-name is a trimmed, lowercased, stripped-of-leading-dot and optionally punycode-encoded domain-name (Section 5.1.2 of RFC6265). For the most part, this function is idempotent (can be run again on its output without ill effects). domainMatch(str,domStr[,canonicalize=true]) ------------------------------------------- Answers "does this real domain match the domain in a cookie?". The `str` is the "current" domain-name and the `domStr` is the "cookie" domain-name. Matches according to RFC6265 Section 5.1.3, but it helps to think of it as a "suffix match". The `canonicalize` parameter will run the other two paramters through `canonicalDomain` or not. defaultPath(path) ----------------- Given a current request/response path, gives the Path apropriate for storing in a cookie. This is basically the "directory" of a "file" in the path, but is specified by Section 5.1.4 of the RFC. The `path` parameter MUST be _only_ the pathname part of a URI (i.e. excludes the hostname, query, fragment, etc.). This is the `.pathname` property of node's `uri.parse()` output. pathMatch(reqPath,cookiePath) ----------------------------- Answers "does the request-path path-match a given cookie-path?" as per RFC6265 Section 5.1.4. Returns a boolean. This is essentially a prefix-match where `cookiePath` is a prefix of `reqPath`. parse(header[,strict=false]) ---------------------------- alias for `Cookie.parse(header[,strict])` fromJSON(string) ---------------- alias for `Cookie.fromJSON(string)` getPublicSuffix(hostname) ------------------------- Returns the public suffix of this hostname. The public suffix is the shortest domain-name upon which a cookie can be set. Returns `null` if the hostname cannot have cookies set for it. For example: `www.example.com` and `www.subdomain.example.com` both have public suffix `example.com`. For further information, see http://publicsuffix.org/. This module derives its list from that site. cookieCompare(a,b) ------------------ For use with `.sort()`, sorts a list of cookies into the recommended order given in the RFC (Section 5.4 step 2). Longest `.path`s go first, then sorted oldest to youngest. ``` javascript var cookies = [ /* unsorted array of Cookie objects */ ]; cookies = cookies.sort(cookieCompare); ``` permuteDomain(domain) --------------------- Generates a list of all possible domains that `domainMatch()` the parameter. May be handy for implementing cookie stores. permutePath(path) ----------------- Generates a list of all possible paths that `pathMatch()` the parameter. May be handy for implementing cookie stores. Cookie ====== Cookie.parse(header[,strict=false]) ----------------------------------- Parses a single Cookie or Set-Cookie HTTP header into a `Cookie` object. Returns `undefined` if the string can't be parsed. If in strict mode, returns `undefined` if the cookie doesn't follow the guidelines in section 4 of RFC6265. Generally speaking, strict mode can be used to validate your own generated Set-Cookie headers, but acting as a client you want to be lenient and leave strict mode off. Here's how to process the Set-Cookie header(s) on a node HTTP/HTTPS response: ``` javascript if (res.headers['set-cookie'] instanceof Array) cookies = res.headers['set-cookie'].map(function (c) { return (Cookie.parse(c)); }); else cookies = [Cookie.parse(res.headers['set-cookie'])]; ``` Cookie.fromJSON(string) ----------------------- Convert a JSON string to a `Cookie` object. Does a `JSON.parse()` and converts the `.created`, `.lastAccessed` and `.expires` properties into `Date` objects. Properties ========== * _key_ - string - the name or key of the cookie (default "") * _value_ - string - the value of the cookie (default "") * _expires_ - `Date` - if set, the `Expires=` attribute of the cookie (defaults to the string `"Infinity"`). See `setExpires()` * _maxAge_ - seconds - if set, the `Max-Age=` attribute _in seconds_ of the cookie. May also be set to strings `"Infinity"` and `"-Infinity"` for non-expiry and immediate-expiry, respectively. See `setMaxAge()` * _domain_ - string - the `Domain=` attribute of the cookie * _path_ - string - the `Path=` of the cookie * _secure_ - boolean - the `Secure` cookie flag * _httpOnly_ - boolean - the `HttpOnly` cookie flag * _extensions_ - `Array` - any unrecognized cookie attributes as strings (even if equal-signs inside) After a cookie has been passed through `CookieJar.setCookie()` it will have the following additional attributes: * _hostOnly_ - boolean - is this a host-only cookie (i.e. no Domain field was set, but was instead implied) * _pathIsDefault_ - boolean - if true, there was no Path field on the cookie and `defaultPath()` was used to derive one. * _created_ - `Date` - when this cookie was added to the jar * _lastAccessed_ - `Date` - last time the cookie got accessed. Will affect cookie cleaning once implemented. Using `cookiejar.getCookies(...)` will update this attribute. Construction([{options}]) ------------ Receives an options object that can contain any Cookie properties, uses the default for unspecified properties. .toString() ----------- encode to a Set-Cookie header value. The Expires cookie field is set using `formatDate()`, but is omitted entirely if `.expires` is `Infinity`. .cookieString() --------------- encode to a Cookie header value (i.e. the `.key` and `.value` properties joined with '='). .setExpires(String) ------------------- sets the expiry based on a date-string passed through `parseDate()`. If parseDate returns `null` (i.e. can't parse this date string), `.expires` is set to `"Infinity"` (a string) is set. .setMaxAge(number) ------------------- sets the maxAge in seconds. Coerces `-Infinity` to `"-Infinity"` and `Infinity` to `"Infinity"` so it JSON serializes correctly. .expiryTime([now=Date.now()]) ----------------------------- .expiryDate([now=Date.now()]) ----------------------------- expiryTime() Computes the absolute unix-epoch milliseconds that this cookie expires. expiryDate() works similarly, except it returns a `Date` object. Note that in both cases the `now` parameter should be milliseconds. Max-Age takes precedence over Expires (as per the RFC). The `.created` attribute -- or, by default, the `now` paramter -- is used to offset the `.maxAge` attribute. If Expires (`.expires`) is set, that's returned. Otherwise, `expiryTime()` returns `Infinity` and `expiryDate()` returns a `Date` object for "Tue, 19 Jan 2038 03:14:07 GMT" (latest date that can be expressed by a 32-bit `time_t`; the common limit for most user-agents). .TTL([now=Date.now()]) --------- compute the TTL relative to `now` (milliseconds). The same precedence rules as for `expiryTime`/`expiryDate` apply. The "number" `Infinity` is returned for cookies without an explicit expiry and `0` is returned if the cookie is expired. Otherwise a time-to-live in milliseconds is returned. .canonicalizedDoman() --------------------- .cdomain() ---------- return the canonicalized `.domain` field. This is lower-cased and punycode (RFC3490) encoded if the domain has any non-ASCII characters. .validate() ----------- Status: *IN PROGRESS*. Works for a few things, but is by no means comprehensive. validates cookie attributes for semantic correctness. Useful for "lint" checking any Set-Cookie headers you generate. For now, it returns a boolean, but eventually could return a reason string -- you can future-proof with this construct: ``` javascript if (cookie.validate() === true) { // it's tasty } else { // yuck! } ``` CookieJar ========= Construction([store = new MemoryCookieStore()][, rejectPublicSuffixes]) ------------ Simply use `new CookieJar()`. If you'd like to use a custom store, pass that to the constructor otherwise a `MemoryCookieStore` will be created and used. Attributes ---------- * _rejectPublicSuffixes_ - boolean - reject cookies with domains like "com" and "co.uk" (default: `true`) Since eventually this module would like to support database/remote/etc. CookieJars, continuation passing style is used for CookieJar methods. .setCookie(cookieOrString, currentUrl, [{options},] cb(err,cookie)) ------------------------------------------------------------------- Attempt to set the cookie in the cookie jar. If the operation fails, an error will be given to the callback `cb`, otherwise the cookie is passed through. The cookie will have updated `.created`, `.lastAccessed` and `.hostOnly` properties. The `options` object can be omitted and can have the following properties: * _http_ - boolean - default `true` - indicates if this is an HTTP or non-HTTP API. Affects HttpOnly cookies. * _secure_ - boolean - autodetect from url - indicates if this is a "Secure" API. If the currentUrl starts with `https:` or `wss:` then this is defaulted to `true`, otherwise `false`. * _now_ - Date - default `new Date()` - what to use for the creation/access time of cookies * _strict_ - boolean - default `false` - perform extra checks * _ignoreError_ - boolean - default `false` - silently ignore things like parse errors and invalid domains. CookieStore errors aren't ignored by this option. As per the RFC, the `.hostOnly` property is set if there was no "Domain=" parameter in the cookie string (or `.domain` was null on the Cookie object). The `.domain` property is set to the fully-qualified hostname of `currentUrl` in this case. Matching this cookie requires an exact hostname match (not a `domainMatch` as per usual). .setCookieSync(cookieOrString, currentUrl, [{options}]) ------------------------------------------------------- Synchronous version of `setCookie`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). .storeCookie(cookie, [{options},] cb(err,cookie)) ------------------------------------------------- __REMOVED__ removed in lieu of the CookieStore API below .getCookies(currentUrl, [{options},] cb(err,cookies)) ----------------------------------------------------- Retrieve the list of cookies that can be sent in a Cookie header for the current url. If an error is encountered, that's passed as `err` to the callback, otherwise an `Array` of `Cookie` objects is passed. The array is sorted with `cookieCompare()` unless the `{sort:false}` option is given. The `options` object can be omitted and can have the following properties: * _http_ - boolean - default `true` - indicates if this is an HTTP or non-HTTP API. Affects HttpOnly cookies. * _secure_ - boolean - autodetect from url - indicates if this is a "Secure" API. If the currentUrl starts with `https:` or `wss:` then this is defaulted to `true`, otherwise `false`. * _now_ - Date - default `new Date()` - what to use for the creation/access time of cookies * _expire_ - boolean - default `true` - perform expiry-time checking of cookies and asynchronously remove expired cookies from the store. Using `false` will return expired cookies and **not** remove them from the store (which is useful for replaying Set-Cookie headers, potentially). * _allPaths_ - boolean - default `false` - if `true`, do not scope cookies by path. The default uses RFC-compliant path scoping. **Note**: may not be supported by the CookieStore `fetchCookies` function (the default MemoryCookieStore supports it). The `.lastAccessed` property of the returned cookies will have been updated. .getCookiesSync(currentUrl, [{options}]) ---------------------------------------- Synchronous version of `getCookies`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). .getCookieString(...) --------------------- Accepts the same options as `.getCookies()` but passes a string suitable for a Cookie header rather than an array to the callback. Simply maps the `Cookie` array via `.cookieString()`. .getCookieStringSync(...) ------------------------- Synchronous version of `getCookieString`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). .getSetCookieStrings(...) ------------------------- Returns an array of strings suitable for **Set-Cookie** headers. Accepts the same options as `.getCookies()`. Simply maps the cookie array via `.toString()`. .getSetCookieStringsSync(...) ----------------------------- Synchronous version of `getSetCookieStrings`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). Store ===== Base class for CookieJar stores. # CookieStore API The storage model for each `CookieJar` instance can be replaced with a custom implementation. The default is `MemoryCookieStore` which can be found in the `lib/memstore.js` file. The API uses continuation-passing-style to allow for asynchronous stores. Stores should inherit from the base `Store` class, which is available as `require('tough-cookie').Store`. Stores are asynchronous by default, but if `store.synchronous` is set, then the `*Sync` methods on the CookieJar can be used. All `domain` parameters will have been normalized before calling. The Cookie store must have all of the following methods. store.findCookie(domain, path, key, cb(err,cookie)) --------------------------------------------------- Retrieve a cookie with the given domain, path and key (a.k.a. name). The RFC maintains that exactly one of these cookies should exist in a store. If the store is using versioning, this means that the latest/newest such cookie should be returned. Callback takes an error and the resulting `Cookie` object. If no cookie is found then `null` MUST be passed instead (i.e. not an error). store.findCookies(domain, path, cb(err,cookies)) ------------------------------------------------ Locates cookies matching the given domain and path. This is most often called in the context of `cookiejar.getCookies()` above. If no cookies are found, the callback MUST be passed an empty array. The resulting list will be checked for applicability to the current request according to the RFC (domain-match, path-match, http-only-flag, secure-flag, expiry, etc.), so it's OK to use an optimistic search algorithm when implementing this method. However, the search algorithm used SHOULD try to find cookies that `domainMatch()` the domain and `pathMatch()` the path in order to limit the amount of checking that needs to be done. As of version 0.9.12, the `allPaths` option to `cookiejar.getCookies()` above will cause the path here to be `null`. If the path is `null`, path-matching MUST NOT be performed (i.e. domain-matching only). store.putCookie(cookie, cb(err)) -------------------------------- Adds a new cookie to the store. The implementation SHOULD replace any existing cookie with the same `.domain`, `.path`, and `.key` properties -- depending on the nature of the implementation, it's possible that between the call to `fetchCookie` and `putCookie` that a duplicate `putCookie` can occur. The `cookie` object MUST NOT be modified; the caller will have already updated the `.creation` and `.lastAccessed` properties. Pass an error if the cookie cannot be stored. store.updateCookie(oldCookie, newCookie, cb(err)) ------------------------------------------------- Update an existing cookie. The implementation MUST update the `.value` for a cookie with the same `domain`, `.path` and `.key`. The implementation SHOULD check that the old value in the store is equivalent to `oldCookie` - how the conflict is resolved is up to the store. The `.lastAccessed` property will always be different between the two objects and `.created` will always be the same. Stores MAY ignore or defer the `.lastAccessed` change at the cost of affecting how cookies are sorted (or selected for deletion). Stores may wish to optimize changing the `.value` of the cookie in the store versus storing a new cookie. If the implementation doesn't define this method a stub that calls `putCookie(newCookie,cb)` will be added to the store object. The `newCookie` and `oldCookie` objects MUST NOT be modified. Pass an error if the newCookie cannot be stored. store.removeCookie(domain, path, key, cb(err)) ---------------------------------------------- Remove a cookie from the store (see notes on `findCookie` about the uniqueness constraint). The implementation MUST NOT pass an error if the cookie doesn't exist; only pass an error due to the failure to remove an existing cookie. store.removeCookies(domain, path, cb(err)) ------------------------------------------ Removes matching cookies from the store. The `path` paramter is optional, and if missing means all paths in a domain should be removed. Pass an error ONLY if removing any existing cookies failed. # TODO * _full_ RFC5890/RFC5891 canonicalization for domains in `cdomain()` * the optional `punycode` requirement implements RFC3492, but RFC6265 requires RFC5891 * better tests for `validate()`? # Copyright and License (tl;dr: MIT with some MPL/1.1) Copyright 2012- GoInstant, Inc. and other contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. Portions may be licensed under different licenses (in particular public-suffix.txt is MPL/1.1); please read the LICENSE file for full details. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/test.js��000644 �000766 �000024 �00000153630 12455173731 033446� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/* * Copyright GoInstant, Inc. and other contributors. All rights reserved. * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ 'use strict'; var vows = require('vows'); var assert = require('assert'); var async = require('async'); // NOTE use require("tough-cookie") in your own code: var tough = require('./lib/cookie'); var Cookie = tough.Cookie; var CookieJar = tough.CookieJar; function dateVows(table) { var theVows = { }; Object.keys(table).forEach(function(date) { var expect = table[date]; theVows[date] = function() { var got = tough.parseDate(date) ? 'valid' : 'invalid'; assert.equal(got, expect ? 'valid' : 'invalid'); }; }); return { "date parsing": theVows }; } function matchVows(func,table) { var theVows = {}; table.forEach(function(item) { var str = item[0]; var dom = item[1]; var expect = item[2]; var label = str+(expect?" matches ":" doesn't match ")+dom; theVows[label] = function() { assert.equal(func(str,dom),expect); }; }); return theVows; } function defaultPathVows(table) { var theVows = {}; table.forEach(function(item) { var str = item[0]; var expect = item[1]; var label = str+" gives "+expect; theVows[label] = function() { assert.equal(tough.defaultPath(str),expect); }; }); return theVows; } var atNow = Date.now(); function at(offset) { return {now: new Date(atNow+offset)}; } vows.describe('Cookie Jar') .addBatch({ "all defined": function() { assert.ok(Cookie); assert.ok(CookieJar); }, }) .addBatch( dateVows({ "Wed, 09 Jun 2021 10:18:14 GMT": true, "Wed, 09 Jun 2021 22:18:14 GMT": true, "Tue, 18 Oct 2011 07:42:42.123 GMT": true, "18 Oct 2011 07:42:42 GMT": true, "8 Oct 2011 7:42:42 GMT": true, "8 Oct 2011 7:2:42 GMT": false, "Oct 18 2011 07:42:42 GMT": true, "Tue Oct 18 2011 07:05:03 GMT+0000 (GMT)": true, "09 Jun 2021 10:18:14 GMT": true, "99 Jix 3038 48:86:72 ZMT": false, '01 Jan 1970 00:00:00 GMT': true, '01 Jan 1600 00:00:00 GMT': false, // before 1601 '01 Jan 1601 00:00:00 GMT': true, '10 Feb 81 13:00:00 GMT': true, // implicit year 'Thu, 01 Jan 1970 00:00:010 GMT': true, // strange time, non-strict OK 'Thu, 17-Apr-2014 02:12:29 GMT': true, // dashes 'Thu, 17-Apr-2014 02:12:29 UTC': true, // dashes and UTC }) ) .addBatch({ "strict date parse of Thu, 01 Jan 1970 00:00:010 GMT": { topic: function() { return tough.parseDate('Thu, 01 Jan 1970 00:00:010 GMT', true) ? true : false; }, "invalid": function(date) { assert.equal(date,false); }, } }) .addBatch({ "formatting": { "a simple cookie": { topic: function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; return c; }, "validates": function(c) { assert.ok(c.validate()); }, "to string": function(c) { assert.equal(c.toString(), 'a=b'); }, }, "a cookie with spaces in the value": { topic: function() { var c = new Cookie(); c.key = 'a'; c.value = 'beta gamma'; return c; }, "doesn't validate": function(c) { assert.ok(!c.validate()); }, "'garbage in, garbage out'": function(c) { assert.equal(c.toString(), 'a=beta gamma'); }, }, "with an empty value and HttpOnly": { topic: function() { var c = new Cookie(); c.key = 'a'; c.httpOnly = true; return c; }, "to string": function(c) { assert.equal(c.toString(), 'a=; HttpOnly'); } }, "with an expiry": { topic: function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.setExpires("Oct 18 2011 07:05:03 GMT"); return c; }, "validates": function(c) { assert.ok(c.validate()); }, "to string": function(c) { assert.equal(c.toString(), 'a=b; Expires=Tue, 18 Oct 2011 07:05:03 GMT'); }, "to short string": function(c) { assert.equal(c.cookieString(), 'a=b'); }, }, "with a max-age": { topic: function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.setExpires("Oct 18 2011 07:05:03 GMT"); c.maxAge = 12345; return c; }, "validates": function(c) { assert.ok(c.validate()); // mabe this one *shouldn't*? }, "to string": function(c) { assert.equal(c.toString(), 'a=b; Expires=Tue, 18 Oct 2011 07:05:03 GMT; Max-Age=12345'); }, }, "with a bunch of things": function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.setExpires("Oct 18 2011 07:05:03 GMT"); c.maxAge = 12345; c.domain = 'example.com'; c.path = '/foo'; c.secure = true; c.httpOnly = true; c.extensions = ['MyExtension']; assert.equal(c.toString(), 'a=b; Expires=Tue, 18 Oct 2011 07:05:03 GMT; Max-Age=12345; Domain=example.com; Path=/foo; Secure; HttpOnly; MyExtension'); }, "a host-only cookie": { topic: function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.hostOnly = true; c.domain = 'shouldnt-stringify.example.com'; c.path = '/should-stringify'; return c; }, "validates": function(c) { assert.ok(c.validate()); }, "to string": function(c) { assert.equal(c.toString(), 'a=b; Path=/should-stringify'); }, }, "minutes are '10'": { topic: function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.expires = new Date(1284113410000); return c; }, "validates": function(c) { assert.ok(c.validate()); }, "to string": function(c) { var str = c.toString(); assert.notEqual(str, 'a=b; Expires=Fri, 010 Sep 2010 010:010:010 GMT'); assert.equal(str, 'a=b; Expires=Fri, 10 Sep 2010 10:10:10 GMT'); }, } } }) .addBatch({ "TTL with max-age": function() { var c = new Cookie(); c.maxAge = 123; assert.equal(c.TTL(), 123000); assert.equal(c.expiryTime(new Date(9000000)), 9123000); }, "TTL with zero max-age": function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.maxAge = 0; // should be treated as "earliest representable" assert.equal(c.TTL(), 0); assert.equal(c.expiryTime(new Date(9000000)), -Infinity); assert.ok(!c.validate()); // not valid, really: non-zero-digit *DIGIT }, "TTL with negative max-age": function() { var c = new Cookie(); c.key = 'a'; c.value = 'b'; c.maxAge = -1; // should be treated as "earliest representable" assert.equal(c.TTL(), 0); assert.equal(c.expiryTime(new Date(9000000)), -Infinity); assert.ok(!c.validate()); // not valid, really: non-zero-digit *DIGIT }, "TTL with max-age and expires": function() { var c = new Cookie(); c.maxAge = 123; c.expires = new Date(Date.now()+9000); assert.equal(c.TTL(), 123000); assert.ok(c.isPersistent()); }, "TTL with expires": function() { var c = new Cookie(); var now = Date.now(); c.expires = new Date(now+9000); assert.equal(c.TTL(now), 9000); assert.equal(c.expiryTime(), c.expires.getTime()); }, "TTL with old expires": function() { var c = new Cookie(); c.setExpires('17 Oct 2010 00:00:00 GMT'); assert.ok(c.TTL() < 0); assert.ok(c.isPersistent()); }, "default TTL": { topic: function() { return new Cookie(); }, "is Infinite-future": function(c) { assert.equal(c.TTL(), Infinity) }, "is a 'session' cookie": function(c) { assert.ok(!c.isPersistent()) }, }, }).addBatch({ "Parsing": { "simple": { topic: function() { return Cookie.parse('a=bcd',true) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'a') }, "value": function(c) { assert.equal(c.value, 'bcd') }, "no path": function(c) { assert.equal(c.path, null) }, "no domain": function(c) { assert.equal(c.domain, null) }, "no extensions": function(c) { assert.ok(!c.extensions) }, }, "with expiry": { topic: function() { return Cookie.parse('a=bcd; Expires=Tue, 18 Oct 2011 07:05:03 GMT',true) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'a') }, "value": function(c) { assert.equal(c.value, 'bcd') }, "has expires": function(c) { assert.ok(c.expires !== Infinity, 'expiry is infinite when it shouldn\'t be'); assert.equal(c.expires.getTime(), 1318921503000); }, }, "with expiry and path": { topic: function() { return Cookie.parse('abc="xyzzy!"; Expires=Tue, 18 Oct 2011 07:05:03 GMT; Path=/aBc',true) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'abc') }, "value": function(c) { assert.equal(c.value, 'xyzzy!') }, "has expires": function(c) { assert.ok(c.expires !== Infinity, 'expiry is infinite when it shouldn\'t be'); assert.equal(c.expires.getTime(), 1318921503000); }, "has path": function(c) { assert.equal(c.path, '/aBc'); }, "no httponly or secure": function(c) { assert.ok(!c.httpOnly); assert.ok(!c.secure); }, }, "with everything": { topic: function() { return Cookie.parse('abc="xyzzy!"; Expires=Tue, 18 Oct 2011 07:05:03 GMT; Path=/aBc; Domain=example.com; Secure; HTTPOnly; Max-Age=1234; Foo=Bar; Baz', true) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'abc') }, "value": function(c) { assert.equal(c.value, 'xyzzy!') }, "has expires": function(c) { assert.ok(c.expires !== Infinity, 'expiry is infinite when it shouldn\'t be'); assert.equal(c.expires.getTime(), 1318921503000); }, "has path": function(c) { assert.equal(c.path, '/aBc'); }, "has domain": function(c) { assert.equal(c.domain, 'example.com'); }, "has httponly": function(c) { assert.equal(c.httpOnly, true); }, "has secure": function(c) { assert.equal(c.secure, true); }, "has max-age": function(c) { assert.equal(c.maxAge, 1234); }, "has extensions": function(c) { assert.ok(c.extensions); assert.equal(c.extensions[0], 'Foo=Bar'); assert.equal(c.extensions[1], 'Baz'); }, }, "invalid expires": { "strict": function() { assert.ok(!Cookie.parse("a=b; Expires=xyzzy", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; Expires=xyzzy"); assert.ok(c); assert.equal(c.expires, Infinity); }, }, "zero max-age": { "strict": function() { assert.ok(!Cookie.parse("a=b; Max-Age=0", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; Max-Age=0"); assert.ok(c); assert.equal(c.maxAge, 0); }, }, "negative max-age": { "strict": function() { assert.ok(!Cookie.parse("a=b; Max-Age=-1", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; Max-Age=-1"); assert.ok(c); assert.equal(c.maxAge, -1); }, }, "empty domain": { "strict": function() { assert.ok(!Cookie.parse("a=b; domain=", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; domain="); assert.ok(c); assert.equal(c.domain, null); }, }, "dot domain": { "strict": function() { assert.ok(!Cookie.parse("a=b; domain=.", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; domain=."); assert.ok(c); assert.equal(c.domain, null); }, }, "uppercase domain": { "strict lowercases": function() { var c = Cookie.parse("a=b; domain=EXAMPLE.COM"); assert.ok(c); assert.equal(c.domain, 'example.com'); }, "non-strict lowercases": function() { var c = Cookie.parse("a=b; domain=EXAMPLE.COM"); assert.ok(c); assert.equal(c.domain, 'example.com'); }, }, "trailing dot in domain": { topic: function() { return Cookie.parse("a=b; Domain=example.com.", true) || null; }, "has the domain": function(c) { assert.equal(c.domain,"example.com.") }, "but doesn't validate": function(c) { assert.equal(c.validate(),false) }, }, "empty path": { "strict": function() { assert.ok(!Cookie.parse("a=b; path=", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; path="); assert.ok(c); assert.equal(c.path, null); }, }, "no-slash path": { "strict": function() { assert.ok(!Cookie.parse("a=b; path=xyzzy", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; path=xyzzy"); assert.ok(c); assert.equal(c.path, null); }, }, "trailing semi-colons after path": { topic: function () { return [ "a=b; path=/;", "c=d;;;;" ]; }, "strict": function (t) { assert.ok(!Cookie.parse(t[0], true)); assert.ok(!Cookie.parse(t[1], true)); }, "non-strict": function (t) { var c1 = Cookie.parse(t[0]); var c2 = Cookie.parse(t[1]); assert.ok(c1); assert.ok(c2); assert.equal(c1.path, '/'); } }, "secure-with-value": { "strict": function() { assert.ok(!Cookie.parse("a=b; Secure=xyzzy", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; Secure=xyzzy"); assert.ok(c); assert.equal(c.secure, true); }, }, "httponly-with-value": { "strict": function() { assert.ok(!Cookie.parse("a=b; HttpOnly=xyzzy", true)) }, "non-strict": function() { var c = Cookie.parse("a=b; HttpOnly=xyzzy"); assert.ok(c); assert.equal(c.httpOnly, true); }, }, "garbage": { topic: function() { return Cookie.parse("\x08", true) || null; }, "doesn't parse": function(c) { assert.equal(c,null) }, }, "public suffix domain": { topic: function() { return Cookie.parse("a=b; domain=kyoto.jp", true) || null; }, "parses fine": function(c) { assert.ok(c); assert.equal(c.domain, 'kyoto.jp'); }, "but fails validation": function(c) { assert.ok(c); assert.ok(!c.validate()); }, }, "Ironically, Google 'GAPS' cookie has very little whitespace": { topic: function() { return Cookie.parse("GAPS=1:A1aaaaAaAAa1aaAaAaaAAAaaa1a11a:aaaAaAaAa-aaaA1-;Path=/;Expires=Thu, 17-Apr-2014 02:12:29 GMT;Secure;HttpOnly"); }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'GAPS') }, "value": function(c) { assert.equal(c.value, '1:A1aaaaAaAAa1aaAaAaaAAAaaa1a11a:aaaAaAaAa-aaaA1-') }, "path": function(c) { assert.notEqual(c.path, '/;Expires'); // BUG assert.equal(c.path, '/'); }, "expires": function(c) { assert.notEqual(c.expires, Infinity); assert.equal(c.expires.getTime(), 1397700749000); }, "secure": function(c) { assert.ok(c.secure) }, "httponly": function(c) { assert.ok(c.httpOnly) }, }, "lots of equal signs": { topic: function() { return Cookie.parse("queryPref=b=c&d=e; Path=/f=g; Expires=Thu, 17 Apr 2014 02:12:29 GMT; HttpOnly"); }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'queryPref') }, "value": function(c) { assert.equal(c.value, 'b=c&d=e') }, "path": function(c) { assert.equal(c.path, '/f=g'); }, "expires": function(c) { assert.notEqual(c.expires, Infinity); assert.equal(c.expires.getTime(), 1397700749000); }, "httponly": function(c) { assert.ok(c.httpOnly) }, }, "spaces in value": { "strict": { topic: function() { return Cookie.parse('a=one two three',true) || null; }, "did not parse": function(c) { assert.isNull(c) }, }, "non-strict": { topic: function() { return Cookie.parse('a=one two three',false) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'a') }, "value": function(c) { assert.equal(c.value, 'one two three') }, "no path": function(c) { assert.equal(c.path, null) }, "no domain": function(c) { assert.equal(c.domain, null) }, "no extensions": function(c) { assert.ok(!c.extensions) }, }, }, "quoted spaces in value": { "strict": { topic: function() { return Cookie.parse('a="one two three"',true) || null; }, "did not parse": function(c) { assert.isNull(c) }, }, "non-strict": { topic: function() { return Cookie.parse('a="one two three"',false) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'a') }, "value": function(c) { assert.equal(c.value, 'one two three') }, "no path": function(c) { assert.equal(c.path, null) }, "no domain": function(c) { assert.equal(c.domain, null) }, "no extensions": function(c) { assert.ok(!c.extensions) }, } }, "non-ASCII in value": { "strict": { topic: function() { return Cookie.parse('farbe=weiß',true) || null; }, "did not parse": function(c) { assert.isNull(c) }, }, "non-strict": { topic: function() { return Cookie.parse('farbe=weiß',false) || null; }, "parsed": function(c) { assert.ok(c) }, "key": function(c) { assert.equal(c.key, 'farbe') }, "value": function(c) { assert.equal(c.value, 'weiß') }, "no path": function(c) { assert.equal(c.path, null) }, "no domain": function(c) { assert.equal(c.domain, null) }, "no extensions": function(c) { assert.ok(!c.extensions) }, }, }, } }) .addBatch({ "domain normalization": { "simple": function() { var c = new Cookie(); c.domain = "EXAMPLE.com"; assert.equal(c.canonicalizedDomain(), "example.com"); }, "extra dots": function() { var c = new Cookie(); c.domain = ".EXAMPLE.com"; assert.equal(c.cdomain(), "example.com"); }, "weird trailing dot": function() { var c = new Cookie(); c.domain = "EXAMPLE.ca."; assert.equal(c.canonicalizedDomain(), "example.ca."); }, "weird internal dots": function() { var c = new Cookie(); c.domain = "EXAMPLE...ca."; assert.equal(c.canonicalizedDomain(), "example...ca."); }, "IDN": function() { var c = new Cookie(); c.domain = "δοκιμή.δοκιμή"; // "test.test" in greek assert.equal(c.canonicalizedDomain(), "xn--jxalpdlp.xn--jxalpdlp"); } } }) .addBatch({ "Domain Match":matchVows(tough.domainMatch, [ // str, dom, expect ["example.com", "example.com", true], ["eXaMpLe.cOm", "ExAmPlE.CoM", true], ["no.ca", "yes.ca", false], ["wwwexample.com", "example.com", false], ["www.example.com", "example.com", true], ["example.com", "www.example.com", false], ["www.subdom.example.com", "example.com", true], ["www.subdom.example.com", "subdom.example.com", true], ["example.com", "example.com.", false], // RFC6265 S4.1.2.3 ["192.168.0.1", "168.0.1", false], // S5.1.3 "The string is a host name" [null, "example.com", null], ["example.com", null, null], [null, null, null], [undefined, undefined, null], ]) }) .addBatch({ "default-path": defaultPathVows([ [null,"/"], ["/","/"], ["/file","/"], ["/dir/file","/dir"], ["noslash","/"], ]) }) .addBatch({ "Path-Match": matchVows(tough.pathMatch, [ // request, cookie, match ["/","/",true], ["/dir","/",true], ["/","/dir",false], ["/dir/","/dir/", true], ["/dir/file","/dir/",true], ["/dir/file","/dir",true], ["/directory","/dir",false], ]) }) .addBatch({ "Cookie Sorting": { topic: function() { var cookies = []; var now = Date.now(); cookies.push(Cookie.parse("a=0; Domain=example.com")); cookies.push(Cookie.parse("b=1; Domain=www.example.com")); cookies.push(Cookie.parse("c=2; Domain=example.com; Path=/pathA")); cookies.push(Cookie.parse("d=3; Domain=www.example.com; Path=/pathA")); cookies.push(Cookie.parse("e=4; Domain=example.com; Path=/pathA/pathB")); cookies.push(Cookie.parse("f=5; Domain=www.example.com; Path=/pathA/pathB")); // force a stable creation time consistent with the order above since // some may have been created at now + 1ms. var i = cookies.length; cookies.forEach(function(cookie) { cookie.creation = new Date(now - 100*(i--)); }); // weak shuffle: cookies = cookies.sort(function(){return Math.random()-0.5}); cookies = cookies.sort(tough.cookieCompare); return cookies; }, "got": function(cookies) { assert.lengthOf(cookies, 6); var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['e','f','c','d','a','b']); }, } }) .addBatch({ "CookieJar": { "Setting a basic cookie": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=example.com; Path=/"); assert.strictEqual(c.hostOnly, null); assert.instanceOf(c.creation, Date); assert.strictEqual(c.lastAccessed, null); c.creation = new Date(Date.now()-10000); cj.setCookie(c, 'http://example.com/index.html', this.callback); }, "works": function(c) { assert.instanceOf(c,Cookie) }, // C is for Cookie, good enough for me "gets timestamped": function(c) { assert.ok(c.creation); assert.ok(Date.now() - c.creation.getTime() < 5000); // recently stamped assert.ok(c.lastAccessed); assert.equal(c.creation, c.lastAccessed); assert.equal(c.TTL(), Infinity); assert.ok(!c.isPersistent()); }, }, "Setting a no-path cookie": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=example.com"); assert.strictEqual(c.hostOnly, null); assert.instanceOf(c.creation, Date); assert.strictEqual(c.lastAccessed, null); c.creation = new Date(Date.now()-10000); cj.setCookie(c, 'http://example.com/index.html', this.callback); }, "domain": function(c) { assert.equal(c.domain, 'example.com') }, "path is /": function(c) { assert.equal(c.path, '/') }, "path was derived": function(c) { assert.strictEqual(c.pathIsDefault, true) }, }, "Setting a cookie already marked as host-only": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=example.com"); assert.strictEqual(c.hostOnly, null); assert.instanceOf(c.creation, Date); assert.strictEqual(c.lastAccessed, null); c.creation = new Date(Date.now()-10000); c.hostOnly = true; cj.setCookie(c, 'http://example.com/index.html', this.callback); }, "domain": function(c) { assert.equal(c.domain, 'example.com') }, "still hostOnly": function(c) { assert.strictEqual(c.hostOnly, true) }, }, "Setting a session cookie": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b"); assert.strictEqual(c.path, null); cj.setCookie(c, 'http://www.example.com/dir/index.html', this.callback); }, "works": function(c) { assert.instanceOf(c,Cookie) }, "gets the domain": function(c) { assert.equal(c.domain, 'www.example.com') }, "gets the default path": function(c) { assert.equal(c.path, '/dir') }, "is 'hostOnly'": function(c) { assert.ok(c.hostOnly) }, }, "Setting wrong domain cookie": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=fooxample.com; Path=/"); cj.setCookie(c, 'http://example.com/index.html', this.callback); }, "fails": function(err,c) { assert.ok(err.message.match(/domain/i)); assert.ok(!c); }, }, "Setting sub-domain cookie": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=www.example.com; Path=/"); cj.setCookie(c, 'http://example.com/index.html', this.callback); }, "fails": function(err,c) { assert.ok(err.message.match(/domain/i)); assert.ok(!c); }, }, "Setting super-domain cookie": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=example.com; Path=/"); cj.setCookie(c, 'http://www.app.example.com/index.html', this.callback); }, "success": function(err,c) { assert.ok(!err); assert.equal(c.domain, 'example.com'); }, }, "Setting a sub-path cookie on a super-domain": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=example.com; Path=/subpath"); assert.strictEqual(c.hostOnly, null); assert.instanceOf(c.creation, Date); assert.strictEqual(c.lastAccessed, null); c.creation = new Date(Date.now()-10000); cj.setCookie(c, 'http://www.example.com/index.html', this.callback); }, "domain is super-domain": function(c) { assert.equal(c.domain, 'example.com') }, "path is /subpath": function(c) { assert.equal(c.path, '/subpath') }, "path was NOT derived": function(c) { assert.strictEqual(c.pathIsDefault, null) }, }, "Setting HttpOnly cookie over non-HTTP API": { topic: function() { var cj = new CookieJar(); var c = Cookie.parse("a=b; Domain=example.com; Path=/; HttpOnly"); cj.setCookie(c, 'http://example.com/index.html', {http:false}, this.callback); }, "fails": function(err,c) { assert.match(err.message, /HttpOnly/i); assert.ok(!c); }, }, }, "Cookie Jar store eight cookies": { topic: function() { var cj = new CookieJar(); var ex = 'http://example.com/index.html'; var tasks = []; tasks.push(function(next) { cj.setCookie('a=1; Domain=example.com; Path=/',ex,at(0),next); }); tasks.push(function(next) { cj.setCookie('b=2; Domain=example.com; Path=/; HttpOnly',ex,at(1000),next); }); tasks.push(function(next) { cj.setCookie('c=3; Domain=example.com; Path=/; Secure',ex,at(2000),next); }); tasks.push(function(next) { // path cj.setCookie('d=4; Domain=example.com; Path=/foo',ex,at(3000),next); }); tasks.push(function(next) { // host only cj.setCookie('e=5',ex,at(4000),next); }); tasks.push(function(next) { // other domain cj.setCookie('f=6; Domain=nodejs.org; Path=/','http://nodejs.org',at(5000),next); }); tasks.push(function(next) { // expired cj.setCookie('g=7; Domain=example.com; Path=/; Expires=Tue, 18 Oct 2011 00:00:00 GMT',ex,at(6000),next); }); tasks.push(function(next) { // expired via Max-Age cj.setCookie('h=8; Domain=example.com; Path=/; Max-Age=1',ex,next); }); var cb = this.callback; async.parallel(tasks, function(err,results){ setTimeout(function() { cb(err,cj,results); }, 2000); // so that 'h=8' expires }); }, "setup ok": function(err,cj,results) { assert.ok(!err); assert.ok(cj); assert.ok(results); }, "then retrieving for http://nodejs.org": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('http://nodejs.org',this.callback); }, "get a nodejs cookie": function(cookies) { assert.lengthOf(cookies, 1); var cookie = cookies[0]; assert.equal(cookie.domain, 'nodejs.org'); }, }, "then retrieving for https://example.com": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('https://example.com',{secure:true},this.callback); }, "get a secure example cookie with others": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['a','b','c','e']); }, }, "then retrieving for https://example.com (missing options)": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('https://example.com',this.callback); }, "get a secure example cookie with others": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['a','b','c','e']); }, }, "then retrieving for http://example.com": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('http://example.com',this.callback); }, "get a bunch of cookies": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['a','b','e']); }, }, "then retrieving for http://EXAMPlE.com": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('http://EXAMPlE.com',this.callback); }, "get a bunch of cookies": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['a','b','e']); }, }, "then retrieving for http://example.com, non-HTTP": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('http://example.com',{http:false},this.callback); }, "get a bunch of cookies": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['a','e']); }, }, "then retrieving for http://example.com/foo/bar": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('http://example.com/foo/bar',this.callback); }, "get a bunch of cookies": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['d','a','b','e']); }, }, "then retrieving for http://example.com as a string": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookieString('http://example.com',this.callback); }, "get a single string": function(cookieHeader) { assert.equal(cookieHeader, "a=1; b=2; e=5"); }, }, "then retrieving for http://example.com as a set-cookie header": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getSetCookieStrings('http://example.com',this.callback); }, "get a single string": function(cookieHeaders) { assert.lengthOf(cookieHeaders, 3); assert.equal(cookieHeaders[0], "a=1; Domain=example.com; Path=/"); assert.equal(cookieHeaders[1], "b=2; Domain=example.com; Path=/; HttpOnly"); assert.equal(cookieHeaders[2], "e=5; Path=/"); }, }, "then retrieving for http://www.example.com/": { topic: function(cj,oldResults) { assert.ok(oldResults); cj.getCookies('http://www.example.com/foo/bar',this.callback); }, "get a bunch of cookies": function(cookies) { var names = cookies.map(function(c) {return c.key}); assert.deepEqual(names, ['d','a','b']); // note lack of 'e' }, }, }, "Repeated names": { topic: function() { var cb = this.callback; var cj = new CookieJar(); var ex = 'http://www.example.com/'; var sc = cj.setCookie; var tasks = []; var now = Date.now(); tasks.push(sc.bind(cj,'aaaa=xxxx',ex,at(0))); tasks.push(sc.bind(cj,'aaaa=1111; Domain=www.example.com',ex,at(1000))); tasks.push(sc.bind(cj,'aaaa=2222; Domain=example.com',ex,at(2000))); tasks.push(sc.bind(cj,'aaaa=3333; Domain=www.example.com; Path=/pathA',ex,at(3000))); async.series(tasks,function(err,results) { results = results.filter(function(e) {return e !== undefined}); cb(err,{cj:cj, cookies:results, now:now}); }); }, "all got set": function(err,t) { assert.lengthOf(t.cookies,4); }, "then getting 'em back": { topic: function(t) { var cj = t.cj; cj.getCookies('http://www.example.com/pathA',this.callback); }, "there's just three": function (err,cookies) { var vals = cookies.map(function(c) {return c.value}); // may break with sorting; sorting should put 3333 first due to longest path: assert.deepEqual(vals, ['3333','1111','2222']); } }, }, "CookieJar setCookie errors": { "public-suffix domain": { topic: function() { var cj = new CookieJar(); cj.setCookie('i=9; Domain=kyoto.jp; Path=/','kyoto.jp',this.callback); }, "errors": function(err,cookie) { assert.ok(err); assert.ok(!cookie); assert.match(err.message, /public suffix/i); }, }, "wrong domain": { topic: function() { var cj = new CookieJar(); cj.setCookie('j=10; Domain=google.com; Path=/','google.ca',this.callback); }, "errors": function(err,cookie) { assert.ok(err); assert.ok(!cookie); assert.match(err.message, /not in this host's domain/i); }, }, "old cookie is HttpOnly": { topic: function() { var cb = this.callback; var next = function (err,c) { c = null; return cb(err,cj); }; var cj = new CookieJar(); cj.setCookie('k=11; Domain=example.ca; Path=/; HttpOnly','http://example.ca',{http:true},next); }, "initial cookie is set": function(err,cj) { assert.ok(!err); assert.ok(cj); }, "but when trying to overwrite": { topic: function(cj) { var cb = this.callback; var next = function(err,c) { c = null; cb(null,err); }; cj.setCookie('k=12; Domain=example.ca; Path=/','http://example.ca',{http:false},next); }, "it's an error": function(err) { assert.ok(err); }, "then, checking the original": { topic: function(ignored,cj) { assert.ok(cj instanceof CookieJar); cj.getCookies('http://example.ca',{http:true},this.callback); }, "cookie has original value": function(err,cookies) { assert.equal(err,null); assert.lengthOf(cookies, 1); assert.equal(cookies[0].value,11); }, }, }, }, }, }) .addBatch({ "JSON": { "serialization": { topic: function() { var c = Cookie.parse('alpha=beta; Domain=example.com; Path=/foo; Expires=Tue, 19 Jan 2038 03:14:07 GMT; HttpOnly'); return JSON.stringify(c); }, "gives a string": function(str) { assert.equal(typeof str, "string"); }, "date is in ISO format": function(str) { assert.match(str, /"expires":"2038-01-19T03:14:07\.000Z"/, 'expires is in ISO format'); }, }, "deserialization": { topic: function() { var json = '{"key":"alpha","value":"beta","domain":"example.com","path":"/foo","expires":"2038-01-19T03:14:07.000Z","httpOnly":true,"lastAccessed":2000000000123}'; return Cookie.fromJSON(json); }, "works": function(c) { assert.ok(c); }, "key": function(c) { assert.equal(c.key, "alpha") }, "value": function(c) { assert.equal(c.value, "beta") }, "domain": function(c) { assert.equal(c.domain, "example.com") }, "path": function(c) { assert.equal(c.path, "/foo") }, "httpOnly": function(c) { assert.strictEqual(c.httpOnly, true) }, "secure": function(c) { assert.strictEqual(c.secure, false) }, "hostOnly": function(c) { assert.strictEqual(c.hostOnly, null) }, "expires is a date object": function(c) { assert.equal(c.expires.getTime(), 2147483647000); }, "lastAccessed is a date object": function(c) { assert.equal(c.lastAccessed.getTime(), 2000000000123); }, "creation defaulted": function(c) { assert.ok(c.creation.getTime()); } }, "null deserialization": { topic: function() { return Cookie.fromJSON(null); }, "is null": function(cookie) { assert.equal(cookie,null); }, }, }, "expiry deserialization": { "Infinity": { topic: Cookie.fromJSON.bind(null, '{"expires":"Infinity"}'), "is infinite": function(c) { assert.strictEqual(c.expires, "Infinity"); assert.equal(c.expires, Infinity); }, }, }, "maxAge serialization": { topic: function() { return function(toSet) { var c = new Cookie(); c.key = 'foo'; c.value = 'bar'; c.setMaxAge(toSet); return JSON.stringify(c); }; }, "zero": { topic: function(f) { return f(0) }, "looks good": function(str) { assert.match(str, /"maxAge":0/); }, }, "Infinity": { topic: function(f) { return f(Infinity) }, "looks good": function(str) { assert.match(str, /"maxAge":"Infinity"/); }, }, "-Infinity": { topic: function(f) { return f(-Infinity) }, "looks good": function(str) { assert.match(str, /"maxAge":"-Infinity"/); }, }, "null": { topic: function(f) { return f(null) }, "looks good": function(str) { assert.match(str, /"maxAge":null/); }, }, }, "maxAge deserialization": { "number": { topic: Cookie.fromJSON.bind(null,'{"key":"foo","value":"bar","maxAge":123}'), "is the number": function(c) { assert.strictEqual(c.maxAge, 123); }, }, "null": { topic: Cookie.fromJSON.bind(null,'{"key":"foo","value":"bar","maxAge":null}'), "is null": function(c) { assert.strictEqual(c.maxAge, null); }, }, "less than zero": { topic: Cookie.fromJSON.bind(null,'{"key":"foo","value":"bar","maxAge":-123}'), "is -123": function(c) { assert.strictEqual(c.maxAge, -123); }, }, "Infinity": { topic: Cookie.fromJSON.bind(null,'{"key":"foo","value":"bar","maxAge":"Infinity"}'), "is inf-as-string": function(c) { assert.strictEqual(c.maxAge, "Infinity"); }, }, "-Infinity": { topic: Cookie.fromJSON.bind(null,'{"key":"foo","value":"bar","maxAge":"-Infinity"}'), "is inf-as-string": function(c) { assert.strictEqual(c.maxAge, "-Infinity"); }, }, } }) .addBatch({ "permuteDomain": { "base case": { topic: tough.permuteDomain.bind(null,'example.com'), "got the domain": function(list) { assert.deepEqual(list, ['example.com']); }, }, "two levels": { topic: tough.permuteDomain.bind(null,'foo.bar.example.com'), "got three things": function(list) { assert.deepEqual(list, ['example.com','bar.example.com','foo.bar.example.com']); }, }, "invalid domain": { topic: tough.permuteDomain.bind(null,'foo.bar.example.localduhmain'), "got three things": function(list) { assert.equal(list, null); }, }, }, "permutePath": { "base case": { topic: tough.permutePath.bind(null,'/'), "just slash": function(list) { assert.deepEqual(list,['/']); }, }, "single case": { topic: tough.permutePath.bind(null,'/foo'), "two things": function(list) { assert.deepEqual(list,['/foo','/']); }, "path matching": function(list) { list.forEach(function(e) { assert.ok(tough.pathMatch('/foo',e)); }); }, }, "double case": { topic: tough.permutePath.bind(null,'/foo/bar'), "four things": function(list) { assert.deepEqual(list,['/foo/bar','/foo','/']); }, "path matching": function(list) { list.forEach(function(e) { assert.ok(tough.pathMatch('/foo/bar',e)); }); }, }, "trailing slash": { topic: tough.permutePath.bind(null,'/foo/bar/'), "three things": function(list) { assert.deepEqual(list,['/foo/bar','/foo','/']); }, "path matching": function(list) { list.forEach(function(e) { assert.ok(tough.pathMatch('/foo/bar/',e)); }); }, }, } }) .addBatch({ "Issue 1": { topic: function() { var cj = new CookieJar(); cj.setCookie('hello=world; path=/some/path/', 'http://domain/some/path/file', function(err,cookie) { this.callback(err,{cj:cj, cookie:cookie}); }.bind(this)); }, "stored a cookie": function(t) { assert.ok(t.cookie); }, "cookie's path was modified to remove unnecessary slash": function(t) { assert.equal(t.cookie.path, '/some/path'); }, "getting it back": { topic: function(t) { t.cj.getCookies('http://domain/some/path/file', function(err,cookies) { this.callback(err, {cj:t.cj, cookies:cookies||[]}); }.bind(this)); }, "got one cookie": function(t) { assert.lengthOf(t.cookies, 1); }, "it's the right one": function(t) { var c = t.cookies[0]; assert.equal(c.key, 'hello'); assert.equal(c.value, 'world'); }, } } }) .addBatch({ "expiry option": { topic: function() { var cb = this.callback; var cj = new CookieJar(); cj.setCookie('near=expiry; Domain=example.com; Path=/; Max-Age=1','http://www.example.com',at(-1), function(err,cookie) { cb(err, {cj:cj, cookie:cookie}); }); }, "set the cookie": function(t) { assert.ok(t.cookie, "didn't set?!"); assert.equal(t.cookie.key, 'near'); }, "then, retrieving": { topic: function(t) { var cb = this.callback; setTimeout(function() { t.cj.getCookies('http://www.example.com', {http:true, expire:false}, function(err,cookies) { t.cookies = cookies; cb(err,t); }); },2000); }, "got the cookie": function(t) { assert.lengthOf(t.cookies, 1); assert.equal(t.cookies[0].key, 'near'); }, } } }) .addBatch({ "trailing semi-colon set into cj": { topic: function () { var cb = this.callback; var cj = new CookieJar(); var ex = 'http://www.example.com'; var tasks = []; tasks.push(function(next) { cj.setCookie('broken_path=testme; path=/;',ex,at(-1),next); }); tasks.push(function(next) { cj.setCookie('b=2; Path=/;;;;',ex,at(-1),next); }); async.parallel(tasks, function (err, cookies) { cb(null, { cj: cj, cookies: cookies }); }); }, "check number of cookies": function (t) { assert.lengthOf(t.cookies, 2, "didn't set"); }, "check *broken_path* was set properly": function (t) { assert.equal(t.cookies[0].key, "broken_path"); assert.equal(t.cookies[0].value, "testme"); assert.equal(t.cookies[0].path, "/"); }, "check *b* was set properly": function (t) { assert.equal(t.cookies[1].key, "b"); assert.equal(t.cookies[1].value, "2"); assert.equal(t.cookies[1].path, "/"); }, "retrieve the cookie": { topic: function (t) { var cb = this.callback; t.cj.getCookies('http://www.example.com', {}, function (err, cookies) { t.cookies = cookies; cb(err, t); }); }, "get the cookie": function(t) { assert.lengthOf(t.cookies, 2); assert.equal(t.cookies[0].key, 'broken_path'); assert.equal(t.cookies[0].value, 'testme'); assert.equal(t.cookies[1].key, "b"); assert.equal(t.cookies[1].value, "2"); assert.equal(t.cookies[1].path, "/"); }, }, } }) .addBatch({ "Constructor":{ topic: function () { return new Cookie({ key: 'test', value: 'b', maxAge: 60 }); }, 'check for key property': function (c) { assert.ok(c); assert.equal(c.key, 'test'); }, 'check for value property': function (c) { assert.equal(c.value, 'b'); }, 'check for maxAge': function (c) { assert.equal(c.maxAge, 60); }, 'check for default values for unspecified properties': function (c) { assert.equal(c.expires, "Infinity"); assert.equal(c.secure, false); assert.equal(c.httpOnly, false); } } }) .addBatch({ "allPaths option": { topic: function() { var cj = new CookieJar(); var tasks = []; tasks.push(cj.setCookie.bind(cj, 'nopath_dom=qq; Path=/; Domain=example.com', 'http://example.com', {})); tasks.push(cj.setCookie.bind(cj, 'path_dom=qq; Path=/foo; Domain=example.com', 'http://example.com', {})); tasks.push(cj.setCookie.bind(cj, 'nopath_host=qq; Path=/', 'http://www.example.com', {})); tasks.push(cj.setCookie.bind(cj, 'path_host=qq; Path=/foo', 'http://www.example.com', {})); tasks.push(cj.setCookie.bind(cj, 'other=qq; Path=/', 'http://other.example.com/', {})); tasks.push(cj.setCookie.bind(cj, 'other2=qq; Path=/foo', 'http://other.example.com/foo', {})); var cb = this.callback; async.parallel(tasks, function(err,results) { cb(err, {cj:cj, cookies: results}); }); }, "all set": function(t) { assert.equal(t.cookies.length, 6); assert.ok(t.cookies.every(function(c) { return !!c })); }, "getting without allPaths": { topic: function(t) { var cb = this.callback; var cj = t.cj; cj.getCookies('http://www.example.com/', {}, function(err,cookies) { cb(err, {cj:cj, cookies:cookies}); }); }, "found just two cookies": function(t) { assert.equal(t.cookies.length, 2); }, "all are path=/": function(t) { assert.ok(t.cookies.every(function(c) { return c.path === '/' })); }, "no 'other' cookies": function(t) { assert.ok(!t.cookies.some(function(c) { return (/^other/).test(c.name) })); }, }, "getting without allPaths for /foo": { topic: function(t) { var cb = this.callback; var cj = t.cj; cj.getCookies('http://www.example.com/foo', {}, function(err,cookies) { cb(err, {cj:cj, cookies:cookies}); }); }, "found four cookies": function(t) { assert.equal(t.cookies.length, 4); }, "no 'other' cookies": function(t) { assert.ok(!t.cookies.some(function(c) { return (/^other/).test(c.name) })); }, }, "getting with allPaths:true": { topic: function(t) { var cb = this.callback; var cj = t.cj; cj.getCookies('http://www.example.com/', {allPaths:true}, function(err,cookies) { cb(err, {cj:cj, cookies:cookies}); }); }, "found four cookies": function(t) { assert.equal(t.cookies.length, 4); }, "no 'other' cookies": function(t) { assert.ok(!t.cookies.some(function(c) { return (/^other/).test(c.name) })); }, }, } }) .addBatch({ "remove cookies": { topic: function() { var jar = new CookieJar(); var cookie = Cookie.parse("a=b; Domain=example.com; Path=/"); var cookie2 = Cookie.parse("a=b; Domain=foo.com; Path=/"); var cookie3 = Cookie.parse("foo=bar; Domain=foo.com; Path=/"); jar.setCookie(cookie, 'http://example.com/index.html', function(){}); jar.setCookie(cookie2, 'http://foo.com/index.html', function(){}); jar.setCookie(cookie3, 'http://foo.com/index.html', function(){}); return jar; }, "all from matching domain": function(jar){ jar.store.removeCookies('example.com',null, function(err) { assert(err == null); jar.store.findCookies('example.com', null, function(err, cookies){ assert(err == null); assert(cookies != null); assert(cookies.length === 0, 'cookie was not removed'); }); jar.store.findCookies('foo.com', null, function(err, cookies){ assert(err == null); assert(cookies != null); assert(cookies.length === 2, 'cookies should not have been removed'); }); }); }, "from cookie store matching domain and key": function(jar){ jar.store.removeCookie('foo.com', '/', 'foo', function(err) { assert(err == null); jar.store.findCookies('foo.com', null, function(err, cookies){ assert(err == null); assert(cookies != null); assert(cookies.length === 1, 'cookie was not removed correctly'); assert(cookies[0].key === 'a', 'wrong cookie was removed'); }); }); } } }) .addBatch({ "Synchronous CookieJar": { "setCookieSync": { topic: function() { var jar = new CookieJar(); var cookie = Cookie.parse("a=b; Domain=example.com; Path=/"); cookie = jar.setCookieSync(cookie, 'http://example.com/index.html'); return cookie; }, "returns a copy of the cookie": function(cookie) { assert.instanceOf(cookie, Cookie); } }, "setCookieSync strict parse error": { topic: function() { var jar = new CookieJar(); var opts = { strict: true }; try { jar.setCookieSync("farbe=weiß", 'http://example.com/index.html', opts); return false; } catch (e) { return e; } }, "throws the error": function(err) { assert.instanceOf(err, Error); assert.equal(err.message, "Cookie failed to parse"); } }, "getCookiesSync": { topic: function() { var jar = new CookieJar(); var url = 'http://example.com/index.html'; jar.setCookieSync("a=b; Domain=example.com; Path=/", url); jar.setCookieSync("c=d; Domain=example.com; Path=/", url); return jar.getCookiesSync(url); }, "returns the cookie array": function(err, cookies) { assert.ok(!err); assert.ok(Array.isArray(cookies)); assert.lengthOf(cookies, 2); cookies.forEach(function(cookie) { assert.instanceOf(cookie, Cookie); }); } }, "getCookieStringSync": { topic: function() { var jar = new CookieJar(); var url = 'http://example.com/index.html'; jar.setCookieSync("a=b; Domain=example.com; Path=/", url); jar.setCookieSync("c=d; Domain=example.com; Path=/", url); return jar.getCookieStringSync(url); }, "returns the cookie header string": function(err, str) { assert.ok(!err); assert.typeOf(str, 'string'); } }, "getSetCookieStringsSync": { topic: function() { var jar = new CookieJar(); var url = 'http://example.com/index.html'; jar.setCookieSync("a=b; Domain=example.com; Path=/", url); jar.setCookieSync("c=d; Domain=example.com; Path=/", url); return jar.getSetCookieStringsSync(url); }, "returns the cookie header string": function(err, headers) { assert.ok(!err); assert.ok(Array.isArray(headers)); assert.lengthOf(headers, 2); headers.forEach(function(header) { assert.typeOf(header, 'string'); }); } }, } }) .addBatch({ "Synchronous API on async CookieJar": { topic: function() { return new tough.Store(); }, "setCookieSync": { topic: function(store) { var jar = new CookieJar(store); try { jar.setCookieSync("a=b", 'http://example.com/index.html'); return false; } catch(e) { return e; } }, "fails": function(err) { assert.instanceOf(err, Error); assert.equal(err.message, 'CookieJar store is not synchronous; use async API instead.'); } }, "getCookiesSync": { topic: function(store) { var jar = new CookieJar(store); try { jar.getCookiesSync('http://example.com/index.html'); return false; } catch(e) { return e; } }, "fails": function(err) { assert.instanceOf(err, Error); assert.equal(err.message, 'CookieJar store is not synchronous; use async API instead.'); } }, "getCookieStringSync": { topic: function(store) { var jar = new CookieJar(store); try { jar.getCookieStringSync('http://example.com/index.html'); return false; } catch(e) { return e; } }, "fails": function(err) { assert.instanceOf(err, Error); assert.equal(err.message, 'CookieJar store is not synchronous; use async API instead.'); } }, "getSetCookieStringsSync": { topic: function(store) { var jar = new CookieJar(store); try { jar.getSetCookieStringsSync('http://example.com/index.html'); return false; } catch(e) { return e; } }, "fails": function(err) { assert.instanceOf(err, Error); assert.equal(err.message, 'CookieJar store is not synchronous; use async API instead.'); } }, } }) .export(module); ��������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/node_modules/punycode/����������000755 �000766 �000024 �00000000000 12456115120 036332� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/tough-cookie/node_modules/punycode/LICENSE-MIT.txt������������000644 �000766 �000024 �00000002065 12455173731 040622� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������Copyright Mathias Bynens <https://mathiasbynens.be/> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/tough-cookie/node_modules/punycode/package.json��000644 �000766 �000024 �00000003636 12455173731 040643� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������{ "name": "punycode", "version": "1.3.2", "description": "A robust Punycode converter that fully complies to RFC 3492 and RFC 5891, and works on nearly all JavaScript platforms.", "homepage": "https://mths.be/punycode", "main": "punycode.js", "keywords": [ "punycode", "unicode", "idn", "idna", "dns", "url", "domain" ], "license": "MIT", "author": { "name": "Mathias Bynens", "url": "https://mathiasbynens.be/" }, "contributors": [ { "name": "Mathias Bynens", "url": "https://mathiasbynens.be/" }, { "name": "John-David Dalton", "url": "http://allyoucanleet.com/" } ], "repository": { "type": "git", "url": "https://github.com/bestiejs/punycode.js.git" }, "bugs": { "url": "https://github.com/bestiejs/punycode.js/issues" }, "files": [ "LICENSE-MIT.txt", "punycode.js" ], "scripts": { "test": "node tests/tests.js" }, "devDependencies": { "coveralls": "^2.10.1", "grunt": "^0.4.5", "grunt-contrib-uglify": "^0.5.0", "grunt-shell": "^0.7.0", "istanbul": "^0.2.13", "qunit-extras": "^1.2.0", "qunitjs": "~1.11.0", "requirejs": "^2.1.14" }, "gitHead": "38c8d3131a82567bfef18da09f7f4db68c84f8a3", "_id": "punycode@1.3.2", "_shasum": "9653a036fb7c1ee42342f2325cceefea3926c48d", "_from": "punycode@>=0.2.0", "_npmVersion": "1.4.28", "_npmUser": { "name": "mathias", "email": "mathias@qiwi.be" }, "maintainers": [ { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "reconbot", "email": "wizard@roborooter.com" } ], "dist": { "shasum": "9653a036fb7c1ee42342f2325cceefea3926c48d", "tarball": "http://registry.npmjs.org/punycode/-/punycode-1.3.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/punycode/-/punycode-1.3.2.tgz", "readme": "ERROR: No README data found!" } ��������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/tough-cookie/node_modules/punycode/punycode.js���000644 �000766 �000024 �00000034471 12455173731 040542� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������/*! https://mths.be/punycode v1.3.2 by @mathias */ ;(function(root) { /** Detect free variables */ var freeExports = typeof exports == 'object' && exports && !exports.nodeType && exports; var freeModule = typeof module == 'object' && module && !module.nodeType && module; var freeGlobal = typeof global == 'object' && global; if ( freeGlobal.global === freeGlobal || freeGlobal.window === freeGlobal || freeGlobal.self === freeGlobal ) { root = freeGlobal; } /** * The `punycode` object. * @name punycode * @type Object */ var punycode, /** Highest positive signed 32-bit float value */ maxInt = 2147483647, // aka. 0x7FFFFFFF or 2^31-1 /** Bootstring parameters */ base = 36, tMin = 1, tMax = 26, skew = 38, damp = 700, initialBias = 72, initialN = 128, // 0x80 delimiter = '-', // '\x2D' /** Regular expressions */ regexPunycode = /^xn--/, regexNonASCII = /[^\x20-\x7E]/, // unprintable ASCII chars + non-ASCII chars regexSeparators = /[\x2E\u3002\uFF0E\uFF61]/g, // RFC 3490 separators /** Error messages */ errors = { 'overflow': 'Overflow: input needs wider integers to process', 'not-basic': 'Illegal input >= 0x80 (not a basic code point)', 'invalid-input': 'Invalid input' }, /** Convenience shortcuts */ baseMinusTMin = base - tMin, floor = Math.floor, stringFromCharCode = String.fromCharCode, /** Temporary variable */ key; /*--------------------------------------------------------------------------*/ /** * A generic error utility function. * @private * @param {String} type The error type. * @returns {Error} Throws a `RangeError` with the applicable error message. */ function error(type) { throw RangeError(errors[type]); } /** * A generic `Array#map` utility function. * @private * @param {Array} array The array to iterate over. * @param {Function} callback The function that gets called for every array * item. * @returns {Array} A new array of values returned by the callback function. */ function map(array, fn) { var length = array.length; var result = []; while (length--) { result[length] = fn(array[length]); } return result; } /** * A simple `Array#map`-like wrapper to work with domain name strings or email * addresses. * @private * @param {String} domain The domain name or email address. * @param {Function} callback The function that gets called for every * character. * @returns {Array} A new string of characters returned by the callback * function. */ function mapDomain(string, fn) { var parts = string.split('@'); var result = ''; if (parts.length > 1) { // In email addresses, only the domain name should be punycoded. Leave // the local part (i.e. everything up to `@`) intact. result = parts[0] + '@'; string = parts[1]; } // Avoid `split(regex)` for IE8 compatibility. See #17. string = string.replace(regexSeparators, '\x2E'); var labels = string.split('.'); var encoded = map(labels, fn).join('.'); return result + encoded; } /** * Creates an array containing the numeric code points of each Unicode * character in the string. While JavaScript uses UCS-2 internally, * this function will convert a pair of surrogate halves (each of which * UCS-2 exposes as separate characters) into a single code point, * matching UTF-16. * @see `punycode.ucs2.encode` * @see <https://mathiasbynens.be/notes/javascript-encoding> * @memberOf punycode.ucs2 * @name decode * @param {String} string The Unicode input string (UCS-2). * @returns {Array} The new array of code points. */ function ucs2decode(string) { var output = [], counter = 0, length = string.length, value, extra; while (counter < length) { value = string.charCodeAt(counter++); if (value >= 0xD800 && value <= 0xDBFF && counter < length) { // high surrogate, and there is a next character extra = string.charCodeAt(counter++); if ((extra & 0xFC00) == 0xDC00) { // low surrogate output.push(((value & 0x3FF) << 10) + (extra & 0x3FF) + 0x10000); } else { // unmatched surrogate; only append this code unit, in case the next // code unit is the high surrogate of a surrogate pair output.push(value); counter--; } } else { output.push(value); } } return output; } /** * Creates a string based on an array of numeric code points. * @see `punycode.ucs2.decode` * @memberOf punycode.ucs2 * @name encode * @param {Array} codePoints The array of numeric code points. * @returns {String} The new Unicode string (UCS-2). */ function ucs2encode(array) { return map(array, function(value) { var output = ''; if (value > 0xFFFF) { value -= 0x10000; output += stringFromCharCode(value >>> 10 & 0x3FF | 0xD800); value = 0xDC00 | value & 0x3FF; } output += stringFromCharCode(value); return output; }).join(''); } /** * Converts a basic code point into a digit/integer. * @see `digitToBasic()` * @private * @param {Number} codePoint The basic numeric code point value. * @returns {Number} The numeric value of a basic code point (for use in * representing integers) in the range `0` to `base - 1`, or `base` if * the code point does not represent a value. */ function basicToDigit(codePoint) { if (codePoint - 48 < 10) { return codePoint - 22; } if (codePoint - 65 < 26) { return codePoint - 65; } if (codePoint - 97 < 26) { return codePoint - 97; } return base; } /** * Converts a digit/integer into a basic code point. * @see `basicToDigit()` * @private * @param {Number} digit The numeric value of a basic code point. * @returns {Number} The basic code point whose value (when used for * representing integers) is `digit`, which needs to be in the range * `0` to `base - 1`. If `flag` is non-zero, the uppercase form is * used; else, the lowercase form is used. The behavior is undefined * if `flag` is non-zero and `digit` has no uppercase form. */ function digitToBasic(digit, flag) { // 0..25 map to ASCII a..z or A..Z // 26..35 map to ASCII 0..9 return digit + 22 + 75 * (digit < 26) - ((flag != 0) << 5); } /** * Bias adaptation function as per section 3.4 of RFC 3492. * http://tools.ietf.org/html/rfc3492#section-3.4 * @private */ function adapt(delta, numPoints, firstTime) { var k = 0; delta = firstTime ? floor(delta / damp) : delta >> 1; delta += floor(delta / numPoints); for (/* no initialization */; delta > baseMinusTMin * tMax >> 1; k += base) { delta = floor(delta / baseMinusTMin); } return floor(k + (baseMinusTMin + 1) * delta / (delta + skew)); } /** * Converts a Punycode string of ASCII-only symbols to a string of Unicode * symbols. * @memberOf punycode * @param {String} input The Punycode string of ASCII-only symbols. * @returns {String} The resulting string of Unicode symbols. */ function decode(input) { // Don't use UCS-2 var output = [], inputLength = input.length, out, i = 0, n = initialN, bias = initialBias, basic, j, index, oldi, w, k, digit, t, /** Cached calculation results */ baseMinusT; // Handle the basic code points: let `basic` be the number of input code // points before the last delimiter, or `0` if there is none, then copy // the first basic code points to the output. basic = input.lastIndexOf(delimiter); if (basic < 0) { basic = 0; } for (j = 0; j < basic; ++j) { // if it's not a basic code point if (input.charCodeAt(j) >= 0x80) { error('not-basic'); } output.push(input.charCodeAt(j)); } // Main decoding loop: start just after the last delimiter if any basic code // points were copied; start at the beginning otherwise. for (index = basic > 0 ? basic + 1 : 0; index < inputLength; /* no final expression */) { // `index` is the index of the next character to be consumed. // Decode a generalized variable-length integer into `delta`, // which gets added to `i`. The overflow checking is easier // if we increase `i` as we go, then subtract off its starting // value at the end to obtain `delta`. for (oldi = i, w = 1, k = base; /* no condition */; k += base) { if (index >= inputLength) { error('invalid-input'); } digit = basicToDigit(input.charCodeAt(index++)); if (digit >= base || digit > floor((maxInt - i) / w)) { error('overflow'); } i += digit * w; t = k <= bias ? tMin : (k >= bias + tMax ? tMax : k - bias); if (digit < t) { break; } baseMinusT = base - t; if (w > floor(maxInt / baseMinusT)) { error('overflow'); } w *= baseMinusT; } out = output.length + 1; bias = adapt(i - oldi, out, oldi == 0); // `i` was supposed to wrap around from `out` to `0`, // incrementing `n` each time, so we'll fix that now: if (floor(i / out) > maxInt - n) { error('overflow'); } n += floor(i / out); i %= out; // Insert `n` at position `i` of the output output.splice(i++, 0, n); } return ucs2encode(output); } /** * Converts a string of Unicode symbols (e.g. a domain name label) to a * Punycode string of ASCII-only symbols. * @memberOf punycode * @param {String} input The string of Unicode symbols. * @returns {String} The resulting Punycode string of ASCII-only symbols. */ function encode(input) { var n, delta, handledCPCount, basicLength, bias, j, m, q, k, t, currentValue, output = [], /** `inputLength` will hold the number of code points in `input`. */ inputLength, /** Cached calculation results */ handledCPCountPlusOne, baseMinusT, qMinusT; // Convert the input in UCS-2 to Unicode input = ucs2decode(input); // Cache the length inputLength = input.length; // Initialize the state n = initialN; delta = 0; bias = initialBias; // Handle the basic code points for (j = 0; j < inputLength; ++j) { currentValue = input[j]; if (currentValue < 0x80) { output.push(stringFromCharCode(currentValue)); } } handledCPCount = basicLength = output.length; // `handledCPCount` is the number of code points that have been handled; // `basicLength` is the number of basic code points. // Finish the basic string - if it is not empty - with a delimiter if (basicLength) { output.push(delimiter); } // Main encoding loop: while (handledCPCount < inputLength) { // All non-basic code points < n have been handled already. Find the next // larger one: for (m = maxInt, j = 0; j < inputLength; ++j) { currentValue = input[j]; if (currentValue >= n && currentValue < m) { m = currentValue; } } // Increase `delta` enough to advance the decoder's <n,i> state to <m,0>, // but guard against overflow handledCPCountPlusOne = handledCPCount + 1; if (m - n > floor((maxInt - delta) / handledCPCountPlusOne)) { error('overflow'); } delta += (m - n) * handledCPCountPlusOne; n = m; for (j = 0; j < inputLength; ++j) { currentValue = input[j]; if (currentValue < n && ++delta > maxInt) { error('overflow'); } if (currentValue == n) { // Represent delta as a generalized variable-length integer for (q = delta, k = base; /* no condition */; k += base) { t = k <= bias ? tMin : (k >= bias + tMax ? tMax : k - bias); if (q < t) { break; } qMinusT = q - t; baseMinusT = base - t; output.push( stringFromCharCode(digitToBasic(t + qMinusT % baseMinusT, 0)) ); q = floor(qMinusT / baseMinusT); } output.push(stringFromCharCode(digitToBasic(q, 0))); bias = adapt(delta, handledCPCountPlusOne, handledCPCount == basicLength); delta = 0; ++handledCPCount; } } ++delta; ++n; } return output.join(''); } /** * Converts a Punycode string representing a domain name or an email address * to Unicode. Only the Punycoded parts of the input will be converted, i.e. * it doesn't matter if you call it on a string that has already been * converted to Unicode. * @memberOf punycode * @param {String} input The Punycoded domain name or email address to * convert to Unicode. * @returns {String} The Unicode representation of the given Punycode * string. */ function toUnicode(input) { return mapDomain(input, function(string) { return regexPunycode.test(string) ? decode(string.slice(4).toLowerCase()) : string; }); } /** * Converts a Unicode string representing a domain name or an email address to * Punycode. Only the non-ASCII parts of the domain name will be converted, * i.e. it doesn't matter if you call it with a domain that's already in * ASCII. * @memberOf punycode * @param {String} input The domain name or email address to convert, as a * Unicode string. * @returns {String} The Punycode representation of the given domain name or * email address. */ function toASCII(input) { return mapDomain(input, function(string) { return regexNonASCII.test(string) ? 'xn--' + encode(string) : string; }); } /*--------------------------------------------------------------------------*/ /** Define the public API */ punycode = { /** * A string representing the current Punycode.js version number. * @memberOf punycode * @type String */ 'version': '1.3.2', /** * An object of methods to convert from JavaScript's internal character * representation (UCS-2) to Unicode code points, and back. * @see <https://mathiasbynens.be/notes/javascript-encoding> * @memberOf punycode * @type Object */ 'ucs2': { 'decode': ucs2decode, 'encode': ucs2encode }, 'decode': decode, 'encode': encode, 'toASCII': toASCII, 'toUnicode': toUnicode }; /** Expose `punycode` */ // Some AMD build optimizers, like r.js, check for specific condition patterns // like the following: if ( typeof define == 'function' && typeof define.amd == 'object' && define.amd ) { define('punycode', function() { return punycode; }); } else if (freeExports && freeModule) { if (module.exports == freeExports) { // in Node.js or RingoJS v0.8.0+ freeModule.exports = punycode; } else { // in Narwhal or RingoJS v0.7.0- for (key in punycode) { punycode.hasOwnProperty(key) && (freeExports[key] = punycode[key]); } } } else { // in Rhino or a web browser root.punycode = punycode; } }(this)); �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/node_modules/punycode/README.md�000644 �000766 �000024 �00000013544 12455173731 037633� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# Punycode.js [![Build status](https://travis-ci.org/bestiejs/punycode.js.svg?branch=master)](https://travis-ci.org/bestiejs/punycode.js) [![Code coverage status](http://img.shields.io/coveralls/bestiejs/punycode.js/master.svg)](https://coveralls.io/r/bestiejs/punycode.js) [![Dependency status](https://gemnasium.com/bestiejs/punycode.js.svg)](https://gemnasium.com/bestiejs/punycode.js) A robust Punycode converter that fully complies to [RFC 3492](http://tools.ietf.org/html/rfc3492) and [RFC 5891](http://tools.ietf.org/html/rfc5891), and works on nearly all JavaScript platforms. This JavaScript library is the result of comparing, optimizing and documenting different open-source implementations of the Punycode algorithm: * [The C example code from RFC 3492](http://tools.ietf.org/html/rfc3492#appendix-C) * [`punycode.c` by _Markus W. Scherer_ (IBM)](http://opensource.apple.com/source/ICU/ICU-400.42/icuSources/common/punycode.c) * [`punycode.c` by _Ben Noordhuis_](https://github.com/bnoordhuis/punycode/blob/master/punycode.c) * [JavaScript implementation by _some_](http://stackoverflow.com/questions/183485/can-anyone-recommend-a-good-free-javascript-for-punycode-to-unicode-conversion/301287#301287) * [`punycode.js` by _Ben Noordhuis_](https://github.com/joyent/node/blob/426298c8c1c0d5b5224ac3658c41e7c2a3fe9377/lib/punycode.js) (note: [not fully compliant](https://github.com/joyent/node/issues/2072)) This project is [bundled](https://github.com/joyent/node/blob/master/lib/punycode.js) with [Node.js v0.6.2+](https://github.com/joyent/node/compare/975f1930b1...61e796decc). ## Installation Via [npm](http://npmjs.org/) (only required for Node.js releases older than v0.6.2): ```bash npm install punycode ``` Via [Bower](http://bower.io/): ```bash bower install punycode ``` Via [Component](https://github.com/component/component): ```bash component install bestiejs/punycode.js ``` In a browser: ```html <script src="punycode.js"></script> ``` In [Narwhal](http://narwhaljs.org/), [Node.js](http://nodejs.org/), and [RingoJS](http://ringojs.org/): ```js var punycode = require('punycode'); ``` In [Rhino](http://www.mozilla.org/rhino/): ```js load('punycode.js'); ``` Using an AMD loader like [RequireJS](http://requirejs.org/): ```js require( { 'paths': { 'punycode': 'path/to/punycode' } }, ['punycode'], function(punycode) { console.log(punycode); } ); ``` ## API ### `punycode.decode(string)` Converts a Punycode string of ASCII symbols to a string of Unicode symbols. ```js // decode domain name parts punycode.decode('maana-pta'); // 'mañana' punycode.decode('--dqo34k'); // '☃-⌘' ``` ### `punycode.encode(string)` Converts a string of Unicode symbols to a Punycode string of ASCII symbols. ```js // encode domain name parts punycode.encode('mañana'); // 'maana-pta' punycode.encode('☃-⌘'); // '--dqo34k' ``` ### `punycode.toUnicode(input)` Converts a Punycode string representing a domain name or an email address to Unicode. Only the Punycoded parts of the input will be converted, i.e. it doesn’t matter if you call it on a string that has already been converted to Unicode. ```js // decode domain names punycode.toUnicode('xn--maana-pta.com'); // → 'mañana.com' punycode.toUnicode('xn----dqo34k.com'); // → '☃-⌘.com' // decode email addresses punycode.toUnicode('джумла@xn--p-8sbkgc5ag7bhce.xn--ba-lmcq'); // → 'джумла@джpумлатест.bрфa' ``` ### `punycode.toASCII(input)` Converts a Unicode string representing a domain name or an email address to Punycode. Only the non-ASCII parts of the input will be converted, i.e. it doesn’t matter if you call it with a domain that's already in ASCII. ```js // encode domain names punycode.toASCII('mañana.com'); // → 'xn--maana-pta.com' punycode.toASCII('☃-⌘.com'); // → 'xn----dqo34k.com' // encode email addresses punycode.toASCII('джумла@джpумлатест.bрфa'); // → 'джумла@xn--p-8sbkgc5ag7bhce.xn--ba-lmcq' ``` ### `punycode.ucs2` #### `punycode.ucs2.decode(string)` Creates an array containing the numeric code point values of each Unicode symbol in the string. While [JavaScript uses UCS-2 internally](https://mathiasbynens.be/notes/javascript-encoding), this function will convert a pair of surrogate halves (each of which UCS-2 exposes as separate characters) into a single code point, matching UTF-16. ```js punycode.ucs2.decode('abc'); // → [0x61, 0x62, 0x63] // surrogate pair for U+1D306 TETRAGRAM FOR CENTRE: punycode.ucs2.decode('\uD834\uDF06'); // → [0x1D306] ``` #### `punycode.ucs2.encode(codePoints)` Creates a string based on an array of numeric code point values. ```js punycode.ucs2.encode([0x61, 0x62, 0x63]); // → 'abc' punycode.ucs2.encode([0x1D306]); // → '\uD834\uDF06' ``` ### `punycode.version` A string representing the current Punycode.js version number. ## Unit tests & code coverage After cloning this repository, run `npm install --dev` to install the dependencies needed for Punycode.js development and testing. You may want to install Istanbul _globally_ using `npm install istanbul -g`. Once that’s done, you can run the unit tests in Node using `npm test` or `node tests/tests.js`. To run the tests in Rhino, Ringo, Narwhal, PhantomJS, and web browsers as well, use `grunt test`. To generate the code coverage report, use `grunt cover`. Feel free to fork if you see possible improvements! ## Author | [![twitter/mathias](https://gravatar.com/avatar/24e08a9ea84deb17ae121074d0f17125?s=70)](https://twitter.com/mathias "Follow @mathias on Twitter") | |---| | [Mathias Bynens](https://mathiasbynens.be/) | ## Contributors | [![twitter/jdalton](https://gravatar.com/avatar/299a3d891ff1920b69c364d061007043?s=70)](https://twitter.com/jdalton "Follow @jdalton on Twitter") | |---| | [John-David Dalton](http://allyoucanleet.com/) | ## License Punycode.js is available under the [MIT](https://mths.be/mit) license. ������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/lib/cookie.js�������������������000644 �000766 �000024 �00000076223 12455173731 034431� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* * Copyright GoInstant, Inc. and other contributors. All rights reserved. * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ 'use strict'; var net = require('net'); var urlParse = require('url').parse; var pubsuffix = require('./pubsuffix'); var Store = require('./store').Store; var punycode; try { punycode = require('punycode'); } catch(e) { console.warn("cookie: can't load punycode; won't use punycode for domain normalization"); } var DATE_DELIM = /[\x09\x20-\x2F\x3B-\x40\x5B-\x60\x7B-\x7E]/; // From RFC2616 S2.2: var TOKEN = /[\x21\x23-\x26\x2A\x2B\x2D\x2E\x30-\x39\x41-\x5A\x5E-\x7A\x7C\x7E]/; // From RFC6265 S4.1.1 // note that it excludes \x3B ";" var COOKIE_OCTET = /[\x21\x23-\x2B\x2D-\x3A\x3C-\x5B\x5D-\x7E]/; var COOKIE_OCTETS = new RegExp('^'+COOKIE_OCTET.source+'$'); // The name/key cannot be empty but the value can (S5.2): var COOKIE_PAIR_STRICT = new RegExp('^('+TOKEN.source+'+)=("?)('+COOKIE_OCTET.source+'*)\\2$'); var COOKIE_PAIR = /^([^=\s]+)\s*=\s*("?)\s*(.*)\s*\2\s*$/; // RFC6265 S4.1.1 defines extension-av as 'any CHAR except CTLs or ";"' // Note ';' is \x3B var NON_CTL_SEMICOLON = /[\x20-\x3A\x3C-\x7E]+/; var EXTENSION_AV = NON_CTL_SEMICOLON; var PATH_VALUE = NON_CTL_SEMICOLON; // Used for checking whether or not there is a trailing semi-colon var TRAILING_SEMICOLON = /;+$/; /* RFC6265 S5.1.1.5: * [fail if] the day-of-month-value is less than 1 or greater than 31 */ var DAY_OF_MONTH = /^(0?[1-9]|[12][0-9]|3[01])$/; /* RFC6265 S5.1.1.5: * [fail if] * * the hour-value is greater than 23, * * the minute-value is greater than 59, or * * the second-value is greater than 59. */ var TIME = /(0?[0-9]|1[0-9]|2[0-3]):([0-5][0-9]):([0-5][0-9])/; var STRICT_TIME = /^(0?[0-9]|1[0-9]|2[0-3]):([0-5][0-9]):([0-5][0-9])$/; var MONTH = /^(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)$/i; var MONTH_TO_NUM = { jan:0, feb:1, mar:2, apr:3, may:4, jun:5, jul:6, aug:7, sep:8, oct:9, nov:10, dec:11 }; var NUM_TO_MONTH = [ 'Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec' ]; var NUM_TO_DAY = [ 'Sun','Mon','Tue','Wed','Thu','Fri','Sat' ]; var YEAR = /^([1-9][0-9]{1,3})$/; // 2 to 4 digits var MAX_TIME = 2147483647000; // 31-bit max var MIN_TIME = 0; // 31-bit min // RFC6265 S5.1.1 date parser: function parseDate(str,strict) { if (!str) { return; } var found_time, found_dom, found_month, found_year; /* RFC6265 S5.1.1: * 2. Process each date-token sequentially in the order the date-tokens * appear in the cookie-date */ var tokens = str.split(DATE_DELIM); if (!tokens) { return; } var date = new Date(); date.setMilliseconds(0); for (var i=0; i<tokens.length; i++) { var token = tokens[i].trim(); if (!token.length) { continue; } var result; /* 2.1. If the found-time flag is not set and the token matches the time * production, set the found-time flag and set the hour- value, * minute-value, and second-value to the numbers denoted by the digits in * the date-token, respectively. Skip the remaining sub-steps and continue * to the next date-token. */ if (!found_time) { result = (strict ? STRICT_TIME : TIME).exec(token); if (result) { found_time = true; date.setUTCHours(result[1]); date.setUTCMinutes(result[2]); date.setUTCSeconds(result[3]); continue; } } /* 2.2. If the found-day-of-month flag is not set and the date-token matches * the day-of-month production, set the found-day-of- month flag and set * the day-of-month-value to the number denoted by the date-token. Skip * the remaining sub-steps and continue to the next date-token. */ if (!found_dom) { result = DAY_OF_MONTH.exec(token); if (result) { found_dom = true; date.setUTCDate(result[1]); continue; } } /* 2.3. If the found-month flag is not set and the date-token matches the * month production, set the found-month flag and set the month-value to * the month denoted by the date-token. Skip the remaining sub-steps and * continue to the next date-token. */ if (!found_month) { result = MONTH.exec(token); if (result) { found_month = true; date.setUTCMonth(MONTH_TO_NUM[result[1].toLowerCase()]); continue; } } /* 2.4. If the found-year flag is not set and the date-token matches the year * production, set the found-year flag and set the year-value to the number * denoted by the date-token. Skip the remaining sub-steps and continue to * the next date-token. */ if (!found_year) { result = YEAR.exec(token); if (result) { var year = result[0]; /* From S5.1.1: * 3. If the year-value is greater than or equal to 70 and less * than or equal to 99, increment the year-value by 1900. * 4. If the year-value is greater than or equal to 0 and less * than or equal to 69, increment the year-value by 2000. */ if (70 <= year && year <= 99) { year += 1900; } else if (0 <= year && year <= 69) { year += 2000; } if (year < 1601) { return; // 5. ... the year-value is less than 1601 } found_year = true; date.setUTCFullYear(year); continue; } } } if (!(found_time && found_dom && found_month && found_year)) { return; // 5. ... at least one of the found-day-of-month, found-month, found- // year, or found-time flags is not set, } return date; } function formatDate(date) { var d = date.getUTCDate(); d = d >= 10 ? d : '0'+d; var h = date.getUTCHours(); h = h >= 10 ? h : '0'+h; var m = date.getUTCMinutes(); m = m >= 10 ? m : '0'+m; var s = date.getUTCSeconds(); s = s >= 10 ? s : '0'+s; return NUM_TO_DAY[date.getUTCDay()] + ', ' + d+' '+ NUM_TO_MONTH[date.getUTCMonth()] +' '+ date.getUTCFullYear() +' '+ h+':'+m+':'+s+' GMT'; } // S5.1.2 Canonicalized Host Names function canonicalDomain(str) { if (str == null) { return null; } str = str.trim().replace(/^\./,''); // S4.1.2.3 & S5.2.3: ignore leading . // convert to IDN if any non-ASCII characters if (punycode && /[^\u0001-\u007f]/.test(str)) { str = punycode.toASCII(str); } return str.toLowerCase(); } // S5.1.3 Domain Matching function domainMatch(str, domStr, canonicalize) { if (str == null || domStr == null) { return null; } if (canonicalize !== false) { str = canonicalDomain(str); domStr = canonicalDomain(domStr); } /* * "The domain string and the string are identical. (Note that both the * domain string and the string will have been canonicalized to lower case at * this point)" */ if (str == domStr) { return true; } /* "All of the following [three] conditions hold:" (order adjusted from the RFC) */ /* "* The string is a host name (i.e., not an IP address)." */ if (net.isIP(str)) { return false; } /* "* The domain string is a suffix of the string" */ var idx = str.indexOf(domStr); if (idx <= 0) { return false; // it's a non-match (-1) or prefix (0) } // e.g "a.b.c".indexOf("b.c") === 2 // 5 === 3+2 if (str.length !== domStr.length + idx) { // it's not a suffix return false; } /* "* The last character of the string that is not included in the domain * string is a %x2E (".") character." */ if (str.substr(idx-1,1) !== '.') { return false; } return true; } // RFC6265 S5.1.4 Paths and Path-Match /* * "The user agent MUST use an algorithm equivalent to the following algorithm * to compute the default-path of a cookie:" * * Assumption: the path (and not query part or absolute uri) is passed in. */ function defaultPath(path) { // "2. If the uri-path is empty or if the first character of the uri-path is not // a %x2F ("/") character, output %x2F ("/") and skip the remaining steps. if (!path || path.substr(0,1) !== "/") { return "/"; } // "3. If the uri-path contains no more than one %x2F ("/") character, output // %x2F ("/") and skip the remaining step." if (path === "/") { return path; } var rightSlash = path.lastIndexOf("/"); if (rightSlash === 0) { return "/"; } // "4. Output the characters of the uri-path from the first character up to, // but not including, the right-most %x2F ("/")." return path.slice(0, rightSlash); } /* * "A request-path path-matches a given cookie-path if at least one of the * following conditions holds:" */ function pathMatch(reqPath,cookiePath) { // "o The cookie-path and the request-path are identical." if (cookiePath === reqPath) { return true; } var idx = reqPath.indexOf(cookiePath); if (idx === 0) { // "o The cookie-path is a prefix of the request-path, and the last // character of the cookie-path is %x2F ("/")." if (cookiePath.substr(-1) === "/") { return true; } // " o The cookie-path is a prefix of the request-path, and the first // character of the request-path that is not included in the cookie- path // is a %x2F ("/") character." if (reqPath.substr(cookiePath.length,1) === "/") { return true; } } return false; } function parse(str, strict) { str = str.trim(); // S4.1.1 Trailing semi-colons are not part of the specification. // If we are not in strict mode we remove the trailing semi-colons. var semiColonCheck = TRAILING_SEMICOLON.exec(str); if (semiColonCheck) { if (strict) { return; } str = str.slice(0, semiColonCheck.index); } // We use a regex to parse the "name-value-pair" part of S5.2 var firstSemi = str.indexOf(';'); // S5.2 step 1 var pairRx = strict ? COOKIE_PAIR_STRICT : COOKIE_PAIR; var result = pairRx.exec(firstSemi === -1 ? str : str.substr(0,firstSemi)); // Rx satisfies the "the name string is empty" and "lacks a %x3D ("=")" // constraints as well as trimming any whitespace. if (!result) { return; } var c = new Cookie(); c.key = result[1]; // the regexp should trim() already c.value = result[3]; // [2] is quotes or empty-string if (firstSemi === -1) { return c; } // S5.2.3 "unparsed-attributes consist of the remainder of the set-cookie-string // (including the %x3B (";") in question)." plus later on in the same section // "discard the first ";" and trim". var unparsed = str.slice(firstSemi).replace(/^\s*;\s*/,'').trim(); // "If the unparsed-attributes string is empty, skip the rest of these // steps." if (unparsed.length === 0) { return c; } /* * S5.2 says that when looping over the items "[p]rocess the attribute-name * and attribute-value according to the requirements in the following * subsections" for every item. Plus, for many of the individual attributes * in S5.3 it says to use the "attribute-value of the last attribute in the * cookie-attribute-list". Therefore, in this implementation, we overwrite * the previous value. */ var cookie_avs = unparsed.split(/\s*;\s*/); while (cookie_avs.length) { var av = cookie_avs.shift(); if (strict && !EXTENSION_AV.test(av)) { return; } var av_sep = av.indexOf('='); var av_key, av_value; if (av_sep === -1) { av_key = av; av_value = null; } else { av_key = av.substr(0,av_sep); av_value = av.substr(av_sep+1); } av_key = av_key.trim().toLowerCase(); if (av_value) { av_value = av_value.trim(); } switch(av_key) { case 'expires': // S5.2.1 if (!av_value) {if(strict){return;}else{break;} } var exp = parseDate(av_value,strict); // "If the attribute-value failed to parse as a cookie date, ignore the // cookie-av." if (exp == null) { if(strict){return;}else{break;} } c.expires = exp; // over and underflow not realistically a concern: V8's getTime() seems to // store something larger than a 32-bit time_t (even with 32-bit node) break; case 'max-age': // S5.2.2 if (!av_value) { if(strict){return;}else{break;} } // "If the first character of the attribute-value is not a DIGIT or a "-" // character ...[or]... If the remainder of attribute-value contains a // non-DIGIT character, ignore the cookie-av." if (!/^-?[0-9]+$/.test(av_value)) { if(strict){return;}else{break;} } var delta = parseInt(av_value,10); if (strict && delta <= 0) { return; // S4.1.1 } // "If delta-seconds is less than or equal to zero (0), let expiry-time // be the earliest representable date and time." c.setMaxAge(delta); break; case 'domain': // S5.2.3 // "If the attribute-value is empty, the behavior is undefined. However, // the user agent SHOULD ignore the cookie-av entirely." if (!av_value) { if(strict){return;}else{break;} } // S5.2.3 "Let cookie-domain be the attribute-value without the leading %x2E // (".") character." var domain = av_value.trim().replace(/^\./,''); if (!domain) { if(strict){return;}else{break;} } // see "is empty" above // "Convert the cookie-domain to lower case." c.domain = domain.toLowerCase(); break; case 'path': // S5.2.4 /* * "If the attribute-value is empty or if the first character of the * attribute-value is not %x2F ("/"): * Let cookie-path be the default-path. * Otherwise: * Let cookie-path be the attribute-value." * * We'll represent the default-path as null since it depends on the * context of the parsing. */ if (!av_value || av_value.substr(0,1) != "/") { if(strict){return;}else{break;} } c.path = av_value; break; case 'secure': // S5.2.5 /* * "If the attribute-name case-insensitively matches the string "Secure", * the user agent MUST append an attribute to the cookie-attribute-list * with an attribute-name of Secure and an empty attribute-value." */ if (av_value != null) { if(strict){return;} } c.secure = true; break; case 'httponly': // S5.2.6 -- effectively the same as 'secure' if (av_value != null) { if(strict){return;} } c.httpOnly = true; break; default: c.extensions = c.extensions || []; c.extensions.push(av); break; } } // ensure a default date for sorting: c.creation = new Date(); return c; } function fromJSON(str) { if (!str) { return null; } var obj; try { obj = JSON.parse(str); } catch (e) { return null; } var c = new Cookie(); for (var i=0; i<numCookieProperties; i++) { var prop = cookieProperties[i]; if (obj[prop] == null) { continue; } if (prop === 'expires' || prop === 'creation' || prop === 'lastAccessed') { c[prop] = obj[prop] == "Infinity" ? "Infinity" : new Date(obj[prop]); } else { c[prop] = obj[prop]; } } // ensure a default date for sorting: c.creation = c.creation || new Date(); return c; } /* Section 5.4 part 2: * "* Cookies with longer paths are listed before cookies with * shorter paths. * * * Among cookies that have equal-length path fields, cookies with * earlier creation-times are listed before cookies with later * creation-times." */ function cookieCompare(a,b) { // descending for length: b CMP a var deltaLen = (b.path ? b.path.length : 0) - (a.path ? a.path.length : 0); if (deltaLen !== 0) { return deltaLen; } // ascending for time: a CMP b return (a.creation ? a.creation.getTime() : MAX_TIME) - (b.creation ? b.creation.getTime() : MAX_TIME); } // Gives the permutation of all possible domainMatch()es of a given domain. The // array is in shortest-to-longest order. Handy for indexing. function permuteDomain(domain) { var pubSuf = pubsuffix.getPublicSuffix(domain); if (!pubSuf) { return null; } if (pubSuf == domain) { return [domain]; } var prefix = domain.slice(0,-(pubSuf.length+1)); // ".example.com" var parts = prefix.split('.').reverse(); var cur = pubSuf; var permutations = [cur]; while (parts.length) { cur = parts.shift()+'.'+cur; permutations.push(cur); } return permutations; } // Gives the permutation of all possible pathMatch()es of a given path. The // array is in longest-to-shortest order. Handy for indexing. function permutePath(path) { if (path === '/') { return ['/']; } if (path.lastIndexOf('/') === path.length-1) { path = path.substr(0,path.length-1); } var permutations = [path]; while (path.length > 1) { var lindex = path.lastIndexOf('/'); if (lindex === 0) { break; } path = path.substr(0,lindex); permutations.push(path); } permutations.push('/'); return permutations; } function Cookie (opts) { if (typeof opts !== "object") { return; } Object.keys(opts).forEach(function (key) { if (Cookie.prototype.hasOwnProperty(key)) { this[key] = opts[key] || Cookie.prototype[key]; } }.bind(this)); } Cookie.parse = parse; Cookie.fromJSON = fromJSON; Cookie.prototype.key = ""; Cookie.prototype.value = ""; // the order in which the RFC has them: Cookie.prototype.expires = "Infinity"; // coerces to literal Infinity Cookie.prototype.maxAge = null; // takes precedence over expires for TTL Cookie.prototype.domain = null; Cookie.prototype.path = null; Cookie.prototype.secure = false; Cookie.prototype.httpOnly = false; Cookie.prototype.extensions = null; // set by the CookieJar: Cookie.prototype.hostOnly = null; // boolean when set Cookie.prototype.pathIsDefault = null; // boolean when set Cookie.prototype.creation = null; // Date when set; defaulted by Cookie.parse Cookie.prototype.lastAccessed = null; // Date when set var cookieProperties = Object.freeze(Object.keys(Cookie.prototype).map(function(p) { if (p instanceof Function) { return; } return p; })); var numCookieProperties = cookieProperties.length; Cookie.prototype.inspect = function inspect() { var now = Date.now(); return 'Cookie="'+this.toString() + '; hostOnly='+(this.hostOnly != null ? this.hostOnly : '?') + '; aAge='+(this.lastAccessed ? (now-this.lastAccessed.getTime())+'ms' : '?') + '; cAge='+(this.creation ? (now-this.creation.getTime())+'ms' : '?') + '"'; }; Cookie.prototype.validate = function validate() { if (!COOKIE_OCTETS.test(this.value)) { return false; } if (this.expires != Infinity && !(this.expires instanceof Date) && !parseDate(this.expires,true)) { return false; } if (this.maxAge != null && this.maxAge <= 0) { return false; // "Max-Age=" non-zero-digit *DIGIT } if (this.path != null && !PATH_VALUE.test(this.path)) { return false; } var cdomain = this.cdomain(); if (cdomain) { if (cdomain.match(/\.$/)) { return false; // S4.1.2.3 suggests that this is bad. domainMatch() tests confirm this } var suffix = pubsuffix.getPublicSuffix(cdomain); if (suffix == null) { // it's a public suffix return false; } } return true; }; Cookie.prototype.setExpires = function setExpires(exp) { if (exp instanceof Date) { this.expires = exp; } else { this.expires = parseDate(exp) || "Infinity"; } }; Cookie.prototype.setMaxAge = function setMaxAge(age) { if (age === Infinity || age === -Infinity) { this.maxAge = age.toString(); // so JSON.stringify() works } else { this.maxAge = age; } }; // gives Cookie header format Cookie.prototype.cookieString = function cookieString() { var val = this.value; if (val == null) { val = ''; } return this.key+'='+val; }; // gives Set-Cookie header format Cookie.prototype.toString = function toString() { var str = this.cookieString(); if (this.expires != Infinity) { if (this.expires instanceof Date) { str += '; Expires='+formatDate(this.expires); } else { str += '; Expires='+this.expires; } } if (this.maxAge != null && this.maxAge != Infinity) { str += '; Max-Age='+this.maxAge; } if (this.domain && !this.hostOnly) { str += '; Domain='+this.domain; } if (this.path) { str += '; Path='+this.path; } if (this.secure) { str += '; Secure'; } if (this.httpOnly) { str += '; HttpOnly'; } if (this.extensions) { this.extensions.forEach(function(ext) { str += '; '+ext; }); } return str; }; // TTL() partially replaces the "expiry-time" parts of S5.3 step 3 (setCookie() // elsewhere) // S5.3 says to give the "latest representable date" for which we use Infinity // For "expired" we use 0 Cookie.prototype.TTL = function TTL(now) { /* RFC6265 S4.1.2.2 If a cookie has both the Max-Age and the Expires * attribute, the Max-Age attribute has precedence and controls the * expiration date of the cookie. * (Concurs with S5.3 step 3) */ if (this.maxAge != null) { return this.maxAge<=0 ? 0 : this.maxAge*1000; } var expires = this.expires; if (expires != Infinity) { if (!(expires instanceof Date)) { expires = parseDate(expires) || Infinity; } if (expires == Infinity) { return Infinity; } return expires.getTime() - (now || Date.now()); } return Infinity; }; // expiryTime() replaces the "expiry-time" parts of S5.3 step 3 (setCookie() // elsewhere) Cookie.prototype.expiryTime = function expiryTime(now) { if (this.maxAge != null) { var relativeTo = this.creation || now || new Date(); var age = (this.maxAge <= 0) ? -Infinity : this.maxAge*1000; return relativeTo.getTime() + age; } if (this.expires == Infinity) { return Infinity; } return this.expires.getTime(); }; // expiryDate() replaces the "expiry-time" parts of S5.3 step 3 (setCookie() // elsewhere), except it returns a Date Cookie.prototype.expiryDate = function expiryDate(now) { var millisec = this.expiryTime(now); if (millisec == Infinity) { return new Date(MAX_TIME); } else if (millisec == -Infinity) { return new Date(MIN_TIME); } else { return new Date(millisec); } }; // This replaces the "persistent-flag" parts of S5.3 step 3 Cookie.prototype.isPersistent = function isPersistent() { return (this.maxAge != null || this.expires != Infinity); }; // Mostly S5.1.2 and S5.2.3: Cookie.prototype.cdomain = Cookie.prototype.canonicalizedDomain = function canonicalizedDomain() { if (this.domain == null) { return null; } return canonicalDomain(this.domain); }; var memstore; function CookieJar(store, rejectPublicSuffixes) { if (rejectPublicSuffixes != null) { this.rejectPublicSuffixes = rejectPublicSuffixes; } if (!store) { memstore = memstore || require('./memstore'); store = new memstore.MemoryCookieStore(); } this.store = store; } CookieJar.prototype.store = null; CookieJar.prototype.rejectPublicSuffixes = true; var CAN_BE_SYNC = []; CAN_BE_SYNC.push('setCookie'); CookieJar.prototype.setCookie = function(cookie, url, options, cb) { var err; var context = (url instanceof Object) ? url : urlParse(url); if (options instanceof Function) { cb = options; options = {}; } var host = canonicalDomain(context.hostname); // S5.3 step 1 if (!(cookie instanceof Cookie)) { cookie = Cookie.parse(cookie, options.strict === true); } if (!cookie) { err = new Error("Cookie failed to parse"); return cb(options.ignoreError ? null : err); } // S5.3 step 2 var now = options.now || new Date(); // will assign later to save effort in the face of errors // S5.3 step 3: NOOP; persistent-flag and expiry-time is handled by getCookie() // S5.3 step 4: NOOP; domain is null by default // S5.3 step 5: public suffixes if (this.rejectPublicSuffixes && cookie.domain) { var suffix = pubsuffix.getPublicSuffix(cookie.cdomain()); if (suffix == null) { // e.g. "com" err = new Error("Cookie has domain set to a public suffix"); return cb(options.ignoreError ? null : err); } } // S5.3 step 6: if (cookie.domain) { if (!domainMatch(host, cookie.cdomain(), false)) { err = new Error("Cookie not in this host's domain. Cookie:"+cookie.cdomain()+" Request:"+host); return cb(options.ignoreError ? null : err); } if (cookie.hostOnly == null) { // don't reset if already set cookie.hostOnly = false; } } else { cookie.hostOnly = true; cookie.domain = host; } // S5.3 step 7: "Otherwise, set the cookie's path to the default-path of the // request-uri" if (!cookie.path) { cookie.path = defaultPath(context.pathname); cookie.pathIsDefault = true; } else { if (cookie.path.length > 1 && cookie.path.substr(-1) == '/') { cookie.path = cookie.path.slice(0,-1); } } // S5.3 step 8: NOOP; secure attribute // S5.3 step 9: NOOP; httpOnly attribute // S5.3 step 10 if (options.http === false && cookie.httpOnly) { err = new Error("Cookie is HttpOnly and this isn't an HTTP API"); return cb(options.ignoreError ? null : err); } var store = this.store; if (!store.updateCookie) { store.updateCookie = function(oldCookie, newCookie, cb) { this.putCookie(newCookie, cb); }; } function withCookie(err, oldCookie) { if (err) { return cb(err); } var next = function(err) { if (err) { return cb(err); } else { cb(null, cookie); } }; if (oldCookie) { // S5.3 step 11 - "If the cookie store contains a cookie with the same name, // domain, and path as the newly created cookie:" if (options.http === false && oldCookie.httpOnly) { // step 11.2 err = new Error("old Cookie is HttpOnly and this isn't an HTTP API"); return cb(options.ignoreError ? null : err); } cookie.creation = oldCookie.creation; // step 11.3 cookie.lastAccessed = now; // Step 11.4 (delete cookie) is implied by just setting the new one: store.updateCookie(oldCookie, cookie, next); // step 12 } else { cookie.creation = cookie.lastAccessed = now; store.putCookie(cookie, next); // step 12 } } store.findCookie(cookie.domain, cookie.path, cookie.key, withCookie); }; // RFC6365 S5.4 CAN_BE_SYNC.push('getCookies'); CookieJar.prototype.getCookies = function(url, options, cb) { var context = (url instanceof Object) ? url : urlParse(url); if (options instanceof Function) { cb = options; options = {}; } var host = canonicalDomain(context.hostname); var path = context.pathname || '/'; var secure = options.secure; if (secure == null && context.protocol && (context.protocol == 'https:' || context.protocol == 'wss:')) { secure = true; } var http = options.http; if (http == null) { http = true; } var now = options.now || Date.now(); var expireCheck = options.expire !== false; var allPaths = !!options.allPaths; var store = this.store; function matchingCookie(c) { // "Either: // The cookie's host-only-flag is true and the canonicalized // request-host is identical to the cookie's domain. // Or: // The cookie's host-only-flag is false and the canonicalized // request-host domain-matches the cookie's domain." if (c.hostOnly) { if (c.domain != host) { return false; } } else { if (!domainMatch(host, c.domain, false)) { return false; } } // "The request-uri's path path-matches the cookie's path." if (!allPaths && !pathMatch(path, c.path)) { return false; } // "If the cookie's secure-only-flag is true, then the request-uri's // scheme must denote a "secure" protocol" if (c.secure && !secure) { return false; } // "If the cookie's http-only-flag is true, then exclude the cookie if the // cookie-string is being generated for a "non-HTTP" API" if (c.httpOnly && !http) { return false; } // deferred from S5.3 // non-RFC: allow retention of expired cookies by choice if (expireCheck && c.expiryTime() <= now) { store.removeCookie(c.domain, c.path, c.key, function(){}); // result ignored return false; } return true; } store.findCookies(host, allPaths ? null : path, function(err,cookies) { if (err) { return cb(err); } cookies = cookies.filter(matchingCookie); // sorting of S5.4 part 2 if (options.sort !== false) { cookies = cookies.sort(cookieCompare); } // S5.4 part 3 var now = new Date(); cookies.forEach(function(c) { c.lastAccessed = now; }); // TODO persist lastAccessed cb(null,cookies); }); }; CAN_BE_SYNC.push('getCookieString'); CookieJar.prototype.getCookieString = function(/*..., cb*/) { var args = Array.prototype.slice.call(arguments,0); var cb = args.pop(); var next = function(err,cookies) { if (err) { cb(err); } else { cb(null, cookies.map(function(c){ return c.cookieString(); }).join('; ')); } }; args.push(next); this.getCookies.apply(this,args); }; CAN_BE_SYNC.push('getSetCookieStrings'); CookieJar.prototype.getSetCookieStrings = function(/*..., cb*/) { var args = Array.prototype.slice.call(arguments,0); var cb = args.pop(); var next = function(err,cookies) { if (err) { cb(err); } else { cb(null, cookies.map(function(c){ return c.toString(); })); } }; args.push(next); this.getCookies.apply(this,args); }; // Use a closure to provide a true imperative API for synchronous stores. function syncWrap(method) { return function() { if (!this.store.synchronous) { throw new Error('CookieJar store is not synchronous; use async API instead.'); } var args = Array.prototype.slice.call(arguments); var syncErr, syncResult; args.push(function syncCb(err, result) { syncErr = err; syncResult = result; }); this[method].apply(this, args); if (syncErr) { throw syncErr; } return syncResult; }; } // wrap all declared CAN_BE_SYNC methods in the sync wrapper CAN_BE_SYNC.forEach(function(method) { CookieJar.prototype[method+'Sync'] = syncWrap(method); }); module.exports = { CookieJar: CookieJar, Cookie: Cookie, Store: Store, parseDate: parseDate, formatDate: formatDate, parse: parse, fromJSON: fromJSON, domainMatch: domainMatch, defaultPath: defaultPath, pathMatch: pathMatch, getPublicSuffix: pubsuffix.getPublicSuffix, cookieCompare: cookieCompare, permuteDomain: permuteDomain, permutePath: permutePath, canonicalDomain: canonicalDomain, }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/lib/memstore.js�����������������000644 �000766 �000024 �00000006303 12455173731 035003� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������'use strict'; var tough = require('./cookie'); var Store = require('./store').Store; var permuteDomain = tough.permuteDomain; var permutePath = tough.permutePath; var util = require('util'); function MemoryCookieStore() { Store.call(this); this.idx = {}; } util.inherits(MemoryCookieStore, Store); exports.MemoryCookieStore = MemoryCookieStore; MemoryCookieStore.prototype.idx = null; MemoryCookieStore.prototype.synchronous = true; // force a default depth: MemoryCookieStore.prototype.inspect = function() { return "{ idx: "+util.inspect(this.idx, false, 2)+' }'; }; MemoryCookieStore.prototype.findCookie = function(domain, path, key, cb) { if (!this.idx[domain]) { return cb(null,undefined); } if (!this.idx[domain][path]) { return cb(null,undefined); } return cb(null,this.idx[domain][path][key]||null); }; MemoryCookieStore.prototype.findCookies = function(domain, path, cb) { var results = []; if (!domain) { return cb(null,[]); } var pathMatcher; if (!path) { // null or '/' means "all paths" pathMatcher = function matchAll(domainIndex) { for (var curPath in domainIndex) { var pathIndex = domainIndex[curPath]; for (var key in pathIndex) { results.push(pathIndex[key]); } } }; } else if (path === '/') { pathMatcher = function matchSlash(domainIndex) { var pathIndex = domainIndex['/']; if (!pathIndex) { return; } for (var key in pathIndex) { results.push(pathIndex[key]); } }; } else { var paths = permutePath(path) || [path]; pathMatcher = function matchRFC(domainIndex) { paths.forEach(function(curPath) { var pathIndex = domainIndex[curPath]; if (!pathIndex) { return; } for (var key in pathIndex) { results.push(pathIndex[key]); } }); }; } var domains = permuteDomain(domain) || [domain]; var idx = this.idx; domains.forEach(function(curDomain) { var domainIndex = idx[curDomain]; if (!domainIndex) { return; } pathMatcher(domainIndex); }); cb(null,results); }; MemoryCookieStore.prototype.putCookie = function(cookie, cb) { if (!this.idx[cookie.domain]) { this.idx[cookie.domain] = {}; } if (!this.idx[cookie.domain][cookie.path]) { this.idx[cookie.domain][cookie.path] = {}; } this.idx[cookie.domain][cookie.path][cookie.key] = cookie; cb(null); }; MemoryCookieStore.prototype.updateCookie = function updateCookie(oldCookie, newCookie, cb) { // updateCookie() may avoid updating cookies that are identical. For example, // lastAccessed may not be important to some stores and an equality // comparison could exclude that field. this.putCookie(newCookie,cb); }; MemoryCookieStore.prototype.removeCookie = function removeCookie(domain, path, key, cb) { if (this.idx[domain] && this.idx[domain][path] && this.idx[domain][path][key]) { delete this.idx[domain][path][key]; } cb(null); }; MemoryCookieStore.prototype.removeCookies = function removeCookies(domain, path, cb) { if (this.idx[domain]) { if (path) { delete this.idx[domain][path]; } else { delete this.idx[domain]; } } return cb(null); }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/lib/pubsuffix.js����������������000644 �000766 �000024 �00000231617 12455173731 035173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/**************************************************** * AUTOMATICALLY GENERATED by generate-pubsuffix.js * * DO NOT EDIT! * ****************************************************/ module.exports.getPublicSuffix = function getPublicSuffix(domain) { /* * Copyright GoInstant, Inc. and other contributors. All rights reserved. * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ if (!domain) return null; if (domain.match(/^\./)) return null; domain = domain.toLowerCase(); var parts = domain.split('.').reverse(); var suffix = ''; var suffixLen = 0; for (var i=0; i<parts.length; i++) { var part = parts[i]; var starstr = '*'+suffix; var partstr = part+suffix; if (index[starstr]) { // star rule matches suffixLen = i+1; if (index[partstr] === false) { // exception rule matches (NB: false, not undefined) suffixLen--; } } else if (index[partstr]) { // exact match, not exception suffixLen = i+1; } suffix = '.'+part+suffix; } if (index['*'+suffix]) { // *.domain exists (e.g. *.kyoto.jp for domain='kyoto.jp'); return null; } if (suffixLen && parts.length > suffixLen) { return parts.slice(0,suffixLen+1).reverse().join('.'); } return null; }; // The following generated structure is used under the MPL version 1.1 // See public-suffix.txt for more information var index = module.exports.index = Object.freeze( {"ac":true,"com.ac":true,"edu.ac":true,"gov.ac":true,"net.ac":true,"mil.ac":true,"org.ac":true,"ad":true,"nom.ad":true,"ae":true,"co.ae":true,"net.ae":true,"org.ae":true,"sch.ae":true,"ac.ae":true,"gov.ae":true,"mil.ae":true,"aero":true,"accident-investigation.aero":true,"accident-prevention.aero":true,"aerobatic.aero":true,"aeroclub.aero":true,"aerodrome.aero":true,"agents.aero":true,"aircraft.aero":true,"airline.aero":true,"airport.aero":true,"air-surveillance.aero":true,"airtraffic.aero":true,"air-traffic-control.aero":true,"ambulance.aero":true,"amusement.aero":true,"association.aero":true,"author.aero":true,"ballooning.aero":true,"broker.aero":true,"caa.aero":true,"cargo.aero":true,"catering.aero":true,"certification.aero":true,"championship.aero":true,"charter.aero":true,"civilaviation.aero":true,"club.aero":true,"conference.aero":true,"consultant.aero":true,"consulting.aero":true,"control.aero":true,"council.aero":true,"crew.aero":true,"design.aero":true,"dgca.aero":true,"educator.aero":true,"emergency.aero":true,"engine.aero":true,"engineer.aero":true,"entertainment.aero":true,"equipment.aero":true,"exchange.aero":true,"express.aero":true,"federation.aero":true,"flight.aero":true,"freight.aero":true,"fuel.aero":true,"gliding.aero":true,"government.aero":true,"groundhandling.aero":true,"group.aero":true,"hanggliding.aero":true,"homebuilt.aero":true,"insurance.aero":true,"journal.aero":true,"journalist.aero":true,"leasing.aero":true,"logistics.aero":true,"magazine.aero":true,"maintenance.aero":true,"marketplace.aero":true,"media.aero":true,"microlight.aero":true,"modelling.aero":true,"navigation.aero":true,"parachuting.aero":true,"paragliding.aero":true,"passenger-association.aero":true,"pilot.aero":true,"press.aero":true,"production.aero":true,"recreation.aero":true,"repbody.aero":true,"res.aero":true,"research.aero":true,"rotorcraft.aero":true,"safety.aero":true,"scientist.aero":true,"services.aero":true,"show.aero":true,"skydiving.aero":true,"software.aero":true,"student.aero":true,"taxi.aero":true,"trader.aero":true,"trading.aero":true,"trainer.aero":true,"union.aero":true,"workinggroup.aero":true,"works.aero":true,"af":true,"gov.af":true,"com.af":true,"org.af":true,"net.af":true,"edu.af":true,"ag":true,"com.ag":true,"org.ag":true,"net.ag":true,"co.ag":true,"nom.ag":true,"ai":true,"off.ai":true,"com.ai":true,"net.ai":true,"org.ai":true,"al":true,"com.al":true,"edu.al":true,"gov.al":true,"mil.al":true,"net.al":true,"org.al":true,"am":true,"an":true,"com.an":true,"net.an":true,"org.an":true,"edu.an":true,"ao":true,"ed.ao":true,"gv.ao":true,"og.ao":true,"co.ao":true,"pb.ao":true,"it.ao":true,"aq":true,"*.ar":true,"congresodelalengua3.ar":false,"educ.ar":false,"gobiernoelectronico.ar":false,"mecon.ar":false,"nacion.ar":false,"nic.ar":false,"promocion.ar":false,"retina.ar":false,"uba.ar":false,"e164.arpa":true,"in-addr.arpa":true,"ip6.arpa":true,"iris.arpa":true,"uri.arpa":true,"urn.arpa":true,"as":true,"gov.as":true,"asia":true,"at":true,"ac.at":true,"co.at":true,"gv.at":true,"or.at":true,"com.au":true,"net.au":true,"org.au":true,"edu.au":true,"gov.au":true,"csiro.au":true,"asn.au":true,"id.au":true,"info.au":true,"conf.au":true,"oz.au":true,"act.au":true,"nsw.au":true,"nt.au":true,"qld.au":true,"sa.au":true,"tas.au":true,"vic.au":true,"wa.au":true,"act.edu.au":true,"nsw.edu.au":true,"nt.edu.au":true,"qld.edu.au":true,"sa.edu.au":true,"tas.edu.au":true,"vic.edu.au":true,"wa.edu.au":true,"act.gov.au":true,"nt.gov.au":true,"qld.gov.au":true,"sa.gov.au":true,"tas.gov.au":true,"vic.gov.au":true,"wa.gov.au":true,"aw":true,"com.aw":true,"ax":true,"az":true,"com.az":true,"net.az":true,"int.az":true,"gov.az":true,"org.az":true,"edu.az":true,"info.az":true,"pp.az":true,"mil.az":true,"name.az":true,"pro.az":true,"biz.az":true,"ba":true,"org.ba":true,"net.ba":true,"edu.ba":true,"gov.ba":true,"mil.ba":true,"unsa.ba":true,"unbi.ba":true,"co.ba":true,"com.ba":true,"rs.ba":true,"bb":true,"biz.bb":true,"com.bb":true,"edu.bb":true,"gov.bb":true,"info.bb":true,"net.bb":true,"org.bb":true,"store.bb":true,"*.bd":true,"be":true,"ac.be":true,"bf":true,"gov.bf":true,"bg":true,"a.bg":true,"b.bg":true,"c.bg":true,"d.bg":true,"e.bg":true,"f.bg":true,"g.bg":true,"h.bg":true,"i.bg":true,"j.bg":true,"k.bg":true,"l.bg":true,"m.bg":true,"n.bg":true,"o.bg":true,"p.bg":true,"q.bg":true,"r.bg":true,"s.bg":true,"t.bg":true,"u.bg":true,"v.bg":true,"w.bg":true,"x.bg":true,"y.bg":true,"z.bg":true,"0.bg":true,"1.bg":true,"2.bg":true,"3.bg":true,"4.bg":true,"5.bg":true,"6.bg":true,"7.bg":true,"8.bg":true,"9.bg":true,"bh":true,"com.bh":true,"edu.bh":true,"net.bh":true,"org.bh":true,"gov.bh":true,"bi":true,"co.bi":true,"com.bi":true,"edu.bi":true,"or.bi":true,"org.bi":true,"biz":true,"bj":true,"asso.bj":true,"barreau.bj":true,"gouv.bj":true,"bm":true,"com.bm":true,"edu.bm":true,"gov.bm":true,"net.bm":true,"org.bm":true,"*.bn":true,"bo":true,"com.bo":true,"edu.bo":true,"gov.bo":true,"gob.bo":true,"int.bo":true,"org.bo":true,"net.bo":true,"mil.bo":true,"tv.bo":true,"br":true,"adm.br":true,"adv.br":true,"agr.br":true,"am.br":true,"arq.br":true,"art.br":true,"ato.br":true,"b.br":true,"bio.br":true,"blog.br":true,"bmd.br":true,"can.br":true,"cim.br":true,"cng.br":true,"cnt.br":true,"com.br":true,"coop.br":true,"ecn.br":true,"edu.br":true,"emp.br":true,"eng.br":true,"esp.br":true,"etc.br":true,"eti.br":true,"far.br":true,"flog.br":true,"fm.br":true,"fnd.br":true,"fot.br":true,"fst.br":true,"g12.br":true,"ggf.br":true,"gov.br":true,"imb.br":true,"ind.br":true,"inf.br":true,"jor.br":true,"jus.br":true,"lel.br":true,"mat.br":true,"med.br":true,"mil.br":true,"mus.br":true,"net.br":true,"nom.br":true,"not.br":true,"ntr.br":true,"odo.br":true,"org.br":true,"ppg.br":true,"pro.br":true,"psc.br":true,"psi.br":true,"qsl.br":true,"radio.br":true,"rec.br":true,"slg.br":true,"srv.br":true,"taxi.br":true,"teo.br":true,"tmp.br":true,"trd.br":true,"tur.br":true,"tv.br":true,"vet.br":true,"vlog.br":true,"wiki.br":true,"zlg.br":true,"bs":true,"com.bs":true,"net.bs":true,"org.bs":true,"edu.bs":true,"gov.bs":true,"bt":true,"com.bt":true,"edu.bt":true,"gov.bt":true,"net.bt":true,"org.bt":true,"bw":true,"co.bw":true,"org.bw":true,"by":true,"gov.by":true,"mil.by":true,"com.by":true,"of.by":true,"bz":true,"com.bz":true,"net.bz":true,"org.bz":true,"edu.bz":true,"gov.bz":true,"ca":true,"ab.ca":true,"bc.ca":true,"mb.ca":true,"nb.ca":true,"nf.ca":true,"nl.ca":true,"ns.ca":true,"nt.ca":true,"nu.ca":true,"on.ca":true,"pe.ca":true,"qc.ca":true,"sk.ca":true,"yk.ca":true,"gc.ca":true,"cat":true,"cc":true,"cd":true,"gov.cd":true,"cf":true,"cg":true,"ch":true,"ci":true,"org.ci":true,"or.ci":true,"com.ci":true,"co.ci":true,"edu.ci":true,"ed.ci":true,"ac.ci":true,"net.ci":true,"go.ci":true,"asso.ci":true,"xn--aroport-bya.ci":true,"int.ci":true,"presse.ci":true,"md.ci":true,"gouv.ci":true,"*.ck":true,"www.ck":false,"cl":true,"gov.cl":true,"gob.cl":true,"co.cl":true,"mil.cl":true,"cm":true,"gov.cm":true,"cn":true,"ac.cn":true,"com.cn":true,"edu.cn":true,"gov.cn":true,"net.cn":true,"org.cn":true,"mil.cn":true,"xn--55qx5d.cn":true,"xn--io0a7i.cn":true,"xn--od0alg.cn":true,"ah.cn":true,"bj.cn":true,"cq.cn":true,"fj.cn":true,"gd.cn":true,"gs.cn":true,"gz.cn":true,"gx.cn":true,"ha.cn":true,"hb.cn":true,"he.cn":true,"hi.cn":true,"hl.cn":true,"hn.cn":true,"jl.cn":true,"js.cn":true,"jx.cn":true,"ln.cn":true,"nm.cn":true,"nx.cn":true,"qh.cn":true,"sc.cn":true,"sd.cn":true,"sh.cn":true,"sn.cn":true,"sx.cn":true,"tj.cn":true,"xj.cn":true,"xz.cn":true,"yn.cn":true,"zj.cn":true,"hk.cn":true,"mo.cn":true,"tw.cn":true,"co":true,"arts.co":true,"com.co":true,"edu.co":true,"firm.co":true,"gov.co":true,"info.co":true,"int.co":true,"mil.co":true,"net.co":true,"nom.co":true,"org.co":true,"rec.co":true,"web.co":true,"com":true,"coop":true,"cr":true,"ac.cr":true,"co.cr":true,"ed.cr":true,"fi.cr":true,"go.cr":true,"or.cr":true,"sa.cr":true,"cu":true,"com.cu":true,"edu.cu":true,"org.cu":true,"net.cu":true,"gov.cu":true,"inf.cu":true,"cv":true,"cx":true,"gov.cx":true,"*.cy":true,"cz":true,"de":true,"dj":true,"dk":true,"dm":true,"com.dm":true,"net.dm":true,"org.dm":true,"edu.dm":true,"gov.dm":true,"do":true,"art.do":true,"com.do":true,"edu.do":true,"gob.do":true,"gov.do":true,"mil.do":true,"net.do":true,"org.do":true,"sld.do":true,"web.do":true,"dz":true,"com.dz":true,"org.dz":true,"net.dz":true,"gov.dz":true,"edu.dz":true,"asso.dz":true,"pol.dz":true,"art.dz":true,"ec":true,"com.ec":true,"info.ec":true,"net.ec":true,"fin.ec":true,"k12.ec":true,"med.ec":true,"pro.ec":true,"org.ec":true,"edu.ec":true,"gov.ec":true,"gob.ec":true,"mil.ec":true,"edu":true,"ee":true,"edu.ee":true,"gov.ee":true,"riik.ee":true,"lib.ee":true,"med.ee":true,"com.ee":true,"pri.ee":true,"aip.ee":true,"org.ee":true,"fie.ee":true,"eg":true,"com.eg":true,"edu.eg":true,"eun.eg":true,"gov.eg":true,"mil.eg":true,"name.eg":true,"net.eg":true,"org.eg":true,"sci.eg":true,"*.er":true,"es":true,"com.es":true,"nom.es":true,"org.es":true,"gob.es":true,"edu.es":true,"*.et":true,"eu":true,"fi":true,"aland.fi":true,"*.fj":true,"*.fk":true,"fm":true,"fo":true,"fr":true,"com.fr":true,"asso.fr":true,"nom.fr":true,"prd.fr":true,"presse.fr":true,"tm.fr":true,"aeroport.fr":true,"assedic.fr":true,"avocat.fr":true,"avoues.fr":true,"cci.fr":true,"chambagri.fr":true,"chirurgiens-dentistes.fr":true,"experts-comptables.fr":true,"geometre-expert.fr":true,"gouv.fr":true,"greta.fr":true,"huissier-justice.fr":true,"medecin.fr":true,"notaires.fr":true,"pharmacien.fr":true,"port.fr":true,"veterinaire.fr":true,"ga":true,"gd":true,"ge":true,"com.ge":true,"edu.ge":true,"gov.ge":true,"org.ge":true,"mil.ge":true,"net.ge":true,"pvt.ge":true,"gf":true,"gg":true,"co.gg":true,"org.gg":true,"net.gg":true,"sch.gg":true,"gov.gg":true,"gh":true,"com.gh":true,"edu.gh":true,"gov.gh":true,"org.gh":true,"mil.gh":true,"gi":true,"com.gi":true,"ltd.gi":true,"gov.gi":true,"mod.gi":true,"edu.gi":true,"org.gi":true,"gl":true,"gm":true,"ac.gn":true,"com.gn":true,"edu.gn":true,"gov.gn":true,"org.gn":true,"net.gn":true,"gov":true,"gp":true,"com.gp":true,"net.gp":true,"mobi.gp":true,"edu.gp":true,"org.gp":true,"asso.gp":true,"gq":true,"gr":true,"com.gr":true,"edu.gr":true,"net.gr":true,"org.gr":true,"gov.gr":true,"gs":true,"*.gt":true,"www.gt":false,"*.gu":true,"gw":true,"gy":true,"co.gy":true,"com.gy":true,"net.gy":true,"hk":true,"com.hk":true,"edu.hk":true,"gov.hk":true,"idv.hk":true,"net.hk":true,"org.hk":true,"xn--55qx5d.hk":true,"xn--wcvs22d.hk":true,"xn--lcvr32d.hk":true,"xn--mxtq1m.hk":true,"xn--gmqw5a.hk":true,"xn--ciqpn.hk":true,"xn--gmq050i.hk":true,"xn--zf0avx.hk":true,"xn--io0a7i.hk":true,"xn--mk0axi.hk":true,"xn--od0alg.hk":true,"xn--od0aq3b.hk":true,"xn--tn0ag.hk":true,"xn--uc0atv.hk":true,"xn--uc0ay4a.hk":true,"hm":true,"hn":true,"com.hn":true,"edu.hn":true,"org.hn":true,"net.hn":true,"mil.hn":true,"gob.hn":true,"hr":true,"iz.hr":true,"from.hr":true,"name.hr":true,"com.hr":true,"ht":true,"com.ht":true,"shop.ht":true,"firm.ht":true,"info.ht":true,"adult.ht":true,"net.ht":true,"pro.ht":true,"org.ht":true,"med.ht":true,"art.ht":true,"coop.ht":true,"pol.ht":true,"asso.ht":true,"edu.ht":true,"rel.ht":true,"gouv.ht":true,"perso.ht":true,"hu":true,"co.hu":true,"info.hu":true,"org.hu":true,"priv.hu":true,"sport.hu":true,"tm.hu":true,"2000.hu":true,"agrar.hu":true,"bolt.hu":true,"casino.hu":true,"city.hu":true,"erotica.hu":true,"erotika.hu":true,"film.hu":true,"forum.hu":true,"games.hu":true,"hotel.hu":true,"ingatlan.hu":true,"jogasz.hu":true,"konyvelo.hu":true,"lakas.hu":true,"media.hu":true,"news.hu":true,"reklam.hu":true,"sex.hu":true,"shop.hu":true,"suli.hu":true,"szex.hu":true,"tozsde.hu":true,"utazas.hu":true,"video.hu":true,"id":true,"ac.id":true,"co.id":true,"go.id":true,"mil.id":true,"net.id":true,"or.id":true,"sch.id":true,"web.id":true,"ie":true,"gov.ie":true,"*.il":true,"im":true,"co.im":true,"ltd.co.im":true,"plc.co.im":true,"net.im":true,"gov.im":true,"org.im":true,"nic.im":true,"ac.im":true,"in":true,"co.in":true,"firm.in":true,"net.in":true,"org.in":true,"gen.in":true,"ind.in":true,"nic.in":true,"ac.in":true,"edu.in":true,"res.in":true,"gov.in":true,"mil.in":true,"info":true,"int":true,"eu.int":true,"io":true,"com.io":true,"iq":true,"gov.iq":true,"edu.iq":true,"mil.iq":true,"com.iq":true,"org.iq":true,"net.iq":true,"ir":true,"ac.ir":true,"co.ir":true,"gov.ir":true,"id.ir":true,"net.ir":true,"org.ir":true,"sch.ir":true,"xn--mgba3a4f16a.ir":true,"xn--mgba3a4fra.ir":true,"is":true,"net.is":true,"com.is":true,"edu.is":true,"gov.is":true,"org.is":true,"int.is":true,"it":true,"gov.it":true,"edu.it":true,"agrigento.it":true,"ag.it":true,"alessandria.it":true,"al.it":true,"ancona.it":true,"an.it":true,"aosta.it":true,"aoste.it":true,"ao.it":true,"arezzo.it":true,"ar.it":true,"ascoli-piceno.it":true,"ascolipiceno.it":true,"ap.it":true,"asti.it":true,"at.it":true,"avellino.it":true,"av.it":true,"bari.it":true,"ba.it":true,"andria-barletta-trani.it":true,"andriabarlettatrani.it":true,"trani-barletta-andria.it":true,"tranibarlettaandria.it":true,"barletta-trani-andria.it":true,"barlettatraniandria.it":true,"andria-trani-barletta.it":true,"andriatranibarletta.it":true,"trani-andria-barletta.it":true,"traniandriabarletta.it":true,"bt.it":true,"belluno.it":true,"bl.it":true,"benevento.it":true,"bn.it":true,"bergamo.it":true,"bg.it":true,"biella.it":true,"bi.it":true,"bologna.it":true,"bo.it":true,"bolzano.it":true,"bozen.it":true,"balsan.it":true,"alto-adige.it":true,"altoadige.it":true,"suedtirol.it":true,"bz.it":true,"brescia.it":true,"bs.it":true,"brindisi.it":true,"br.it":true,"cagliari.it":true,"ca.it":true,"caltanissetta.it":true,"cl.it":true,"campobasso.it":true,"cb.it":true,"carboniaiglesias.it":true,"carbonia-iglesias.it":true,"iglesias-carbonia.it":true,"iglesiascarbonia.it":true,"ci.it":true,"caserta.it":true,"ce.it":true,"catania.it":true,"ct.it":true,"catanzaro.it":true,"cz.it":true,"chieti.it":true,"ch.it":true,"como.it":true,"co.it":true,"cosenza.it":true,"cs.it":true,"cremona.it":true,"cr.it":true,"crotone.it":true,"kr.it":true,"cuneo.it":true,"cn.it":true,"dell-ogliastra.it":true,"dellogliastra.it":true,"ogliastra.it":true,"og.it":true,"enna.it":true,"en.it":true,"ferrara.it":true,"fe.it":true,"fermo.it":true,"fm.it":true,"firenze.it":true,"florence.it":true,"fi.it":true,"foggia.it":true,"fg.it":true,"forli-cesena.it":true,"forlicesena.it":true,"cesena-forli.it":true,"cesenaforli.it":true,"fc.it":true,"frosinone.it":true,"fr.it":true,"genova.it":true,"genoa.it":true,"ge.it":true,"gorizia.it":true,"go.it":true,"grosseto.it":true,"gr.it":true,"imperia.it":true,"im.it":true,"isernia.it":true,"is.it":true,"laquila.it":true,"aquila.it":true,"aq.it":true,"la-spezia.it":true,"laspezia.it":true,"sp.it":true,"latina.it":true,"lt.it":true,"lecce.it":true,"le.it":true,"lecco.it":true,"lc.it":true,"livorno.it":true,"li.it":true,"lodi.it":true,"lo.it":true,"lucca.it":true,"lu.it":true,"macerata.it":true,"mc.it":true,"mantova.it":true,"mn.it":true,"massa-carrara.it":true,"massacarrara.it":true,"carrara-massa.it":true,"carraramassa.it":true,"ms.it":true,"matera.it":true,"mt.it":true,"medio-campidano.it":true,"mediocampidano.it":true,"campidano-medio.it":true,"campidanomedio.it":true,"vs.it":true,"messina.it":true,"me.it":true,"milano.it":true,"milan.it":true,"mi.it":true,"modena.it":true,"mo.it":true,"monza.it":true,"monza-brianza.it":true,"monzabrianza.it":true,"monzaebrianza.it":true,"monzaedellabrianza.it":true,"monza-e-della-brianza.it":true,"mb.it":true,"napoli.it":true,"naples.it":true,"na.it":true,"novara.it":true,"no.it":true,"nuoro.it":true,"nu.it":true,"oristano.it":true,"or.it":true,"padova.it":true,"padua.it":true,"pd.it":true,"palermo.it":true,"pa.it":true,"parma.it":true,"pr.it":true,"pavia.it":true,"pv.it":true,"perugia.it":true,"pg.it":true,"pescara.it":true,"pe.it":true,"pesaro-urbino.it":true,"pesarourbino.it":true,"urbino-pesaro.it":true,"urbinopesaro.it":true,"pu.it":true,"piacenza.it":true,"pc.it":true,"pisa.it":true,"pi.it":true,"pistoia.it":true,"pt.it":true,"pordenone.it":true,"pn.it":true,"potenza.it":true,"pz.it":true,"prato.it":true,"po.it":true,"ragusa.it":true,"rg.it":true,"ravenna.it":true,"ra.it":true,"reggio-calabria.it":true,"reggiocalabria.it":true,"rc.it":true,"reggio-emilia.it":true,"reggioemilia.it":true,"re.it":true,"rieti.it":true,"ri.it":true,"rimini.it":true,"rn.it":true,"roma.it":true,"rome.it":true,"rm.it":true,"rovigo.it":true,"ro.it":true,"salerno.it":true,"sa.it":true,"sassari.it":true,"ss.it":true,"savona.it":true,"sv.it":true,"siena.it":true,"si.it":true,"siracusa.it":true,"sr.it":true,"sondrio.it":true,"so.it":true,"taranto.it":true,"ta.it":true,"tempio-olbia.it":true,"tempioolbia.it":true,"olbia-tempio.it":true,"olbiatempio.it":true,"ot.it":true,"teramo.it":true,"te.it":true,"terni.it":true,"tr.it":true,"torino.it":true,"turin.it":true,"to.it":true,"trapani.it":true,"tp.it":true,"trento.it":true,"trentino.it":true,"tn.it":true,"treviso.it":true,"tv.it":true,"trieste.it":true,"ts.it":true,"udine.it":true,"ud.it":true,"varese.it":true,"va.it":true,"venezia.it":true,"venice.it":true,"ve.it":true,"verbania.it":true,"vb.it":true,"vercelli.it":true,"vc.it":true,"verona.it":true,"vr.it":true,"vibo-valentia.it":true,"vibovalentia.it":true,"vv.it":true,"vicenza.it":true,"vi.it":true,"viterbo.it":true,"vt.it":true,"je":true,"co.je":true,"org.je":true,"net.je":true,"sch.je":true,"gov.je":true,"*.jm":true,"jo":true,"com.jo":true,"org.jo":true,"net.jo":true,"edu.jo":true,"sch.jo":true,"gov.jo":true,"mil.jo":true,"name.jo":true,"jobs":true,"jp":true,"ac.jp":true,"ad.jp":true,"co.jp":true,"ed.jp":true,"go.jp":true,"gr.jp":true,"lg.jp":true,"ne.jp":true,"or.jp":true,"*.aichi.jp":true,"*.akita.jp":true,"*.aomori.jp":true,"*.chiba.jp":true,"*.ehime.jp":true,"*.fukui.jp":true,"*.fukuoka.jp":true,"*.fukushima.jp":true,"*.gifu.jp":true,"*.gunma.jp":true,"*.hiroshima.jp":true,"*.hokkaido.jp":true,"*.hyogo.jp":true,"*.ibaraki.jp":true,"*.ishikawa.jp":true,"*.iwate.jp":true,"*.kagawa.jp":true,"*.kagoshima.jp":true,"*.kanagawa.jp":true,"*.kawasaki.jp":true,"*.kitakyushu.jp":true,"*.kobe.jp":true,"*.kochi.jp":true,"*.kumamoto.jp":true,"*.kyoto.jp":true,"*.mie.jp":true,"*.miyagi.jp":true,"*.miyazaki.jp":true,"*.nagano.jp":true,"*.nagasaki.jp":true,"*.nagoya.jp":true,"*.nara.jp":true,"*.niigata.jp":true,"*.oita.jp":true,"*.okayama.jp":true,"*.okinawa.jp":true,"*.osaka.jp":true,"*.saga.jp":true,"*.saitama.jp":true,"*.sapporo.jp":true,"*.sendai.jp":true,"*.shiga.jp":true,"*.shimane.jp":true,"*.shizuoka.jp":true,"*.tochigi.jp":true,"*.tokushima.jp":true,"*.tokyo.jp":true,"*.tottori.jp":true,"*.toyama.jp":true,"*.wakayama.jp":true,"*.yamagata.jp":true,"*.yamaguchi.jp":true,"*.yamanashi.jp":true,"*.yokohama.jp":true,"metro.tokyo.jp":false,"pref.aichi.jp":false,"pref.akita.jp":false,"pref.aomori.jp":false,"pref.chiba.jp":false,"pref.ehime.jp":false,"pref.fukui.jp":false,"pref.fukuoka.jp":false,"pref.fukushima.jp":false,"pref.gifu.jp":false,"pref.gunma.jp":false,"pref.hiroshima.jp":false,"pref.hokkaido.jp":false,"pref.hyogo.jp":false,"pref.ibaraki.jp":false,"pref.ishikawa.jp":false,"pref.iwate.jp":false,"pref.kagawa.jp":false,"pref.kagoshima.jp":false,"pref.kanagawa.jp":false,"pref.kochi.jp":false,"pref.kumamoto.jp":false,"pref.kyoto.jp":false,"pref.mie.jp":false,"pref.miyagi.jp":false,"pref.miyazaki.jp":false,"pref.nagano.jp":false,"pref.nagasaki.jp":false,"pref.nara.jp":false,"pref.niigata.jp":false,"pref.oita.jp":false,"pref.okayama.jp":false,"pref.okinawa.jp":false,"pref.osaka.jp":false,"pref.saga.jp":false,"pref.saitama.jp":false,"pref.shiga.jp":false,"pref.shimane.jp":false,"pref.shizuoka.jp":false,"pref.tochigi.jp":false,"pref.tokushima.jp":false,"pref.tottori.jp":false,"pref.toyama.jp":false,"pref.wakayama.jp":false,"pref.yamagata.jp":false,"pref.yamaguchi.jp":false,"pref.yamanashi.jp":false,"city.chiba.jp":false,"city.fukuoka.jp":false,"city.hiroshima.jp":false,"city.kawasaki.jp":false,"city.kitakyushu.jp":false,"city.kobe.jp":false,"city.kyoto.jp":false,"city.nagoya.jp":false,"city.niigata.jp":false,"city.okayama.jp":false,"city.osaka.jp":false,"city.saitama.jp":false,"city.sapporo.jp":false,"city.sendai.jp":false,"city.shizuoka.jp":false,"city.yokohama.jp":false,"*.ke":true,"kg":true,"org.kg":true,"net.kg":true,"com.kg":true,"edu.kg":true,"gov.kg":true,"mil.kg":true,"*.kh":true,"ki":true,"edu.ki":true,"biz.ki":true,"net.ki":true,"org.ki":true,"gov.ki":true,"info.ki":true,"com.ki":true,"km":true,"org.km":true,"nom.km":true,"gov.km":true,"prd.km":true,"tm.km":true,"edu.km":true,"mil.km":true,"ass.km":true,"com.km":true,"coop.km":true,"asso.km":true,"presse.km":true,"medecin.km":true,"notaires.km":true,"pharmaciens.km":true,"veterinaire.km":true,"gouv.km":true,"kn":true,"net.kn":true,"org.kn":true,"edu.kn":true,"gov.kn":true,"com.kp":true,"edu.kp":true,"gov.kp":true,"org.kp":true,"rep.kp":true,"tra.kp":true,"kr":true,"ac.kr":true,"co.kr":true,"es.kr":true,"go.kr":true,"hs.kr":true,"kg.kr":true,"mil.kr":true,"ms.kr":true,"ne.kr":true,"or.kr":true,"pe.kr":true,"re.kr":true,"sc.kr":true,"busan.kr":true,"chungbuk.kr":true,"chungnam.kr":true,"daegu.kr":true,"daejeon.kr":true,"gangwon.kr":true,"gwangju.kr":true,"gyeongbuk.kr":true,"gyeonggi.kr":true,"gyeongnam.kr":true,"incheon.kr":true,"jeju.kr":true,"jeonbuk.kr":true,"jeonnam.kr":true,"seoul.kr":true,"ulsan.kr":true,"*.kw":true,"ky":true,"edu.ky":true,"gov.ky":true,"com.ky":true,"org.ky":true,"net.ky":true,"kz":true,"org.kz":true,"edu.kz":true,"net.kz":true,"gov.kz":true,"mil.kz":true,"com.kz":true,"la":true,"int.la":true,"net.la":true,"info.la":true,"edu.la":true,"gov.la":true,"per.la":true,"com.la":true,"org.la":true,"com.lb":true,"edu.lb":true,"gov.lb":true,"net.lb":true,"org.lb":true,"lc":true,"com.lc":true,"net.lc":true,"co.lc":true,"org.lc":true,"edu.lc":true,"gov.lc":true,"li":true,"lk":true,"gov.lk":true,"sch.lk":true,"net.lk":true,"int.lk":true,"com.lk":true,"org.lk":true,"edu.lk":true,"ngo.lk":true,"soc.lk":true,"web.lk":true,"ltd.lk":true,"assn.lk":true,"grp.lk":true,"hotel.lk":true,"com.lr":true,"edu.lr":true,"gov.lr":true,"org.lr":true,"net.lr":true,"ls":true,"co.ls":true,"org.ls":true,"lt":true,"gov.lt":true,"lu":true,"lv":true,"com.lv":true,"edu.lv":true,"gov.lv":true,"org.lv":true,"mil.lv":true,"id.lv":true,"net.lv":true,"asn.lv":true,"conf.lv":true,"ly":true,"com.ly":true,"net.ly":true,"gov.ly":true,"plc.ly":true,"edu.ly":true,"sch.ly":true,"med.ly":true,"org.ly":true,"id.ly":true,"ma":true,"co.ma":true,"net.ma":true,"gov.ma":true,"org.ma":true,"ac.ma":true,"press.ma":true,"mc":true,"tm.mc":true,"asso.mc":true,"md":true,"me":true,"co.me":true,"net.me":true,"org.me":true,"edu.me":true,"ac.me":true,"gov.me":true,"its.me":true,"priv.me":true,"mg":true,"org.mg":true,"nom.mg":true,"gov.mg":true,"prd.mg":true,"tm.mg":true,"edu.mg":true,"mil.mg":true,"com.mg":true,"mh":true,"mil":true,"mk":true,"com.mk":true,"org.mk":true,"net.mk":true,"edu.mk":true,"gov.mk":true,"inf.mk":true,"name.mk":true,"ml":true,"com.ml":true,"edu.ml":true,"gouv.ml":true,"gov.ml":true,"net.ml":true,"org.ml":true,"presse.ml":true,"*.mm":true,"mn":true,"gov.mn":true,"edu.mn":true,"org.mn":true,"mo":true,"com.mo":true,"net.mo":true,"org.mo":true,"edu.mo":true,"gov.mo":true,"mobi":true,"mp":true,"mq":true,"mr":true,"gov.mr":true,"ms":true,"*.mt":true,"mu":true,"com.mu":true,"net.mu":true,"org.mu":true,"gov.mu":true,"ac.mu":true,"co.mu":true,"or.mu":true,"museum":true,"academy.museum":true,"agriculture.museum":true,"air.museum":true,"airguard.museum":true,"alabama.museum":true,"alaska.museum":true,"amber.museum":true,"ambulance.museum":true,"american.museum":true,"americana.museum":true,"americanantiques.museum":true,"americanart.museum":true,"amsterdam.museum":true,"and.museum":true,"annefrank.museum":true,"anthro.museum":true,"anthropology.museum":true,"antiques.museum":true,"aquarium.museum":true,"arboretum.museum":true,"archaeological.museum":true,"archaeology.museum":true,"architecture.museum":true,"art.museum":true,"artanddesign.museum":true,"artcenter.museum":true,"artdeco.museum":true,"arteducation.museum":true,"artgallery.museum":true,"arts.museum":true,"artsandcrafts.museum":true,"asmatart.museum":true,"assassination.museum":true,"assisi.museum":true,"association.museum":true,"astronomy.museum":true,"atlanta.museum":true,"austin.museum":true,"australia.museum":true,"automotive.museum":true,"aviation.museum":true,"axis.museum":true,"badajoz.museum":true,"baghdad.museum":true,"bahn.museum":true,"bale.museum":true,"baltimore.museum":true,"barcelona.museum":true,"baseball.museum":true,"basel.museum":true,"baths.museum":true,"bauern.museum":true,"beauxarts.museum":true,"beeldengeluid.museum":true,"bellevue.museum":true,"bergbau.museum":true,"berkeley.museum":true,"berlin.museum":true,"bern.museum":true,"bible.museum":true,"bilbao.museum":true,"bill.museum":true,"birdart.museum":true,"birthplace.museum":true,"bonn.museum":true,"boston.museum":true,"botanical.museum":true,"botanicalgarden.museum":true,"botanicgarden.museum":true,"botany.museum":true,"brandywinevalley.museum":true,"brasil.museum":true,"bristol.museum":true,"british.museum":true,"britishcolumbia.museum":true,"broadcast.museum":true,"brunel.museum":true,"brussel.museum":true,"brussels.museum":true,"bruxelles.museum":true,"building.museum":true,"burghof.museum":true,"bus.museum":true,"bushey.museum":true,"cadaques.museum":true,"california.museum":true,"cambridge.museum":true,"can.museum":true,"canada.museum":true,"capebreton.museum":true,"carrier.museum":true,"cartoonart.museum":true,"casadelamoneda.museum":true,"castle.museum":true,"castres.museum":true,"celtic.museum":true,"center.museum":true,"chattanooga.museum":true,"cheltenham.museum":true,"chesapeakebay.museum":true,"chicago.museum":true,"children.museum":true,"childrens.museum":true,"childrensgarden.museum":true,"chiropractic.museum":true,"chocolate.museum":true,"christiansburg.museum":true,"cincinnati.museum":true,"cinema.museum":true,"circus.museum":true,"civilisation.museum":true,"civilization.museum":true,"civilwar.museum":true,"clinton.museum":true,"clock.museum":true,"coal.museum":true,"coastaldefence.museum":true,"cody.museum":true,"coldwar.museum":true,"collection.museum":true,"colonialwilliamsburg.museum":true,"coloradoplateau.museum":true,"columbia.museum":true,"columbus.museum":true,"communication.museum":true,"communications.museum":true,"community.museum":true,"computer.museum":true,"computerhistory.museum":true,"xn--comunicaes-v6a2o.museum":true,"contemporary.museum":true,"contemporaryart.museum":true,"convent.museum":true,"copenhagen.museum":true,"corporation.museum":true,"xn--correios-e-telecomunicaes-ghc29a.museum":true,"corvette.museum":true,"costume.museum":true,"countryestate.museum":true,"county.museum":true,"crafts.museum":true,"cranbrook.museum":true,"creation.museum":true,"cultural.museum":true,"culturalcenter.museum":true,"culture.museum":true,"cyber.museum":true,"cymru.museum":true,"dali.museum":true,"dallas.museum":true,"database.museum":true,"ddr.museum":true,"decorativearts.museum":true,"delaware.museum":true,"delmenhorst.museum":true,"denmark.museum":true,"depot.museum":true,"design.museum":true,"detroit.museum":true,"dinosaur.museum":true,"discovery.museum":true,"dolls.museum":true,"donostia.museum":true,"durham.museum":true,"eastafrica.museum":true,"eastcoast.museum":true,"education.museum":true,"educational.museum":true,"egyptian.museum":true,"eisenbahn.museum":true,"elburg.museum":true,"elvendrell.museum":true,"embroidery.museum":true,"encyclopedic.museum":true,"england.museum":true,"entomology.museum":true,"environment.museum":true,"environmentalconservation.museum":true,"epilepsy.museum":true,"essex.museum":true,"estate.museum":true,"ethnology.museum":true,"exeter.museum":true,"exhibition.museum":true,"family.museum":true,"farm.museum":true,"farmequipment.museum":true,"farmers.museum":true,"farmstead.museum":true,"field.museum":true,"figueres.museum":true,"filatelia.museum":true,"film.museum":true,"fineart.museum":true,"finearts.museum":true,"finland.museum":true,"flanders.museum":true,"florida.museum":true,"force.museum":true,"fortmissoula.museum":true,"fortworth.museum":true,"foundation.museum":true,"francaise.museum":true,"frankfurt.museum":true,"franziskaner.museum":true,"freemasonry.museum":true,"freiburg.museum":true,"fribourg.museum":true,"frog.museum":true,"fundacio.museum":true,"furniture.museum":true,"gallery.museum":true,"garden.museum":true,"gateway.museum":true,"geelvinck.museum":true,"gemological.museum":true,"geology.museum":true,"georgia.museum":true,"giessen.museum":true,"glas.museum":true,"glass.museum":true,"gorge.museum":true,"grandrapids.museum":true,"graz.museum":true,"guernsey.museum":true,"halloffame.museum":true,"hamburg.museum":true,"handson.museum":true,"harvestcelebration.museum":true,"hawaii.museum":true,"health.museum":true,"heimatunduhren.museum":true,"hellas.museum":true,"helsinki.museum":true,"hembygdsforbund.museum":true,"heritage.museum":true,"histoire.museum":true,"historical.museum":true,"historicalsociety.museum":true,"historichouses.museum":true,"historisch.museum":true,"historisches.museum":true,"history.museum":true,"historyofscience.museum":true,"horology.museum":true,"house.museum":true,"humanities.museum":true,"illustration.museum":true,"imageandsound.museum":true,"indian.museum":true,"indiana.museum":true,"indianapolis.museum":true,"indianmarket.museum":true,"intelligence.museum":true,"interactive.museum":true,"iraq.museum":true,"iron.museum":true,"isleofman.museum":true,"jamison.museum":true,"jefferson.museum":true,"jerusalem.museum":true,"jewelry.museum":true,"jewish.museum":true,"jewishart.museum":true,"jfk.museum":true,"journalism.museum":true,"judaica.museum":true,"judygarland.museum":true,"juedisches.museum":true,"juif.museum":true,"karate.museum":true,"karikatur.museum":true,"kids.museum":true,"koebenhavn.museum":true,"koeln.museum":true,"kunst.museum":true,"kunstsammlung.museum":true,"kunstunddesign.museum":true,"labor.museum":true,"labour.museum":true,"lajolla.museum":true,"lancashire.museum":true,"landes.museum":true,"lans.museum":true,"xn--lns-qla.museum":true,"larsson.museum":true,"lewismiller.museum":true,"lincoln.museum":true,"linz.museum":true,"living.museum":true,"livinghistory.museum":true,"localhistory.museum":true,"london.museum":true,"losangeles.museum":true,"louvre.museum":true,"loyalist.museum":true,"lucerne.museum":true,"luxembourg.museum":true,"luzern.museum":true,"mad.museum":true,"madrid.museum":true,"mallorca.museum":true,"manchester.museum":true,"mansion.museum":true,"mansions.museum":true,"manx.museum":true,"marburg.museum":true,"maritime.museum":true,"maritimo.museum":true,"maryland.museum":true,"marylhurst.museum":true,"media.museum":true,"medical.museum":true,"medizinhistorisches.museum":true,"meeres.museum":true,"memorial.museum":true,"mesaverde.museum":true,"michigan.museum":true,"midatlantic.museum":true,"military.museum":true,"mill.museum":true,"miners.museum":true,"mining.museum":true,"minnesota.museum":true,"missile.museum":true,"missoula.museum":true,"modern.museum":true,"moma.museum":true,"money.museum":true,"monmouth.museum":true,"monticello.museum":true,"montreal.museum":true,"moscow.museum":true,"motorcycle.museum":true,"muenchen.museum":true,"muenster.museum":true,"mulhouse.museum":true,"muncie.museum":true,"museet.museum":true,"museumcenter.museum":true,"museumvereniging.museum":true,"music.museum":true,"national.museum":true,"nationalfirearms.museum":true,"nationalheritage.museum":true,"nativeamerican.museum":true,"naturalhistory.museum":true,"naturalhistorymuseum.museum":true,"naturalsciences.museum":true,"nature.museum":true,"naturhistorisches.museum":true,"natuurwetenschappen.museum":true,"naumburg.museum":true,"naval.museum":true,"nebraska.museum":true,"neues.museum":true,"newhampshire.museum":true,"newjersey.museum":true,"newmexico.museum":true,"newport.museum":true,"newspaper.museum":true,"newyork.museum":true,"niepce.museum":true,"norfolk.museum":true,"north.museum":true,"nrw.museum":true,"nuernberg.museum":true,"nuremberg.museum":true,"nyc.museum":true,"nyny.museum":true,"oceanographic.museum":true,"oceanographique.museum":true,"omaha.museum":true,"online.museum":true,"ontario.museum":true,"openair.museum":true,"oregon.museum":true,"oregontrail.museum":true,"otago.museum":true,"oxford.museum":true,"pacific.museum":true,"paderborn.museum":true,"palace.museum":true,"paleo.museum":true,"palmsprings.museum":true,"panama.museum":true,"paris.museum":true,"pasadena.museum":true,"pharmacy.museum":true,"philadelphia.museum":true,"philadelphiaarea.museum":true,"philately.museum":true,"phoenix.museum":true,"photography.museum":true,"pilots.museum":true,"pittsburgh.museum":true,"planetarium.museum":true,"plantation.museum":true,"plants.museum":true,"plaza.museum":true,"portal.museum":true,"portland.museum":true,"portlligat.museum":true,"posts-and-telecommunications.museum":true,"preservation.museum":true,"presidio.museum":true,"press.museum":true,"project.museum":true,"public.museum":true,"pubol.museum":true,"quebec.museum":true,"railroad.museum":true,"railway.museum":true,"research.museum":true,"resistance.museum":true,"riodejaneiro.museum":true,"rochester.museum":true,"rockart.museum":true,"roma.museum":true,"russia.museum":true,"saintlouis.museum":true,"salem.museum":true,"salvadordali.museum":true,"salzburg.museum":true,"sandiego.museum":true,"sanfrancisco.museum":true,"santabarbara.museum":true,"santacruz.museum":true,"santafe.museum":true,"saskatchewan.museum":true,"satx.museum":true,"savannahga.museum":true,"schlesisches.museum":true,"schoenbrunn.museum":true,"schokoladen.museum":true,"school.museum":true,"schweiz.museum":true,"science.museum":true,"scienceandhistory.museum":true,"scienceandindustry.museum":true,"sciencecenter.museum":true,"sciencecenters.museum":true,"science-fiction.museum":true,"sciencehistory.museum":true,"sciences.museum":true,"sciencesnaturelles.museum":true,"scotland.museum":true,"seaport.museum":true,"settlement.museum":true,"settlers.museum":true,"shell.museum":true,"sherbrooke.museum":true,"sibenik.museum":true,"silk.museum":true,"ski.museum":true,"skole.museum":true,"society.museum":true,"sologne.museum":true,"soundandvision.museum":true,"southcarolina.museum":true,"southwest.museum":true,"space.museum":true,"spy.museum":true,"square.museum":true,"stadt.museum":true,"stalbans.museum":true,"starnberg.museum":true,"state.museum":true,"stateofdelaware.museum":true,"station.museum":true,"steam.museum":true,"steiermark.museum":true,"stjohn.museum":true,"stockholm.museum":true,"stpetersburg.museum":true,"stuttgart.museum":true,"suisse.museum":true,"surgeonshall.museum":true,"surrey.museum":true,"svizzera.museum":true,"sweden.museum":true,"sydney.museum":true,"tank.museum":true,"tcm.museum":true,"technology.museum":true,"telekommunikation.museum":true,"television.museum":true,"texas.museum":true,"textile.museum":true,"theater.museum":true,"time.museum":true,"timekeeping.museum":true,"topology.museum":true,"torino.museum":true,"touch.museum":true,"town.museum":true,"transport.museum":true,"tree.museum":true,"trolley.museum":true,"trust.museum":true,"trustee.museum":true,"uhren.museum":true,"ulm.museum":true,"undersea.museum":true,"university.museum":true,"usa.museum":true,"usantiques.museum":true,"usarts.museum":true,"uscountryestate.museum":true,"usculture.museum":true,"usdecorativearts.museum":true,"usgarden.museum":true,"ushistory.museum":true,"ushuaia.museum":true,"uslivinghistory.museum":true,"utah.museum":true,"uvic.museum":true,"valley.museum":true,"vantaa.museum":true,"versailles.museum":true,"viking.museum":true,"village.museum":true,"virginia.museum":true,"virtual.museum":true,"virtuel.museum":true,"vlaanderen.museum":true,"volkenkunde.museum":true,"wales.museum":true,"wallonie.museum":true,"war.museum":true,"washingtondc.museum":true,"watchandclock.museum":true,"watch-and-clock.museum":true,"western.museum":true,"westfalen.museum":true,"whaling.museum":true,"wildlife.museum":true,"williamsburg.museum":true,"windmill.museum":true,"workshop.museum":true,"york.museum":true,"yorkshire.museum":true,"yosemite.museum":true,"youth.museum":true,"zoological.museum":true,"zoology.museum":true,"xn--9dbhblg6di.museum":true,"xn--h1aegh.museum":true,"mv":true,"aero.mv":true,"biz.mv":true,"com.mv":true,"coop.mv":true,"edu.mv":true,"gov.mv":true,"info.mv":true,"int.mv":true,"mil.mv":true,"museum.mv":true,"name.mv":true,"net.mv":true,"org.mv":true,"pro.mv":true,"mw":true,"ac.mw":true,"biz.mw":true,"co.mw":true,"com.mw":true,"coop.mw":true,"edu.mw":true,"gov.mw":true,"int.mw":true,"museum.mw":true,"net.mw":true,"org.mw":true,"mx":true,"com.mx":true,"org.mx":true,"gob.mx":true,"edu.mx":true,"net.mx":true,"my":true,"com.my":true,"net.my":true,"org.my":true,"gov.my":true,"edu.my":true,"mil.my":true,"name.my":true,"*.mz":true,"na":true,"info.na":true,"pro.na":true,"name.na":true,"school.na":true,"or.na":true,"dr.na":true,"us.na":true,"mx.na":true,"ca.na":true,"in.na":true,"cc.na":true,"tv.na":true,"ws.na":true,"mobi.na":true,"co.na":true,"com.na":true,"org.na":true,"name":true,"nc":true,"asso.nc":true,"ne":true,"net":true,"nf":true,"com.nf":true,"net.nf":true,"per.nf":true,"rec.nf":true,"web.nf":true,"arts.nf":true,"firm.nf":true,"info.nf":true,"other.nf":true,"store.nf":true,"ac.ng":true,"com.ng":true,"edu.ng":true,"gov.ng":true,"net.ng":true,"org.ng":true,"*.ni":true,"nl":true,"bv.nl":true,"no":true,"fhs.no":true,"vgs.no":true,"fylkesbibl.no":true,"folkebibl.no":true,"museum.no":true,"idrett.no":true,"priv.no":true,"mil.no":true,"stat.no":true,"dep.no":true,"kommune.no":true,"herad.no":true,"aa.no":true,"ah.no":true,"bu.no":true,"fm.no":true,"hl.no":true,"hm.no":true,"jan-mayen.no":true,"mr.no":true,"nl.no":true,"nt.no":true,"of.no":true,"ol.no":true,"oslo.no":true,"rl.no":true,"sf.no":true,"st.no":true,"svalbard.no":true,"tm.no":true,"tr.no":true,"va.no":true,"vf.no":true,"gs.aa.no":true,"gs.ah.no":true,"gs.bu.no":true,"gs.fm.no":true,"gs.hl.no":true,"gs.hm.no":true,"gs.jan-mayen.no":true,"gs.mr.no":true,"gs.nl.no":true,"gs.nt.no":true,"gs.of.no":true,"gs.ol.no":true,"gs.oslo.no":true,"gs.rl.no":true,"gs.sf.no":true,"gs.st.no":true,"gs.svalbard.no":true,"gs.tm.no":true,"gs.tr.no":true,"gs.va.no":true,"gs.vf.no":true,"akrehamn.no":true,"xn--krehamn-dxa.no":true,"algard.no":true,"xn--lgrd-poac.no":true,"arna.no":true,"brumunddal.no":true,"bryne.no":true,"bronnoysund.no":true,"xn--brnnysund-m8ac.no":true,"drobak.no":true,"xn--drbak-wua.no":true,"egersund.no":true,"fetsund.no":true,"floro.no":true,"xn--flor-jra.no":true,"fredrikstad.no":true,"hokksund.no":true,"honefoss.no":true,"xn--hnefoss-q1a.no":true,"jessheim.no":true,"jorpeland.no":true,"xn--jrpeland-54a.no":true,"kirkenes.no":true,"kopervik.no":true,"krokstadelva.no":true,"langevag.no":true,"xn--langevg-jxa.no":true,"leirvik.no":true,"mjondalen.no":true,"xn--mjndalen-64a.no":true,"mo-i-rana.no":true,"mosjoen.no":true,"xn--mosjen-eya.no":true,"nesoddtangen.no":true,"orkanger.no":true,"osoyro.no":true,"xn--osyro-wua.no":true,"raholt.no":true,"xn--rholt-mra.no":true,"sandnessjoen.no":true,"xn--sandnessjen-ogb.no":true,"skedsmokorset.no":true,"slattum.no":true,"spjelkavik.no":true,"stathelle.no":true,"stavern.no":true,"stjordalshalsen.no":true,"xn--stjrdalshalsen-sqb.no":true,"tananger.no":true,"tranby.no":true,"vossevangen.no":true,"afjord.no":true,"xn--fjord-lra.no":true,"agdenes.no":true,"al.no":true,"xn--l-1fa.no":true,"alesund.no":true,"xn--lesund-hua.no":true,"alstahaug.no":true,"alta.no":true,"xn--lt-liac.no":true,"alaheadju.no":true,"xn--laheadju-7ya.no":true,"alvdal.no":true,"amli.no":true,"xn--mli-tla.no":true,"amot.no":true,"xn--mot-tla.no":true,"andebu.no":true,"andoy.no":true,"xn--andy-ira.no":true,"andasuolo.no":true,"ardal.no":true,"xn--rdal-poa.no":true,"aremark.no":true,"arendal.no":true,"xn--s-1fa.no":true,"aseral.no":true,"xn--seral-lra.no":true,"asker.no":true,"askim.no":true,"askvoll.no":true,"askoy.no":true,"xn--asky-ira.no":true,"asnes.no":true,"xn--snes-poa.no":true,"audnedaln.no":true,"aukra.no":true,"aure.no":true,"aurland.no":true,"aurskog-holand.no":true,"xn--aurskog-hland-jnb.no":true,"austevoll.no":true,"austrheim.no":true,"averoy.no":true,"xn--avery-yua.no":true,"balestrand.no":true,"ballangen.no":true,"balat.no":true,"xn--blt-elab.no":true,"balsfjord.no":true,"bahccavuotna.no":true,"xn--bhccavuotna-k7a.no":true,"bamble.no":true,"bardu.no":true,"beardu.no":true,"beiarn.no":true,"bajddar.no":true,"xn--bjddar-pta.no":true,"baidar.no":true,"xn--bidr-5nac.no":true,"berg.no":true,"bergen.no":true,"berlevag.no":true,"xn--berlevg-jxa.no":true,"bearalvahki.no":true,"xn--bearalvhki-y4a.no":true,"bindal.no":true,"birkenes.no":true,"bjarkoy.no":true,"xn--bjarky-fya.no":true,"bjerkreim.no":true,"bjugn.no":true,"bodo.no":true,"xn--bod-2na.no":true,"badaddja.no":true,"xn--bdddj-mrabd.no":true,"budejju.no":true,"bokn.no":true,"bremanger.no":true,"bronnoy.no":true,"xn--brnny-wuac.no":true,"bygland.no":true,"bykle.no":true,"barum.no":true,"xn--brum-voa.no":true,"bo.telemark.no":true,"xn--b-5ga.telemark.no":true,"bo.nordland.no":true,"xn--b-5ga.nordland.no":true,"bievat.no":true,"xn--bievt-0qa.no":true,"bomlo.no":true,"xn--bmlo-gra.no":true,"batsfjord.no":true,"xn--btsfjord-9za.no":true,"bahcavuotna.no":true,"xn--bhcavuotna-s4a.no":true,"dovre.no":true,"drammen.no":true,"drangedal.no":true,"dyroy.no":true,"xn--dyry-ira.no":true,"donna.no":true,"xn--dnna-gra.no":true,"eid.no":true,"eidfjord.no":true,"eidsberg.no":true,"eidskog.no":true,"eidsvoll.no":true,"eigersund.no":true,"elverum.no":true,"enebakk.no":true,"engerdal.no":true,"etne.no":true,"etnedal.no":true,"evenes.no":true,"evenassi.no":true,"xn--eveni-0qa01ga.no":true,"evje-og-hornnes.no":true,"farsund.no":true,"fauske.no":true,"fuossko.no":true,"fuoisku.no":true,"fedje.no":true,"fet.no":true,"finnoy.no":true,"xn--finny-yua.no":true,"fitjar.no":true,"fjaler.no":true,"fjell.no":true,"flakstad.no":true,"flatanger.no":true,"flekkefjord.no":true,"flesberg.no":true,"flora.no":true,"fla.no":true,"xn--fl-zia.no":true,"folldal.no":true,"forsand.no":true,"fosnes.no":true,"frei.no":true,"frogn.no":true,"froland.no":true,"frosta.no":true,"frana.no":true,"xn--frna-woa.no":true,"froya.no":true,"xn--frya-hra.no":true,"fusa.no":true,"fyresdal.no":true,"forde.no":true,"xn--frde-gra.no":true,"gamvik.no":true,"gangaviika.no":true,"xn--ggaviika-8ya47h.no":true,"gaular.no":true,"gausdal.no":true,"gildeskal.no":true,"xn--gildeskl-g0a.no":true,"giske.no":true,"gjemnes.no":true,"gjerdrum.no":true,"gjerstad.no":true,"gjesdal.no":true,"gjovik.no":true,"xn--gjvik-wua.no":true,"gloppen.no":true,"gol.no":true,"gran.no":true,"grane.no":true,"granvin.no":true,"gratangen.no":true,"grimstad.no":true,"grong.no":true,"kraanghke.no":true,"xn--kranghke-b0a.no":true,"grue.no":true,"gulen.no":true,"hadsel.no":true,"halden.no":true,"halsa.no":true,"hamar.no":true,"hamaroy.no":true,"habmer.no":true,"xn--hbmer-xqa.no":true,"hapmir.no":true,"xn--hpmir-xqa.no":true,"hammerfest.no":true,"hammarfeasta.no":true,"xn--hmmrfeasta-s4ac.no":true,"haram.no":true,"hareid.no":true,"harstad.no":true,"hasvik.no":true,"aknoluokta.no":true,"xn--koluokta-7ya57h.no":true,"hattfjelldal.no":true,"aarborte.no":true,"haugesund.no":true,"hemne.no":true,"hemnes.no":true,"hemsedal.no":true,"heroy.more-og-romsdal.no":true,"xn--hery-ira.xn--mre-og-romsdal-qqb.no":true,"heroy.nordland.no":true,"xn--hery-ira.nordland.no":true,"hitra.no":true,"hjartdal.no":true,"hjelmeland.no":true,"hobol.no":true,"xn--hobl-ira.no":true,"hof.no":true,"hol.no":true,"hole.no":true,"holmestrand.no":true,"holtalen.no":true,"xn--holtlen-hxa.no":true,"hornindal.no":true,"horten.no":true,"hurdal.no":true,"hurum.no":true,"hvaler.no":true,"hyllestad.no":true,"hagebostad.no":true,"xn--hgebostad-g3a.no":true,"hoyanger.no":true,"xn--hyanger-q1a.no":true,"hoylandet.no":true,"xn--hylandet-54a.no":true,"ha.no":true,"xn--h-2fa.no":true,"ibestad.no":true,"inderoy.no":true,"xn--indery-fya.no":true,"iveland.no":true,"jevnaker.no":true,"jondal.no":true,"jolster.no":true,"xn--jlster-bya.no":true,"karasjok.no":true,"karasjohka.no":true,"xn--krjohka-hwab49j.no":true,"karlsoy.no":true,"galsa.no":true,"xn--gls-elac.no":true,"karmoy.no":true,"xn--karmy-yua.no":true,"kautokeino.no":true,"guovdageaidnu.no":true,"klepp.no":true,"klabu.no":true,"xn--klbu-woa.no":true,"kongsberg.no":true,"kongsvinger.no":true,"kragero.no":true,"xn--krager-gya.no":true,"kristiansand.no":true,"kristiansund.no":true,"krodsherad.no":true,"xn--krdsherad-m8a.no":true,"kvalsund.no":true,"rahkkeravju.no":true,"xn--rhkkervju-01af.no":true,"kvam.no":true,"kvinesdal.no":true,"kvinnherad.no":true,"kviteseid.no":true,"kvitsoy.no":true,"xn--kvitsy-fya.no":true,"kvafjord.no":true,"xn--kvfjord-nxa.no":true,"giehtavuoatna.no":true,"kvanangen.no":true,"xn--kvnangen-k0a.no":true,"navuotna.no":true,"xn--nvuotna-hwa.no":true,"kafjord.no":true,"xn--kfjord-iua.no":true,"gaivuotna.no":true,"xn--givuotna-8ya.no":true,"larvik.no":true,"lavangen.no":true,"lavagis.no":true,"loabat.no":true,"xn--loabt-0qa.no":true,"lebesby.no":true,"davvesiida.no":true,"leikanger.no":true,"leirfjord.no":true,"leka.no":true,"leksvik.no":true,"lenvik.no":true,"leangaviika.no":true,"xn--leagaviika-52b.no":true,"lesja.no":true,"levanger.no":true,"lier.no":true,"lierne.no":true,"lillehammer.no":true,"lillesand.no":true,"lindesnes.no":true,"lindas.no":true,"xn--linds-pra.no":true,"lom.no":true,"loppa.no":true,"lahppi.no":true,"xn--lhppi-xqa.no":true,"lund.no":true,"lunner.no":true,"luroy.no":true,"xn--lury-ira.no":true,"luster.no":true,"lyngdal.no":true,"lyngen.no":true,"ivgu.no":true,"lardal.no":true,"lerdal.no":true,"xn--lrdal-sra.no":true,"lodingen.no":true,"xn--ldingen-q1a.no":true,"lorenskog.no":true,"xn--lrenskog-54a.no":true,"loten.no":true,"xn--lten-gra.no":true,"malvik.no":true,"masoy.no":true,"xn--msy-ula0h.no":true,"muosat.no":true,"xn--muost-0qa.no":true,"mandal.no":true,"marker.no":true,"marnardal.no":true,"masfjorden.no":true,"meland.no":true,"meldal.no":true,"melhus.no":true,"meloy.no":true,"xn--mely-ira.no":true,"meraker.no":true,"xn--merker-kua.no":true,"moareke.no":true,"xn--moreke-jua.no":true,"midsund.no":true,"midtre-gauldal.no":true,"modalen.no":true,"modum.no":true,"molde.no":true,"moskenes.no":true,"moss.no":true,"mosvik.no":true,"malselv.no":true,"xn--mlselv-iua.no":true,"malatvuopmi.no":true,"xn--mlatvuopmi-s4a.no":true,"namdalseid.no":true,"aejrie.no":true,"namsos.no":true,"namsskogan.no":true,"naamesjevuemie.no":true,"xn--nmesjevuemie-tcba.no":true,"laakesvuemie.no":true,"nannestad.no":true,"narvik.no":true,"narviika.no":true,"naustdal.no":true,"nedre-eiker.no":true,"nes.akershus.no":true,"nes.buskerud.no":true,"nesna.no":true,"nesodden.no":true,"nesseby.no":true,"unjarga.no":true,"xn--unjrga-rta.no":true,"nesset.no":true,"nissedal.no":true,"nittedal.no":true,"nord-aurdal.no":true,"nord-fron.no":true,"nord-odal.no":true,"norddal.no":true,"nordkapp.no":true,"davvenjarga.no":true,"xn--davvenjrga-y4a.no":true,"nordre-land.no":true,"nordreisa.no":true,"raisa.no":true,"xn--risa-5na.no":true,"nore-og-uvdal.no":true,"notodden.no":true,"naroy.no":true,"xn--nry-yla5g.no":true,"notteroy.no":true,"xn--nttery-byae.no":true,"odda.no":true,"oksnes.no":true,"xn--ksnes-uua.no":true,"oppdal.no":true,"oppegard.no":true,"xn--oppegrd-ixa.no":true,"orkdal.no":true,"orland.no":true,"xn--rland-uua.no":true,"orskog.no":true,"xn--rskog-uua.no":true,"orsta.no":true,"xn--rsta-fra.no":true,"os.hedmark.no":true,"os.hordaland.no":true,"osen.no":true,"osteroy.no":true,"xn--ostery-fya.no":true,"ostre-toten.no":true,"xn--stre-toten-zcb.no":true,"overhalla.no":true,"ovre-eiker.no":true,"xn--vre-eiker-k8a.no":true,"oyer.no":true,"xn--yer-zna.no":true,"oygarden.no":true,"xn--ygarden-p1a.no":true,"oystre-slidre.no":true,"xn--ystre-slidre-ujb.no":true,"porsanger.no":true,"porsangu.no":true,"xn--porsgu-sta26f.no":true,"porsgrunn.no":true,"radoy.no":true,"xn--rady-ira.no":true,"rakkestad.no":true,"rana.no":true,"ruovat.no":true,"randaberg.no":true,"rauma.no":true,"rendalen.no":true,"rennebu.no":true,"rennesoy.no":true,"xn--rennesy-v1a.no":true,"rindal.no":true,"ringebu.no":true,"ringerike.no":true,"ringsaker.no":true,"rissa.no":true,"risor.no":true,"xn--risr-ira.no":true,"roan.no":true,"rollag.no":true,"rygge.no":true,"ralingen.no":true,"xn--rlingen-mxa.no":true,"rodoy.no":true,"xn--rdy-0nab.no":true,"romskog.no":true,"xn--rmskog-bya.no":true,"roros.no":true,"xn--rros-gra.no":true,"rost.no":true,"xn--rst-0na.no":true,"royken.no":true,"xn--ryken-vua.no":true,"royrvik.no":true,"xn--ryrvik-bya.no":true,"rade.no":true,"xn--rde-ula.no":true,"salangen.no":true,"siellak.no":true,"saltdal.no":true,"salat.no":true,"xn--slt-elab.no":true,"xn--slat-5na.no":true,"samnanger.no":true,"sande.more-og-romsdal.no":true,"sande.xn--mre-og-romsdal-qqb.no":true,"sande.vestfold.no":true,"sandefjord.no":true,"sandnes.no":true,"sandoy.no":true,"xn--sandy-yua.no":true,"sarpsborg.no":true,"sauda.no":true,"sauherad.no":true,"sel.no":true,"selbu.no":true,"selje.no":true,"seljord.no":true,"sigdal.no":true,"siljan.no":true,"sirdal.no":true,"skaun.no":true,"skedsmo.no":true,"ski.no":true,"skien.no":true,"skiptvet.no":true,"skjervoy.no":true,"xn--skjervy-v1a.no":true,"skierva.no":true,"xn--skierv-uta.no":true,"skjak.no":true,"xn--skjk-soa.no":true,"skodje.no":true,"skanland.no":true,"xn--sknland-fxa.no":true,"skanit.no":true,"xn--sknit-yqa.no":true,"smola.no":true,"xn--smla-hra.no":true,"snillfjord.no":true,"snasa.no":true,"xn--snsa-roa.no":true,"snoasa.no":true,"snaase.no":true,"xn--snase-nra.no":true,"sogndal.no":true,"sokndal.no":true,"sola.no":true,"solund.no":true,"songdalen.no":true,"sortland.no":true,"spydeberg.no":true,"stange.no":true,"stavanger.no":true,"steigen.no":true,"steinkjer.no":true,"stjordal.no":true,"xn--stjrdal-s1a.no":true,"stokke.no":true,"stor-elvdal.no":true,"stord.no":true,"stordal.no":true,"storfjord.no":true,"omasvuotna.no":true,"strand.no":true,"stranda.no":true,"stryn.no":true,"sula.no":true,"suldal.no":true,"sund.no":true,"sunndal.no":true,"surnadal.no":true,"sveio.no":true,"svelvik.no":true,"sykkylven.no":true,"sogne.no":true,"xn--sgne-gra.no":true,"somna.no":true,"xn--smna-gra.no":true,"sondre-land.no":true,"xn--sndre-land-0cb.no":true,"sor-aurdal.no":true,"xn--sr-aurdal-l8a.no":true,"sor-fron.no":true,"xn--sr-fron-q1a.no":true,"sor-odal.no":true,"xn--sr-odal-q1a.no":true,"sor-varanger.no":true,"xn--sr-varanger-ggb.no":true,"matta-varjjat.no":true,"xn--mtta-vrjjat-k7af.no":true,"sorfold.no":true,"xn--srfold-bya.no":true,"sorreisa.no":true,"xn--srreisa-q1a.no":true,"sorum.no":true,"xn--srum-gra.no":true,"tana.no":true,"deatnu.no":true,"time.no":true,"tingvoll.no":true,"tinn.no":true,"tjeldsund.no":true,"dielddanuorri.no":true,"tjome.no":true,"xn--tjme-hra.no":true,"tokke.no":true,"tolga.no":true,"torsken.no":true,"tranoy.no":true,"xn--trany-yua.no":true,"tromso.no":true,"xn--troms-zua.no":true,"tromsa.no":true,"romsa.no":true,"trondheim.no":true,"troandin.no":true,"trysil.no":true,"trana.no":true,"xn--trna-woa.no":true,"trogstad.no":true,"xn--trgstad-r1a.no":true,"tvedestrand.no":true,"tydal.no":true,"tynset.no":true,"tysfjord.no":true,"divtasvuodna.no":true,"divttasvuotna.no":true,"tysnes.no":true,"tysvar.no":true,"xn--tysvr-vra.no":true,"tonsberg.no":true,"xn--tnsberg-q1a.no":true,"ullensaker.no":true,"ullensvang.no":true,"ulvik.no":true,"utsira.no":true,"vadso.no":true,"xn--vads-jra.no":true,"cahcesuolo.no":true,"xn--hcesuolo-7ya35b.no":true,"vaksdal.no":true,"valle.no":true,"vang.no":true,"vanylven.no":true,"vardo.no":true,"xn--vard-jra.no":true,"varggat.no":true,"xn--vrggt-xqad.no":true,"vefsn.no":true,"vaapste.no":true,"vega.no":true,"vegarshei.no":true,"xn--vegrshei-c0a.no":true,"vennesla.no":true,"verdal.no":true,"verran.no":true,"vestby.no":true,"vestnes.no":true,"vestre-slidre.no":true,"vestre-toten.no":true,"vestvagoy.no":true,"xn--vestvgy-ixa6o.no":true,"vevelstad.no":true,"vik.no":true,"vikna.no":true,"vindafjord.no":true,"volda.no":true,"voss.no":true,"varoy.no":true,"xn--vry-yla5g.no":true,"vagan.no":true,"xn--vgan-qoa.no":true,"voagat.no":true,"vagsoy.no":true,"xn--vgsy-qoa0j.no":true,"vaga.no":true,"xn--vg-yiab.no":true,"valer.ostfold.no":true,"xn--vler-qoa.xn--stfold-9xa.no":true,"valer.hedmark.no":true,"xn--vler-qoa.hedmark.no":true,"*.np":true,"nr":true,"biz.nr":true,"info.nr":true,"gov.nr":true,"edu.nr":true,"org.nr":true,"net.nr":true,"com.nr":true,"nu":true,"*.nz":true,"*.om":true,"mediaphone.om":false,"nawrastelecom.om":false,"nawras.om":false,"omanmobile.om":false,"omanpost.om":false,"omantel.om":false,"rakpetroleum.om":false,"siemens.om":false,"songfest.om":false,"statecouncil.om":false,"org":true,"pa":true,"ac.pa":true,"gob.pa":true,"com.pa":true,"org.pa":true,"sld.pa":true,"edu.pa":true,"net.pa":true,"ing.pa":true,"abo.pa":true,"med.pa":true,"nom.pa":true,"pe":true,"edu.pe":true,"gob.pe":true,"nom.pe":true,"mil.pe":true,"org.pe":true,"com.pe":true,"net.pe":true,"pf":true,"com.pf":true,"org.pf":true,"edu.pf":true,"*.pg":true,"ph":true,"com.ph":true,"net.ph":true,"org.ph":true,"gov.ph":true,"edu.ph":true,"ngo.ph":true,"mil.ph":true,"i.ph":true,"pk":true,"com.pk":true,"net.pk":true,"edu.pk":true,"org.pk":true,"fam.pk":true,"biz.pk":true,"web.pk":true,"gov.pk":true,"gob.pk":true,"gok.pk":true,"gon.pk":true,"gop.pk":true,"gos.pk":true,"info.pk":true,"pl":true,"aid.pl":true,"agro.pl":true,"atm.pl":true,"auto.pl":true,"biz.pl":true,"com.pl":true,"edu.pl":true,"gmina.pl":true,"gsm.pl":true,"info.pl":true,"mail.pl":true,"miasta.pl":true,"media.pl":true,"mil.pl":true,"net.pl":true,"nieruchomosci.pl":true,"nom.pl":true,"org.pl":true,"pc.pl":true,"powiat.pl":true,"priv.pl":true,"realestate.pl":true,"rel.pl":true,"sex.pl":true,"shop.pl":true,"sklep.pl":true,"sos.pl":true,"szkola.pl":true,"targi.pl":true,"tm.pl":true,"tourism.pl":true,"travel.pl":true,"turystyka.pl":true,"6bone.pl":true,"art.pl":true,"mbone.pl":true,"gov.pl":true,"uw.gov.pl":true,"um.gov.pl":true,"ug.gov.pl":true,"upow.gov.pl":true,"starostwo.gov.pl":true,"so.gov.pl":true,"sr.gov.pl":true,"po.gov.pl":true,"pa.gov.pl":true,"ngo.pl":true,"irc.pl":true,"usenet.pl":true,"augustow.pl":true,"babia-gora.pl":true,"bedzin.pl":true,"beskidy.pl":true,"bialowieza.pl":true,"bialystok.pl":true,"bielawa.pl":true,"bieszczady.pl":true,"boleslawiec.pl":true,"bydgoszcz.pl":true,"bytom.pl":true,"cieszyn.pl":true,"czeladz.pl":true,"czest.pl":true,"dlugoleka.pl":true,"elblag.pl":true,"elk.pl":true,"glogow.pl":true,"gniezno.pl":true,"gorlice.pl":true,"grajewo.pl":true,"ilawa.pl":true,"jaworzno.pl":true,"jelenia-gora.pl":true,"jgora.pl":true,"kalisz.pl":true,"kazimierz-dolny.pl":true,"karpacz.pl":true,"kartuzy.pl":true,"kaszuby.pl":true,"katowice.pl":true,"kepno.pl":true,"ketrzyn.pl":true,"klodzko.pl":true,"kobierzyce.pl":true,"kolobrzeg.pl":true,"konin.pl":true,"konskowola.pl":true,"kutno.pl":true,"lapy.pl":true,"lebork.pl":true,"legnica.pl":true,"lezajsk.pl":true,"limanowa.pl":true,"lomza.pl":true,"lowicz.pl":true,"lubin.pl":true,"lukow.pl":true,"malbork.pl":true,"malopolska.pl":true,"mazowsze.pl":true,"mazury.pl":true,"mielec.pl":true,"mielno.pl":true,"mragowo.pl":true,"naklo.pl":true,"nowaruda.pl":true,"nysa.pl":true,"olawa.pl":true,"olecko.pl":true,"olkusz.pl":true,"olsztyn.pl":true,"opoczno.pl":true,"opole.pl":true,"ostroda.pl":true,"ostroleka.pl":true,"ostrowiec.pl":true,"ostrowwlkp.pl":true,"pila.pl":true,"pisz.pl":true,"podhale.pl":true,"podlasie.pl":true,"polkowice.pl":true,"pomorze.pl":true,"pomorskie.pl":true,"prochowice.pl":true,"pruszkow.pl":true,"przeworsk.pl":true,"pulawy.pl":true,"radom.pl":true,"rawa-maz.pl":true,"rybnik.pl":true,"rzeszow.pl":true,"sanok.pl":true,"sejny.pl":true,"siedlce.pl":true,"slask.pl":true,"slupsk.pl":true,"sosnowiec.pl":true,"stalowa-wola.pl":true,"skoczow.pl":true,"starachowice.pl":true,"stargard.pl":true,"suwalki.pl":true,"swidnica.pl":true,"swiebodzin.pl":true,"swinoujscie.pl":true,"szczecin.pl":true,"szczytno.pl":true,"tarnobrzeg.pl":true,"tgory.pl":true,"turek.pl":true,"tychy.pl":true,"ustka.pl":true,"walbrzych.pl":true,"warmia.pl":true,"warszawa.pl":true,"waw.pl":true,"wegrow.pl":true,"wielun.pl":true,"wlocl.pl":true,"wloclawek.pl":true,"wodzislaw.pl":true,"wolomin.pl":true,"wroclaw.pl":true,"zachpomor.pl":true,"zagan.pl":true,"zarow.pl":true,"zgora.pl":true,"zgorzelec.pl":true,"gda.pl":true,"gdansk.pl":true,"gdynia.pl":true,"med.pl":true,"sopot.pl":true,"gliwice.pl":true,"krakow.pl":true,"poznan.pl":true,"wroc.pl":true,"zakopane.pl":true,"pm":true,"pn":true,"gov.pn":true,"co.pn":true,"org.pn":true,"edu.pn":true,"net.pn":true,"pr":true,"com.pr":true,"net.pr":true,"org.pr":true,"gov.pr":true,"edu.pr":true,"isla.pr":true,"pro.pr":true,"biz.pr":true,"info.pr":true,"name.pr":true,"est.pr":true,"prof.pr":true,"ac.pr":true,"pro":true,"aca.pro":true,"bar.pro":true,"cpa.pro":true,"jur.pro":true,"law.pro":true,"med.pro":true,"eng.pro":true,"ps":true,"edu.ps":true,"gov.ps":true,"sec.ps":true,"plo.ps":true,"com.ps":true,"org.ps":true,"net.ps":true,"pt":true,"net.pt":true,"gov.pt":true,"org.pt":true,"edu.pt":true,"int.pt":true,"publ.pt":true,"com.pt":true,"nome.pt":true,"pw":true,"co.pw":true,"ne.pw":true,"or.pw":true,"ed.pw":true,"go.pw":true,"belau.pw":true,"*.py":true,"qa":true,"com.qa":true,"edu.qa":true,"gov.qa":true,"mil.qa":true,"name.qa":true,"net.qa":true,"org.qa":true,"sch.qa":true,"re":true,"com.re":true,"asso.re":true,"nom.re":true,"ro":true,"com.ro":true,"org.ro":true,"tm.ro":true,"nt.ro":true,"nom.ro":true,"info.ro":true,"rec.ro":true,"arts.ro":true,"firm.ro":true,"store.ro":true,"www.ro":true,"rs":true,"co.rs":true,"org.rs":true,"edu.rs":true,"ac.rs":true,"gov.rs":true,"in.rs":true,"ru":true,"ac.ru":true,"com.ru":true,"edu.ru":true,"int.ru":true,"net.ru":true,"org.ru":true,"pp.ru":true,"adygeya.ru":true,"altai.ru":true,"amur.ru":true,"arkhangelsk.ru":true,"astrakhan.ru":true,"bashkiria.ru":true,"belgorod.ru":true,"bir.ru":true,"bryansk.ru":true,"buryatia.ru":true,"cbg.ru":true,"chel.ru":true,"chelyabinsk.ru":true,"chita.ru":true,"chukotka.ru":true,"chuvashia.ru":true,"dagestan.ru":true,"dudinka.ru":true,"e-burg.ru":true,"grozny.ru":true,"irkutsk.ru":true,"ivanovo.ru":true,"izhevsk.ru":true,"jar.ru":true,"joshkar-ola.ru":true,"kalmykia.ru":true,"kaluga.ru":true,"kamchatka.ru":true,"karelia.ru":true,"kazan.ru":true,"kchr.ru":true,"kemerovo.ru":true,"khabarovsk.ru":true,"khakassia.ru":true,"khv.ru":true,"kirov.ru":true,"koenig.ru":true,"komi.ru":true,"kostroma.ru":true,"krasnoyarsk.ru":true,"kuban.ru":true,"kurgan.ru":true,"kursk.ru":true,"lipetsk.ru":true,"magadan.ru":true,"mari.ru":true,"mari-el.ru":true,"marine.ru":true,"mordovia.ru":true,"mosreg.ru":true,"msk.ru":true,"murmansk.ru":true,"nalchik.ru":true,"nnov.ru":true,"nov.ru":true,"novosibirsk.ru":true,"nsk.ru":true,"omsk.ru":true,"orenburg.ru":true,"oryol.ru":true,"palana.ru":true,"penza.ru":true,"perm.ru":true,"pskov.ru":true,"ptz.ru":true,"rnd.ru":true,"ryazan.ru":true,"sakhalin.ru":true,"samara.ru":true,"saratov.ru":true,"simbirsk.ru":true,"smolensk.ru":true,"spb.ru":true,"stavropol.ru":true,"stv.ru":true,"surgut.ru":true,"tambov.ru":true,"tatarstan.ru":true,"tom.ru":true,"tomsk.ru":true,"tsaritsyn.ru":true,"tsk.ru":true,"tula.ru":true,"tuva.ru":true,"tver.ru":true,"tyumen.ru":true,"udm.ru":true,"udmurtia.ru":true,"ulan-ude.ru":true,"vladikavkaz.ru":true,"vladimir.ru":true,"vladivostok.ru":true,"volgograd.ru":true,"vologda.ru":true,"voronezh.ru":true,"vrn.ru":true,"vyatka.ru":true,"yakutia.ru":true,"yamal.ru":true,"yaroslavl.ru":true,"yekaterinburg.ru":true,"yuzhno-sakhalinsk.ru":true,"amursk.ru":true,"baikal.ru":true,"cmw.ru":true,"fareast.ru":true,"jamal.ru":true,"kms.ru":true,"k-uralsk.ru":true,"kustanai.ru":true,"kuzbass.ru":true,"magnitka.ru":true,"mytis.ru":true,"nakhodka.ru":true,"nkz.ru":true,"norilsk.ru":true,"oskol.ru":true,"pyatigorsk.ru":true,"rubtsovsk.ru":true,"snz.ru":true,"syzran.ru":true,"vdonsk.ru":true,"zgrad.ru":true,"gov.ru":true,"mil.ru":true,"test.ru":true,"rw":true,"gov.rw":true,"net.rw":true,"edu.rw":true,"ac.rw":true,"com.rw":true,"co.rw":true,"int.rw":true,"mil.rw":true,"gouv.rw":true,"sa":true,"com.sa":true,"net.sa":true,"org.sa":true,"gov.sa":true,"med.sa":true,"pub.sa":true,"edu.sa":true,"sch.sa":true,"sb":true,"com.sb":true,"edu.sb":true,"gov.sb":true,"net.sb":true,"org.sb":true,"sc":true,"com.sc":true,"gov.sc":true,"net.sc":true,"org.sc":true,"edu.sc":true,"sd":true,"com.sd":true,"net.sd":true,"org.sd":true,"edu.sd":true,"med.sd":true,"gov.sd":true,"info.sd":true,"se":true,"a.se":true,"ac.se":true,"b.se":true,"bd.se":true,"brand.se":true,"c.se":true,"d.se":true,"e.se":true,"f.se":true,"fh.se":true,"fhsk.se":true,"fhv.se":true,"g.se":true,"h.se":true,"i.se":true,"k.se":true,"komforb.se":true,"kommunalforbund.se":true,"komvux.se":true,"l.se":true,"lanbib.se":true,"m.se":true,"n.se":true,"naturbruksgymn.se":true,"o.se":true,"org.se":true,"p.se":true,"parti.se":true,"pp.se":true,"press.se":true,"r.se":true,"s.se":true,"sshn.se":true,"t.se":true,"tm.se":true,"u.se":true,"w.se":true,"x.se":true,"y.se":true,"z.se":true,"sg":true,"com.sg":true,"net.sg":true,"org.sg":true,"gov.sg":true,"edu.sg":true,"per.sg":true,"sh":true,"si":true,"sk":true,"sl":true,"com.sl":true,"net.sl":true,"edu.sl":true,"gov.sl":true,"org.sl":true,"sm":true,"sn":true,"art.sn":true,"com.sn":true,"edu.sn":true,"gouv.sn":true,"org.sn":true,"perso.sn":true,"univ.sn":true,"so":true,"com.so":true,"net.so":true,"org.so":true,"sr":true,"st":true,"co.st":true,"com.st":true,"consulado.st":true,"edu.st":true,"embaixada.st":true,"gov.st":true,"mil.st":true,"net.st":true,"org.st":true,"principe.st":true,"saotome.st":true,"store.st":true,"su":true,"*.sv":true,"sy":true,"edu.sy":true,"gov.sy":true,"net.sy":true,"mil.sy":true,"com.sy":true,"org.sy":true,"sz":true,"co.sz":true,"ac.sz":true,"org.sz":true,"tc":true,"td":true,"tel":true,"tf":true,"tg":true,"th":true,"ac.th":true,"co.th":true,"go.th":true,"in.th":true,"mi.th":true,"net.th":true,"or.th":true,"tj":true,"ac.tj":true,"biz.tj":true,"co.tj":true,"com.tj":true,"edu.tj":true,"go.tj":true,"gov.tj":true,"int.tj":true,"mil.tj":true,"name.tj":true,"net.tj":true,"nic.tj":true,"org.tj":true,"test.tj":true,"web.tj":true,"tk":true,"tl":true,"gov.tl":true,"tm":true,"tn":true,"com.tn":true,"ens.tn":true,"fin.tn":true,"gov.tn":true,"ind.tn":true,"intl.tn":true,"nat.tn":true,"net.tn":true,"org.tn":true,"info.tn":true,"perso.tn":true,"tourism.tn":true,"edunet.tn":true,"rnrt.tn":true,"rns.tn":true,"rnu.tn":true,"mincom.tn":true,"agrinet.tn":true,"defense.tn":true,"turen.tn":true,"to":true,"com.to":true,"gov.to":true,"net.to":true,"org.to":true,"edu.to":true,"mil.to":true,"*.tr":true,"nic.tr":false,"gov.nc.tr":true,"travel":true,"tt":true,"co.tt":true,"com.tt":true,"org.tt":true,"net.tt":true,"biz.tt":true,"info.tt":true,"pro.tt":true,"int.tt":true,"coop.tt":true,"jobs.tt":true,"mobi.tt":true,"travel.tt":true,"museum.tt":true,"aero.tt":true,"name.tt":true,"gov.tt":true,"edu.tt":true,"tv":true,"tw":true,"edu.tw":true,"gov.tw":true,"mil.tw":true,"com.tw":true,"net.tw":true,"org.tw":true,"idv.tw":true,"game.tw":true,"ebiz.tw":true,"club.tw":true,"xn--zf0ao64a.tw":true,"xn--uc0atv.tw":true,"xn--czrw28b.tw":true,"ac.tz":true,"co.tz":true,"go.tz":true,"mil.tz":true,"ne.tz":true,"or.tz":true,"sc.tz":true,"ua":true,"com.ua":true,"edu.ua":true,"gov.ua":true,"in.ua":true,"net.ua":true,"org.ua":true,"cherkassy.ua":true,"chernigov.ua":true,"chernovtsy.ua":true,"ck.ua":true,"cn.ua":true,"crimea.ua":true,"cv.ua":true,"dn.ua":true,"dnepropetrovsk.ua":true,"donetsk.ua":true,"dp.ua":true,"if.ua":true,"ivano-frankivsk.ua":true,"kh.ua":true,"kharkov.ua":true,"kherson.ua":true,"khmelnitskiy.ua":true,"kiev.ua":true,"kirovograd.ua":true,"km.ua":true,"kr.ua":true,"ks.ua":true,"kv.ua":true,"lg.ua":true,"lugansk.ua":true,"lutsk.ua":true,"lviv.ua":true,"mk.ua":true,"nikolaev.ua":true,"od.ua":true,"odessa.ua":true,"pl.ua":true,"poltava.ua":true,"rovno.ua":true,"rv.ua":true,"sebastopol.ua":true,"sumy.ua":true,"te.ua":true,"ternopil.ua":true,"uzhgorod.ua":true,"vinnica.ua":true,"vn.ua":true,"zaporizhzhe.ua":true,"zp.ua":true,"zhitomir.ua":true,"zt.ua":true,"co.ua":true,"pp.ua":true,"ug":true,"co.ug":true,"ac.ug":true,"sc.ug":true,"go.ug":true,"ne.ug":true,"or.ug":true,"*.uk":true,"*.sch.uk":true,"bl.uk":false,"british-library.uk":false,"icnet.uk":false,"jet.uk":false,"mod.uk":false,"nel.uk":false,"nhs.uk":false,"nic.uk":false,"nls.uk":false,"national-library-scotland.uk":false,"parliament.uk":false,"police.uk":false,"us":true,"dni.us":true,"fed.us":true,"isa.us":true,"kids.us":true,"nsn.us":true,"ak.us":true,"al.us":true,"ar.us":true,"as.us":true,"az.us":true,"ca.us":true,"co.us":true,"ct.us":true,"dc.us":true,"de.us":true,"fl.us":true,"ga.us":true,"gu.us":true,"hi.us":true,"ia.us":true,"id.us":true,"il.us":true,"in.us":true,"ks.us":true,"ky.us":true,"la.us":true,"ma.us":true,"md.us":true,"me.us":true,"mi.us":true,"mn.us":true,"mo.us":true,"ms.us":true,"mt.us":true,"nc.us":true,"nd.us":true,"ne.us":true,"nh.us":true,"nj.us":true,"nm.us":true,"nv.us":true,"ny.us":true,"oh.us":true,"ok.us":true,"or.us":true,"pa.us":true,"pr.us":true,"ri.us":true,"sc.us":true,"sd.us":true,"tn.us":true,"tx.us":true,"ut.us":true,"vi.us":true,"vt.us":true,"va.us":true,"wa.us":true,"wi.us":true,"wv.us":true,"wy.us":true,"k12.ak.us":true,"k12.al.us":true,"k12.ar.us":true,"k12.as.us":true,"k12.az.us":true,"k12.ca.us":true,"k12.co.us":true,"k12.ct.us":true,"k12.dc.us":true,"k12.de.us":true,"k12.fl.us":true,"k12.ga.us":true,"k12.gu.us":true,"k12.ia.us":true,"k12.id.us":true,"k12.il.us":true,"k12.in.us":true,"k12.ks.us":true,"k12.ky.us":true,"k12.la.us":true,"k12.ma.us":true,"k12.md.us":true,"k12.me.us":true,"k12.mi.us":true,"k12.mn.us":true,"k12.mo.us":true,"k12.ms.us":true,"k12.mt.us":true,"k12.nc.us":true,"k12.nd.us":true,"k12.ne.us":true,"k12.nh.us":true,"k12.nj.us":true,"k12.nm.us":true,"k12.nv.us":true,"k12.ny.us":true,"k12.oh.us":true,"k12.ok.us":true,"k12.or.us":true,"k12.pa.us":true,"k12.pr.us":true,"k12.ri.us":true,"k12.sc.us":true,"k12.sd.us":true,"k12.tn.us":true,"k12.tx.us":true,"k12.ut.us":true,"k12.vi.us":true,"k12.vt.us":true,"k12.va.us":true,"k12.wa.us":true,"k12.wi.us":true,"k12.wv.us":true,"k12.wy.us":true,"cc.ak.us":true,"cc.al.us":true,"cc.ar.us":true,"cc.as.us":true,"cc.az.us":true,"cc.ca.us":true,"cc.co.us":true,"cc.ct.us":true,"cc.dc.us":true,"cc.de.us":true,"cc.fl.us":true,"cc.ga.us":true,"cc.gu.us":true,"cc.hi.us":true,"cc.ia.us":true,"cc.id.us":true,"cc.il.us":true,"cc.in.us":true,"cc.ks.us":true,"cc.ky.us":true,"cc.la.us":true,"cc.ma.us":true,"cc.md.us":true,"cc.me.us":true,"cc.mi.us":true,"cc.mn.us":true,"cc.mo.us":true,"cc.ms.us":true,"cc.mt.us":true,"cc.nc.us":true,"cc.nd.us":true,"cc.ne.us":true,"cc.nh.us":true,"cc.nj.us":true,"cc.nm.us":true,"cc.nv.us":true,"cc.ny.us":true,"cc.oh.us":true,"cc.ok.us":true,"cc.or.us":true,"cc.pa.us":true,"cc.pr.us":true,"cc.ri.us":true,"cc.sc.us":true,"cc.sd.us":true,"cc.tn.us":true,"cc.tx.us":true,"cc.ut.us":true,"cc.vi.us":true,"cc.vt.us":true,"cc.va.us":true,"cc.wa.us":true,"cc.wi.us":true,"cc.wv.us":true,"cc.wy.us":true,"lib.ak.us":true,"lib.al.us":true,"lib.ar.us":true,"lib.as.us":true,"lib.az.us":true,"lib.ca.us":true,"lib.co.us":true,"lib.ct.us":true,"lib.dc.us":true,"lib.de.us":true,"lib.fl.us":true,"lib.ga.us":true,"lib.gu.us":true,"lib.hi.us":true,"lib.ia.us":true,"lib.id.us":true,"lib.il.us":true,"lib.in.us":true,"lib.ks.us":true,"lib.ky.us":true,"lib.la.us":true,"lib.ma.us":true,"lib.md.us":true,"lib.me.us":true,"lib.mi.us":true,"lib.mn.us":true,"lib.mo.us":true,"lib.ms.us":true,"lib.mt.us":true,"lib.nc.us":true,"lib.nd.us":true,"lib.ne.us":true,"lib.nh.us":true,"lib.nj.us":true,"lib.nm.us":true,"lib.nv.us":true,"lib.ny.us":true,"lib.oh.us":true,"lib.ok.us":true,"lib.or.us":true,"lib.pa.us":true,"lib.pr.us":true,"lib.ri.us":true,"lib.sc.us":true,"lib.sd.us":true,"lib.tn.us":true,"lib.tx.us":true,"lib.ut.us":true,"lib.vi.us":true,"lib.vt.us":true,"lib.va.us":true,"lib.wa.us":true,"lib.wi.us":true,"lib.wv.us":true,"lib.wy.us":true,"pvt.k12.ma.us":true,"chtr.k12.ma.us":true,"paroch.k12.ma.us":true,"*.uy":true,"uz":true,"com.uz":true,"co.uz":true,"va":true,"vc":true,"com.vc":true,"net.vc":true,"org.vc":true,"gov.vc":true,"mil.vc":true,"edu.vc":true,"*.ve":true,"vg":true,"vi":true,"co.vi":true,"com.vi":true,"k12.vi":true,"net.vi":true,"org.vi":true,"vn":true,"com.vn":true,"net.vn":true,"org.vn":true,"edu.vn":true,"gov.vn":true,"int.vn":true,"ac.vn":true,"biz.vn":true,"info.vn":true,"name.vn":true,"pro.vn":true,"health.vn":true,"vu":true,"wf":true,"ws":true,"com.ws":true,"net.ws":true,"org.ws":true,"gov.ws":true,"edu.ws":true,"yt":true,"xn--mgbaam7a8h":true,"xn--54b7fta0cc":true,"xn--fiqs8s":true,"xn--fiqz9s":true,"xn--lgbbat1ad8j":true,"xn--wgbh1c":true,"xn--node":true,"xn--j6w193g":true,"xn--h2brj9c":true,"xn--mgbbh1a71e":true,"xn--fpcrj9c3d":true,"xn--gecrj9c":true,"xn--s9brj9c":true,"xn--45brj9c":true,"xn--xkc2dl3a5ee0h":true,"xn--mgba3a4f16a":true,"xn--mgba3a4fra":true,"xn--mgbayh7gpa":true,"xn--3e0b707e":true,"xn--fzc2c9e2c":true,"xn--xkc2al3hye2a":true,"xn--mgbc0a9azcg":true,"xn--mgb9awbf":true,"xn--ygbi2ammx":true,"xn--90a3ac":true,"xn--p1ai":true,"xn--wgbl6a":true,"xn--mgberp4a5d4ar":true,"xn--mgberp4a5d4a87g":true,"xn--mgbqly7c0a67fbc":true,"xn--mgbqly7cvafr":true,"xn--ogbpf8fl":true,"xn--mgbtf8fl":true,"xn--yfro4i67o":true,"xn--clchc0ea0b2g2a9gcd":true,"xn--o3cw4h":true,"xn--pgbs0dh":true,"xn--kpry57d":true,"xn--kprw13d":true,"xn--nnx388a":true,"xn--j1amh":true,"xn--mgb2ddes":true,"xxx":true,"*.ye":true,"*.za":true,"*.zm":true,"*.zw":true,"biz.at":true,"info.at":true,"priv.at":true,"co.ca":true,"ar.com":true,"br.com":true,"cn.com":true,"de.com":true,"eu.com":true,"gb.com":true,"gr.com":true,"hu.com":true,"jpn.com":true,"kr.com":true,"no.com":true,"qc.com":true,"ru.com":true,"sa.com":true,"se.com":true,"uk.com":true,"us.com":true,"uy.com":true,"za.com":true,"gb.net":true,"jp.net":true,"se.net":true,"uk.net":true,"ae.org":true,"us.org":true,"com.de":true,"operaunite.com":true,"appspot.com":true,"iki.fi":true,"c.la":true,"za.net":true,"za.org":true,"co.nl":true,"co.no":true,"co.pl":true,"dyndns-at-home.com":true,"dyndns-at-work.com":true,"dyndns-blog.com":true,"dyndns-free.com":true,"dyndns-home.com":true,"dyndns-ip.com":true,"dyndns-mail.com":true,"dyndns-office.com":true,"dyndns-pics.com":true,"dyndns-remote.com":true,"dyndns-server.com":true,"dyndns-web.com":true,"dyndns-wiki.com":true,"dyndns-work.com":true,"dyndns.biz":true,"dyndns.info":true,"dyndns.org":true,"dyndns.tv":true,"at-band-camp.net":true,"ath.cx":true,"barrel-of-knowledge.info":true,"barrell-of-knowledge.info":true,"better-than.tv":true,"blogdns.com":true,"blogdns.net":true,"blogdns.org":true,"blogsite.org":true,"boldlygoingnowhere.org":true,"broke-it.net":true,"buyshouses.net":true,"cechire.com":true,"dnsalias.com":true,"dnsalias.net":true,"dnsalias.org":true,"dnsdojo.com":true,"dnsdojo.net":true,"dnsdojo.org":true,"does-it.net":true,"doesntexist.com":true,"doesntexist.org":true,"dontexist.com":true,"dontexist.net":true,"dontexist.org":true,"doomdns.com":true,"doomdns.org":true,"dvrdns.org":true,"dyn-o-saur.com":true,"dynalias.com":true,"dynalias.net":true,"dynalias.org":true,"dynathome.net":true,"dyndns.ws":true,"endofinternet.net":true,"endofinternet.org":true,"endoftheinternet.org":true,"est-a-la-maison.com":true,"est-a-la-masion.com":true,"est-le-patron.com":true,"est-mon-blogueur.com":true,"for-better.biz":true,"for-more.biz":true,"for-our.info":true,"for-some.biz":true,"for-the.biz":true,"forgot.her.name":true,"forgot.his.name":true,"from-ak.com":true,"from-al.com":true,"from-ar.com":true,"from-az.net":true,"from-ca.com":true,"from-co.net":true,"from-ct.com":true,"from-dc.com":true,"from-de.com":true,"from-fl.com":true,"from-ga.com":true,"from-hi.com":true,"from-ia.com":true,"from-id.com":true,"from-il.com":true,"from-in.com":true,"from-ks.com":true,"from-ky.com":true,"from-la.net":true,"from-ma.com":true,"from-md.com":true,"from-me.org":true,"from-mi.com":true,"from-mn.com":true,"from-mo.com":true,"from-ms.com":true,"from-mt.com":true,"from-nc.com":true,"from-nd.com":true,"from-ne.com":true,"from-nh.com":true,"from-nj.com":true,"from-nm.com":true,"from-nv.com":true,"from-ny.net":true,"from-oh.com":true,"from-ok.com":true,"from-or.com":true,"from-pa.com":true,"from-pr.com":true,"from-ri.com":true,"from-sc.com":true,"from-sd.com":true,"from-tn.com":true,"from-tx.com":true,"from-ut.com":true,"from-va.com":true,"from-vt.com":true,"from-wa.com":true,"from-wi.com":true,"from-wv.com":true,"from-wy.com":true,"ftpaccess.cc":true,"fuettertdasnetz.de":true,"game-host.org":true,"game-server.cc":true,"getmyip.com":true,"gets-it.net":true,"go.dyndns.org":true,"gotdns.com":true,"gotdns.org":true,"groks-the.info":true,"groks-this.info":true,"ham-radio-op.net":true,"here-for-more.info":true,"hobby-site.com":true,"hobby-site.org":true,"home.dyndns.org":true,"homedns.org":true,"homeftp.net":true,"homeftp.org":true,"homeip.net":true,"homelinux.com":true,"homelinux.net":true,"homelinux.org":true,"homeunix.com":true,"homeunix.net":true,"homeunix.org":true,"iamallama.com":true,"in-the-band.net":true,"is-a-anarchist.com":true,"is-a-blogger.com":true,"is-a-bookkeeper.com":true,"is-a-bruinsfan.org":true,"is-a-bulls-fan.com":true,"is-a-candidate.org":true,"is-a-caterer.com":true,"is-a-celticsfan.org":true,"is-a-chef.com":true,"is-a-chef.net":true,"is-a-chef.org":true,"is-a-conservative.com":true,"is-a-cpa.com":true,"is-a-cubicle-slave.com":true,"is-a-democrat.com":true,"is-a-designer.com":true,"is-a-doctor.com":true,"is-a-financialadvisor.com":true,"is-a-geek.com":true,"is-a-geek.net":true,"is-a-geek.org":true,"is-a-green.com":true,"is-a-guru.com":true,"is-a-hard-worker.com":true,"is-a-hunter.com":true,"is-a-knight.org":true,"is-a-landscaper.com":true,"is-a-lawyer.com":true,"is-a-liberal.com":true,"is-a-libertarian.com":true,"is-a-linux-user.org":true,"is-a-llama.com":true,"is-a-musician.com":true,"is-a-nascarfan.com":true,"is-a-nurse.com":true,"is-a-painter.com":true,"is-a-patsfan.org":true,"is-a-personaltrainer.com":true,"is-a-photographer.com":true,"is-a-player.com":true,"is-a-republican.com":true,"is-a-rockstar.com":true,"is-a-socialist.com":true,"is-a-soxfan.org":true,"is-a-student.com":true,"is-a-teacher.com":true,"is-a-techie.com":true,"is-a-therapist.com":true,"is-an-accountant.com":true,"is-an-actor.com":true,"is-an-actress.com":true,"is-an-anarchist.com":true,"is-an-artist.com":true,"is-an-engineer.com":true,"is-an-entertainer.com":true,"is-by.us":true,"is-certified.com":true,"is-found.org":true,"is-gone.com":true,"is-into-anime.com":true,"is-into-cars.com":true,"is-into-cartoons.com":true,"is-into-games.com":true,"is-leet.com":true,"is-lost.org":true,"is-not-certified.com":true,"is-saved.org":true,"is-slick.com":true,"is-uberleet.com":true,"is-very-bad.org":true,"is-very-evil.org":true,"is-very-good.org":true,"is-very-nice.org":true,"is-very-sweet.org":true,"is-with-theband.com":true,"isa-geek.com":true,"isa-geek.net":true,"isa-geek.org":true,"isa-hockeynut.com":true,"issmarterthanyou.com":true,"isteingeek.de":true,"istmein.de":true,"kicks-ass.net":true,"kicks-ass.org":true,"knowsitall.info":true,"land-4-sale.us":true,"lebtimnetz.de":true,"leitungsen.de":true,"likes-pie.com":true,"likescandy.com":true,"merseine.nu":true,"mine.nu":true,"misconfused.org":true,"mypets.ws":true,"myphotos.cc":true,"neat-url.com":true,"office-on-the.net":true,"on-the-web.tv":true,"podzone.net":true,"podzone.org":true,"readmyblog.org":true,"saves-the-whales.com":true,"scrapper-site.net":true,"scrapping.cc":true,"selfip.biz":true,"selfip.com":true,"selfip.info":true,"selfip.net":true,"selfip.org":true,"sells-for-less.com":true,"sells-for-u.com":true,"sells-it.net":true,"sellsyourhome.org":true,"servebbs.com":true,"servebbs.net":true,"servebbs.org":true,"serveftp.net":true,"serveftp.org":true,"servegame.org":true,"shacknet.nu":true,"simple-url.com":true,"space-to-rent.com":true,"stuff-4-sale.org":true,"stuff-4-sale.us":true,"teaches-yoga.com":true,"thruhere.net":true,"traeumtgerade.de":true,"webhop.biz":true,"webhop.info":true,"webhop.net":true,"webhop.org":true,"worse-than.tv":true,"writesthisblog.com":true}); // END of automatically generated file �����������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/tough-cookie/lib/store.js��������������������000644 �000766 �000024 �00000002164 12455173731 034305� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������'use strict'; /*jshint unused:false */ function Store() { } exports.Store = Store; // Stores may be synchronous, but are still required to use a // Continuation-Passing Style API. The CookieJar itself will expose a "*Sync" // API that converts from synchronous-callbacks to imperative style. Store.prototype.synchronous = false; Store.prototype.findCookie = function(domain, path, key, cb) { throw new Error('findCookie is not implemented'); }; Store.prototype.findCookies = function(domain, path, cb) { throw new Error('findCookies is not implemented'); }; Store.prototype.putCookie = function(cookie, cb) { throw new Error('putCookie is not implemented'); }; Store.prototype.updateCookie = function(oldCookie, newCookie, cb) { // recommended default implementation: // return this.putCookie(newCookie, cb); throw new Error('updateCookie is not implemented'); }; Store.prototype.removeCookie = function(domain, path, key, cb) { throw new Error('removeCookie is not implemented'); }; Store.prototype.removeCookies = function removeCookies(domain, path, cb) { throw new Error('removeCookies is not implemented'); }; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/stringstream/.npmignore����������������������000644 �000766 �000024 �00000000140 12455173731 034161� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib-cov *.seed *.log *.csv *.dat *.out *.pid *.gz pids logs results node_modules npm-debug.log��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/stringstream/.travis.yml���������������������000644 �000766 �000024 �00000000053 12455173731 034276� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.4 - 0.6 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/stringstream/example.js����������������������000644 �000766 �000024 �00000001456 12455173731 034166� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var zlib = require('zlib') var strs = require('stringstream') var utf8Stream = fs.createReadStream('massiveLogFile.gz') .pipe(zlib.createGunzip()) .pipe(strs('utf8')) utf8Stream.pipe(process.stdout) // Stream from utf8 to hex to base64... Why not, ay. var hex64Stream = fs.createReadStream('myFile') .pipe(strs('utf8', 'hex')) .pipe(strs('hex', 'base64')) hex64Stream.pipe(process.stdout) // Deals with base64 correctly by aligning chunks var stream = fs.createReadStream('myFile').pipe(strs('base64')) var base64Str = '' stream.on('data', function(data) { base64Str += data }) stream.on('end', function() { console.log('My base64 encoded file is: ' + base64Str) // Wouldn't work with setEncoding() console.log('Original file is: ' + new Buffer(base64Str, 'base64')) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/stringstream/LICENSE.txt���������������������000644 �000766 �000024 �00000000252 12455173731 034011� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2012 Michael Hart (michael.hart.au@gmail.com) This project is free software released under the MIT license: http://www.opensource.org/licenses/mit-license.php ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/stringstream/package.json��������������������000644 �000766 �000024 �00000004504 12455173731 034460� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "stringstream", "version": "0.0.4", "description": "Encode and decode streams into string streams", "author": { "name": "Michael Hart", "email": "michael.hart.au@gmail.com", "url": "http://github.com/mhart" }, "main": "stringstream.js", "keywords": [ "string", "stream", "base64", "gzip" ], "repository": { "type": "git", "url": "https://github.com/mhart/StringStream.git" }, "license": "MIT", "readme": "# Decode streams into strings The Right Way(tm)\n\n```javascript\nvar fs = require('fs')\nvar zlib = require('zlib')\nvar strs = require('stringstream')\n\nvar utf8Stream = fs.createReadStream('massiveLogFile.gz')\n .pipe(zlib.createGunzip())\n .pipe(strs('utf8'))\n```\n\nNo need to deal with `setEncoding()` weirdness, just compose streams\nlike they were supposed to be!\n\nHandles input and output encoding:\n\n```javascript\n// Stream from utf8 to hex to base64... Why not, ay.\nvar hex64Stream = fs.createReadStream('myFile')\n .pipe(strs('utf8', 'hex'))\n .pipe(strs('hex', 'base64'))\n```\n\nAlso deals with `base64` output correctly by aligning each emitted data\nchunk so that there are no dangling `=` characters:\n\n```javascript\nvar stream = fs.createReadStream('myFile').pipe(strs('base64'))\n\nvar base64Str = ''\n\nstream.on('data', function(data) { base64Str += data })\nstream.on('end', function() {\n console.log('My base64 encoded file is: ' + base64Str) // Wouldn't work with setEncoding()\n console.log('Original file is: ' + new Buffer(base64Str, 'base64'))\n})\n```\n", "readmeFilename": "README.md", "_id": "stringstream@0.0.4", "dist": { "shasum": "0f0e3423f942960b5692ac324a57dd093bc41a92", "tarball": "http://registry.npmjs.org/stringstream/-/stringstream-0.0.4.tgz" }, "_npmVersion": "1.2.0", "_npmUser": { "name": "hichaelmart", "email": "michael.hart.au@gmail.com" }, "maintainers": [ { "name": "hichaelmart", "email": "michael.hart.au@gmail.com" } ], "directories": {}, "_shasum": "0f0e3423f942960b5692ac324a57dd093bc41a92", "_resolved": "https://registry.npmjs.org/stringstream/-/stringstream-0.0.4.tgz", "_from": "stringstream@>=0.0.4 <0.1.0", "bugs": { "url": "https://github.com/mhart/StringStream/issues" }, "homepage": "https://github.com/mhart/StringStream", "scripts": {} } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/stringstream/README.md000644 �000766 �000024 �00000002046 12455173731 033527� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Decode streams into strings The Right Way(tm) ```javascript var fs = require('fs') var zlib = require('zlib') var strs = require('stringstream') var utf8Stream = fs.createReadStream('massiveLogFile.gz') .pipe(zlib.createGunzip()) .pipe(strs('utf8')) ``` No need to deal with `setEncoding()` weirdness, just compose streams like they were supposed to be! Handles input and output encoding: ```javascript // Stream from utf8 to hex to base64... Why not, ay. var hex64Stream = fs.createReadStream('myFile') .pipe(strs('utf8', 'hex')) .pipe(strs('hex', 'base64')) ``` Also deals with `base64` output correctly by aligning each emitted data chunk so that there are no dangling `=` characters: ```javascript var stream = fs.createReadStream('myFile').pipe(strs('base64')) var base64Str = '' stream.on('data', function(data) { base64Str += data }) stream.on('end', function() { console.log('My base64 encoded file is: ' + base64Str) // Wouldn't work with setEncoding() console.log('Original file is: ' + new Buffer(base64Str, 'base64')) }) ``` ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/stringstream/stringstream.js�����������������000644 �000766 �000024 �00000005350 12455173731 035252� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var util = require('util') var Stream = require('stream') var StringDecoder = require('string_decoder').StringDecoder module.exports = StringStream module.exports.AlignedStringDecoder = AlignedStringDecoder function StringStream(from, to) { if (!(this instanceof StringStream)) return new StringStream(from, to) Stream.call(this) if (from == null) from = 'utf8' this.readable = this.writable = true this.paused = false this.toEncoding = (to == null ? from : to) this.fromEncoding = (to == null ? '' : from) this.decoder = new AlignedStringDecoder(this.toEncoding) } util.inherits(StringStream, Stream) StringStream.prototype.write = function(data) { if (!this.writable) { var err = new Error('stream not writable') err.code = 'EPIPE' this.emit('error', err) return false } if (this.fromEncoding) { if (Buffer.isBuffer(data)) data = data.toString() data = new Buffer(data, this.fromEncoding) } var string = this.decoder.write(data) if (string.length) this.emit('data', string) return !this.paused } StringStream.prototype.flush = function() { if (this.decoder.flush) { var string = this.decoder.flush() if (string.length) this.emit('data', string) } } StringStream.prototype.end = function() { if (!this.writable && !this.readable) return this.flush() this.emit('end') this.writable = this.readable = false this.destroy() } StringStream.prototype.destroy = function() { this.decoder = null this.writable = this.readable = false this.emit('close') } StringStream.prototype.pause = function() { this.paused = true } StringStream.prototype.resume = function () { if (this.paused) this.emit('drain') this.paused = false } function AlignedStringDecoder(encoding) { StringDecoder.call(this, encoding) switch (this.encoding) { case 'base64': this.write = alignedWrite this.alignedBuffer = new Buffer(3) this.alignedBytes = 0 break } } util.inherits(AlignedStringDecoder, StringDecoder) AlignedStringDecoder.prototype.flush = function() { if (!this.alignedBuffer || !this.alignedBytes) return '' var leftover = this.alignedBuffer.toString(this.encoding, 0, this.alignedBytes) this.alignedBytes = 0 return leftover } function alignedWrite(buffer) { var rem = (this.alignedBytes + buffer.length) % this.alignedBuffer.length if (!rem && !this.alignedBytes) return buffer.toString(this.encoding) var returnBuffer = new Buffer(this.alignedBytes + buffer.length - rem) this.alignedBuffer.copy(returnBuffer, 0, 0, this.alignedBytes) buffer.copy(returnBuffer, this.alignedBytes, 0, buffer.length - rem) buffer.copy(this.alignedBuffer, 0, buffer.length - rem, buffer.length) this.alignedBytes = rem return returnBuffer.toString(this.encoding) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/.jshintignore������000644 �000766 �000024 �00000000015 12455173731 032647� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/.jshintrc����������000644 �000766 �000024 �00000000203 12455173731 031767� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "node": true, "curly": true, "latedef": true, "quotmark": true, "undef": true, "unused": true, "trailing": true } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/.npmignore���������000644 �000766 �000024 �00000000277 12455173731 032154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov complexity.md ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/.travis.yml��������000644 �000766 �000024 �00000000044 12455173731 032256� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/CHANGELOG.md�������000644 �000766 �000024 �00000010510 12455173731 031755� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ ## [**2.3.3**](https://github.com/hapijs/qs/issues?milestone=18&state=open) - [**#59**](https://github.com/hapijs/qs/issues/59) make sure array indexes are >= 0, closes #57 - [**#58**](https://github.com/hapijs/qs/issues/58) make qs usable for browser loader ## [**2.3.2**](https://github.com/hapijs/qs/issues?milestone=17&state=closed) - [**#55**](https://github.com/hapijs/qs/issues/55) allow merging a string into an object ## [**2.3.1**](https://github.com/hapijs/qs/issues?milestone=16&state=closed) - [**#52**](https://github.com/hapijs/qs/issues/52) Return "undefined" and "false" instead of throwing "TypeError". ## [**2.3.0**](https://github.com/hapijs/qs/issues?milestone=15&state=closed) - [**#50**](https://github.com/hapijs/qs/issues/50) add option to omit array indices, closes #46 ## [**2.2.5**](https://github.com/hapijs/qs/issues?milestone=14&state=closed) - [**#39**](https://github.com/hapijs/qs/issues/39) Is there an alternative to Buffer.isBuffer? - [**#49**](https://github.com/hapijs/qs/issues/49) refactor utils.merge, fixes #45 - [**#41**](https://github.com/hapijs/qs/issues/41) avoid browserifying Buffer, for #39 ## [**2.2.4**](https://github.com/hapijs/qs/issues?milestone=13&state=closed) - [**#38**](https://github.com/hapijs/qs/issues/38) how to handle object keys beginning with a number ## [**2.2.3**](https://github.com/hapijs/qs/issues?milestone=12&state=closed) - [**#37**](https://github.com/hapijs/qs/issues/37) parser discards first empty value in array - [**#36**](https://github.com/hapijs/qs/issues/36) Update to lab 4.x ## [**2.2.2**](https://github.com/hapijs/qs/issues?milestone=11&state=closed) - [**#33**](https://github.com/hapijs/qs/issues/33) Error when plain object in a value - [**#34**](https://github.com/hapijs/qs/issues/34) use Object.prototype.hasOwnProperty.call instead of obj.hasOwnProperty - [**#24**](https://github.com/hapijs/qs/issues/24) Changelog? Semver? ## [**2.2.1**](https://github.com/hapijs/qs/issues?milestone=10&state=closed) - [**#32**](https://github.com/hapijs/qs/issues/32) account for circular references properly, closes #31 - [**#31**](https://github.com/hapijs/qs/issues/31) qs.parse stackoverflow on circular objects ## [**2.2.0**](https://github.com/hapijs/qs/issues?milestone=9&state=closed) - [**#26**](https://github.com/hapijs/qs/issues/26) Don't use Buffer global if it's not present - [**#30**](https://github.com/hapijs/qs/issues/30) Bug when merging non-object values into arrays - [**#29**](https://github.com/hapijs/qs/issues/29) Don't call Utils.clone at the top of Utils.merge - [**#23**](https://github.com/hapijs/qs/issues/23) Ability to not limit parameters? ## [**2.1.0**](https://github.com/hapijs/qs/issues?milestone=8&state=closed) - [**#22**](https://github.com/hapijs/qs/issues/22) Enable using a RegExp as delimiter ## [**2.0.0**](https://github.com/hapijs/qs/issues?milestone=7&state=closed) - [**#18**](https://github.com/hapijs/qs/issues/18) Why is there arrayLimit? - [**#20**](https://github.com/hapijs/qs/issues/20) Configurable parametersLimit - [**#21**](https://github.com/hapijs/qs/issues/21) make all limits optional, for #18, for #20 ## [**1.2.2**](https://github.com/hapijs/qs/issues?milestone=6&state=closed) - [**#19**](https://github.com/hapijs/qs/issues/19) Don't overwrite null values ## [**1.2.1**](https://github.com/hapijs/qs/issues?milestone=5&state=closed) - [**#16**](https://github.com/hapijs/qs/issues/16) ignore non-string delimiters - [**#15**](https://github.com/hapijs/qs/issues/15) Close code block ## [**1.2.0**](https://github.com/hapijs/qs/issues?milestone=4&state=closed) - [**#12**](https://github.com/hapijs/qs/issues/12) Add optional delim argument - [**#13**](https://github.com/hapijs/qs/issues/13) fix #11: flattened keys in array are now correctly parsed ## [**1.1.0**](https://github.com/hapijs/qs/issues?milestone=3&state=closed) - [**#7**](https://github.com/hapijs/qs/issues/7) Empty values of a POST array disappear after being submitted - [**#9**](https://github.com/hapijs/qs/issues/9) Should not omit equals signs (=) when value is null - [**#6**](https://github.com/hapijs/qs/issues/6) Minor grammar fix in README ## [**1.0.2**](https://github.com/hapijs/qs/issues?milestone=2&state=closed) - [**#5**](https://github.com/hapijs/qs/issues/5) array holes incorrectly copied into object on large index ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/CONTRIBUTING.md����000644 �000766 �000024 �00000000151 12455173731 032375� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Please view our [hapijs contributing guide](https://github.com/hapijs/hapi/blob/master/CONTRIBUTING.md). �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/index.js�����������000644 �000766 �000024 �00000000044 12455173731 031612� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib/'); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/lib/���������������000755 �000766 �000024 �00000000000 12456115120 030702� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/LICENSE������������000755 �000766 �000024 �00000003166 12455173731 031165� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2014 Nathan LaFreniere and other contributors. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. * * * The complete list of contributors can be found at: https://github.com/hapijs/qs/graphs/contributors ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/Makefile�����������000644 �000766 �000024 �00000000345 12455173731 031611� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������test: @node node_modules/lab/bin/lab -a code -L test-cov: @node node_modules/lab/bin/lab -a code -t 100 -L test-cov-html: @node node_modules/lab/bin/lab -a code -L -r html -o coverage.html .PHONY: test test-cov test-cov-html �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/package.json�������000755 �000766 �000024 �00000002561 12455173731 032444� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "qs", "version": "2.3.3", "description": "A querystring parser that supports nesting and arrays, with a depth limit", "homepage": "https://github.com/hapijs/qs", "main": "index.js", "dependencies": {}, "devDependencies": { "code": "1.x.x", "lab": "5.x.x" }, "scripts": { "test": "make test-cov" }, "repository": { "type": "git", "url": "https://github.com/hapijs/qs.git" }, "keywords": [ "querystring", "qs" ], "licenses": [ { "type": "BSD", "url": "http://github.com/hapijs/qs/raw/master/LICENSE" } ], "gitHead": "9250c4cda5102fcf72441445816e6d311fc6813d", "bugs": { "url": "https://github.com/hapijs/qs/issues" }, "_id": "qs@2.3.3", "_shasum": "e9e85adbe75da0bbe4c8e0476a086290f863b404", "_from": "qs@>=2.3.1 <2.4.0", "_npmVersion": "2.1.6", "_nodeVersion": "0.10.32", "_npmUser": { "name": "nlf", "email": "quitlahok@gmail.com" }, "maintainers": [ { "name": "nlf", "email": "quitlahok@gmail.com" }, { "name": "hueniverse", "email": "eran@hueniverse.com" } ], "dist": { "shasum": "e9e85adbe75da0bbe4c8e0476a086290f863b404", "tarball": "http://registry.npmjs.org/qs/-/qs-2.3.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/qs/-/qs-2.3.3.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/Readme.md����������000755 �000766 �000024 �00000011562 12455173731 031676� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# qs A querystring parsing and stringifying library with some added security. [![Build Status](https://secure.travis-ci.org/hapijs/qs.svg)](http://travis-ci.org/hapijs/qs) Lead Maintainer: [Nathan LaFreniere](https://github.com/nlf) The **qs** module was originally created and maintained by [TJ Holowaychuk](https://github.com/visionmedia/node-querystring). ## Usage ```javascript var Qs = require('qs'); var obj = Qs.parse('a=c'); // { a: 'c' } var str = Qs.stringify(obj); // 'a=c' ``` ### Parsing Objects ```javascript Qs.parse(string, [options]); ``` **qs** allows you to create nested objects within your query strings, by surrounding the name of sub-keys with square brackets `[]`. For example, the string `'foo[bar]=baz'` converts to: ```javascript { foo: { bar: 'baz' } } ``` URI encoded strings work too: ```javascript Qs.parse('a%5Bb%5D=c'); // { a: { b: 'c' } } ``` You can also nest your objects, like `'foo[bar][baz]=foobarbaz'`: ```javascript { foo: { bar: { baz: 'foobarbaz' } } } ``` By default, when nesting objects **qs** will only parse up to 5 children deep. This means if you attempt to parse a string like `'a[b][c][d][e][f][g][h][i]=j'` your resulting object will be: ```javascript { a: { b: { c: { d: { e: { f: { '[g][h][i]': 'j' } } } } } } } ``` This depth can be overridden by passing a `depth` option to `Qs.parse(string, [options])`: ```javascript Qs.parse('a[b][c][d][e][f][g][h][i]=j', { depth: 1 }); // { a: { b: { '[c][d][e][f][g][h][i]': 'j' } } } ``` The depth limit helps mitigate abuse when **qs** is used to parse user input, and it is recommended to keep it a reasonably small number. For similar reasons, by default **qs** will only parse up to 1000 parameters. This can be overridden by passing a `parameterLimit` option: ```javascript Qs.parse('a=b&c=d', { parameterLimit: 1 }); // { a: 'b' } ``` An optional delimiter can also be passed: ```javascript Qs.parse('a=b;c=d', { delimiter: ';' }); // { a: 'b', c: 'd' } ``` Delimiters can be a regular expression too: ```javascript Qs.parse('a=b;c=d,e=f', { delimiter: /[;,]/ }); // { a: 'b', c: 'd', e: 'f' } ``` ### Parsing Arrays **qs** can also parse arrays using a similar `[]` notation: ```javascript Qs.parse('a[]=b&a[]=c'); // { a: ['b', 'c'] } ``` You may specify an index as well: ```javascript Qs.parse('a[1]=c&a[0]=b'); // { a: ['b', 'c'] } ``` Note that the only difference between an index in an array and a key in an object is that the value between the brackets must be a number to create an array. When creating arrays with specific indices, **qs** will compact a sparse array to only the existing values preserving their order: ```javascript Qs.parse('a[1]=b&a[15]=c'); // { a: ['b', 'c'] } ``` Note that an empty string is also a value, and will be preserved: ```javascript Qs.parse('a[]=&a[]=b'); // { a: ['', 'b'] } Qs.parse('a[0]=b&a[1]=&a[2]=c'); // { a: ['b', '', 'c'] } ``` **qs** will also limit specifying indices in an array to a maximum index of `20`. Any array members with an index of greater than `20` will instead be converted to an object with the index as the key: ```javascript Qs.parse('a[100]=b'); // { a: { '100': 'b' } } ``` This limit can be overridden by passing an `arrayLimit` option: ```javascript Qs.parse('a[1]=b', { arrayLimit: 0 }); // { a: { '1': 'b' } } ``` To disable array parsing entirely, set `arrayLimit` to `-1`. If you mix notations, **qs** will merge the two items into an object: ```javascript Qs.parse('a[0]=b&a[b]=c'); // { a: { '0': 'b', b: 'c' } } ``` You can also create arrays of objects: ```javascript Qs.parse('a[][b]=c'); // { a: [{ b: 'c' }] } ``` ### Stringifying ```javascript Qs.stringify(object, [options]); ``` When stringifying, **qs** always URI encodes output. Objects are stringified as you would expect: ```javascript Qs.stringify({ a: 'b' }); // 'a=b' Qs.stringify({ a: { b: 'c' } }); // 'a%5Bb%5D=c' ``` Examples beyond this point will be shown as though the output is not URI encoded for clarity. Please note that the return values in these cases *will* be URI encoded during real usage. When arrays are stringified, by default they are given explicit indices: ```javascript Qs.stringify({ a: ['b', 'c', 'd'] }); // 'a[0]=b&a[1]=c&a[2]=d' ``` You may override this by setting the `indices` option to `false`: ```javascript Qs.stringify({ a: ['b', 'c', 'd'] }, { indices: false }); // 'a=b&a=c&a=d' ``` Empty strings and null values will omit the value, but the equals sign (=) remains in place: ```javascript Qs.stringify({ a: '' }); // 'a=' ``` Properties that are set to `undefined` will be omitted entirely: ```javascript Qs.stringify({ a: null, b: undefined }); // 'a=' ``` The delimiter may be overridden with stringify as well: ```javascript Qs.stringify({ a: 'b', c: 'd' }, { delimiter: ';' }); // 'a=b;c=d' ``` ����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/lib/index.js�������000755 �000766 �000024 �00000000310 12455173731 032357� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Stringify = require('./stringify'); var Parse = require('./parse'); // Declare internals var internals = {}; module.exports = { stringify: Stringify, parse: Parse }; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/lib/parse.js�������000755 �000766 �000024 �00000007553 12455173731 032402� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Utils = require('./utils'); // Declare internals var internals = { delimiter: '&', depth: 5, arrayLimit: 20, parameterLimit: 1000 }; internals.parseValues = function (str, options) { var obj = {}; var parts = str.split(options.delimiter, options.parameterLimit === Infinity ? undefined : options.parameterLimit); for (var i = 0, il = parts.length; i < il; ++i) { var part = parts[i]; var pos = part.indexOf(']=') === -1 ? part.indexOf('=') : part.indexOf(']=') + 1; if (pos === -1) { obj[Utils.decode(part)] = ''; } else { var key = Utils.decode(part.slice(0, pos)); var val = Utils.decode(part.slice(pos + 1)); if (!obj.hasOwnProperty(key)) { obj[key] = val; } else { obj[key] = [].concat(obj[key]).concat(val); } } } return obj; }; internals.parseObject = function (chain, val, options) { if (!chain.length) { return val; } var root = chain.shift(); var obj = {}; if (root === '[]') { obj = []; obj = obj.concat(internals.parseObject(chain, val, options)); } else { var cleanRoot = root[0] === '[' && root[root.length - 1] === ']' ? root.slice(1, root.length - 1) : root; var index = parseInt(cleanRoot, 10); var indexString = '' + index; if (!isNaN(index) && root !== cleanRoot && indexString === cleanRoot && index >= 0 && index <= options.arrayLimit) { obj = []; obj[index] = internals.parseObject(chain, val, options); } else { obj[cleanRoot] = internals.parseObject(chain, val, options); } } return obj; }; internals.parseKeys = function (key, val, options) { if (!key) { return; } // The regex chunks var parent = /^([^\[\]]*)/; var child = /(\[[^\[\]]*\])/g; // Get the parent var segment = parent.exec(key); // Don't allow them to overwrite object prototype properties if (Object.prototype.hasOwnProperty(segment[1])) { return; } // Stash the parent if it exists var keys = []; if (segment[1]) { keys.push(segment[1]); } // Loop through children appending to the array until we hit depth var i = 0; while ((segment = child.exec(key)) !== null && i < options.depth) { ++i; if (!Object.prototype.hasOwnProperty(segment[1].replace(/\[|\]/g, ''))) { keys.push(segment[1]); } } // If there's a remainder, just add whatever is left if (segment) { keys.push('[' + key.slice(segment.index) + ']'); } return internals.parseObject(keys, val, options); }; module.exports = function (str, options) { if (str === '' || str === null || typeof str === 'undefined') { return {}; } options = options || {}; options.delimiter = typeof options.delimiter === 'string' || Utils.isRegExp(options.delimiter) ? options.delimiter : internals.delimiter; options.depth = typeof options.depth === 'number' ? options.depth : internals.depth; options.arrayLimit = typeof options.arrayLimit === 'number' ? options.arrayLimit : internals.arrayLimit; options.parameterLimit = typeof options.parameterLimit === 'number' ? options.parameterLimit : internals.parameterLimit; var tempObj = typeof str === 'string' ? internals.parseValues(str, options) : str; var obj = {}; // Iterate over the keys and setup the new object var keys = Object.keys(tempObj); for (var i = 0, il = keys.length; i < il; ++i) { var key = keys[i]; var newObj = internals.parseKeys(key, tempObj[key], options); obj = Utils.merge(obj, newObj); } return Utils.compact(obj); }; �����������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/lib/stringify.js���000755 �000766 �000024 �00000003301 12455173731 033271� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Utils = require('./utils'); // Declare internals var internals = { delimiter: '&', indices: true }; internals.stringify = function (obj, prefix, options) { if (Utils.isBuffer(obj)) { obj = obj.toString(); } else if (obj instanceof Date) { obj = obj.toISOString(); } else if (obj === null) { obj = ''; } if (typeof obj === 'string' || typeof obj === 'number' || typeof obj === 'boolean') { return [encodeURIComponent(prefix) + '=' + encodeURIComponent(obj)]; } var values = []; if (typeof obj === 'undefined') { return values; } var objKeys = Object.keys(obj); for (var i = 0, il = objKeys.length; i < il; ++i) { var key = objKeys[i]; if (!options.indices && Array.isArray(obj)) { values = values.concat(internals.stringify(obj[key], prefix, options)); } else { values = values.concat(internals.stringify(obj[key], prefix + '[' + key + ']', options)); } } return values; }; module.exports = function (obj, options) { options = options || {}; var delimiter = typeof options.delimiter === 'undefined' ? internals.delimiter : options.delimiter; options.indices = typeof options.indices === 'boolean' ? options.indices : internals.indices; var keys = []; if (typeof obj !== 'object' || obj === null) { return ''; } var objKeys = Object.keys(obj); for (var i = 0, il = objKeys.length; i < il; ++i) { var key = objKeys[i]; keys = keys.concat(internals.stringify(obj[key], key, options)); } return keys.join(delimiter); }; �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/qs/lib/utils.js�������000755 �000766 �000024 �00000004540 12455173731 032421� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules // Declare internals var internals = {}; exports.arrayToObject = function (source) { var obj = {}; for (var i = 0, il = source.length; i < il; ++i) { if (typeof source[i] !== 'undefined') { obj[i] = source[i]; } } return obj; }; exports.merge = function (target, source) { if (!source) { return target; } if (typeof source !== 'object') { if (Array.isArray(target)) { target.push(source); } else { target[source] = true; } return target; } if (typeof target !== 'object') { target = [target].concat(source); return target; } if (Array.isArray(target) && !Array.isArray(source)) { target = exports.arrayToObject(target); } var keys = Object.keys(source); for (var k = 0, kl = keys.length; k < kl; ++k) { var key = keys[k]; var value = source[key]; if (!target[key]) { target[key] = value; } else { target[key] = exports.merge(target[key], value); } } return target; }; exports.decode = function (str) { try { return decodeURIComponent(str.replace(/\+/g, ' ')); } catch (e) { return str; } }; exports.compact = function (obj, refs) { if (typeof obj !== 'object' || obj === null) { return obj; } refs = refs || []; var lookup = refs.indexOf(obj); if (lookup !== -1) { return refs[lookup]; } refs.push(obj); if (Array.isArray(obj)) { var compacted = []; for (var i = 0, il = obj.length; i < il; ++i) { if (typeof obj[i] !== 'undefined') { compacted.push(obj[i]); } } return compacted; } var keys = Object.keys(obj); for (i = 0, il = keys.length; i < il; ++i) { var key = keys[i]; obj[key] = exports.compact(obj[key], refs); } return obj; }; exports.isRegExp = function (obj) { return Object.prototype.toString.call(obj) === '[object RegExp]'; }; exports.isBuffer = function (obj) { if (obj === null || typeof obj === 'undefined') { return false; } return !!(obj.constructor && obj.constructor.isBuffer && obj.constructor.isBuffer(obj)); }; ����������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/index.js���000644 �000766 �000024 �00000006074 12455173731 033256� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var crypto = require('crypto') , qs = require('querystring') ; function sha1 (key, body) { return crypto.createHmac('sha1', key).update(body).digest('base64') } function rsa (key, body) { return crypto.createSign("RSA-SHA1").update(body).sign(key, 'base64'); } function rfc3986 (str) { return encodeURIComponent(str) .replace(/!/g,'%21') .replace(/\*/g,'%2A') .replace(/\(/g,'%28') .replace(/\)/g,'%29') .replace(/'/g,'%27') ; } // Maps object to bi-dimensional array // Converts { foo: 'A', bar: [ 'b', 'B' ]} to // [ ['foo', 'A'], ['bar', 'b'], ['bar', 'B'] ] function map (obj) { var key, val, arr = [] for (key in obj) { val = obj[key] if (Array.isArray(val)) for (var i = 0; i < val.length; i++) arr.push([key, val[i]]) else arr.push([key, val]) } return arr } // Compare function for sort function compare (a, b) { return a > b ? 1 : a < b ? -1 : 0 } function generateBase (httpMethod, base_uri, params) { // adapted from https://dev.twitter.com/docs/auth/oauth and // https://dev.twitter.com/docs/auth/creating-signature // Parameter normalization // http://tools.ietf.org/html/rfc5849#section-3.4.1.3.2 var normalized = map(params) // 1. First, the name and value of each parameter are encoded .map(function (p) { return [ rfc3986(p[0]), rfc3986(p[1] || '') ] }) // 2. The parameters are sorted by name, using ascending byte value // ordering. If two or more parameters share the same name, they // are sorted by their value. .sort(function (a, b) { return compare(a[0], b[0]) || compare(a[1], b[1]) }) // 3. The name of each parameter is concatenated to its corresponding // value using an "=" character (ASCII code 61) as a separator, even // if the value is empty. .map(function (p) { return p.join('=') }) // 4. The sorted name/value pairs are concatenated together into a // single string by using an "&" character (ASCII code 38) as // separator. .join('&') var base = [ rfc3986(httpMethod ? httpMethod.toUpperCase() : 'GET'), rfc3986(base_uri), rfc3986(normalized) ].join('&') return base } function hmacsign (httpMethod, base_uri, params, consumer_secret, token_secret) { var base = generateBase(httpMethod, base_uri, params) var key = [ consumer_secret || '', token_secret || '' ].map(rfc3986).join('&') return sha1(key, base) } function rsasign (httpMethod, base_uri, params, private_key, token_secret) { var base = generateBase(httpMethod, base_uri, params) var key = private_key || '' return rsa(key, base) } function sign (signMethod, httpMethod, base_uri, params, consumer_secret, token_secret) { var method switch (signMethod) { case 'RSA-SHA1': method = rsasign break case 'HMAC-SHA1': method = hmacsign break default: throw new Error("Signature method not supported: " + signMethod) } return method.apply(null, [].slice.call(arguments, 1)) } exports.hmacsign = hmacsign exports.rsasign = rsasign exports.sign = sign exports.rfc3986 = rfc3986 ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/LICENSE����000644 �000766 �000024 �00000021664 12455173731 032620� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS����������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/package.json����������������������000644 �000766 �000024 �00000002507 12455173731 034015� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "oauth-sign", "description": "OAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module.", "version": "0.5.0", "repository": { "url": "https://github.com/mikeal/oauth-sign" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "scripts": { "test": "node test.js" }, "gitHead": "6fea86c2d4a38e1b3780ad0cc56f00196e5213c1", "bugs": { "url": "https://github.com/mikeal/oauth-sign/issues" }, "homepage": "https://github.com/mikeal/oauth-sign", "_id": "oauth-sign@0.5.0", "_shasum": "d767f5169325620eab2e087ef0c472e773db6461", "_from": "oauth-sign@>=0.5.0 <0.6.0", "_npmVersion": "2.0.0", "_npmUser": { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" } ], "dist": { "shasum": "d767f5169325620eab2e087ef0c472e773db6461", "tarball": "http://registry.npmjs.org/oauth-sign/-/oauth-sign-0.5.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/oauth-sign/-/oauth-sign-0.5.0.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/README.md��000644 �000766 �000024 �00000000153 12455173731 033060� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������oauth-sign ========== OAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/oauth-sign/test.js����000644 �000766 �000024 �00000004767 12455173731 033135� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var hmacsign = require('./index').hmacsign , assert = require('assert') , qs = require('querystring') ; // Tests from Twitter documentation https://dev.twitter.com/docs/auth/oauth var reqsign = hmacsign('POST', 'https://api.twitter.com/oauth/request_token', { oauth_callback: 'http://localhost:3005/the_dance/process_callback?service_provider_id=11' , oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g' , oauth_nonce: 'QP70eNmVz8jvdPevU3oJD2AfF7R7odC2XJcn4XlZJqk' , oauth_signature_method: 'HMAC-SHA1' , oauth_timestamp: '1272323042' , oauth_version: '1.0' }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98") console.log(reqsign) console.log('8wUi7m5HFQy76nowoCThusfgB+Q=') assert.equal(reqsign, '8wUi7m5HFQy76nowoCThusfgB+Q=') var accsign = hmacsign('POST', 'https://api.twitter.com/oauth/access_token', { oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g' , oauth_nonce: '9zWH6qe0qG7Lc1telCn7FhUbLyVdjEaL3MO5uHxn8' , oauth_signature_method: 'HMAC-SHA1' , oauth_token: '8ldIZyxQeVrFZXFOZH5tAwj6vzJYuLQpl0WUEYtWc' , oauth_timestamp: '1272323047' , oauth_verifier: 'pDNg57prOHapMbhv25RNf75lVRd6JDsni1AJJIDYoTY' , oauth_version: '1.0' }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "x6qpRnlEmW9JbQn4PQVVeVG8ZLPEx6A0TOebgwcuA") console.log(accsign) console.log('PUw/dHA4fnlJYM6RhXk5IU/0fCc=') assert.equal(accsign, 'PUw/dHA4fnlJYM6RhXk5IU/0fCc=') var upsign = hmacsign('POST', 'http://api.twitter.com/1/statuses/update.json', { oauth_consumer_key: "GDdmIQH6jhtmLUypg82g" , oauth_nonce: "oElnnMTQIZvqvlfXM56aBLAf5noGD0AQR3Fmi7Q6Y" , oauth_signature_method: "HMAC-SHA1" , oauth_token: "819797-Jxq8aYUDRmykzVKrgoLhXSq67TEa5ruc4GJC2rWimw" , oauth_timestamp: "1272325550" , oauth_version: "1.0" , status: 'setting up my twitter 私のさえずりを設定する' }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "J6zix3FfA9LofH0awS24M3HcBYXO5nI1iYe8EfBA") console.log(upsign) console.log('yOahq5m0YjDDjfjxHaXEsW9D+X0=') assert.equal(upsign, 'yOahq5m0YjDDjfjxHaXEsW9D+X0=') // example in rfc5849 var params = qs.parse('b5=%3D%253D&a3=a&c%40=&a2=r%20b' + '&' + 'c2&a3=2+q') params.oauth_consumer_key = '9djdj82h48djs9d2' params.oauth_token = 'kkk9d7dh3k39sjv7' params.oauth_nonce = '7d8f3e4a' params.oauth_signature_method = 'HMAC-SHA1' params.oauth_timestamp = '137131201' var rfc5849sign = hmacsign('POST', 'http://example.com/request', params, "j49sk3j29djd", "dh893hdasih9") console.log(rfc5849sign) console.log('r6/TJjbCOr97/+UU0NsvSne7s5g=') assert.equal(rfc5849sign, 'r6/TJjbCOr97/+UU0NsvSne7s5g=') ���������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/.npmignore��000644 �000766 �000024 �00000000027 12455173731 033413� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules .DS_Store ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/��000755 �000766 �000024 �00000000000 12456115120 033334� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/bin/��������000755 �000766 �000024 �00000000000 12456115120 032152� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/node-uuid/component.json���������������������000644 �000766 �000024 �00000000731 12455173731 034234� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "node-uuid", "repo": "broofa/node-uuid", "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.", "version": "1.4.0", "author": "Robert Kieffer <robert@broofa.com>", "contributors": [ {"name": "Christoph Tavan <dev@tavan.de>", "github": "https://github.com/ctavan"} ], "keywords": ["uuid", "guid", "rfc4122"], "dependencies": {}, "development": {}, "main": "uuid.js", "scripts": [ "uuid.js" ], "license": "MIT" }���������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/LICENSE.md��000644 �000766 �000024 �00000002077 12455173731 033027� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The MIT License (MIT) Copyright (c) 2010-2012 Robert Kieffer Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/package.json000644 �000766 �000024 �00000003027 12455173731 033705� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "node-uuid", "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.", "url": "http://github.com/broofa/node-uuid", "keywords": [ "uuid", "guid", "rfc4122" ], "author": { "name": "Robert Kieffer", "email": "robert@broofa.com" }, "contributors": [ { "name": "Christoph Tavan", "email": "dev@tavan.de" } ], "bin": { "uuid": "./bin/uuid" }, "scripts": { "test": "node test/test.js" }, "lib": ".", "main": "./uuid.js", "repository": { "type": "git", "url": "https://github.com/broofa/node-uuid.git" }, "version": "1.4.2", "licenses": [ { "type": "MIT", "url": "https://raw.github.com/broofa/node-uuid/master/LICENSE.md" } ], "gitHead": "14c42d2568977f7ddfc02399bd2a6b09e2cfbe5f", "bugs": { "url": "https://github.com/broofa/node-uuid/issues" }, "homepage": "https://github.com/broofa/node-uuid", "_id": "node-uuid@1.4.2", "_shasum": "907db3d11b7b6a2cf4f905fb7199f14ae7379ba0", "_from": "node-uuid@>=1.4.0 <1.5.0", "_npmVersion": "1.4.28", "_npmUser": { "name": "broofa", "email": "robert@broofa.com" }, "maintainers": [ { "name": "broofa", "email": "robert@broofa.com" } ], "dist": { "shasum": "907db3d11b7b6a2cf4f905fb7199f14ae7379ba0", "tarball": "http://registry.npmjs.org/node-uuid/-/node-uuid-1.4.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/node-uuid/-/node-uuid-1.4.2.tgz", "readme": "ERROR: No README data found!" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/README.md���000644 �000766 �000024 �00000016051 12455173731 032677� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# node-uuid Simple, fast generation of [RFC4122](http://www.ietf.org/rfc/rfc4122.txt) UUIDS. Features: * Generate RFC4122 version 1 or version 4 UUIDs * Runs in node.js and all browsers. * Registered as a [ComponentJS](https://github.com/component/component) [component](https://github.com/component/component/wiki/Components) ('broofa/node-uuid'). * Cryptographically strong random # generation on supporting platforms * 1.1K minified and gzip'ed (Want something smaller? Check this [crazy shit](https://gist.github.com/982883) out! ) * [Annotated source code](http://broofa.github.com/node-uuid/docs/uuid.html) * Comes with a Command Line Interface for generating uuids on the command line ## Getting Started Install it in your browser: ```html <script src="uuid.js"></script> ``` Or in node.js: ``` npm install node-uuid ``` ```javascript var uuid = require('node-uuid'); ``` Then create some ids ... ```javascript // Generate a v1 (time-based) id uuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a' // Generate a v4 (random) id uuid.v4(); // -> '110ec58a-a0f2-4ac4-8393-c866d813b8d1' ``` ## API ### uuid.v1([`options` [, `buffer` [, `offset`]]]) Generate and return a RFC4122 v1 (timestamp-based) UUID. * `options` - (Object) Optional uuid state to apply. Properties may include: * `node` - (Array) Node id as Array of 6 bytes (per 4.1.6). Default: Randomly generated ID. See note 1. * `clockseq` - (Number between 0 - 0x3fff) RFC clock sequence. Default: An internally maintained clockseq is used. * `msecs` - (Number | Date) Time in milliseconds since unix Epoch. Default: The current time is used. * `nsecs` - (Number between 0-9999) additional time, in 100-nanosecond units. Ignored if `msecs` is unspecified. Default: internal uuid counter is used, as per 4.2.1.2. * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. * `offset` - (Number) Starting index in `buffer` at which to begin writing. Returns `buffer`, if specified, otherwise the string form of the UUID Notes: 1. The randomly generated node id is only guaranteed to stay constant for the lifetime of the current JS runtime. (Future versions of this module may use persistent storage mechanisms to extend this guarantee.) Example: Generate string UUID with fully-specified options ```javascript uuid.v1({ node: [0x01, 0x23, 0x45, 0x67, 0x89, 0xab], clockseq: 0x1234, msecs: new Date('2011-11-01').getTime(), nsecs: 5678 }); // -> "710b962e-041c-11e1-9234-0123456789ab" ``` Example: In-place generation of two binary IDs ```javascript // Generate two ids in an array var arr = new Array(32); // -> [] uuid.v1(null, arr, 0); // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15] uuid.v1(null, arr, 16); // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15 02 a3 1c b0 14 32 11 e1 85 58 0b 48 8e 4f c1 15] // Optionally use uuid.unparse() to get stringify the ids uuid.unparse(buffer); // -> '02a2ce90-1432-11e1-8558-0b488e4fc115' uuid.unparse(buffer, 16) // -> '02a31cb0-1432-11e1-8558-0b488e4fc115' ``` ### uuid.v4([`options` [, `buffer` [, `offset`]]]) Generate and return a RFC4122 v4 UUID. * `options` - (Object) Optional uuid state to apply. Properties may include: * `random` - (Number[16]) Array of 16 numbers (0-255) to use in place of randomly generated values * `rng` - (Function) Random # generator to use. Set to one of the built-in generators - `uuid.mathRNG` (all platforms), `uuid.nodeRNG` (node.js only), `uuid.whatwgRNG` (WebKit only) - or a custom function that returns an array[16] of byte values. * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. * `offset` - (Number) Starting index in `buffer` at which to begin writing. Returns `buffer`, if specified, otherwise the string form of the UUID Example: Generate string UUID with fully-specified options ```javascript uuid.v4({ random: [ 0x10, 0x91, 0x56, 0xbe, 0xc4, 0xfb, 0xc1, 0xea, 0x71, 0xb4, 0xef, 0xe1, 0x67, 0x1c, 0x58, 0x36 ] }); // -> "109156be-c4fb-41ea-b1b4-efe1671c5836" ``` Example: Generate two IDs in a single buffer ```javascript var buffer = new Array(32); // (or 'new Buffer' in node.js) uuid.v4(null, buffer, 0); uuid.v4(null, buffer, 16); ``` ### uuid.parse(id[, buffer[, offset]]) ### uuid.unparse(buffer[, offset]) Parse and unparse UUIDs * `id` - (String) UUID(-like) string * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. Default: A new Array or Buffer is used * `offset` - (Number) Starting index in `buffer` at which to begin writing. Default: 0 Example parsing and unparsing a UUID string ```javascript var bytes = uuid.parse('797ff043-11eb-11e1-80d6-510998755d10'); // -> <Buffer 79 7f f0 43 11 eb 11 e1 80 d6 51 09 98 75 5d 10> var string = uuid.unparse(bytes); // -> '797ff043-11eb-11e1-80d6-510998755d10' ``` ### uuid.noConflict() (Browsers only) Set `uuid` property back to it's previous value. Returns the node-uuid object. Example: ```javascript var myUuid = uuid.noConflict(); myUuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a' ``` ## Deprecated APIs Support for the following v1.2 APIs is available in v1.3, but is deprecated and will be removed in the next major version. ### uuid([format [, buffer [, offset]]]) uuid() has become uuid.v4(), and the `format` argument is now implicit in the `buffer` argument. (i.e. if you specify a buffer, the format is assumed to be binary). ### uuid.BufferClass The class of container created when generating binary uuid data if no buffer argument is specified. This is expected to go away, with no replacement API. ## Command Line Interface To use the executable, it's probably best to install this library globally. `npm install -g node-uuid` Usage: ``` USAGE: uuid [version] [options] options: --help Display this message and exit ``` `version` must be an RFC4122 version that is supported by this library, which is currently version 1 and version 4 (denoted by "v1" and "v4", respectively). `version` defaults to version 4 when not supplied. ### Examples ``` > uuid 3a91f950-dec8-4688-ba14-5b7bbfc7a563 ``` ``` > uuid v1 9d0b43e0-7696-11e3-964b-250efa37a98e ``` ``` > uuid v4 6790ac7c-24ac-4f98-8464-42f6d98a53ae ``` ## Testing In node.js ``` npm test ``` In Browser ``` open test/test.html ``` ### Benchmarking Requires node.js ``` npm install uuid uuid-js node benchmark/benchmark.js ``` For a more complete discussion of node-uuid performance, please see the `benchmark/README.md` file, and the [benchmark wiki](https://github.com/broofa/node-uuid/wiki/Benchmark) For browser performance [checkout the JSPerf tests](http://jsperf.com/node-uuid-performance). ## Release notes ### 1.4.0 * Improved module context detection * Removed public RNG functions ### 1.3.2 * Improve tests and handling of v1() options (Issue #24) * Expose RNG option to allow for perf testing with different generators ### 1.3.0 * Support for version 1 ids, thanks to [@ctavan](https://github.com/ctavan)! * Support for node.js crypto API * De-emphasizing performance in favor of a) cryptographic quality PRNGs where available and b) more manageable code ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/uuid.js�����000644 �000766 �000024 �00000016107 12455173731 032726� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// uuid.js // // Copyright (c) 2010-2012 Robert Kieffer // MIT License - http://opensource.org/licenses/mit-license.php (function() { var _global = this; // Unique ID creation requires a high quality random # generator. We feature // detect to determine the best RNG source, normalizing to a function that // returns 128-bits of randomness, since that's what's usually required var _rng; // Node.js crypto-based RNG - http://nodejs.org/docs/v0.6.2/api/crypto.html // // Moderately fast, high quality if (typeof(_global.require) == 'function') { try { var _rb = _global.require('crypto').randomBytes; _rng = _rb && function() {return _rb(16);}; } catch(e) {} } if (!_rng && _global.crypto && crypto.getRandomValues) { // WHATWG crypto-based RNG - http://wiki.whatwg.org/wiki/Crypto // // Moderately fast, high quality var _rnds8 = new Uint8Array(16); _rng = function whatwgRNG() { crypto.getRandomValues(_rnds8); return _rnds8; }; } if (!_rng) { // Math.random()-based (RNG) // // If all else fails, use Math.random(). It's fast, but is of unspecified // quality. var _rnds = new Array(16); _rng = function() { for (var i = 0, r; i < 16; i++) { if ((i & 0x03) === 0) r = Math.random() * 0x100000000; _rnds[i] = r >>> ((i & 0x03) << 3) & 0xff; } return _rnds; }; } // Buffer class to use var BufferClass = typeof(_global.Buffer) == 'function' ? _global.Buffer : Array; // Maps for number <-> hex string conversion var _byteToHex = []; var _hexToByte = {}; for (var i = 0; i < 256; i++) { _byteToHex[i] = (i + 0x100).toString(16).substr(1); _hexToByte[_byteToHex[i]] = i; } // **`parse()` - Parse a UUID into it's component bytes** function parse(s, buf, offset) { var i = (buf && offset) || 0, ii = 0; buf = buf || []; s.toLowerCase().replace(/[0-9a-f]{2}/g, function(oct) { if (ii < 16) { // Don't overflow! buf[i + ii++] = _hexToByte[oct]; } }); // Zero out remaining bytes if string was short while (ii < 16) { buf[i + ii++] = 0; } return buf; } // **`unparse()` - Convert UUID byte array (ala parse()) into a string** function unparse(buf, offset) { var i = offset || 0, bth = _byteToHex; return bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]]; } // **`v1()` - Generate time-based UUID** // // Inspired by https://github.com/LiosK/UUID.js // and http://docs.python.org/library/uuid.html // random #'s we need to init node and clockseq var _seedBytes = _rng(); // Per 4.5, create and 48-bit node id, (47 random bits + multicast bit = 1) var _nodeId = [ _seedBytes[0] | 0x01, _seedBytes[1], _seedBytes[2], _seedBytes[3], _seedBytes[4], _seedBytes[5] ]; // Per 4.2.2, randomize (14 bit) clockseq var _clockseq = (_seedBytes[6] << 8 | _seedBytes[7]) & 0x3fff; // Previous uuid creation time var _lastMSecs = 0, _lastNSecs = 0; // See https://github.com/broofa/node-uuid for API details function v1(options, buf, offset) { var i = buf && offset || 0; var b = buf || []; options = options || {}; var clockseq = options.clockseq != null ? options.clockseq : _clockseq; // UUID timestamps are 100 nano-second units since the Gregorian epoch, // (1582-10-15 00:00). JSNumbers aren't precise enough for this, so // time is handled internally as 'msecs' (integer milliseconds) and 'nsecs' // (100-nanoseconds offset from msecs) since unix epoch, 1970-01-01 00:00. var msecs = options.msecs != null ? options.msecs : new Date().getTime(); // Per 4.2.1.2, use count of uuid's generated during the current clock // cycle to simulate higher resolution clock var nsecs = options.nsecs != null ? options.nsecs : _lastNSecs + 1; // Time since last uuid creation (in msecs) var dt = (msecs - _lastMSecs) + (nsecs - _lastNSecs)/10000; // Per 4.2.1.2, Bump clockseq on clock regression if (dt < 0 && options.clockseq == null) { clockseq = clockseq + 1 & 0x3fff; } // Reset nsecs if clock regresses (new clockseq) or we've moved onto a new // time interval if ((dt < 0 || msecs > _lastMSecs) && options.nsecs == null) { nsecs = 0; } // Per 4.2.1.2 Throw error if too many uuids are requested if (nsecs >= 10000) { throw new Error('uuid.v1(): Can\'t create more than 10M uuids/sec'); } _lastMSecs = msecs; _lastNSecs = nsecs; _clockseq = clockseq; // Per 4.1.4 - Convert from unix epoch to Gregorian epoch msecs += 12219292800000; // `time_low` var tl = ((msecs & 0xfffffff) * 10000 + nsecs) % 0x100000000; b[i++] = tl >>> 24 & 0xff; b[i++] = tl >>> 16 & 0xff; b[i++] = tl >>> 8 & 0xff; b[i++] = tl & 0xff; // `time_mid` var tmh = (msecs / 0x100000000 * 10000) & 0xfffffff; b[i++] = tmh >>> 8 & 0xff; b[i++] = tmh & 0xff; // `time_high_and_version` b[i++] = tmh >>> 24 & 0xf | 0x10; // include version b[i++] = tmh >>> 16 & 0xff; // `clock_seq_hi_and_reserved` (Per 4.2.2 - include variant) b[i++] = clockseq >>> 8 | 0x80; // `clock_seq_low` b[i++] = clockseq & 0xff; // `node` var node = options.node || _nodeId; for (var n = 0; n < 6; n++) { b[i + n] = node[n]; } return buf ? buf : unparse(b); } // **`v4()` - Generate random UUID** // See https://github.com/broofa/node-uuid for API details function v4(options, buf, offset) { // Deprecated - 'format' argument, as supported in v1.2 var i = buf && offset || 0; if (typeof(options) == 'string') { buf = options == 'binary' ? new BufferClass(16) : null; options = null; } options = options || {}; var rnds = options.random || (options.rng || _rng)(); // Per 4.4, set bits for version and `clock_seq_hi_and_reserved` rnds[6] = (rnds[6] & 0x0f) | 0x40; rnds[8] = (rnds[8] & 0x3f) | 0x80; // Copy bytes to buffer, if provided if (buf) { for (var ii = 0; ii < 16; ii++) { buf[i + ii] = rnds[ii]; } } return buf || unparse(rnds); } // Export public API var uuid = v4; uuid.v1 = v1; uuid.v4 = v4; uuid.parse = parse; uuid.unparse = unparse; uuid.BufferClass = BufferClass; if (typeof define === 'function' && define.amd) { // Publish as AMD module define(function() {return uuid;}); } else if (typeof(module) != 'undefined' && module.exports) { // Publish as node.js module module.exports = uuid; } else { // Publish as global (in browsers) var _previousRoot = _global.uuid; // **`noConflict()` - (browser only) to reset global 'uuid' var** uuid.noConflict = function() { _global.uuid = _previousRoot; return uuid; }; _global.uuid = uuid; } }).call(this); ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/node-uuid/bin/uuid����000644 �000766 �000024 �00000001125 12455173731 033055� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var path = require('path'); var uuid = require(path.join(__dirname, '..')); var arg = process.argv[2]; if ('--help' === arg) { console.log('\n USAGE: uuid [version] [options]\n\n'); console.log(' options:\n'); console.log(' --help Display this message and exit\n'); process.exit(0); } if (null == arg) { console.log(uuid()); process.exit(0); } if ('v1' !== arg && 'v4' !== arg) { console.error('Version must be RFC4122 version 1 or version 4, denoted as "v1" or "v4"'); process.exit(1); } console.log(uuid[arg]()); process.exit(0); �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/bench.gnu����������������000644 �000766 �000024 �00000013276 12455173731 035073� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������#!/opt/local/bin/gnuplot -persist # # # G N U P L O T # Version 4.4 patchlevel 3 # last modified March 2011 # System: Darwin 10.8.0 # # Copyright (C) 1986-1993, 1998, 2004, 2007-2010 # Thomas Williams, Colin Kelley and many others # # gnuplot home: http://www.gnuplot.info # faq, bugs, etc: type "help seeking-assistance" # immediate help: type "help" # plot window: hit 'h' set terminal postscript eps noenhanced defaultplex \ leveldefault color colortext \ solid linewidth 1.2 butt noclip \ palfuncparam 2000,0.003 \ "Helvetica" 14 set output 'bench.eps' unset clip points set clip one unset clip two set bar 1.000000 front set border 31 front linetype -1 linewidth 1.000 set xdata set ydata set zdata set x2data set y2data set timefmt x "%d/%m/%y,%H:%M" set timefmt y "%d/%m/%y,%H:%M" set timefmt z "%d/%m/%y,%H:%M" set timefmt x2 "%d/%m/%y,%H:%M" set timefmt y2 "%d/%m/%y,%H:%M" set timefmt cb "%d/%m/%y,%H:%M" set boxwidth set style fill empty border set style rectangle back fc lt -3 fillstyle solid 1.00 border lt -1 set style circle radius graph 0.02, first 0, 0 set dummy x,y set format x "% g" set format y "% g" set format x2 "% g" set format y2 "% g" set format z "% g" set format cb "% g" set angles radians unset grid set key title "" set key outside left top horizontal Right noreverse enhanced autotitles columnhead nobox set key noinvert samplen 4 spacing 1 width 0 height 0 set key maxcolumns 2 maxrows 0 unset label unset arrow set style increment default unset style line set style line 1 linetype 1 linewidth 2.000 pointtype 1 pointsize default pointinterval 0 unset style arrow set style histogram clustered gap 2 title offset character 0, 0, 0 unset logscale set offsets graph 0.05, 0.15, 0, 0 set pointsize 1.5 set pointintervalbox 1 set encoding default unset polar unset parametric unset decimalsign set view 60, 30, 1, 1 set samples 100, 100 set isosamples 10, 10 set surface unset contour set clabel '%8.3g' set mapping cartesian set datafile separator whitespace unset hidden3d set cntrparam order 4 set cntrparam linear set cntrparam levels auto 5 set cntrparam points 5 set size ratio 0 1,1 set origin 0,0 set style data points set style function lines set xzeroaxis linetype -2 linewidth 1.000 set yzeroaxis linetype -2 linewidth 1.000 set zzeroaxis linetype -2 linewidth 1.000 set x2zeroaxis linetype -2 linewidth 1.000 set y2zeroaxis linetype -2 linewidth 1.000 set ticslevel 0.5 set mxtics default set mytics default set mztics default set mx2tics default set my2tics default set mcbtics default set xtics border in scale 1,0.5 mirror norotate offset character 0, 0, 0 set xtics norangelimit set xtics () set ytics border in scale 1,0.5 mirror norotate offset character 0, 0, 0 set ytics autofreq norangelimit set ztics border in scale 1,0.5 nomirror norotate offset character 0, 0, 0 set ztics autofreq norangelimit set nox2tics set noy2tics set cbtics border in scale 1,0.5 mirror norotate offset character 0, 0, 0 set cbtics autofreq norangelimit set title "" set title offset character 0, 0, 0 font "" norotate set timestamp bottom set timestamp "" set timestamp offset character 0, 0, 0 font "" norotate set rrange [ * : * ] noreverse nowriteback # (currently [8.98847e+307:-8.98847e+307] ) set autoscale rfixmin set autoscale rfixmax set trange [ * : * ] noreverse nowriteback # (currently [-5.00000:5.00000] ) set autoscale tfixmin set autoscale tfixmax set urange [ * : * ] noreverse nowriteback # (currently [-10.0000:10.0000] ) set autoscale ufixmin set autoscale ufixmax set vrange [ * : * ] noreverse nowriteback # (currently [-10.0000:10.0000] ) set autoscale vfixmin set autoscale vfixmax set xlabel "" set xlabel offset character 0, 0, 0 font "" textcolor lt -1 norotate set x2label "" set x2label offset character 0, 0, 0 font "" textcolor lt -1 norotate set xrange [ * : * ] noreverse nowriteback # (currently [-0.150000:3.15000] ) set autoscale xfixmin set autoscale xfixmax set x2range [ * : * ] noreverse nowriteback # (currently [0.00000:3.00000] ) set autoscale x2fixmin set autoscale x2fixmax set ylabel "" set ylabel offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270 set y2label "" set y2label offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270 set yrange [ 0.00000 : 1.90000e+06 ] noreverse nowriteback # (currently [:] ) set autoscale yfixmin set autoscale yfixmax set y2range [ * : * ] noreverse nowriteback # (currently [0.00000:1.90000e+06] ) set autoscale y2fixmin set autoscale y2fixmax set zlabel "" set zlabel offset character 0, 0, 0 font "" textcolor lt -1 norotate set zrange [ * : * ] noreverse nowriteback # (currently [-10.0000:10.0000] ) set autoscale zfixmin set autoscale zfixmax set cblabel "" set cblabel offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270 set cbrange [ * : * ] noreverse nowriteback # (currently [8.98847e+307:-8.98847e+307] ) set autoscale cbfixmin set autoscale cbfixmax set zero 1e-08 set lmargin -1 set bmargin -1 set rmargin -1 set tmargin -1 set pm3d explicit at s set pm3d scansautomatic set pm3d interpolate 1,1 flush begin noftriangles nohidden3d corners2color mean set palette positive nops_allcF maxcolors 0 gamma 1.5 color model RGB set palette rgbformulae 7, 5, 15 set colorbox default set colorbox vertical origin screen 0.9, 0.2, 0 size screen 0.05, 0.6, 0 front bdefault set loadpath set fontpath set fit noerrorvariables GNUTERM = "aqua" plot 'bench_results.txt' using 2:xticlabel(1) w lp lw 2, '' using 3:xticlabel(1) w lp lw 2, '' using 4:xticlabel(1) w lp lw 2, '' using 5:xticlabel(1) w lp lw 2, '' using 6:xticlabel(1) w lp lw 2, '' using 7:xticlabel(1) w lp lw 2, '' using 8:xticlabel(1) w lp lw 2, '' using 9:xticlabel(1) w lp lw 2 # EOF ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/bench.sh�����������������000755 �000766 �000024 �00000002115 12455173731 034705� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash # for a given node version run: # for i in {0..9}; do node benchmark.js >> bench_0.6.2.log; done; PATTERNS=('nodeuuid.v1()' "nodeuuid.v1('binary'," 'nodeuuid.v4()' "nodeuuid.v4('binary'," "uuid()" "uuid('binary')" 'uuidjs.create(1)' 'uuidjs.create(4)' '140byte') FILES=(node_uuid_v1_string node_uuid_v1_buf node_uuid_v4_string node_uuid_v4_buf libuuid_v4_string libuuid_v4_binary uuidjs_v1_string uuidjs_v4_string 140byte_es) INDICES=(2 3 2 3 2 2 2 2 2) VERSIONS=$( ls bench_*.log | sed -e 's/^bench_\([0-9\.]*\)\.log/\1/' | tr "\\n" " " ) TMPJOIN="tmp_join" OUTPUT="bench_results.txt" for I in ${!FILES[*]}; do F=${FILES[$I]} P=${PATTERNS[$I]} INDEX=${INDICES[$I]} echo "version $F" > $F for V in $VERSIONS; do (VAL=$( grep "$P" bench_$V.log | LC_ALL=en_US awk '{ sum += $'$INDEX' } END { print sum/NR }' ); echo $V $VAL) >> $F done if [ $I == 0 ]; then cat $F > $TMPJOIN else join $TMPJOIN $F > $OUTPUT cp $OUTPUT $TMPJOIN fi rm $F done rm $TMPJOIN gnuplot bench.gnu convert -density 200 -resize 800x560 -flatten bench.eps bench.png rm bench.eps ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/benchmark-native.c�������000644 �000766 �000024 �00000001145 12455173731 036653� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* Test performance of native C UUID generation To Compile: cc -luuid benchmark-native.c -o benchmark-native */ #include <stdio.h> #include <unistd.h> #include <sys/time.h> #include <uuid/uuid.h> int main() { uuid_t myid; char buf[36+1]; int i; struct timeval t; double start, finish; gettimeofday(&t, NULL); start = t.tv_sec + t.tv_usec/1e6; int n = 2e5; for (i = 0; i < n; i++) { uuid_generate(myid); uuid_unparse(myid, buf); } gettimeofday(&t, NULL); finish = t.tv_sec + t.tv_usec/1e6; double dur = finish - start; printf("%d uuids/sec", (int)(n/dur)); return 0; } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/benchmark.js�������������000644 �000766 �000024 �00000004275 12455173731 035570� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������try { var nodeuuid = require('../uuid'); } catch (e) { console.error('node-uuid require failed - skipping tests'); } try { var uuid = require('uuid'); } catch (e) { console.error('uuid require failed - skipping tests'); } try { var uuidjs = require('uuid-js'); } catch (e) { console.error('uuid-js require failed - skipping tests'); } var N = 5e5; function rate(msg, t) { console.log(msg + ': ' + (N / (Date.now() - t) * 1e3 | 0) + ' uuids/second'); } console.log('# v4'); // node-uuid - string form if (nodeuuid) { for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4(); rate('nodeuuid.v4() - using node.js crypto RNG', t); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4({rng: nodeuuid.mathRNG}); rate('nodeuuid.v4() - using Math.random() RNG', t); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4('binary'); rate('nodeuuid.v4(\'binary\')', t); var buffer = new nodeuuid.BufferClass(16); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4('binary', buffer); rate('nodeuuid.v4(\'binary\', buffer)', t); } // libuuid - string form if (uuid) { for (var i = 0, t = Date.now(); i < N; i++) uuid(); rate('uuid()', t); for (var i = 0, t = Date.now(); i < N; i++) uuid('binary'); rate('uuid(\'binary\')', t); } // uuid-js - string form if (uuidjs) { for (var i = 0, t = Date.now(); i < N; i++) uuidjs.create(4); rate('uuidjs.create(4)', t); } // 140byte.es for (var i = 0, t = Date.now(); i < N; i++) 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g,function(s,r){r=Math.random()*16|0;return (s=='x'?r:r&0x3|0x8).toString(16)}); rate('140byte.es_v4', t); console.log(''); console.log('# v1'); // node-uuid - v1 string form if (nodeuuid) { for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1(); rate('nodeuuid.v1()', t); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1('binary'); rate('nodeuuid.v1(\'binary\')', t); var buffer = new nodeuuid.BufferClass(16); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1('binary', buffer); rate('nodeuuid.v1(\'binary\', buffer)', t); } // uuid-js - v1 string form if (uuidjs) { for (var i = 0, t = Date.now(); i < N; i++) uuidjs.create(1); rate('uuidjs.create(1)', t); } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/node-uuid/benchmark/README.md����������������000644 �000766 �000024 �00000003757 12455173731 034563� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# node-uuid Benchmarks ### Results To see the results of our benchmarks visit https://github.com/broofa/node-uuid/wiki/Benchmark ### Run them yourself node-uuid comes with some benchmarks to measure performance of generating UUIDs. These can be run using node.js. node-uuid is being benchmarked against some other uuid modules, that are available through npm namely `uuid` and `uuid-js`. To prepare and run the benchmark issue; ``` npm install uuid uuid-js node benchmark/benchmark.js ``` You'll see an output like this one: ``` # v4 nodeuuid.v4(): 854700 uuids/second nodeuuid.v4('binary'): 788643 uuids/second nodeuuid.v4('binary', buffer): 1336898 uuids/second uuid(): 479386 uuids/second uuid('binary'): 582072 uuids/second uuidjs.create(4): 312304 uuids/second # v1 nodeuuid.v1(): 938086 uuids/second nodeuuid.v1('binary'): 683060 uuids/second nodeuuid.v1('binary', buffer): 1644736 uuids/second uuidjs.create(1): 190621 uuids/second ``` * The `uuid()` entries are for Nikhil Marathe's [uuid module](https://bitbucket.org/nikhilm/uuidjs) which is a wrapper around the native libuuid library. * The `uuidjs()` entries are for Patrick Negri's [uuid-js module](https://github.com/pnegri/uuid-js) which is a pure javascript implementation based on [UUID.js](https://github.com/LiosK/UUID.js) by LiosK. If you want to get more reliable results you can run the benchmark multiple times and write the output into a log file: ``` for i in {0..9}; do node benchmark/benchmark.js >> benchmark/bench_0.4.12.log; done; ``` If you're interested in how performance varies between different node versions, you can issue the above command multiple times. You can then use the shell script `bench.sh` provided in this directory to calculate the averages over all benchmark runs and draw a nice plot: ``` (cd benchmark/ && ./bench.sh) ``` This assumes you have [gnuplot](http://www.gnuplot.info/) and [ImageMagick](http://www.imagemagick.org/) installed. You'll find a nice `bench.png` graph in the `benchmark/` directory then. �����������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/.npmignore�000644 �000766 �000024 �00000000232 12455173731 033611� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������test build.js # OS generated files # ###################### .DS_Store* # Icon? ehthumbs.db Thumbs.db # Node.js # ########### node_modules npm-debug.log ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/.travis.yml000644 �000766 �000024 �00000000546 12455173731 033733� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10" - "0.11" matrix: allow_failures: - node_js: "0.11" fast_finish: true before_install: # remove build script deps before install - node -pe 'f="./package.json";p=require(f);d=p.devDependencies;for(k in d){if("co"===k.substr(0,2))delete d[k]}require("fs").writeFileSync(f,JSON.stringify(p,null,2))' ����������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/mime-types/component.json��������������������000644 �000766 �000024 �00000000705 12455173731 034435� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "mime-types", "description": "The ultimate javascript content-type utility.", "version": "0.1.0", "author": { "name": "Jonathan Ong", "email": "me@jongleberry.com", "url": "http://jongleberry.com", "twitter": "https://twitter.com/jongleberry" }, "repository": "expressjs/mime-types", "license": "MIT", "main": "lib/index.js", "scripts": ["lib/index.js"], "json": ["mime.json", "node.json", "custom.json"] } �����������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/lib/�������000755 �000766 �000024 �00000000000 12456115120 032350� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/LICENSE����000644 �000766 �000024 �00000002113 12455173731 032617� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ The MIT License (MIT) Copyright (c) 2014 Jonathan Ong me@jongleberry.com Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/Makefile���000644 �000766 �000024 �00000000217 12455173731 033255� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ build: node --harmony-generators build.js test: node test/mime.js mocha --require should --reporter spec test/test.js .PHONY: build test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/mime-types/package.json����������������������000644 �000766 �000024 �00000003257 12455173731 034033� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "mime-types", "description": "The ultimate javascript content-type utility.", "version": "1.0.2", "author": { "name": "Jonathan Ong", "email": "me@jongleberry.com", "url": "http://jongleberry.com" }, "contributors": [ { "name": "Jeremiah Senkpiel", "email": "fishrock123@rocketmail.com", "url": "https://searchbeam.jit.su" } ], "repository": { "type": "git", "url": "https://github.com/expressjs/mime-types" }, "license": "MIT", "main": "lib", "devDependencies": { "co": "3", "cogent": "0", "mocha": "1", "should": "3" }, "engines": { "node": ">= 0.8.0" }, "scripts": { "test": "make test" }, "gitHead": "e82b23836eb42003b8346fb31769da2fb7eb54e8", "bugs": { "url": "https://github.com/expressjs/mime-types/issues" }, "homepage": "https://github.com/expressjs/mime-types", "_id": "mime-types@1.0.2", "_shasum": "995ae1392ab8affcbfcb2641dd054e943c0d5dce", "_from": "mime-types@>=1.0.1 <1.1.0", "_npmVersion": "1.4.21", "_npmUser": { "name": "dougwilson", "email": "doug@somethingdoug.com" }, "maintainers": [ { "name": "jongleberry", "email": "jonathanrichardong@gmail.com" }, { "name": "fishrock123", "email": "fishrock123@rocketmail.com" }, { "name": "dougwilson", "email": "doug@somethingdoug.com" } ], "dist": { "shasum": "995ae1392ab8affcbfcb2641dd054e943c0d5dce", "tarball": "http://registry.npmjs.org/mime-types/-/mime-types-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/mime-types/-/mime-types-1.0.2.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/README.md��000644 �000766 �000024 �00000004717 12455173731 033105� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# mime-types [![NPM version](https://badge.fury.io/js/mime-types.svg)](https://badge.fury.io/js/mime-types) [![Build Status](https://travis-ci.org/expressjs/mime-types.svg?branch=master)](https://travis-ci.org/expressjs/mime-types) The ultimate javascript content-type utility. ### Install ```sh $ npm install mime-types ``` #### Similar to [node-mime](https://github.com/broofa/node-mime), except: - __No fallbacks.__ Instead of naively returning the first available type, `mime-types` simply returns `false`, so do `var type = mime.lookup('unrecognized') || 'application/octet-stream'`. - No `new Mime()` business, so you could do `var lookup = require('mime-types').lookup`. - Additional mime types are added such as jade and stylus. Feel free to add more! - Browser support via Browserify and Component by converting lists to JSON files. Otherwise, the API is compatible. ### Adding Types If you'd like to add additional types, simply create a PR adding the type to `custom.json` and a reference link to the [sources](SOURCES.md). Do __NOT__ edit `mime.json` or `node.json`. Those are pulled using `build.js`. You should only touch `custom.json`. ## API ```js var mime = require('mime-types') ``` All functions return `false` if input is invalid or not found. ### mime.lookup(path) Lookup the content-type associated with a file. ```js mime.lookup('json') // 'application/json' mime.lookup('.md') // 'text/x-markdown' mime.lookup('file.html') // 'text/html' mime.lookup('folder/file.js') // 'application/javascript' mime.lookup('cats') // false ``` ### mime.contentType(type) Create a full content-type header given a content-type or extension. ```js mime.contentType('markdown') // 'text/x-markdown; charset=utf-8' mime.contentType('file.json') // 'application/json; charset=utf-8' ``` ### mime.extension(type) Get the default extension for a content-type. ```js mime.extension('application/octet-stream') // 'bin' ``` ### mime.charset(type) Lookup the implied default charset of a content-type. ```js mime.charset('text/x-markdown') // 'UTF-8' ``` ### mime.types[extension] = type A map of content-types by extension. ### mime.extensions[type] = [extensions] A map of extensions by content-type. ### mime.define(types) Globally add definitions. `types` must be an object of the form: ```js { "<content-type>": [extensions...], "<content-type>": [extensions...] } ``` See the `.json` files in `lib/` for examples. ## License [MIT](LICENSE) �������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/mime-types/SOURCES.md�000644 �000766 �000024 �00000002177 12455173731 033271� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ ### Sources for custom types This is a list of sources for any custom mime types. When adding custom mime types, please link to where you found the mime type, even if it's from an unofficial source. - `text/coffeescript` - http://coffeescript.org/#scripts - `text/x-handlebars-template` - https://handlebarsjs.com/#getting-started - `text/x-sass` & `text/x-scss` - https://github.com/janlelis/rubybuntu-mime/blob/master/sass.xml - `text.jsx` - http://facebook.github.io/react/docs/getting-started.html [[2]](https://github.com/facebook/react/blob/f230e0a03154e6f8a616e0da1fb3d97ffa1a6472/vendor/browser-transforms.js#L210) [Sources for node.json types](https://github.com/broofa/node-mime/blob/master/types/node.types) ### Notes on weird types - `font/opentype` - This type is technically invalid according to the spec. No valid types begin with `font/`. No-one uses the official type of `application/vnd.ms-opentype` as the community standardized `application/x-font-otf`. However, chrome logs nonsense warnings unless opentype fonts are served with `font/opentype`. [[1]](http://stackoverflow.com/questions/2871655/proper-mime-type-for-fonts) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/mime-types/lib/custom.json�������������������000644 �000766 �000024 �00000000467 12455173731 034520� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "text/jade": [ "jade" ], "text/stylus": [ "stylus", "styl" ], "text/less": [ "less" ], "text/x-sass": [ "sass" ], "text/x-scss": [ "scss" ], "text/coffeescript": [ "coffee" ], "text/x-handlebars-template": [ "hbs" ], "text/jsx": [ "jsx" ] } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/mime-types/lib/index.js����������������������000644 �000766 �000024 �00000003652 12455173731 033757� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� // types[extension] = type exports.types = Object.create(null) // extensions[type] = [extensions] exports.extensions = Object.create(null) // define more mime types exports.define = define // store the json files exports.json = { mime: require('./mime.json'), node: require('./node.json'), custom: require('./custom.json'), } exports.lookup = function (string) { if (!string || typeof string !== "string") return false string = string.replace(/.*[\.\/\\]/, '').toLowerCase() if (!string) return false return exports.types[string] || false } exports.extension = function (type) { if (!type || typeof type !== "string") return false type = type.match(/^\s*([^;\s]*)(?:;|\s|$)/) if (!type) return false var exts = exports.extensions[type[1].toLowerCase()] if (!exts || !exts.length) return false return exts[0] } // type has to be an exact mime type exports.charset = function (type) { // special cases switch (type) { case 'application/json': return 'UTF-8' case 'application/javascript': return 'UTF-8' } // default text/* to utf-8 if (/^text\//.test(type)) return 'UTF-8' return false } // backwards compatibility exports.charsets = { lookup: exports.charset } exports.contentType = function (type) { if (!type || typeof type !== "string") return false if (!~type.indexOf('/')) type = exports.lookup(type) if (!type) return false if (!~type.indexOf('charset')) { var charset = exports.charset(type) if (charset) type += '; charset=' + charset.toLowerCase() } return type } define(exports.json.mime) define(exports.json.node) define(exports.json.custom) function define(json) { Object.keys(json).forEach(function (type) { var exts = json[type] || [] exports.extensions[type] = exports.extensions[type] || [] exts.forEach(function (ext) { if (!~exports.extensions[type].indexOf(ext)) exports.extensions[type].push(ext) exports.types[ext] = type }) }) } ��������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/mime-types/lib/mime.json���������������������000644 �000766 �000024 �00000212775 12455173731 034144� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "application/1d-interleaved-parityfec": [], "application/3gpp-ims+xml": [], "application/activemessage": [], "application/andrew-inset": [ "ez" ], "application/applefile": [], "application/applixware": [ "aw" ], "application/atom+xml": [ "atom" ], "application/atomcat+xml": [ "atomcat" ], "application/atomicmail": [], "application/atomsvc+xml": [ "atomsvc" ], "application/auth-policy+xml": [], "application/batch-smtp": [], "application/beep+xml": [], "application/calendar+xml": [], "application/cals-1840": [], "application/ccmp+xml": [], "application/ccxml+xml": [ "ccxml" ], "application/cdmi-capability": [ "cdmia" ], "application/cdmi-container": [ "cdmic" ], "application/cdmi-domain": [ "cdmid" ], "application/cdmi-object": [ "cdmio" ], "application/cdmi-queue": [ "cdmiq" ], "application/cea-2018+xml": [], "application/cellml+xml": [], "application/cfw": [], "application/cnrp+xml": [], "application/commonground": [], "application/conference-info+xml": [], "application/cpl+xml": [], "application/csta+xml": [], "application/cstadata+xml": [], "application/cu-seeme": [ "cu" ], "application/cybercash": [], "application/davmount+xml": [ "davmount" ], "application/dca-rft": [], "application/dec-dx": [], "application/dialog-info+xml": [], "application/dicom": [], "application/dns": [], "application/docbook+xml": [ "dbk" ], "application/dskpp+xml": [], "application/dssc+der": [ "dssc" ], "application/dssc+xml": [ "xdssc" ], "application/dvcs": [], "application/ecmascript": [ "ecma" ], "application/edi-consent": [], "application/edi-x12": [], "application/edifact": [], "application/emma+xml": [ "emma" ], "application/epp+xml": [], "application/epub+zip": [ "epub" ], "application/eshop": [], "application/example": [], "application/exi": [ "exi" ], "application/fastinfoset": [], "application/fastsoap": [], "application/fits": [], "application/font-tdpfr": [ "pfr" ], "application/framework-attributes+xml": [], "application/gml+xml": [ "gml" ], "application/gpx+xml": [ "gpx" ], "application/gxf": [ "gxf" ], "application/h224": [], "application/held+xml": [], "application/http": [], "application/hyperstudio": [ "stk" ], "application/ibe-key-request+xml": [], "application/ibe-pkg-reply+xml": [], "application/ibe-pp-data": [], "application/iges": [], "application/im-iscomposing+xml": [], "application/index": [], "application/index.cmd": [], "application/index.obj": [], "application/index.response": [], "application/index.vnd": [], "application/inkml+xml": [ "ink", "inkml" ], "application/iotp": [], "application/ipfix": [ "ipfix" ], "application/ipp": [], "application/isup": [], "application/java-archive": [ "jar" ], "application/java-serialized-object": [ "ser" ], "application/java-vm": [ "class" ], "application/javascript": [ "js" ], "application/json": [ "json" ], "application/jsonml+json": [ "jsonml" ], "application/kpml-request+xml": [], "application/kpml-response+xml": [], "application/lost+xml": [ "lostxml" ], "application/mac-binhex40": [ "hqx" ], "application/mac-compactpro": [ "cpt" ], "application/macwriteii": [], "application/mads+xml": [ "mads" ], "application/marc": [ "mrc" ], "application/marcxml+xml": [ "mrcx" ], "application/mathematica": [ "ma", "nb", "mb" ], "application/mathml-content+xml": [], "application/mathml-presentation+xml": [], "application/mathml+xml": [ "mathml" ], "application/mbms-associated-procedure-description+xml": [], "application/mbms-deregister+xml": [], "application/mbms-envelope+xml": [], "application/mbms-msk+xml": [], "application/mbms-msk-response+xml": [], "application/mbms-protection-description+xml": [], "application/mbms-reception-report+xml": [], "application/mbms-register+xml": [], "application/mbms-register-response+xml": [], "application/mbms-user-service-description+xml": [], "application/mbox": [ "mbox" ], "application/media_control+xml": [], "application/mediaservercontrol+xml": [ "mscml" ], "application/metalink+xml": [ "metalink" ], "application/metalink4+xml": [ "meta4" ], "application/mets+xml": [ "mets" ], "application/mikey": [], "application/mods+xml": [ "mods" ], "application/moss-keys": [], "application/moss-signature": [], "application/mosskey-data": [], "application/mosskey-request": [], "application/mp21": [ "m21", "mp21" ], "application/mp4": [ "mp4s" ], "application/mpeg4-generic": [], "application/mpeg4-iod": [], "application/mpeg4-iod-xmt": [], "application/msc-ivr+xml": [], "application/msc-mixer+xml": [], "application/msword": [ "doc", "dot" ], "application/mxf": [ "mxf" ], "application/nasdata": [], "application/news-checkgroups": [], "application/news-groupinfo": [], "application/news-transmission": [], "application/nss": [], "application/ocsp-request": [], "application/ocsp-response": [], "application/octet-stream": [ "bin", "dms", "lrf", "mar", "so", "dist", "distz", "pkg", "bpk", "dump", "elc", "deploy" ], "application/oda": [ "oda" ], "application/oebps-package+xml": [ "opf" ], "application/ogg": [ "ogx" ], "application/omdoc+xml": [ "omdoc" ], "application/onenote": [ "onetoc", "onetoc2", "onetmp", "onepkg" ], "application/oxps": [ "oxps" ], "application/parityfec": [], "application/patch-ops-error+xml": [ "xer" ], "application/pdf": [ "pdf" ], "application/pgp-encrypted": [ "pgp" ], "application/pgp-keys": [], "application/pgp-signature": [ "asc", "sig" ], "application/pics-rules": [ "prf" ], "application/pidf+xml": [], "application/pidf-diff+xml": [], "application/pkcs10": [ "p10" ], "application/pkcs7-mime": [ "p7m", "p7c" ], "application/pkcs7-signature": [ "p7s" ], "application/pkcs8": [ "p8" ], "application/pkix-attr-cert": [ "ac" ], "application/pkix-cert": [ "cer" ], "application/pkix-crl": [ "crl" ], "application/pkix-pkipath": [ "pkipath" ], "application/pkixcmp": [ "pki" ], "application/pls+xml": [ "pls" ], "application/poc-settings+xml": [], "application/postscript": [ "ai", "eps", "ps" ], "application/prs.alvestrand.titrax-sheet": [], "application/prs.cww": [ "cww" ], "application/prs.nprend": [], "application/prs.plucker": [], "application/prs.rdf-xml-crypt": [], "application/prs.xsf+xml": [], "application/pskc+xml": [ "pskcxml" ], "application/qsig": [], "application/rdf+xml": [ "rdf" ], "application/reginfo+xml": [ "rif" ], "application/relax-ng-compact-syntax": [ "rnc" ], "application/remote-printing": [], "application/resource-lists+xml": [ "rl" ], "application/resource-lists-diff+xml": [ "rld" ], "application/riscos": [], "application/rlmi+xml": [], "application/rls-services+xml": [ "rs" ], "application/rpki-ghostbusters": [ "gbr" ], "application/rpki-manifest": [ "mft" ], "application/rpki-roa": [ "roa" ], "application/rpki-updown": [], "application/rsd+xml": [ "rsd" ], "application/rss+xml": [ "rss" ], "application/rtf": [ "rtf" ], "application/rtx": [], "application/samlassertion+xml": [], "application/samlmetadata+xml": [], "application/sbml+xml": [ "sbml" ], "application/scvp-cv-request": [ "scq" ], "application/scvp-cv-response": [ "scs" ], "application/scvp-vp-request": [ "spq" ], "application/scvp-vp-response": [ "spp" ], "application/sdp": [ "sdp" ], "application/set-payment": [], "application/set-payment-initiation": [ "setpay" ], "application/set-registration": [], "application/set-registration-initiation": [ "setreg" ], "application/sgml": [], "application/sgml-open-catalog": [], "application/shf+xml": [ "shf" ], "application/sieve": [], "application/simple-filter+xml": [], "application/simple-message-summary": [], "application/simplesymbolcontainer": [], "application/slate": [], "application/smil": [], "application/smil+xml": [ "smi", "smil" ], "application/soap+fastinfoset": [], "application/soap+xml": [], "application/sparql-query": [ "rq" ], "application/sparql-results+xml": [ "srx" ], "application/spirits-event+xml": [], "application/srgs": [ "gram" ], "application/srgs+xml": [ "grxml" ], "application/sru+xml": [ "sru" ], "application/ssdl+xml": [ "ssdl" ], "application/ssml+xml": [ "ssml" ], "application/tamp-apex-update": [], "application/tamp-apex-update-confirm": [], "application/tamp-community-update": [], "application/tamp-community-update-confirm": [], "application/tamp-error": [], "application/tamp-sequence-adjust": [], "application/tamp-sequence-adjust-confirm": [], "application/tamp-status-query": [], "application/tamp-status-response": [], "application/tamp-update": [], "application/tamp-update-confirm": [], "application/tei+xml": [ "tei", "teicorpus" ], "application/thraud+xml": [ "tfi" ], "application/timestamp-query": [], "application/timestamp-reply": [], "application/timestamped-data": [ "tsd" ], "application/tve-trigger": [], "application/ulpfec": [], "application/vcard+xml": [], "application/vemmi": [], "application/vividence.scriptfile": [], "application/vnd.3gpp.bsf+xml": [], "application/vnd.3gpp.pic-bw-large": [ "plb" ], "application/vnd.3gpp.pic-bw-small": [ "psb" ], "application/vnd.3gpp.pic-bw-var": [ "pvb" ], "application/vnd.3gpp.sms": [], "application/vnd.3gpp2.bcmcsinfo+xml": [], "application/vnd.3gpp2.sms": [], "application/vnd.3gpp2.tcap": [ "tcap" ], "application/vnd.3m.post-it-notes": [ "pwn" ], "application/vnd.accpac.simply.aso": [ "aso" ], "application/vnd.accpac.simply.imp": [ "imp" ], "application/vnd.acucobol": [ "acu" ], "application/vnd.acucorp": [ "atc", "acutc" ], "application/vnd.adobe.air-application-installer-package+zip": [ "air" ], "application/vnd.adobe.formscentral.fcdt": [ "fcdt" ], "application/vnd.adobe.fxp": [ "fxp", "fxpl" ], "application/vnd.adobe.partial-upload": [], "application/vnd.adobe.xdp+xml": [ "xdp" ], "application/vnd.adobe.xfdf": [ "xfdf" ], "application/vnd.aether.imp": [], "application/vnd.ah-barcode": [], "application/vnd.ahead.space": [ "ahead" ], "application/vnd.airzip.filesecure.azf": [ "azf" ], "application/vnd.airzip.filesecure.azs": [ "azs" ], "application/vnd.amazon.ebook": [ "azw" ], "application/vnd.americandynamics.acc": [ "acc" ], "application/vnd.amiga.ami": [ "ami" ], "application/vnd.amundsen.maze+xml": [], "application/vnd.android.package-archive": [ "apk" ], "application/vnd.anser-web-certificate-issue-initiation": [ "cii" ], "application/vnd.anser-web-funds-transfer-initiation": [ "fti" ], "application/vnd.antix.game-component": [ "atx" ], "application/vnd.apple.installer+xml": [ "mpkg" ], "application/vnd.apple.mpegurl": [ "m3u8" ], "application/vnd.arastra.swi": [], "application/vnd.aristanetworks.swi": [ "swi" ], "application/vnd.astraea-software.iota": [ "iota" ], "application/vnd.audiograph": [ "aep" ], "application/vnd.autopackage": [], "application/vnd.avistar+xml": [], "application/vnd.blueice.multipass": [ "mpm" ], "application/vnd.bluetooth.ep.oob": [], "application/vnd.bmi": [ "bmi" ], "application/vnd.businessobjects": [ "rep" ], "application/vnd.cab-jscript": [], "application/vnd.canon-cpdl": [], "application/vnd.canon-lips": [], "application/vnd.cendio.thinlinc.clientconf": [], "application/vnd.chemdraw+xml": [ "cdxml" ], "application/vnd.chipnuts.karaoke-mmd": [ "mmd" ], "application/vnd.cinderella": [ "cdy" ], "application/vnd.cirpack.isdn-ext": [], "application/vnd.claymore": [ "cla" ], "application/vnd.cloanto.rp9": [ "rp9" ], "application/vnd.clonk.c4group": [ "c4g", "c4d", "c4f", "c4p", "c4u" ], "application/vnd.cluetrust.cartomobile-config": [ "c11amc" ], "application/vnd.cluetrust.cartomobile-config-pkg": [ "c11amz" ], "application/vnd.collection+json": [], "application/vnd.commerce-battelle": [], "application/vnd.commonspace": [ "csp" ], "application/vnd.contact.cmsg": [ "cdbcmsg" ], "application/vnd.cosmocaller": [ "cmc" ], "application/vnd.crick.clicker": [ "clkx" ], "application/vnd.crick.clicker.keyboard": [ "clkk" ], "application/vnd.crick.clicker.palette": [ "clkp" ], "application/vnd.crick.clicker.template": [ "clkt" ], "application/vnd.crick.clicker.wordbank": [ "clkw" ], "application/vnd.criticaltools.wbs+xml": [ "wbs" ], "application/vnd.ctc-posml": [ "pml" ], "application/vnd.ctct.ws+xml": [], "application/vnd.cups-pdf": [], "application/vnd.cups-postscript": [], "application/vnd.cups-ppd": [ "ppd" ], "application/vnd.cups-raster": [], "application/vnd.cups-raw": [], "application/vnd.curl": [], "application/vnd.curl.car": [ "car" ], "application/vnd.curl.pcurl": [ "pcurl" ], "application/vnd.cybank": [], "application/vnd.dart": [ "dart" ], "application/vnd.data-vision.rdz": [ "rdz" ], "application/vnd.dece.data": [ "uvf", "uvvf", "uvd", "uvvd" ], "application/vnd.dece.ttml+xml": [ "uvt", "uvvt" ], "application/vnd.dece.unspecified": [ "uvx", "uvvx" ], "application/vnd.dece.zip": [ "uvz", "uvvz" ], "application/vnd.denovo.fcselayout-link": [ "fe_launch" ], "application/vnd.dir-bi.plate-dl-nosuffix": [], "application/vnd.dna": [ "dna" ], "application/vnd.dolby.mlp": [ "mlp" ], "application/vnd.dolby.mobile.1": [], "application/vnd.dolby.mobile.2": [], "application/vnd.dpgraph": [ "dpg" ], "application/vnd.dreamfactory": [ "dfac" ], "application/vnd.ds-keypoint": [ "kpxx" ], "application/vnd.dvb.ait": [ "ait" ], "application/vnd.dvb.dvbj": [], "application/vnd.dvb.esgcontainer": [], "application/vnd.dvb.ipdcdftnotifaccess": [], "application/vnd.dvb.ipdcesgaccess": [], "application/vnd.dvb.ipdcesgaccess2": [], "application/vnd.dvb.ipdcesgpdd": [], "application/vnd.dvb.ipdcroaming": [], "application/vnd.dvb.iptv.alfec-base": [], "application/vnd.dvb.iptv.alfec-enhancement": [], "application/vnd.dvb.notif-aggregate-root+xml": [], "application/vnd.dvb.notif-container+xml": [], "application/vnd.dvb.notif-generic+xml": [], "application/vnd.dvb.notif-ia-msglist+xml": [], "application/vnd.dvb.notif-ia-registration-request+xml": [], "application/vnd.dvb.notif-ia-registration-response+xml": [], "application/vnd.dvb.notif-init+xml": [], "application/vnd.dvb.pfr": [], "application/vnd.dvb.service": [ "svc" ], "application/vnd.dxr": [], "application/vnd.dynageo": [ "geo" ], "application/vnd.easykaraoke.cdgdownload": [], "application/vnd.ecdis-update": [], "application/vnd.ecowin.chart": [ "mag" ], "application/vnd.ecowin.filerequest": [], "application/vnd.ecowin.fileupdate": [], "application/vnd.ecowin.series": [], "application/vnd.ecowin.seriesrequest": [], "application/vnd.ecowin.seriesupdate": [], "application/vnd.emclient.accessrequest+xml": [], "application/vnd.enliven": [ "nml" ], "application/vnd.eprints.data+xml": [], "application/vnd.epson.esf": [ "esf" ], "application/vnd.epson.msf": [ "msf" ], "application/vnd.epson.quickanime": [ "qam" ], "application/vnd.epson.salt": [ "slt" ], "application/vnd.epson.ssf": [ "ssf" ], "application/vnd.ericsson.quickcall": [], "application/vnd.eszigno3+xml": [ "es3", "et3" ], "application/vnd.etsi.aoc+xml": [], "application/vnd.etsi.cug+xml": [], "application/vnd.etsi.iptvcommand+xml": [], "application/vnd.etsi.iptvdiscovery+xml": [], "application/vnd.etsi.iptvprofile+xml": [], "application/vnd.etsi.iptvsad-bc+xml": [], "application/vnd.etsi.iptvsad-cod+xml": [], "application/vnd.etsi.iptvsad-npvr+xml": [], "application/vnd.etsi.iptvservice+xml": [], "application/vnd.etsi.iptvsync+xml": [], "application/vnd.etsi.iptvueprofile+xml": [], "application/vnd.etsi.mcid+xml": [], "application/vnd.etsi.overload-control-policy-dataset+xml": [], "application/vnd.etsi.sci+xml": [], "application/vnd.etsi.simservs+xml": [], "application/vnd.etsi.tsl+xml": [], "application/vnd.etsi.tsl.der": [], "application/vnd.eudora.data": [], "application/vnd.ezpix-album": [ "ez2" ], "application/vnd.ezpix-package": [ "ez3" ], "application/vnd.f-secure.mobile": [], "application/vnd.fdf": [ "fdf" ], "application/vnd.fdsn.mseed": [ "mseed" ], "application/vnd.fdsn.seed": [ "seed", "dataless" ], "application/vnd.ffsns": [], "application/vnd.fints": [], "application/vnd.flographit": [ "gph" ], "application/vnd.fluxtime.clip": [ "ftc" ], "application/vnd.font-fontforge-sfd": [], "application/vnd.framemaker": [ "fm", "frame", "maker", "book" ], "application/vnd.frogans.fnc": [ "fnc" ], "application/vnd.frogans.ltf": [ "ltf" ], "application/vnd.fsc.weblaunch": [ "fsc" ], "application/vnd.fujitsu.oasys": [ "oas" ], "application/vnd.fujitsu.oasys2": [ "oa2" ], "application/vnd.fujitsu.oasys3": [ "oa3" ], "application/vnd.fujitsu.oasysgp": [ "fg5" ], "application/vnd.fujitsu.oasysprs": [ "bh2" ], "application/vnd.fujixerox.art-ex": [], "application/vnd.fujixerox.art4": [], "application/vnd.fujixerox.hbpl": [], "application/vnd.fujixerox.ddd": [ "ddd" ], "application/vnd.fujixerox.docuworks": [ "xdw" ], "application/vnd.fujixerox.docuworks.binder": [ "xbd" ], "application/vnd.fut-misnet": [], "application/vnd.fuzzysheet": [ "fzs" ], "application/vnd.genomatix.tuxedo": [ "txd" ], "application/vnd.geocube+xml": [], "application/vnd.geogebra.file": [ "ggb" ], "application/vnd.geogebra.tool": [ "ggt" ], "application/vnd.geometry-explorer": [ "gex", "gre" ], "application/vnd.geonext": [ "gxt" ], "application/vnd.geoplan": [ "g2w" ], "application/vnd.geospace": [ "g3w" ], "application/vnd.globalplatform.card-content-mgt": [], "application/vnd.globalplatform.card-content-mgt-response": [], "application/vnd.gmx": [ "gmx" ], "application/vnd.google-earth.kml+xml": [ "kml" ], "application/vnd.google-earth.kmz": [ "kmz" ], "application/vnd.grafeq": [ "gqf", "gqs" ], "application/vnd.gridmp": [], "application/vnd.groove-account": [ "gac" ], "application/vnd.groove-help": [ "ghf" ], "application/vnd.groove-identity-message": [ "gim" ], "application/vnd.groove-injector": [ "grv" ], "application/vnd.groove-tool-message": [ "gtm" ], "application/vnd.groove-tool-template": [ "tpl" ], "application/vnd.groove-vcard": [ "vcg" ], "application/vnd.hal+json": [], "application/vnd.hal+xml": [ "hal" ], "application/vnd.handheld-entertainment+xml": [ "zmm" ], "application/vnd.hbci": [ "hbci" ], "application/vnd.hcl-bireports": [], "application/vnd.hhe.lesson-player": [ "les" ], "application/vnd.hp-hpgl": [ "hpgl" ], "application/vnd.hp-hpid": [ "hpid" ], "application/vnd.hp-hps": [ "hps" ], "application/vnd.hp-jlyt": [ "jlt" ], "application/vnd.hp-pcl": [ "pcl" ], "application/vnd.hp-pclxl": [ "pclxl" ], "application/vnd.httphone": [], "application/vnd.hzn-3d-crossword": [], "application/vnd.ibm.afplinedata": [], "application/vnd.ibm.electronic-media": [], "application/vnd.ibm.minipay": [ "mpy" ], "application/vnd.ibm.modcap": [ "afp", "listafp", "list3820" ], "application/vnd.ibm.rights-management": [ "irm" ], "application/vnd.ibm.secure-container": [ "sc" ], "application/vnd.iccprofile": [ "icc", "icm" ], "application/vnd.igloader": [ "igl" ], "application/vnd.immervision-ivp": [ "ivp" ], "application/vnd.immervision-ivu": [ "ivu" ], "application/vnd.informedcontrol.rms+xml": [], "application/vnd.informix-visionary": [], "application/vnd.infotech.project": [], "application/vnd.infotech.project+xml": [], "application/vnd.innopath.wamp.notification": [], "application/vnd.insors.igm": [ "igm" ], "application/vnd.intercon.formnet": [ "xpw", "xpx" ], "application/vnd.intergeo": [ "i2g" ], "application/vnd.intertrust.digibox": [], "application/vnd.intertrust.nncp": [], "application/vnd.intu.qbo": [ "qbo" ], "application/vnd.intu.qfx": [ "qfx" ], "application/vnd.iptc.g2.conceptitem+xml": [], "application/vnd.iptc.g2.knowledgeitem+xml": [], "application/vnd.iptc.g2.newsitem+xml": [], "application/vnd.iptc.g2.newsmessage+xml": [], "application/vnd.iptc.g2.packageitem+xml": [], "application/vnd.iptc.g2.planningitem+xml": [], "application/vnd.ipunplugged.rcprofile": [ "rcprofile" ], "application/vnd.irepository.package+xml": [ "irp" ], "application/vnd.is-xpr": [ "xpr" ], "application/vnd.isac.fcs": [ "fcs" ], "application/vnd.jam": [ "jam" ], "application/vnd.japannet-directory-service": [], "application/vnd.japannet-jpnstore-wakeup": [], "application/vnd.japannet-payment-wakeup": [], "application/vnd.japannet-registration": [], "application/vnd.japannet-registration-wakeup": [], "application/vnd.japannet-setstore-wakeup": [], "application/vnd.japannet-verification": [], "application/vnd.japannet-verification-wakeup": [], "application/vnd.jcp.javame.midlet-rms": [ "rms" ], "application/vnd.jisp": [ "jisp" ], "application/vnd.joost.joda-archive": [ "joda" ], "application/vnd.kahootz": [ "ktz", "ktr" ], "application/vnd.kde.karbon": [ "karbon" ], "application/vnd.kde.kchart": [ "chrt" ], "application/vnd.kde.kformula": [ "kfo" ], "application/vnd.kde.kivio": [ "flw" ], "application/vnd.kde.kontour": [ "kon" ], "application/vnd.kde.kpresenter": [ "kpr", "kpt" ], "application/vnd.kde.kspread": [ "ksp" ], "application/vnd.kde.kword": [ "kwd", "kwt" ], "application/vnd.kenameaapp": [ "htke" ], "application/vnd.kidspiration": [ "kia" ], "application/vnd.kinar": [ "kne", "knp" ], "application/vnd.koan": [ "skp", "skd", "skt", "skm" ], "application/vnd.kodak-descriptor": [ "sse" ], "application/vnd.las.las+xml": [ "lasxml" ], "application/vnd.liberty-request+xml": [], "application/vnd.llamagraphics.life-balance.desktop": [ "lbd" ], "application/vnd.llamagraphics.life-balance.exchange+xml": [ "lbe" ], "application/vnd.lotus-1-2-3": [ "123" ], "application/vnd.lotus-approach": [ "apr" ], "application/vnd.lotus-freelance": [ "pre" ], "application/vnd.lotus-notes": [ "nsf" ], "application/vnd.lotus-organizer": [ "org" ], "application/vnd.lotus-screencam": [ "scm" ], "application/vnd.lotus-wordpro": [ "lwp" ], "application/vnd.macports.portpkg": [ "portpkg" ], "application/vnd.marlin.drm.actiontoken+xml": [], "application/vnd.marlin.drm.conftoken+xml": [], "application/vnd.marlin.drm.license+xml": [], "application/vnd.marlin.drm.mdcf": [], "application/vnd.mcd": [ "mcd" ], "application/vnd.medcalcdata": [ "mc1" ], "application/vnd.mediastation.cdkey": [ "cdkey" ], "application/vnd.meridian-slingshot": [], "application/vnd.mfer": [ "mwf" ], "application/vnd.mfmp": [ "mfm" ], "application/vnd.micrografx.flo": [ "flo" ], "application/vnd.micrografx.igx": [ "igx" ], "application/vnd.mif": [ "mif" ], "application/vnd.minisoft-hp3000-save": [], "application/vnd.mitsubishi.misty-guard.trustweb": [], "application/vnd.mobius.daf": [ "daf" ], "application/vnd.mobius.dis": [ "dis" ], "application/vnd.mobius.mbk": [ "mbk" ], "application/vnd.mobius.mqy": [ "mqy" ], "application/vnd.mobius.msl": [ "msl" ], "application/vnd.mobius.plc": [ "plc" ], "application/vnd.mobius.txf": [ "txf" ], "application/vnd.mophun.application": [ "mpn" ], "application/vnd.mophun.certificate": [ "mpc" ], "application/vnd.motorola.flexsuite": [], "application/vnd.motorola.flexsuite.adsi": [], "application/vnd.motorola.flexsuite.fis": [], "application/vnd.motorola.flexsuite.gotap": [], "application/vnd.motorola.flexsuite.kmr": [], "application/vnd.motorola.flexsuite.ttc": [], "application/vnd.motorola.flexsuite.wem": [], "application/vnd.motorola.iprm": [], "application/vnd.mozilla.xul+xml": [ "xul" ], "application/vnd.ms-artgalry": [ "cil" ], "application/vnd.ms-asf": [], "application/vnd.ms-cab-compressed": [ "cab" ], "application/vnd.ms-color.iccprofile": [], "application/vnd.ms-excel": [ "xls", "xlm", "xla", "xlc", "xlt", "xlw" ], "application/vnd.ms-excel.addin.macroenabled.12": [ "xlam" ], "application/vnd.ms-excel.sheet.binary.macroenabled.12": [ "xlsb" ], "application/vnd.ms-excel.sheet.macroenabled.12": [ "xlsm" ], "application/vnd.ms-excel.template.macroenabled.12": [ "xltm" ], "application/vnd.ms-fontobject": [ "eot" ], "application/vnd.ms-htmlhelp": [ "chm" ], "application/vnd.ms-ims": [ "ims" ], "application/vnd.ms-lrm": [ "lrm" ], "application/vnd.ms-office.activex+xml": [], "application/vnd.ms-officetheme": [ "thmx" ], "application/vnd.ms-opentype": [], "application/vnd.ms-package.obfuscated-opentype": [], "application/vnd.ms-pki.seccat": [ "cat" ], "application/vnd.ms-pki.stl": [ "stl" ], "application/vnd.ms-playready.initiator+xml": [], "application/vnd.ms-powerpoint": [ "ppt", "pps", "pot" ], "application/vnd.ms-powerpoint.addin.macroenabled.12": [ "ppam" ], "application/vnd.ms-powerpoint.presentation.macroenabled.12": [ "pptm" ], "application/vnd.ms-powerpoint.slide.macroenabled.12": [ "sldm" ], "application/vnd.ms-powerpoint.slideshow.macroenabled.12": [ "ppsm" ], "application/vnd.ms-powerpoint.template.macroenabled.12": [ "potm" ], "application/vnd.ms-printing.printticket+xml": [], "application/vnd.ms-project": [ "mpp", "mpt" ], "application/vnd.ms-tnef": [], "application/vnd.ms-wmdrm.lic-chlg-req": [], "application/vnd.ms-wmdrm.lic-resp": [], "application/vnd.ms-wmdrm.meter-chlg-req": [], "application/vnd.ms-wmdrm.meter-resp": [], "application/vnd.ms-word.document.macroenabled.12": [ "docm" ], "application/vnd.ms-word.template.macroenabled.12": [ "dotm" ], "application/vnd.ms-works": [ "wps", "wks", "wcm", "wdb" ], "application/vnd.ms-wpl": [ "wpl" ], "application/vnd.ms-xpsdocument": [ "xps" ], "application/vnd.mseq": [ "mseq" ], "application/vnd.msign": [], "application/vnd.multiad.creator": [], "application/vnd.multiad.creator.cif": [], "application/vnd.music-niff": [], "application/vnd.musician": [ "mus" ], "application/vnd.muvee.style": [ "msty" ], "application/vnd.mynfc": [ "taglet" ], "application/vnd.ncd.control": [], "application/vnd.ncd.reference": [], "application/vnd.nervana": [], "application/vnd.netfpx": [], "application/vnd.neurolanguage.nlu": [ "nlu" ], "application/vnd.nitf": [ "ntf", "nitf" ], "application/vnd.noblenet-directory": [ "nnd" ], "application/vnd.noblenet-sealer": [ "nns" ], "application/vnd.noblenet-web": [ "nnw" ], "application/vnd.nokia.catalogs": [], "application/vnd.nokia.conml+wbxml": [], "application/vnd.nokia.conml+xml": [], "application/vnd.nokia.isds-radio-presets": [], "application/vnd.nokia.iptv.config+xml": [], "application/vnd.nokia.landmark+wbxml": [], "application/vnd.nokia.landmark+xml": [], "application/vnd.nokia.landmarkcollection+xml": [], "application/vnd.nokia.n-gage.ac+xml": [], "application/vnd.nokia.n-gage.data": [ "ngdat" ], "application/vnd.nokia.ncd": [], "application/vnd.nokia.pcd+wbxml": [], "application/vnd.nokia.pcd+xml": [], "application/vnd.nokia.radio-preset": [ "rpst" ], "application/vnd.nokia.radio-presets": [ "rpss" ], "application/vnd.novadigm.edm": [ "edm" ], "application/vnd.novadigm.edx": [ "edx" ], "application/vnd.novadigm.ext": [ "ext" ], "application/vnd.ntt-local.file-transfer": [], "application/vnd.ntt-local.sip-ta_remote": [], "application/vnd.ntt-local.sip-ta_tcp_stream": [], "application/vnd.oasis.opendocument.chart": [ "odc" ], "application/vnd.oasis.opendocument.chart-template": [ "otc" ], "application/vnd.oasis.opendocument.database": [ "odb" ], "application/vnd.oasis.opendocument.formula": [ "odf" ], "application/vnd.oasis.opendocument.formula-template": [ "odft" ], "application/vnd.oasis.opendocument.graphics": [ "odg" ], "application/vnd.oasis.opendocument.graphics-template": [ "otg" ], "application/vnd.oasis.opendocument.image": [ "odi" ], "application/vnd.oasis.opendocument.image-template": [ "oti" ], "application/vnd.oasis.opendocument.presentation": [ "odp" ], "application/vnd.oasis.opendocument.presentation-template": [ "otp" ], "application/vnd.oasis.opendocument.spreadsheet": [ "ods" ], "application/vnd.oasis.opendocument.spreadsheet-template": [ "ots" ], "application/vnd.oasis.opendocument.text": [ "odt" ], "application/vnd.oasis.opendocument.text-master": [ "odm" ], "application/vnd.oasis.opendocument.text-template": [ "ott" ], "application/vnd.oasis.opendocument.text-web": [ "oth" ], "application/vnd.obn": [], "application/vnd.oftn.l10n+json": [], "application/vnd.oipf.contentaccessdownload+xml": [], "application/vnd.oipf.contentaccessstreaming+xml": [], "application/vnd.oipf.cspg-hexbinary": [], "application/vnd.oipf.dae.svg+xml": [], "application/vnd.oipf.dae.xhtml+xml": [], "application/vnd.oipf.mippvcontrolmessage+xml": [], "application/vnd.oipf.pae.gem": [], "application/vnd.oipf.spdiscovery+xml": [], "application/vnd.oipf.spdlist+xml": [], "application/vnd.oipf.ueprofile+xml": [], "application/vnd.oipf.userprofile+xml": [], "application/vnd.olpc-sugar": [ "xo" ], "application/vnd.oma-scws-config": [], "application/vnd.oma-scws-http-request": [], "application/vnd.oma-scws-http-response": [], "application/vnd.oma.bcast.associated-procedure-parameter+xml": [], "application/vnd.oma.bcast.drm-trigger+xml": [], "application/vnd.oma.bcast.imd+xml": [], "application/vnd.oma.bcast.ltkm": [], "application/vnd.oma.bcast.notification+xml": [], "application/vnd.oma.bcast.provisioningtrigger": [], "application/vnd.oma.bcast.sgboot": [], "application/vnd.oma.bcast.sgdd+xml": [], "application/vnd.oma.bcast.sgdu": [], "application/vnd.oma.bcast.simple-symbol-container": [], "application/vnd.oma.bcast.smartcard-trigger+xml": [], "application/vnd.oma.bcast.sprov+xml": [], "application/vnd.oma.bcast.stkm": [], "application/vnd.oma.cab-address-book+xml": [], "application/vnd.oma.cab-feature-handler+xml": [], "application/vnd.oma.cab-pcc+xml": [], "application/vnd.oma.cab-user-prefs+xml": [], "application/vnd.oma.dcd": [], "application/vnd.oma.dcdc": [], "application/vnd.oma.dd2+xml": [ "dd2" ], "application/vnd.oma.drm.risd+xml": [], "application/vnd.oma.group-usage-list+xml": [], "application/vnd.oma.pal+xml": [], "application/vnd.oma.poc.detailed-progress-report+xml": [], "application/vnd.oma.poc.final-report+xml": [], "application/vnd.oma.poc.groups+xml": [], "application/vnd.oma.poc.invocation-descriptor+xml": [], "application/vnd.oma.poc.optimized-progress-report+xml": [], "application/vnd.oma.push": [], "application/vnd.oma.scidm.messages+xml": [], "application/vnd.oma.xcap-directory+xml": [], "application/vnd.omads-email+xml": [], "application/vnd.omads-file+xml": [], "application/vnd.omads-folder+xml": [], "application/vnd.omaloc-supl-init": [], "application/vnd.openofficeorg.extension": [ "oxt" ], "application/vnd.openxmlformats-officedocument.custom-properties+xml": [], "application/vnd.openxmlformats-officedocument.customxmlproperties+xml": [], "application/vnd.openxmlformats-officedocument.drawing+xml": [], "application/vnd.openxmlformats-officedocument.drawingml.chart+xml": [], "application/vnd.openxmlformats-officedocument.drawingml.chartshapes+xml": [], "application/vnd.openxmlformats-officedocument.drawingml.diagramcolors+xml": [], "application/vnd.openxmlformats-officedocument.drawingml.diagramdata+xml": [], "application/vnd.openxmlformats-officedocument.drawingml.diagramlayout+xml": [], "application/vnd.openxmlformats-officedocument.drawingml.diagramstyle+xml": [], "application/vnd.openxmlformats-officedocument.extended-properties+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.commentauthors+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.comments+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.handoutmaster+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.notesmaster+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.notesslide+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.presentation": [ "pptx" ], "application/vnd.openxmlformats-officedocument.presentationml.presentation.main+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.presprops+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.slide": [ "sldx" ], "application/vnd.openxmlformats-officedocument.presentationml.slide+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.slidelayout+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.slidemaster+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.slideshow": [ "ppsx" ], "application/vnd.openxmlformats-officedocument.presentationml.slideshow.main+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.slideupdateinfo+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.tablestyles+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.tags+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.template": [ "potx" ], "application/vnd.openxmlformats-officedocument.presentationml.template.main+xml": [], "application/vnd.openxmlformats-officedocument.presentationml.viewprops+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.calcchain+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.chartsheet+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.comments+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.connections+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.dialogsheet+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.externallink+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcachedefinition+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcacherecords+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.pivottable+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.querytable+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.revisionheaders+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.revisionlog+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.sharedstrings+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet": [ "xlsx" ], "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.sheetmetadata+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.styles+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.table+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.tablesinglecells+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.template": [ "xltx" ], "application/vnd.openxmlformats-officedocument.spreadsheetml.template.main+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.usernames+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.volatiledependencies+xml": [], "application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml": [], "application/vnd.openxmlformats-officedocument.theme+xml": [], "application/vnd.openxmlformats-officedocument.themeoverride+xml": [], "application/vnd.openxmlformats-officedocument.vmldrawing": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.comments+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.document": [ "docx" ], "application/vnd.openxmlformats-officedocument.wordprocessingml.document.glossary+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.endnotes+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.fonttable+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.footer+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.footnotes+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.numbering+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.settings+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.styles+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.template": [ "dotx" ], "application/vnd.openxmlformats-officedocument.wordprocessingml.template.main+xml": [], "application/vnd.openxmlformats-officedocument.wordprocessingml.websettings+xml": [], "application/vnd.openxmlformats-package.core-properties+xml": [], "application/vnd.openxmlformats-package.digital-signature-xmlsignature+xml": [], "application/vnd.openxmlformats-package.relationships+xml": [], "application/vnd.quobject-quoxdocument": [], "application/vnd.osa.netdeploy": [], "application/vnd.osgeo.mapguide.package": [ "mgp" ], "application/vnd.osgi.bundle": [], "application/vnd.osgi.dp": [ "dp" ], "application/vnd.osgi.subsystem": [ "esa" ], "application/vnd.otps.ct-kip+xml": [], "application/vnd.palm": [ "pdb", "pqa", "oprc" ], "application/vnd.paos.xml": [], "application/vnd.pawaafile": [ "paw" ], "application/vnd.pg.format": [ "str" ], "application/vnd.pg.osasli": [ "ei6" ], "application/vnd.piaccess.application-licence": [], "application/vnd.picsel": [ "efif" ], "application/vnd.pmi.widget": [ "wg" ], "application/vnd.poc.group-advertisement+xml": [], "application/vnd.pocketlearn": [ "plf" ], "application/vnd.powerbuilder6": [ "pbd" ], "application/vnd.powerbuilder6-s": [], "application/vnd.powerbuilder7": [], "application/vnd.powerbuilder7-s": [], "application/vnd.powerbuilder75": [], "application/vnd.powerbuilder75-s": [], "application/vnd.preminet": [], "application/vnd.previewsystems.box": [ "box" ], "application/vnd.proteus.magazine": [ "mgz" ], "application/vnd.publishare-delta-tree": [ "qps" ], "application/vnd.pvi.ptid1": [ "ptid" ], "application/vnd.pwg-multiplexed": [], "application/vnd.pwg-xhtml-print+xml": [], "application/vnd.qualcomm.brew-app-res": [], "application/vnd.quark.quarkxpress": [ "qxd", "qxt", "qwd", "qwt", "qxl", "qxb" ], "application/vnd.radisys.moml+xml": [], "application/vnd.radisys.msml+xml": [], "application/vnd.radisys.msml-audit+xml": [], "application/vnd.radisys.msml-audit-conf+xml": [], "application/vnd.radisys.msml-audit-conn+xml": [], "application/vnd.radisys.msml-audit-dialog+xml": [], "application/vnd.radisys.msml-audit-stream+xml": [], "application/vnd.radisys.msml-conf+xml": [], "application/vnd.radisys.msml-dialog+xml": [], "application/vnd.radisys.msml-dialog-base+xml": [], "application/vnd.radisys.msml-dialog-fax-detect+xml": [], "application/vnd.radisys.msml-dialog-fax-sendrecv+xml": [], "application/vnd.radisys.msml-dialog-group+xml": [], "application/vnd.radisys.msml-dialog-speech+xml": [], "application/vnd.radisys.msml-dialog-transform+xml": [], "application/vnd.rainstor.data": [], "application/vnd.rapid": [], "application/vnd.realvnc.bed": [ "bed" ], "application/vnd.recordare.musicxml": [ "mxl" ], "application/vnd.recordare.musicxml+xml": [ "musicxml" ], "application/vnd.renlearn.rlprint": [], "application/vnd.rig.cryptonote": [ "cryptonote" ], "application/vnd.rim.cod": [ "cod" ], "application/vnd.rn-realmedia": [ "rm" ], "application/vnd.rn-realmedia-vbr": [ "rmvb" ], "application/vnd.route66.link66+xml": [ "link66" ], "application/vnd.rs-274x": [], "application/vnd.ruckus.download": [], "application/vnd.s3sms": [], "application/vnd.sailingtracker.track": [ "st" ], "application/vnd.sbm.cid": [], "application/vnd.sbm.mid2": [], "application/vnd.scribus": [], "application/vnd.sealed.3df": [], "application/vnd.sealed.csf": [], "application/vnd.sealed.doc": [], "application/vnd.sealed.eml": [], "application/vnd.sealed.mht": [], "application/vnd.sealed.net": [], "application/vnd.sealed.ppt": [], "application/vnd.sealed.tiff": [], "application/vnd.sealed.xls": [], "application/vnd.sealedmedia.softseal.html": [], "application/vnd.sealedmedia.softseal.pdf": [], "application/vnd.seemail": [ "see" ], "application/vnd.sema": [ "sema" ], "application/vnd.semd": [ "semd" ], "application/vnd.semf": [ "semf" ], "application/vnd.shana.informed.formdata": [ "ifm" ], "application/vnd.shana.informed.formtemplate": [ "itp" ], "application/vnd.shana.informed.interchange": [ "iif" ], "application/vnd.shana.informed.package": [ "ipk" ], "application/vnd.simtech-mindmapper": [ "twd", "twds" ], "application/vnd.smaf": [ "mmf" ], "application/vnd.smart.notebook": [], "application/vnd.smart.teacher": [ "teacher" ], "application/vnd.software602.filler.form+xml": [], "application/vnd.software602.filler.form-xml-zip": [], "application/vnd.solent.sdkm+xml": [ "sdkm", "sdkd" ], "application/vnd.spotfire.dxp": [ "dxp" ], "application/vnd.spotfire.sfs": [ "sfs" ], "application/vnd.sss-cod": [], "application/vnd.sss-dtf": [], "application/vnd.sss-ntf": [], "application/vnd.stardivision.calc": [ "sdc" ], "application/vnd.stardivision.draw": [ "sda" ], "application/vnd.stardivision.impress": [ "sdd" ], "application/vnd.stardivision.math": [ "smf" ], "application/vnd.stardivision.writer": [ "sdw", "vor" ], "application/vnd.stardivision.writer-global": [ "sgl" ], "application/vnd.stepmania.package": [ "smzip" ], "application/vnd.stepmania.stepchart": [ "sm" ], "application/vnd.street-stream": [], "application/vnd.sun.xml.calc": [ "sxc" ], "application/vnd.sun.xml.calc.template": [ "stc" ], "application/vnd.sun.xml.draw": [ "sxd" ], "application/vnd.sun.xml.draw.template": [ "std" ], "application/vnd.sun.xml.impress": [ "sxi" ], "application/vnd.sun.xml.impress.template": [ "sti" ], "application/vnd.sun.xml.math": [ "sxm" ], "application/vnd.sun.xml.writer": [ "sxw" ], "application/vnd.sun.xml.writer.global": [ "sxg" ], "application/vnd.sun.xml.writer.template": [ "stw" ], "application/vnd.sun.wadl+xml": [], "application/vnd.sus-calendar": [ "sus", "susp" ], "application/vnd.svd": [ "svd" ], "application/vnd.swiftview-ics": [], "application/vnd.symbian.install": [ "sis", "sisx" ], "application/vnd.syncml+xml": [ "xsm" ], "application/vnd.syncml.dm+wbxml": [ "bdm" ], "application/vnd.syncml.dm+xml": [ "xdm" ], "application/vnd.syncml.dm.notification": [], "application/vnd.syncml.ds.notification": [], "application/vnd.tao.intent-module-archive": [ "tao" ], "application/vnd.tcpdump.pcap": [ "pcap", "cap", "dmp" ], "application/vnd.tmobile-livetv": [ "tmo" ], "application/vnd.trid.tpt": [ "tpt" ], "application/vnd.triscape.mxs": [ "mxs" ], "application/vnd.trueapp": [ "tra" ], "application/vnd.truedoc": [], "application/vnd.ubisoft.webplayer": [], "application/vnd.ufdl": [ "ufd", "ufdl" ], "application/vnd.uiq.theme": [ "utz" ], "application/vnd.umajin": [ "umj" ], "application/vnd.unity": [ "unityweb" ], "application/vnd.uoml+xml": [ "uoml" ], "application/vnd.uplanet.alert": [], "application/vnd.uplanet.alert-wbxml": [], "application/vnd.uplanet.bearer-choice": [], "application/vnd.uplanet.bearer-choice-wbxml": [], "application/vnd.uplanet.cacheop": [], "application/vnd.uplanet.cacheop-wbxml": [], "application/vnd.uplanet.channel": [], "application/vnd.uplanet.channel-wbxml": [], "application/vnd.uplanet.list": [], "application/vnd.uplanet.list-wbxml": [], "application/vnd.uplanet.listcmd": [], "application/vnd.uplanet.listcmd-wbxml": [], "application/vnd.uplanet.signal": [], "application/vnd.vcx": [ "vcx" ], "application/vnd.vd-study": [], "application/vnd.vectorworks": [], "application/vnd.verimatrix.vcas": [], "application/vnd.vidsoft.vidconference": [], "application/vnd.visio": [ "vsd", "vst", "vss", "vsw" ], "application/vnd.visionary": [ "vis" ], "application/vnd.vividence.scriptfile": [], "application/vnd.vsf": [ "vsf" ], "application/vnd.wap.sic": [], "application/vnd.wap.slc": [], "application/vnd.wap.wbxml": [ "wbxml" ], "application/vnd.wap.wmlc": [ "wmlc" ], "application/vnd.wap.wmlscriptc": [ "wmlsc" ], "application/vnd.webturbo": [ "wtb" ], "application/vnd.wfa.wsc": [], "application/vnd.wmc": [], "application/vnd.wmf.bootstrap": [], "application/vnd.wolfram.mathematica": [], "application/vnd.wolfram.mathematica.package": [], "application/vnd.wolfram.player": [ "nbp" ], "application/vnd.wordperfect": [ "wpd" ], "application/vnd.wqd": [ "wqd" ], "application/vnd.wrq-hp3000-labelled": [], "application/vnd.wt.stf": [ "stf" ], "application/vnd.wv.csp+wbxml": [], "application/vnd.wv.csp+xml": [], "application/vnd.wv.ssp+xml": [], "application/vnd.xara": [ "xar" ], "application/vnd.xfdl": [ "xfdl" ], "application/vnd.xfdl.webform": [], "application/vnd.xmi+xml": [], "application/vnd.xmpie.cpkg": [], "application/vnd.xmpie.dpkg": [], "application/vnd.xmpie.plan": [], "application/vnd.xmpie.ppkg": [], "application/vnd.xmpie.xlim": [], "application/vnd.yamaha.hv-dic": [ "hvd" ], "application/vnd.yamaha.hv-script": [ "hvs" ], "application/vnd.yamaha.hv-voice": [ "hvp" ], "application/vnd.yamaha.openscoreformat": [ "osf" ], "application/vnd.yamaha.openscoreformat.osfpvg+xml": [ "osfpvg" ], "application/vnd.yamaha.remote-setup": [], "application/vnd.yamaha.smaf-audio": [ "saf" ], "application/vnd.yamaha.smaf-phrase": [ "spf" ], "application/vnd.yamaha.through-ngn": [], "application/vnd.yamaha.tunnel-udpencap": [], "application/vnd.yellowriver-custom-menu": [ "cmp" ], "application/vnd.zul": [ "zir", "zirz" ], "application/vnd.zzazz.deck+xml": [ "zaz" ], "application/voicexml+xml": [ "vxml" ], "application/vq-rtcpxr": [], "application/watcherinfo+xml": [], "application/whoispp-query": [], "application/whoispp-response": [], "application/widget": [ "wgt" ], "application/winhlp": [ "hlp" ], "application/wita": [], "application/wordperfect5.1": [], "application/wsdl+xml": [ "wsdl" ], "application/wspolicy+xml": [ "wspolicy" ], "application/x-7z-compressed": [ "7z" ], "application/x-abiword": [ "abw" ], "application/x-ace-compressed": [ "ace" ], "application/x-amf": [], "application/x-apple-diskimage": [ "dmg" ], "application/x-authorware-bin": [ "aab", "x32", "u32", "vox" ], "application/x-authorware-map": [ "aam" ], "application/x-authorware-seg": [ "aas" ], "application/x-bcpio": [ "bcpio" ], "application/x-bittorrent": [ "torrent" ], "application/x-blorb": [ "blb", "blorb" ], "application/x-bzip": [ "bz" ], "application/x-bzip2": [ "bz2", "boz" ], "application/x-cbr": [ "cbr", "cba", "cbt", "cbz", "cb7" ], "application/x-cdlink": [ "vcd" ], "application/x-cfs-compressed": [ "cfs" ], "application/x-chat": [ "chat" ], "application/x-chess-pgn": [ "pgn" ], "application/x-conference": [ "nsc" ], "application/x-compress": [], "application/x-cpio": [ "cpio" ], "application/x-csh": [ "csh" ], "application/x-debian-package": [ "deb", "udeb" ], "application/x-dgc-compressed": [ "dgc" ], "application/x-director": [ "dir", "dcr", "dxr", "cst", "cct", "cxt", "w3d", "fgd", "swa" ], "application/x-doom": [ "wad" ], "application/x-dtbncx+xml": [ "ncx" ], "application/x-dtbook+xml": [ "dtb" ], "application/x-dtbresource+xml": [ "res" ], "application/x-dvi": [ "dvi" ], "application/x-envoy": [ "evy" ], "application/x-eva": [ "eva" ], "application/x-font-bdf": [ "bdf" ], "application/x-font-dos": [], "application/x-font-framemaker": [], "application/x-font-ghostscript": [ "gsf" ], "application/x-font-libgrx": [], "application/x-font-linux-psf": [ "psf" ], "application/x-font-otf": [ "otf" ], "application/x-font-pcf": [ "pcf" ], "application/x-font-snf": [ "snf" ], "application/x-font-speedo": [], "application/x-font-sunos-news": [], "application/x-font-ttf": [ "ttf", "ttc" ], "application/x-font-type1": [ "pfa", "pfb", "pfm", "afm" ], "application/font-woff": [ "woff" ], "application/x-font-vfont": [], "application/x-freearc": [ "arc" ], "application/x-futuresplash": [ "spl" ], "application/x-gca-compressed": [ "gca" ], "application/x-glulx": [ "ulx" ], "application/x-gnumeric": [ "gnumeric" ], "application/x-gramps-xml": [ "gramps" ], "application/x-gtar": [ "gtar" ], "application/x-gzip": [], "application/x-hdf": [ "hdf" ], "application/x-install-instructions": [ "install" ], "application/x-iso9660-image": [ "iso" ], "application/x-java-jnlp-file": [ "jnlp" ], "application/x-latex": [ "latex" ], "application/x-lzh-compressed": [ "lzh", "lha" ], "application/x-mie": [ "mie" ], "application/x-mobipocket-ebook": [ "prc", "mobi" ], "application/x-ms-application": [ "application" ], "application/x-ms-shortcut": [ "lnk" ], "application/x-ms-wmd": [ "wmd" ], "application/x-ms-wmz": [ "wmz" ], "application/x-ms-xbap": [ "xbap" ], "application/x-msaccess": [ "mdb" ], "application/x-msbinder": [ "obd" ], "application/x-mscardfile": [ "crd" ], "application/x-msclip": [ "clp" ], "application/x-msdownload": [ "exe", "dll", "com", "bat", "msi" ], "application/x-msmediaview": [ "mvb", "m13", "m14" ], "application/x-msmetafile": [ "wmf", "wmz", "emf", "emz" ], "application/x-msmoney": [ "mny" ], "application/x-mspublisher": [ "pub" ], "application/x-msschedule": [ "scd" ], "application/x-msterminal": [ "trm" ], "application/x-mswrite": [ "wri" ], "application/x-netcdf": [ "nc", "cdf" ], "application/x-nzb": [ "nzb" ], "application/x-pkcs12": [ "p12", "pfx" ], "application/x-pkcs7-certificates": [ "p7b", "spc" ], "application/x-pkcs7-certreqresp": [ "p7r" ], "application/x-rar-compressed": [ "rar" ], "application/x-research-info-systems": [ "ris" ], "application/x-sh": [ "sh" ], "application/x-shar": [ "shar" ], "application/x-shockwave-flash": [ "swf" ], "application/x-silverlight-app": [ "xap" ], "application/x-sql": [ "sql" ], "application/x-stuffit": [ "sit" ], "application/x-stuffitx": [ "sitx" ], "application/x-subrip": [ "srt" ], "application/x-sv4cpio": [ "sv4cpio" ], "application/x-sv4crc": [ "sv4crc" ], "application/x-t3vm-image": [ "t3" ], "application/x-tads": [ "gam" ], "application/x-tar": [ "tar" ], "application/x-tcl": [ "tcl" ], "application/x-tex": [ "tex" ], "application/x-tex-tfm": [ "tfm" ], "application/x-texinfo": [ "texinfo", "texi" ], "application/x-tgif": [ "obj" ], "application/x-ustar": [ "ustar" ], "application/x-wais-source": [ "src" ], "application/x-x509-ca-cert": [ "der", "crt" ], "application/x-xfig": [ "fig" ], "application/x-xliff+xml": [ "xlf" ], "application/x-xpinstall": [ "xpi" ], "application/x-xz": [ "xz" ], "application/x-zmachine": [ "z1", "z2", "z3", "z4", "z5", "z6", "z7", "z8" ], "application/x400-bp": [], "application/xaml+xml": [ "xaml" ], "application/xcap-att+xml": [], "application/xcap-caps+xml": [], "application/xcap-diff+xml": [ "xdf" ], "application/xcap-el+xml": [], "application/xcap-error+xml": [], "application/xcap-ns+xml": [], "application/xcon-conference-info-diff+xml": [], "application/xcon-conference-info+xml": [], "application/xenc+xml": [ "xenc" ], "application/xhtml+xml": [ "xhtml", "xht" ], "application/xhtml-voice+xml": [], "application/xml": [ "xml", "xsl" ], "application/xml-dtd": [ "dtd" ], "application/xml-external-parsed-entity": [], "application/xmpp+xml": [], "application/xop+xml": [ "xop" ], "application/xproc+xml": [ "xpl" ], "application/xslt+xml": [ "xslt" ], "application/xspf+xml": [ "xspf" ], "application/xv+xml": [ "mxml", "xhvml", "xvml", "xvm" ], "application/yang": [ "yang" ], "application/yin+xml": [ "yin" ], "application/zip": [ "zip" ], "audio/1d-interleaved-parityfec": [], "audio/32kadpcm": [], "audio/3gpp": [], "audio/3gpp2": [], "audio/ac3": [], "audio/adpcm": [ "adp" ], "audio/amr": [], "audio/amr-wb": [], "audio/amr-wb+": [], "audio/asc": [], "audio/atrac-advanced-lossless": [], "audio/atrac-x": [], "audio/atrac3": [], "audio/basic": [ "au", "snd" ], "audio/bv16": [], "audio/bv32": [], "audio/clearmode": [], "audio/cn": [], "audio/dat12": [], "audio/dls": [], "audio/dsr-es201108": [], "audio/dsr-es202050": [], "audio/dsr-es202211": [], "audio/dsr-es202212": [], "audio/dv": [], "audio/dvi4": [], "audio/eac3": [], "audio/evrc": [], "audio/evrc-qcp": [], "audio/evrc0": [], "audio/evrc1": [], "audio/evrcb": [], "audio/evrcb0": [], "audio/evrcb1": [], "audio/evrcwb": [], "audio/evrcwb0": [], "audio/evrcwb1": [], "audio/example": [], "audio/fwdred": [], "audio/g719": [], "audio/g722": [], "audio/g7221": [], "audio/g723": [], "audio/g726-16": [], "audio/g726-24": [], "audio/g726-32": [], "audio/g726-40": [], "audio/g728": [], "audio/g729": [], "audio/g7291": [], "audio/g729d": [], "audio/g729e": [], "audio/gsm": [], "audio/gsm-efr": [], "audio/gsm-hr-08": [], "audio/ilbc": [], "audio/ip-mr_v2.5": [], "audio/isac": [], "audio/l16": [], "audio/l20": [], "audio/l24": [], "audio/l8": [], "audio/lpc": [], "audio/midi": [ "mid", "midi", "kar", "rmi" ], "audio/mobile-xmf": [], "audio/mp4": [ "mp4a" ], "audio/mp4a-latm": [], "audio/mpa": [], "audio/mpa-robust": [], "audio/mpeg": [ "mpga", "mp2", "mp2a", "mp3", "m2a", "m3a" ], "audio/mpeg4-generic": [], "audio/musepack": [], "audio/ogg": [ "oga", "ogg", "spx" ], "audio/opus": [], "audio/parityfec": [], "audio/pcma": [], "audio/pcma-wb": [], "audio/pcmu-wb": [], "audio/pcmu": [], "audio/prs.sid": [], "audio/qcelp": [], "audio/red": [], "audio/rtp-enc-aescm128": [], "audio/rtp-midi": [], "audio/rtx": [], "audio/s3m": [ "s3m" ], "audio/silk": [ "sil" ], "audio/smv": [], "audio/smv0": [], "audio/smv-qcp": [], "audio/sp-midi": [], "audio/speex": [], "audio/t140c": [], "audio/t38": [], "audio/telephone-event": [], "audio/tone": [], "audio/uemclip": [], "audio/ulpfec": [], "audio/vdvi": [], "audio/vmr-wb": [], "audio/vnd.3gpp.iufp": [], "audio/vnd.4sb": [], "audio/vnd.audiokoz": [], "audio/vnd.celp": [], "audio/vnd.cisco.nse": [], "audio/vnd.cmles.radio-events": [], "audio/vnd.cns.anp1": [], "audio/vnd.cns.inf1": [], "audio/vnd.dece.audio": [ "uva", "uvva" ], "audio/vnd.digital-winds": [ "eol" ], "audio/vnd.dlna.adts": [], "audio/vnd.dolby.heaac.1": [], "audio/vnd.dolby.heaac.2": [], "audio/vnd.dolby.mlp": [], "audio/vnd.dolby.mps": [], "audio/vnd.dolby.pl2": [], "audio/vnd.dolby.pl2x": [], "audio/vnd.dolby.pl2z": [], "audio/vnd.dolby.pulse.1": [], "audio/vnd.dra": [ "dra" ], "audio/vnd.dts": [ "dts" ], "audio/vnd.dts.hd": [ "dtshd" ], "audio/vnd.dvb.file": [], "audio/vnd.everad.plj": [], "audio/vnd.hns.audio": [], "audio/vnd.lucent.voice": [ "lvp" ], "audio/vnd.ms-playready.media.pya": [ "pya" ], "audio/vnd.nokia.mobile-xmf": [], "audio/vnd.nortel.vbk": [], "audio/vnd.nuera.ecelp4800": [ "ecelp4800" ], "audio/vnd.nuera.ecelp7470": [ "ecelp7470" ], "audio/vnd.nuera.ecelp9600": [ "ecelp9600" ], "audio/vnd.octel.sbc": [], "audio/vnd.qcelp": [], "audio/vnd.rhetorex.32kadpcm": [], "audio/vnd.rip": [ "rip" ], "audio/vnd.sealedmedia.softseal.mpeg": [], "audio/vnd.vmx.cvsd": [], "audio/vorbis": [], "audio/vorbis-config": [], "audio/webm": [ "weba" ], "audio/x-aac": [ "aac" ], "audio/x-aiff": [ "aif", "aiff", "aifc" ], "audio/x-caf": [ "caf" ], "audio/x-flac": [ "flac" ], "audio/x-matroska": [ "mka" ], "audio/x-mpegurl": [ "m3u" ], "audio/x-ms-wax": [ "wax" ], "audio/x-ms-wma": [ "wma" ], "audio/x-pn-realaudio": [ "ram", "ra" ], "audio/x-pn-realaudio-plugin": [ "rmp" ], "audio/x-tta": [], "audio/x-wav": [ "wav" ], "audio/xm": [ "xm" ], "chemical/x-cdx": [ "cdx" ], "chemical/x-cif": [ "cif" ], "chemical/x-cmdf": [ "cmdf" ], "chemical/x-cml": [ "cml" ], "chemical/x-csml": [ "csml" ], "chemical/x-pdb": [], "chemical/x-xyz": [ "xyz" ], "image/bmp": [ "bmp" ], "image/cgm": [ "cgm" ], "image/example": [], "image/fits": [], "image/g3fax": [ "g3" ], "image/gif": [ "gif" ], "image/ief": [ "ief" ], "image/jp2": [], "image/jpeg": [ "jpeg", "jpg", "jpe" ], "image/jpm": [], "image/jpx": [], "image/ktx": [ "ktx" ], "image/naplps": [], "image/png": [ "png" ], "image/prs.btif": [ "btif" ], "image/prs.pti": [], "image/sgi": [ "sgi" ], "image/svg+xml": [ "svg", "svgz" ], "image/t38": [], "image/tiff": [ "tiff", "tif" ], "image/tiff-fx": [], "image/vnd.adobe.photoshop": [ "psd" ], "image/vnd.cns.inf2": [], "image/vnd.dece.graphic": [ "uvi", "uvvi", "uvg", "uvvg" ], "image/vnd.dvb.subtitle": [ "sub" ], "image/vnd.djvu": [ "djvu", "djv" ], "image/vnd.dwg": [ "dwg" ], "image/vnd.dxf": [ "dxf" ], "image/vnd.fastbidsheet": [ "fbs" ], "image/vnd.fpx": [ "fpx" ], "image/vnd.fst": [ "fst" ], "image/vnd.fujixerox.edmics-mmr": [ "mmr" ], "image/vnd.fujixerox.edmics-rlc": [ "rlc" ], "image/vnd.globalgraphics.pgb": [], "image/vnd.microsoft.icon": [], "image/vnd.mix": [], "image/vnd.ms-modi": [ "mdi" ], "image/vnd.ms-photo": [ "wdp" ], "image/vnd.net-fpx": [ "npx" ], "image/vnd.radiance": [], "image/vnd.sealed.png": [], "image/vnd.sealedmedia.softseal.gif": [], "image/vnd.sealedmedia.softseal.jpg": [], "image/vnd.svf": [], "image/vnd.wap.wbmp": [ "wbmp" ], "image/vnd.xiff": [ "xif" ], "image/webp": [ "webp" ], "image/x-3ds": [ "3ds" ], "image/x-cmu-raster": [ "ras" ], "image/x-cmx": [ "cmx" ], "image/x-freehand": [ "fh", "fhc", "fh4", "fh5", "fh7" ], "image/x-icon": [ "ico" ], "image/x-mrsid-image": [ "sid" ], "image/x-pcx": [ "pcx" ], "image/x-pict": [ "pic", "pct" ], "image/x-portable-anymap": [ "pnm" ], "image/x-portable-bitmap": [ "pbm" ], "image/x-portable-graymap": [ "pgm" ], "image/x-portable-pixmap": [ "ppm" ], "image/x-rgb": [ "rgb" ], "image/x-tga": [ "tga" ], "image/x-xbitmap": [ "xbm" ], "image/x-xpixmap": [ "xpm" ], "image/x-xwindowdump": [ "xwd" ], "message/cpim": [], "message/delivery-status": [], "message/disposition-notification": [], "message/example": [], "message/external-body": [], "message/feedback-report": [], "message/global": [], "message/global-delivery-status": [], "message/global-disposition-notification": [], "message/global-headers": [], "message/http": [], "message/imdn+xml": [], "message/news": [], "message/partial": [], "message/rfc822": [ "eml", "mime" ], "message/s-http": [], "message/sip": [], "message/sipfrag": [], "message/tracking-status": [], "message/vnd.si.simp": [], "model/example": [], "model/iges": [ "igs", "iges" ], "model/mesh": [ "msh", "mesh", "silo" ], "model/vnd.collada+xml": [ "dae" ], "model/vnd.dwf": [ "dwf" ], "model/vnd.flatland.3dml": [], "model/vnd.gdl": [ "gdl" ], "model/vnd.gs-gdl": [], "model/vnd.gs.gdl": [], "model/vnd.gtw": [ "gtw" ], "model/vnd.moml+xml": [], "model/vnd.mts": [ "mts" ], "model/vnd.parasolid.transmit.binary": [], "model/vnd.parasolid.transmit.text": [], "model/vnd.vtu": [ "vtu" ], "model/vrml": [ "wrl", "vrml" ], "model/x3d+binary": [ "x3db", "x3dbz" ], "model/x3d+vrml": [ "x3dv", "x3dvz" ], "model/x3d+xml": [ "x3d", "x3dz" ], "multipart/alternative": [], "multipart/appledouble": [], "multipart/byteranges": [], "multipart/digest": [], "multipart/encrypted": [], "multipart/example": [], "multipart/form-data": [], "multipart/header-set": [], "multipart/mixed": [], "multipart/parallel": [], "multipart/related": [], "multipart/report": [], "multipart/signed": [], "multipart/voice-message": [], "text/1d-interleaved-parityfec": [], "text/cache-manifest": [ "appcache" ], "text/calendar": [ "ics", "ifb" ], "text/css": [ "css" ], "text/csv": [ "csv" ], "text/directory": [], "text/dns": [], "text/ecmascript": [], "text/enriched": [], "text/example": [], "text/fwdred": [], "text/html": [ "html", "htm" ], "text/javascript": [], "text/n3": [ "n3" ], "text/parityfec": [], "text/plain": [ "txt", "text", "conf", "def", "list", "log", "in" ], "text/prs.fallenstein.rst": [], "text/prs.lines.tag": [ "dsc" ], "text/vnd.radisys.msml-basic-layout": [], "text/red": [], "text/rfc822-headers": [], "text/richtext": [ "rtx" ], "text/rtf": [], "text/rtp-enc-aescm128": [], "text/rtx": [], "text/sgml": [ "sgml", "sgm" ], "text/t140": [], "text/tab-separated-values": [ "tsv" ], "text/troff": [ "t", "tr", "roff", "man", "me", "ms" ], "text/turtle": [ "ttl" ], "text/ulpfec": [], "text/uri-list": [ "uri", "uris", "urls" ], "text/vcard": [ "vcard" ], "text/vnd.abc": [], "text/vnd.curl": [ "curl" ], "text/vnd.curl.dcurl": [ "dcurl" ], "text/vnd.curl.scurl": [ "scurl" ], "text/vnd.curl.mcurl": [ "mcurl" ], "text/vnd.dmclientscript": [], "text/vnd.dvb.subtitle": [ "sub" ], "text/vnd.esmertec.theme-descriptor": [], "text/vnd.fly": [ "fly" ], "text/vnd.fmi.flexstor": [ "flx" ], "text/vnd.graphviz": [ "gv" ], "text/vnd.in3d.3dml": [ "3dml" ], "text/vnd.in3d.spot": [ "spot" ], "text/vnd.iptc.newsml": [], "text/vnd.iptc.nitf": [], "text/vnd.latex-z": [], "text/vnd.motorola.reflex": [], "text/vnd.ms-mediapackage": [], "text/vnd.net2phone.commcenter.command": [], "text/vnd.si.uricatalogue": [], "text/vnd.sun.j2me.app-descriptor": [ "jad" ], "text/vnd.trolltech.linguist": [], "text/vnd.wap.si": [], "text/vnd.wap.sl": [], "text/vnd.wap.wml": [ "wml" ], "text/vnd.wap.wmlscript": [ "wmls" ], "text/x-asm": [ "s", "asm" ], "text/x-c": [ "c", "cc", "cxx", "cpp", "h", "hh", "dic" ], "text/x-fortran": [ "f", "for", "f77", "f90" ], "text/x-java-source": [ "java" ], "text/x-opml": [ "opml" ], "text/x-pascal": [ "p", "pas" ], "text/x-nfo": [ "nfo" ], "text/x-setext": [ "etx" ], "text/x-sfv": [ "sfv" ], "text/x-uuencode": [ "uu" ], "text/x-vcalendar": [ "vcs" ], "text/x-vcard": [ "vcf" ], "text/xml": [], "text/xml-external-parsed-entity": [], "video/1d-interleaved-parityfec": [], "video/3gpp": [ "3gp" ], "video/3gpp-tt": [], "video/3gpp2": [ "3g2" ], "video/bmpeg": [], "video/bt656": [], "video/celb": [], "video/dv": [], "video/example": [], "video/h261": [ "h261" ], "video/h263": [ "h263" ], "video/h263-1998": [], "video/h263-2000": [], "video/h264": [ "h264" ], "video/h264-rcdo": [], "video/h264-svc": [], "video/jpeg": [ "jpgv" ], "video/jpeg2000": [], "video/jpm": [ "jpm", "jpgm" ], "video/mj2": [ "mj2", "mjp2" ], "video/mp1s": [], "video/mp2p": [], "video/mp2t": [], "video/mp4": [ "mp4", "mp4v", "mpg4" ], "video/mp4v-es": [], "video/mpeg": [ "mpeg", "mpg", "mpe", "m1v", "m2v" ], "video/mpeg4-generic": [], "video/mpv": [], "video/nv": [], "video/ogg": [ "ogv" ], "video/parityfec": [], "video/pointer": [], "video/quicktime": [ "qt", "mov" ], "video/raw": [], "video/rtp-enc-aescm128": [], "video/rtx": [], "video/smpte292m": [], "video/ulpfec": [], "video/vc1": [], "video/vnd.cctv": [], "video/vnd.dece.hd": [ "uvh", "uvvh" ], "video/vnd.dece.mobile": [ "uvm", "uvvm" ], "video/vnd.dece.mp4": [], "video/vnd.dece.pd": [ "uvp", "uvvp" ], "video/vnd.dece.sd": [ "uvs", "uvvs" ], "video/vnd.dece.video": [ "uvv", "uvvv" ], "video/vnd.directv.mpeg": [], "video/vnd.directv.mpeg-tts": [], "video/vnd.dlna.mpeg-tts": [], "video/vnd.dvb.file": [ "dvb" ], "video/vnd.fvt": [ "fvt" ], "video/vnd.hns.video": [], "video/vnd.iptvforum.1dparityfec-1010": [], "video/vnd.iptvforum.1dparityfec-2005": [], "video/vnd.iptvforum.2dparityfec-1010": [], "video/vnd.iptvforum.2dparityfec-2005": [], "video/vnd.iptvforum.ttsavc": [], "video/vnd.iptvforum.ttsmpeg2": [], "video/vnd.motorola.video": [], "video/vnd.motorola.videop": [], "video/vnd.mpegurl": [ "mxu", "m4u" ], "video/vnd.ms-playready.media.pyv": [ "pyv" ], "video/vnd.nokia.interleaved-multimedia": [], "video/vnd.nokia.videovoip": [], "video/vnd.objectvideo": [], "video/vnd.sealed.mpeg1": [], "video/vnd.sealed.mpeg4": [], "video/vnd.sealed.swf": [], "video/vnd.sealedmedia.softseal.mov": [], "video/vnd.uvvu.mp4": [ "uvu", "uvvu" ], "video/vnd.vivo": [ "viv" ], "video/webm": [ "webm" ], "video/x-f4v": [ "f4v" ], "video/x-fli": [ "fli" ], "video/x-flv": [ "flv" ], "video/x-m4v": [ "m4v" ], "video/x-matroska": [ "mkv", "mk3d", "mks" ], "video/x-mng": [ "mng" ], "video/x-ms-asf": [ "asf", "asx" ], "video/x-ms-vob": [ "vob" ], "video/x-ms-wm": [ "wm" ], "video/x-ms-wmv": [ "wmv" ], "video/x-ms-wmx": [ "wmx" ], "video/x-ms-wvx": [ "wvx" ], "video/x-msvideo": [ "avi" ], "video/x-sgi-movie": [ "movie" ], "video/x-smv": [ "smv" ], "x-conference/x-cooltalk": [ "ice" ] } ���lib/node_modules/npm/node_modules/request/node_modules/mime-types/lib/node.json���������������������000644 �000766 �000024 �00000001322 12455173731 034122� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "text/vtt": [ "vtt" ], "application/x-chrome-extension": [ "crx" ], "text/x-component": [ "htc" ], "text/cache-manifest": [ "manifest" ], "application/octet-stream": [ "buffer" ], "application/mp4": [ "m4p" ], "audio/mp4": [ "m4a" ], "video/MP2T": [ "ts" ], "application/x-web-app-manifest+json": [ "webapp" ], "text/x-lua": [ "lua" ], "application/x-lua-bytecode": [ "luac" ], "text/x-markdown": [ "markdown", "md", "mkd" ], "text/plain": [ "ini" ], "application/dash+xml": [ "mdp" ], "font/opentype": [ "otf" ], "application/json": [ "map" ], "application/xml": [ "xsd" ] } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/LICENSE������������������000644 �000766 �000024 �00000002436 12455173731 034360� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/package.json�������������000644 �000766 �000024 �00000002444 12455173731 035640� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "json-stringify-safe", "version": "5.0.0", "description": "Like JSON.stringify, but doesn't blow up on circular refs", "main": "stringify.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/json-stringify-safe" }, "keywords": [ "json", "stringify", "circular", "safe" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "license": "BSD", "bugs": { "url": "https://github.com/isaacs/json-stringify-safe/issues" }, "_id": "json-stringify-safe@5.0.0", "dist": { "shasum": "4c1f228b5050837eba9d21f50c2e6e320624566e", "tarball": "http://registry.npmjs.org/json-stringify-safe/-/json-stringify-safe-5.0.0.tgz" }, "_from": "json-stringify-safe@>=5.0.0 <5.1.0", "_npmVersion": "1.3.6", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "4c1f228b5050837eba9d21f50c2e6e320624566e", "_resolved": "https://registry.npmjs.org/json-stringify-safe/-/json-stringify-safe-5.0.0.tgz", "readme": "ERROR: No README data found!", "homepage": "https://github.com/isaacs/json-stringify-safe" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/README.md����������������000644 �000766 �000024 �00000002172 12455173731 034627� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# json-stringify-safe Like JSON.stringify, but doesn't throw on circular references. ## Usage Takes the same arguments as `JSON.stringify`. ```javascript var stringify = require('json-stringify-safe'); var circularObj = {}; circularObj.circularRef = circularObj; circularObj.list = [ circularObj, circularObj ]; console.log(stringify(circularObj, null, 2)); ``` Output: ```json { "circularRef": "[Circular]", "list": [ "[Circular]", "[Circular]" ] } ``` ## Details ``` stringify(obj, serializer, indent, decycler) ``` The first three arguments are the same as to JSON.stringify. The last is an argument that's only used when the object has been seen already. The default `decycler` function returns the string `'[Circular]'`. If, for example, you pass in `function(k,v){}` (return nothing) then it will prune cycles. If you pass in `function(k,v){ return {foo: 'bar'}}`, then cyclical objects will always be represented as `{"foo":"bar"}` in the result. ``` stringify.getSerialize(serializer, decycler) ``` Returns a serializer that can be used elsewhere. This is the actual function that's passed to JSON.stringify. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/stringify.js�������������000644 �000766 �000024 �00000001672 12455173731 035730� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = stringify; function getSerialize (fn, decycle) { var seen = [], keys = []; decycle = decycle || function(key, value) { return '[Circular ' + getPath(value, seen, keys) + ']' }; return function(key, value) { var ret = value; if (typeof value === 'object' && value) { if (seen.indexOf(value) !== -1) ret = decycle(key, value); else { seen.push(value); keys.push(key); } } if (fn) ret = fn(key, ret); return ret; } } function getPath (value, seen, keys) { var index = seen.indexOf(value); var path = [ keys[index] ]; for (index--; index >= 0; index--) { if (seen[index][ path[0] ] === value) { value = seen[index]; path.unshift(keys[index]); } } return '~' + path.join('.'); } function stringify(obj, fn, spaces, decycle) { return JSON.stringify(obj, getSerialize(fn, decycle), spaces); } stringify.getSerialize = getSerialize; ����������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/json-stringify-safe/test.js������������������000644 �000766 �000024 �00000004343 12455173731 034667� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var stringify = require('./stringify.js'); var circularObj = { a: 'b' }; circularObj.circularRef = circularObj; circularObj.list = [ circularObj, circularObj ]; ////////// // default var testObj = { "a": "b", "circularRef": "[Circular ~]", "list": [ "[Circular ~]", "[Circular ~]" ] }; var assert = require('assert'); assert.equal(JSON.stringify(testObj, null, 2), stringify(circularObj, null, 2)); assert.equal(JSON.stringify(testObj, null, 2), JSON.stringify(circularObj, stringify.getSerialize(), 2)); //////// // prune testObj = { "a": "b", "list": [ null, null ] }; function prune(k, v) {} assert.equal(JSON.stringify(testObj, null, 2), stringify(circularObj, null, 2, prune)); /////////// // re-cycle // (throws) function recycle(k, v) { return v; } assert.throws(function() { stringify(circularObj, null, 2, recycle); }); //////// // fancy testObj = { "a": "b", "circularRef": "circularRef{a:string,circularRef:Object,list:Array}", "list": [ "0{a:string,circularRef:Object,list:Array}", "1{a:string,circularRef:Object,list:Array}" ] }; function signer(key, value) { var ret = key + '{'; var f = false; for (var i in value) { if (f) ret += ','; f = true; ret += i + ':'; var v = value[i]; switch (typeof v) { case 'object': if (!v) ret += 'null'; else if (Array.isArray(v)) ret += 'Array' else ret += v.constructor && v.constructor.name || 'Object'; break; default: ret += typeof v; break; } } ret += '}'; return ret; } assert.equal(JSON.stringify(testObj, null, 2), stringify(circularObj, null, 2, signer)); /////// //multi var a = { x: 1 }; a.a = a; var b = { x: 2 }; b.a = a; var c = { a: a, b: b }; var d = { list: [ a, b, c ] }; d.d = d; var multi = { "list": [ { "x": 1, "a": "[Circular ~.list.0]" }, { "x": 2, "a": "[Circular ~.list.0]" }, { "a": "[Circular ~.list.0]", "b": "[Circular ~.list.1]" } ], "d": "[Circular ~]" }; assert.equal(JSON.stringify(multi, null, 2), stringify(d, null, 2)); //////// // pass! console.log('ok'); ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/.dir-locals.el����������������000644 �000766 �000024 �00000000262 12455173731 035054� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������((nil . ((indent-tabs-mode . nil) (tab-width . 8) (fill-column . 80))) (js-mode . ((js-indent-level . 2) (indent-tabs-mode . nil) )))����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/.npmignore��������������������000644 �000766 �000024 �00000000066 12455173731 034424� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������.gitmodules deps docs Makefile node_modules test tools��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/http_signing.md���������������000644 �000766 �000024 �00000026467 12455173731 035461� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# Abstract This document describes a way to add origin authentication, message integrity, and replay resistance to HTTP REST requests. It is intended to be used over the HTTPS protocol. # Copyright Notice Copyright (c) 2011 Joyent, Inc. and the persons identified as document authors. All rights reserved. Code Components extracted from this document must include MIT License text. # Introduction This protocol is intended to provide a standard way for clients to sign HTTP requests. RFC2617 (HTTP Authentication) defines Basic and Digest authentication mechanisms, and RFC5246 (TLS 1.2) defines client-auth, both of which are widely employed on the Internet today. However, it is common place that the burdens of PKI prevent web service operators from deploying that methodoloy, and so many fall back to Basic authentication, which has poor security characteristics. Additionally, OAuth provides a fully-specified alternative for authorization of web service requests, but is not (always) ideal for machine to machine communication, as the key acquisition steps (generally) imply a fixed infrastructure that may not make sense to a service provider (e.g., symmetric keys). Several web service providers have invented their own schemes for signing HTTP requests, but to date, none have been placed in the public domain as a standard. This document serves that purpose. There are no techniques in this proposal that are novel beyond previous art, however, this aims to be a simple mechanism for signing these requests. # Signature Authentication Scheme The "signature" authentication scheme is based on the model that the client must authenticate itself with a digital signature produced by either a private asymmetric key (e.g., RSA) or a shared symmetric key (e.g., HMAC). The scheme is parameterized enough such that it is not bound to any particular key type or signing algorithm. However, it does explicitly assume that clients can send an HTTP `Date` header. ## Authorization Header The client is expected to send an Authorization header (as defined in RFC 2617) with the following parameterization: credentials := "Signature" params params := 1#(keyId | algorithm | [headers] | [ext] | signature) digitalSignature := plain-string keyId := "keyId" "=" <"> plain-string <"> algorithm := "algorithm" "=" <"> plain-string <"> headers := "headers" "=" <"> 1#headers-value <"> ext := "ext" "=" <"> plain-string <"> signature := "signature" "=" <"> plain-string <"> headers-value := plain-string plain-string = 1*( %x20-21 / %x23-5B / %x5D-7E ) ### Signature Parameters #### keyId REQUIRED. The `keyId` field is an opaque string that the server can use to look up the component they need to validate the signature. It could be an SSH key fingerprint, an LDAP DN, etc. Management of keys and assignment of `keyId` is out of scope for this document. #### algorithm REQUIRED. The `algorithm` parameter is used if the client and server agree on a non-standard digital signature algorithm. The full list of supported signature mechanisms is listed below. #### headers OPTIONAL. The `headers` parameter is used to specify the list of HTTP headers used to sign the request. If specified, it should be a quoted list of HTTP header names, separated by a single space character. By default, only one HTTP header is signed, which is the `Date` header. Note that the list MUST be specified in the order the values are concatenated together during signing. To include the HTTP request line in the signature calculation, use the special `request-line` value. While this is overloading the definition of `headers` in HTTP linguism, the request-line is defined in RFC 2616, and as the outlier from headers in useful signature calculation, it is deemed simpler to simply use `request-line` than to add a separate parameter for it. #### extensions OPTIONAL. The `extensions` parameter is used to include additional information which is covered by the request. The content and format of the string is out of scope for this document, and expected to be specified by implementors. #### signature REQUIRED. The `signature` parameter is a `Base64` encoded digital signature generated by the client. The client uses the `algorithm` and `headers` request parameters to form a canonicalized `signing string`. This `signing string` is then signed with the key associated with `keyId` and the algorithm corresponding to `algorithm`. The `signature` parameter is then set to the `Base64` encoding of the signature. ### Signing String Composition In order to generate the string that is signed with a key, the client MUST take the values of each HTTP header specified by `headers` in the order they appear. 1. If the header name is not `request-line` then append the lowercased header name followed with an ASCII colon `:` and an ASCII space ` `. 2. If the header name is `request-line` then appened the HTTP request line, otherwise append the header value. 3. If value is not the last value then append an ASCII newline `\n`. The string MUST NOT include a trailing ASCII newline. # Example Requests All requests refer to the following request (body ommitted): POST /foo HTTP/1.1 Host: example.org Date: Tue, 07 Jun 2011 20:51:35 GMT Content-Type: application/json Content-MD5: h0auK8hnYJKmHTLhKtMTkQ== Content-Length: 123 The "rsa-key-1" keyId refers to a private key known to the client and a public key known to the server. The "hmac-key-1" keyId refers to key known to the client and server. ## Default parameterization The authorization header and signature would be generated as: Authorization: Signature keyId="rsa-key-1",algorithm="rsa-sha256",signature="Base64(RSA-SHA256(signing string))" The client would compose the signing string as: date: Tue, 07 Jun 2011 20:51:35 GMT ## Header List The authorization header and signature would be generated as: Authorization: Signature keyId="rsa-key-1",algorithm="rsa-sha256",headers="request-line date content-type content-md5",signature="Base64(RSA-SHA256(signing string))" The client would compose the signing string as (`+ "\n"` inserted for readability): POST /foo HTTP/1.1 + "\n" date: Tue, 07 Jun 2011 20:51:35 GMT + "\n" content-type: application/json + "\n" content-md5: h0auK8hnYJKmHTLhKtMTkQ== ## Algorithm The authorization header and signature would be generated as: Authorization: Signature keyId="hmac-key-1",algorithm="hmac-sha1",signature="Base64(HMAC-SHA1(signing string))" The client would compose the signing string as: date: Tue, 07 Jun 2011 20:51:35 GMT # Signing Algorithms Currently supported algorithm names are: * rsa-sha1 * rsa-sha256 * rsa-sha512 * dsa-sha1 * hmac-sha1 * hmac-sha256 * hmac-sha512 # Security Considerations ## Default Parameters Note the default parameterization of the `Signature` scheme is only safe if all requests are carried over a secure transport (i.e., TLS). Sending the default scheme over a non-secure transport will leave the request vulnerable to spoofing, tampering, replay/repudiaton, and integrity violations (if using the STRIDE threat-modeling methodology). ## Insecure Transports If sending the request over plain HTTP, service providers SHOULD require clients to sign ALL HTTP headers, and the `request-line`. Additionally, service providers SHOULD require `Content-MD5` calculations to be performed to ensure against any tampering from clients. ## Nonces Nonces are out of scope for this document simply because many service providers fail to implement them correctly, or do not adopt security specfiications because of the infrastructure complexity. Given the `header` parameterization, a service provider is fully enabled to add nonce semantics into this scheme by using something like an `x-request-nonce` header, and ensuring it is signed with the `Date` header. ## Clock Skew As the default scheme is to sign the `Date` header, service providers SHOULD protect against logged replay attacks by enforcing a clock skew. The server SHOULD be synchronized with NTP, and the recommendation in this specification is to allow 300s of clock skew (in either direction). ## Required Headers to Sign It is out of scope for this document to dictate what headers a service provider will want to enforce, but service providers SHOULD at minimum include the `Date` header. # References ## Normative References * [RFC2616] Hypertext Transfer Protocol -- HTTP/1.1 * [RFC2617] HTTP Authentication: Basic and Digest Access Authentication * [RFC5246] The Transport Layer Security (TLS) Protocol Version 1.2 ## Informative References Name: Mark Cavage (editor) Company: Joyent, Inc. Email: mark.cavage@joyent.com URI: http://www.joyent.com # Appendix A - Test Values The following test data uses the RSA (2048b) keys, which we will refer to as `keyId=Test` in the following samples: -----BEGIN PUBLIC KEY----- MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQDCFENGw33yGihy92pDjZQhl0C3 6rPJj+CvfSC8+q28hxA161QFNUd13wuCTUcq0Qd2qsBe/2hFyc2DCJJg0h1L78+6 Z4UMR7EOcpfdUE9Hf3m/hs+FUR45uBJeDK1HSFHD8bHKD6kv8FPGfJTotc+2xjJw oYi+1hqp1fIekaxsyQIDAQAB -----END PUBLIC KEY----- -----BEGIN RSA PRIVATE KEY----- MIICXgIBAAKBgQDCFENGw33yGihy92pDjZQhl0C36rPJj+CvfSC8+q28hxA161QF NUd13wuCTUcq0Qd2qsBe/2hFyc2DCJJg0h1L78+6Z4UMR7EOcpfdUE9Hf3m/hs+F UR45uBJeDK1HSFHD8bHKD6kv8FPGfJTotc+2xjJwoYi+1hqp1fIekaxsyQIDAQAB AoGBAJR8ZkCUvx5kzv+utdl7T5MnordT1TvoXXJGXK7ZZ+UuvMNUCdN2QPc4sBiA QWvLw1cSKt5DsKZ8UETpYPy8pPYnnDEz2dDYiaew9+xEpubyeW2oH4Zx71wqBtOK kqwrXa/pzdpiucRRjk6vE6YY7EBBs/g7uanVpGibOVAEsqH1AkEA7DkjVH28WDUg f1nqvfn2Kj6CT7nIcE3jGJsZZ7zlZmBmHFDONMLUrXR/Zm3pR5m0tCmBqa5RK95u 412jt1dPIwJBANJT3v8pnkth48bQo/fKel6uEYyboRtA5/uHuHkZ6FQF7OUkGogc mSJluOdc5t6hI1VsLn0QZEjQZMEOWr+wKSMCQQCC4kXJEsHAve77oP6HtG/IiEn7 kpyUXRNvFsDE0czpJJBvL/aRFUJxuRK91jhjC68sA7NsKMGg5OXb5I5Jj36xAkEA gIT7aFOYBFwGgQAQkWNKLvySgKbAZRTeLBacpHMuQdl1DfdntvAyqpAZ0lY0RKmW G6aFKaqQfOXKCyWoUiVknQJAXrlgySFci/2ueKlIE1QqIiLSZ8V8OlpFLRnb1pzI 7U1yQXnTAEFYM560yJlzUpOb1V4cScGd365tiSMvxLOvTA== -----END RSA PRIVATE KEY----- And all examples use this request: POST /foo?param=value&pet=dog HTTP/1.1 Host: example.com Date: Thu, 05 Jan 2012 21:31:40 GMT Content-Type: application/json Content-MD5: Sd/dVLAcvNLSq16eXua5uQ== Content-Length: 18 {"hello": "world"} ### Default The string to sign would be: date: Thu, 05 Jan 2012 21:31:40 GMT The Authorization header would be: Authorization: Signature keyId="Test",algorithm="rsa-sha256",signature="JldXnt8W9t643M2Sce10gqCh/+E7QIYLiI+bSjnFBGCti7s+mPPvOjVb72sbd1FjeOUwPTDpKbrQQORrm+xBYfAwCxF3LBSSzORvyJ5nRFCFxfJ3nlQD6Kdxhw8wrVZX5nSem4A/W3C8qH5uhFTRwF4ruRjh+ENHWuovPgO/HGQ=" ### All Headers Parameterized to include all headers, the string to sign would be (`+ "\n"` inserted for readability): POST /foo?param=value&pet=dog HTTP/1.1 + "\n" host: example.com + "\n" date: Thu, 05 Jan 2012 21:31:40 GMT + "\n" content-type: application/json + "\n" content-md5: Sd/dVLAcvNLSq16eXua5uQ== + "\n" content-length: 18 The Authorization header would be: Authorization: Signature keyId="Test",algorithm="rsa-sha256",headers="request-line host date content-type content-md5 content-length",signature="Gm7W/r+e90REDpWytALMrft4MqZxCmslOTOvwJX17ViEBA5E65QqvWI0vIH3l/vSsGiaMVmuUgzYsJLYMLcm5dGrv1+a+0fCoUdVKPZWHyImQEqpLkopVwqEH67LVECFBqFTAKlQgBn676zrfXQbb+b/VebAsNUtvQMe6cTjnDY=" ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/���000755 �000766 �000024 �00000000000 12456115120 033235� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/http-signature/LICENSE000644 �000766 �000024 �00000002053 12455173731 033507� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/�����������������000755 �000766 �000024 �00000000000 12456115120 035065� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/package.json������������������000644 �000766 �000024 �00000006132 12455173731 034713� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Joyent, Inc" }, "name": "http-signature", "description": "Reference implementation of Joyent's HTTP Signature Scheme", "version": "0.10.0", "repository": { "type": "git", "url": "git://github.com/joyent/node-http-signature.git" }, "engines": { "node": ">=0.8" }, "main": "lib/index.js", "scripts": { "test": "tap tst/*.js" }, "dependencies": { "assert-plus": "0.1.2", "asn1": "0.1.11", "ctype": "0.5.2" }, "devDependencies": { "node-uuid": "1.4.0", "tap": "0.4.2" }, "readme": "# node-http-signature\n\nnode-http-signature is a node.js library that has client and server components\nfor Joyent's [HTTP Signature Scheme](http_signing.md).\n\n## Usage\n\nNote the example below signs a request with the same key/cert used to start an\nHTTP server. This is almost certainly not what you actaully want, but is just\nused to illustrate the API calls; you will need to provide your own key\nmanagement in addition to this library.\n\n### Client\n\n var fs = require('fs');\n var https = require('https');\n var httpSignature = require('http-signature');\n\n var key = fs.readFileSync('./key.pem', 'ascii');\n\n var options = {\n host: 'localhost',\n port: 8443,\n path: '/',\n method: 'GET',\n headers: {}\n };\n\n // Adds a 'Date' header in, signs it, and adds the\n // 'Authorization' header in.\n var req = https.request(options, function(res) {\n console.log(res.statusCode);\n });\n\n\n httpSignature.sign(req, {\n key: key,\n keyId: './cert.pem'\n });\n\n req.end();\n\n### Server\n\n var fs = require('fs');\n var https = require('https');\n var httpSignature = require('http-signature');\n\n var options = {\n key: fs.readFileSync('./key.pem'),\n cert: fs.readFileSync('./cert.pem')\n };\n\n https.createServer(options, function (req, res) {\n var rc = 200;\n var parsed = httpSignature.parseRequest(req);\n var pub = fs.readFileSync(parsed.keyId, 'ascii');\n if (!httpSignature.verifySignature(parsed, pub))\n rc = 401;\n\n res.writeHead(rc);\n res.end();\n }).listen(8443);\n\n## Installation\n\n npm install http-signature\n\n## License\n\nMIT.\n\n## Bugs\n\nSee <https://github.com/joyent/node-http-signature/issues>.\n", "readmeFilename": "README.md", "_id": "http-signature@0.10.0", "dist": { "shasum": "1494e4f5000a83c0f11bcc12d6007c530cb99582", "tarball": "http://registry.npmjs.org/http-signature/-/http-signature-0.10.0.tgz" }, "_from": "http-signature@>=0.10.0 <0.11.0", "_npmVersion": "1.2.18", "_npmUser": { "name": "mcavage", "email": "mcavage@gmail.com" }, "maintainers": [ { "name": "mcavage", "email": "mcavage@gmail.com" } ], "directories": {}, "_shasum": "1494e4f5000a83c0f11bcc12d6007c530cb99582", "_resolved": "https://registry.npmjs.org/http-signature/-/http-signature-0.10.0.tgz", "bugs": { "url": "https://github.com/joyent/node-http-signature/issues" }, "homepage": "https://github.com/joyent/node-http-signature" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/README.md���������������������000644 �000766 �000024 �00000003272 12455173731 033706� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# node-http-signature node-http-signature is a node.js library that has client and server components for Joyent's [HTTP Signature Scheme](http_signing.md). ## Usage Note the example below signs a request with the same key/cert used to start an HTTP server. This is almost certainly not what you actaully want, but is just used to illustrate the API calls; you will need to provide your own key management in addition to this library. ### Client var fs = require('fs'); var https = require('https'); var httpSignature = require('http-signature'); var key = fs.readFileSync('./key.pem', 'ascii'); var options = { host: 'localhost', port: 8443, path: '/', method: 'GET', headers: {} }; // Adds a 'Date' header in, signs it, and adds the // 'Authorization' header in. var req = https.request(options, function(res) { console.log(res.statusCode); }); httpSignature.sign(req, { key: key, keyId: './cert.pem' }); req.end(); ### Server var fs = require('fs'); var https = require('https'); var httpSignature = require('http-signature'); var options = { key: fs.readFileSync('./key.pem'), cert: fs.readFileSync('./cert.pem') }; https.createServer(options, function (req, res) { var rc = 200; var parsed = httpSignature.parseRequest(req); var pub = fs.readFileSync(parsed.keyId, 'ascii'); if (!httpSignature.verifySignature(parsed, pub)) rc = 401; res.writeHead(rc); res.end(); }).listen(8443); ## Installation npm install http-signature ## License MIT. ## Bugs See <https://github.com/joyent/node-http-signature/issues>. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/������������000755 �000766 �000024 �00000000000 12456115120 035727� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/�����000755 �000766 �000024 �00000000000 12456115120 037347� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/�����������000755 �000766 �000024 �00000000000 12456115120 036211� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/CHANGELOG��000644 �000766 �000024 �00000003551 12455173731 037442� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������This contains tickets fixed in each version release in reverse chronological order. There is one ticket per line. Each commits message has the tickets fixed in it. The commit message also has the corresponding github issue. i.e. CTYPE-42 would be issue 42. Each issue can be found at: https://github.com/rmustacc/node-ctype/issues/%d. CTYPE v0.5.2 CTYPE-46 Release 0.5.2 CTYPE-45 error in setEndian logic v0.5.1 CTYPE-44 Release 0.5.1 Contributed by Terin Stock: CTYPE-41 CTypeParser.writeStruct should return its offset Contributed by Terin Stock: CTYPE-42 int64_t returns wrong size v0.5.0 CTYPE-40 Release 0.5.0 CTYPE-39 want > 0.6 engine support v0.4.0 CTYPE-37 Release v0.4.0 CTYPE-6 want additional entry point for write CTYPE-20 Add 64-bit int support into core parser CTYPE-31 Fix bounds errors node/2129 CTYPE-33 Update copyright holders CTYPE-34 ctf.js confuses sign bit. CTYPE-35 Make the README more useful for getting started CTYPE-36 want manual page on ctio functions v0.3.1 CTYPE-29 Release 0.3.1 CTYPE-28 Want v0.6 npm support v0.3.0 CTYPE-27 Release v0.3.0 CTYPE-26 Want alternate default char behavior v0.2.1 CTYPE-25 Release v0.2.1 CTYPE-24 Writing structs is busted v0.2.0: CTYPE-23 Release v0.2.0 CTYPE-21 Add support for CTF JSON data CTYPE-22 Add Javascriptlint profile CTYPE-15 Pull in ctio updates from node/master v0.1.0: CTYPE-18 Bump version to v0.1.0 CTYPE-17 Fix nested structures CTYPE-16 Remove extraneous logging CTYPE-14 toAbs64 and toApprox64 are not exported v0.0.3: CTYPE-12 Bump version to v0.0.3 CTYPE-11 fix typo in wuint64 CTYPE-10 Integrate jsstyle v0.0.2: CTYPE-8 dump npm version to v0.0.2 CTYPE-9 want changelog CTYPE-7 fix typo in detypes. v0.0.1: CTYPE-5 Missing from NPM registry CTYPE-4 int16_t calls wrong read function CTYPE-3 API example types are missing quotes as strings CTYPE-2 doc missing 64-bit functions CTYPE-1 Need license �������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctf.js�����000644 �000766 �000024 �00000013312 12455173731 037336� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* * ctf.js * * Understand and parse all of the different JSON formats of CTF data and * translate that into a series of node-ctype friendly pieces. The reason for * the abstraction is to handle different changes in the file format. * * We have to be careful here that we don't end up using a name that is already * a built in type. */ var mod_assert = require('assert'); var ASSERT = mod_assert.ok; var ctf_versions = [ '1.0' ]; var ctf_entries = [ 'integer', 'float', 'typedef', 'struct' ]; var ctf_deftypes = [ 'int8_t', 'uint8_t', 'int16_t', 'uint16_t', 'int32_t', 'uint32_t', 'float', 'double' ]; function ctfParseInteger(entry, ctype) { var name, sign, len, type; name = entry['name']; if (!('signed' in entry['integer'])) throw (new Error('Malformed CTF JSON: integer missing ' + 'signed value')); if (!('length' in entry['integer'])) throw (new Error('Malformed CTF JSON: integer missing ' + 'length value')); sign = entry['integer']['signed']; len = entry['integer']['length']; type = null; if (sign && len == 1) type = 'int8_t'; else if (len == 1) type = 'uint8_t'; else if (sign && len == 2) type = 'int16_t'; else if (len == 2) type = 'uint16_t'; else if (sign && len == 4) type = 'int32_t'; else if (len == 4) type = 'uint32_t'; else if (sign && len == 8) type = 'int64_t'; else if (len == 8) type = 'uint64_t'; if (type === null) throw (new Error('Malformed CTF JSON: integer has ' + 'unsupported length and sign - ' + len + '/' + sign)); /* * This means that this is the same as one of our built in types. If * that's the case defining it would be an error. So instead of trying * to typedef it, we'll return here. */ if (name == type) return; if (name == 'char') { ASSERT(type == 'int8_t'); return; } ctype.typedef(name, type); } function ctfParseFloat(entry, ctype) { var name, len; name = entry['name']; if (!('length' in entry['float'])) throw (new Error('Malformed CTF JSON: float missing ' + 'length value')); len = entry['float']['length']; if (len != 4 && len != 8) throw (new Error('Malformed CTF JSON: float has invalid ' + 'length value')); if (len == 4) { if (name == 'float') return; ctype.typedef(name, 'float'); } else if (len == 8) { if (name == 'double') return; ctype.typedef(name, 'double'); } } function ctfParseTypedef(entry, ctype) { var name, type, ii; name = entry['name']; if (typeof (entry['typedef']) != 'string') throw (new Error('Malformed CTF JSON: typedef value in not ' + 'a string')); type = entry['typedef']; /* * We need to ensure that we're not looking at type that's one of our * built in types. Traditionally in C a uint32_t would be a typedef to * some kind of integer. However, those size types are built ins. */ for (ii = 0; ii < ctf_deftypes.length; ii++) { if (name == ctf_deftypes[ii]) return; } ctype.typedef(name, type); } function ctfParseStruct(entry, ctype) { var name, type, ii, val, index, member, push; member = []; if (!Array.isArray(entry['struct'])) throw (new Error('Malformed CTF JSON: struct value is not ' + 'an array')); for (ii = 0; ii < entry['struct'].length; ii++) { val = entry['struct'][ii]; if (!('name' in val)) throw (new Error('Malformed CTF JSON: struct member ' + 'missing name')); if (!('type' in val)) throw (new Error('Malformed CTF JSON: struct member ' + 'missing type')); if (typeof (val['name']) != 'string') throw (new Error('Malformed CTF JSON: struct member ' + 'name isn\'t a string')); if (typeof (val['type']) != 'string') throw (new Error('Malformed CTF JSON: struct member ' + 'type isn\'t a string')); /* * CTF version 2 specifies array names as <type> [<num>] where * as node-ctype does this as <type>[<num>]. */ name = val['name']; type = val['type']; index = type.indexOf(' ['); if (index != -1) { type = type.substring(0, index) + type.substring(index + 1, type.length); } push = {}; push[name] = { 'type': type }; member.push(push); } name = entry['name']; ctype.typedef(name, member); } function ctfParseEntry(entry, ctype) { var ii, found; if (!('name' in entry)) throw (new Error('Malformed CTF JSON: entry missing "name" ' + 'section')); for (ii = 0; ii < ctf_entries.length; ii++) { if (ctf_entries[ii] in entry) found++; } if (found === 0) throw (new Error('Malformed CTF JSON: found no entries')); if (found >= 2) throw (new Error('Malformed CTF JSON: found more than one ' + 'entry')); if ('integer' in entry) { ctfParseInteger(entry, ctype); return; } if ('float' in entry) { ctfParseFloat(entry, ctype); return; } if ('typedef' in entry) { ctfParseTypedef(entry, ctype); return; } if ('struct' in entry) { ctfParseStruct(entry, ctype); return; } ASSERT(false, 'shouldn\'t reach here'); } function ctfParseJson(json, ctype) { var version, ii; ASSERT(json); ASSERT(ctype); if (!('metadata' in json)) throw (new Error('Invalid CTF JSON: missing metadata section')); if (!('ctf2json_version' in json['metadata'])) throw (new Error('Invalid CTF JSON: missing ctf2json_version')); version = json['metadata']['ctf2json_version']; for (ii = 0; ii < ctf_versions.length; ii++) { if (ctf_versions[ii] == version) break; } if (ii == ctf_versions.length) throw (new Error('Unsuported ctf2json_version: ' + version)); if (!('data' in json)) throw (new Error('Invalid CTF JSON: missing data section')); if (!Array.isArray(json['data'])) throw (new Error('Malformed CTF JSON: data section is not ' + 'an array')); for (ii = 0; ii < json['data'].length; ii++) ctfParseEntry(json['data'][ii], ctype); } exports.ctfParseJson = ctfParseJson; ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctio.js����000644 �000766 �000024 �00000125037 12455173731 037530� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* * rm - Feb 2011 * ctio.js: * * A simple way to read and write simple ctypes. Of course, as you'll find the * code isn't as simple as it might appear. The following types are currently * supported in big and little endian formats: * * uint8_t int8_t * uint16_t int16_t * uint32_t int32_t * float (single precision IEEE 754) * double (double precision IEEE 754) * * This is designed to work in Node and v8. It may in fact work in other * Javascript interpreters (that'd be pretty neat), but it hasn't been tested. * If you find that it does in fact work, that's pretty cool. Try and pass word * back to the original author. * * Note to the reader: If you're tabstop isn't set to 8, parts of this may look * weird. */ /* * Numbers in Javascript have a secret: all numbers must be represented with an * IEEE-754 double. The double has a mantissa with a length of 52 bits with an * implicit one. Thus the range of integers that can be represented is limited * to the size of the mantissa, this makes reading and writing 64-bit integers * difficult, but far from impossible. * * Another side effect of this representation is what happens when you use the * bitwise operators, i.e. shift left, shift right, and, or, etc. In Javascript, * each operand and the result is cast to a signed 32-bit number. However, in * the case of >>> the values are cast to an unsigned number. */ /* * A reminder on endian related issues: * * Big Endian: MSB -> First byte * Little Endian: MSB->Last byte */ var mod_assert = require('assert'); /* * An 8 bit unsigned integer involves doing no significant work. */ function ruint8(buffer, endian, offset) { if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset >= buffer.length) throw (new Error('Trying to read beyond buffer length')); return (buffer[offset]); } /* * For 16 bit unsigned numbers we can do all the casting that we want to do. */ function rgint16(buffer, endian, offset) { var val = 0; if (endian == 'big') { val = buffer[offset] << 8; val |= buffer[offset+1]; } else { val = buffer[offset]; val |= buffer[offset+1] << 8; } return (val); } function ruint16(buffer, endian, offset) { if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 1 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); return (rgint16(buffer, endian, offset)); } /* * Because most bitshifting is done using signed numbers, if we would go into * the realm where we use that 32nd bit, we'll end up going into the negative * range. i.e.: * > 200 << 24 * -939524096 * * Not the value you'd expect. To work around this, we end up having to do some * abuse of the JavaScript standard. in this case, we know that a >>> shift is * defined to cast our value to an *unsigned* 32-bit number. Because of that, we * use that instead to save us some additional math, though it does feel a * little weird and it isn't obvious as to why you woul dwant to do this at * first. */ function rgint32(buffer, endian, offset) { var val = 0; if (endian == 'big') { val = buffer[offset+1] << 16; val |= buffer[offset+2] << 8; val |= buffer[offset+3]; val = val + (buffer[offset] << 24 >>> 0); } else { val = buffer[offset+2] << 16; val |= buffer[offset+1] << 8; val |= buffer[offset]; val = val + (buffer[offset + 3] << 24 >>> 0); } return (val); } function ruint32(buffer, endian, offset) { if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); return (rgint32(buffer, endian, offset)); } /* * Reads a 64-bit unsigned number. The astue observer will note that this * doesn't quite work. Javascript has chosen to only have numbers that can be * represented by a double. A double only has 52 bits of mantissa with an * implicit 1, thus we have up to 53 bits to represent an integer. However, 2^53 * doesn't quite give us what we want. Isn't 53 bits enough for anyone? What * could you have possibly wanted to represent that was larger than that? Oh, * maybe a size? You mean we bypassed the 4 GB limit on file sizes, when did * that happen? * * To get around this egregious language issue, we're going to instead construct * an array of two 32 bit unsigned integers. Where arr[0] << 32 + arr[1] would * give the actual number. However, note that the above code probably won't * produce the desired results because of the way Javascript numbers are * doubles. */ function rgint64(buffer, endian, offset) { var val = new Array(2); if (endian == 'big') { val[0] = ruint32(buffer, endian, offset); val[1] = ruint32(buffer, endian, offset+4); } else { val[0] = ruint32(buffer, endian, offset+4); val[1] = ruint32(buffer, endian, offset); } return (val); } function ruint64(buffer, endian, offset) { if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 7 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); return (rgint64(buffer, endian, offset)); } /* * Signed integer types, yay team! A reminder on how two's complement actually * works. The first bit is the signed bit, i.e. tells us whether or not the * number should be positive or negative. If the two's complement value is * positive, then we're done, as it's equivalent to the unsigned representation. * * Now if the number is positive, you're pretty much done, you can just leverage * the unsigned translations and return those. Unfortunately, negative numbers * aren't quite that straightforward. * * At first glance, one might be inclined to use the traditional formula to * translate binary numbers between the positive and negative values in two's * complement. (Though it doesn't quite work for the most negative value) * Mainly: * - invert all the bits * - add one to the result * * Of course, this doesn't quite work in Javascript. Take for example the value * of -128. This could be represented in 16 bits (big-endian) as 0xff80. But of * course, Javascript will do the following: * * > ~0xff80 * -65409 * * Whoh there, Javascript, that's not quite right. But wait, according to * Javascript that's perfectly correct. When Javascript ends up seeing the * constant 0xff80, it has no notion that it is actually a signed number. It * assumes that we've input the unsigned value 0xff80. Thus, when it does the * binary negation, it casts it into a signed value, (positive 0xff80). Then * when you perform binary negation on that, it turns it into a negative number. * * Instead, we're going to have to use the following general formula, that works * in a rather Javascript friendly way. I'm glad we don't support this kind of * weird numbering scheme in the kernel. * * (BIT-MAX - (unsigned)val + 1) * -1 * * The astute observer, may think that this doesn't make sense for 8-bit numbers * (really it isn't necessary for them). However, when you get 16-bit numbers, * you do. Let's go back to our prior example and see how this will look: * * (0xffff - 0xff80 + 1) * -1 * (0x007f + 1) * -1 * (0x0080) * -1 * * Doing it this way ends up allowing us to treat it appropriately in * Javascript. Sigh, that's really quite ugly for what should just be a few bit * shifts, ~ and &. */ /* * Endianness doesn't matter for 8-bit signed values. We could in fact optimize * this case because the more traditional methods work, but for consistency, * we'll keep doing this the same way. */ function rsint8(buffer, endian, offset) { var neg; if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset >= buffer.length) throw (new Error('Trying to read beyond buffer length')); neg = buffer[offset] & 0x80; if (!neg) return (buffer[offset]); return ((0xff - buffer[offset] + 1) * -1); } /* * The 16-bit version requires a bit more effort. In this case, we can leverage * our unsigned code to generate the value we want to return. */ function rsint16(buffer, endian, offset) { var neg, val; if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 1 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = rgint16(buffer, endian, offset); neg = val & 0x8000; if (!neg) return (val); return ((0xffff - val + 1) * -1); } /* * We really shouldn't leverage our 32-bit code here and instead utilize the * fact that we know that since these are signed numbers, we can do all the * shifting and binary anding to generate the 32-bit number. But, for * consistency we'll do the same. If we want to do otherwise, we should instead * make the 32 bit unsigned code do the optimization. But as long as there * aren't floats secretly under the hood for that, we /should/ be okay. */ function rsint32(buffer, endian, offset) { var neg, val; if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = rgint32(buffer, endian, offset); neg = val & 0x80000000; if (!neg) return (val); return ((0xffffffff - val + 1) * -1); } /* * The signed version of this code suffers from all of the same problems of the * other 64 bit version. */ function rsint64(buffer, endian, offset) { var neg, val; if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = rgint64(buffer, endian, offset); neg = val[0] & 0x80000000; if (!neg) return (val); val[0] = (0xffffffff - val[0]) * -1; val[1] = (0xffffffff - val[1] + 1) * -1; /* * If we had the key 0x8000000000000000, that would leave the lower 32 * bits as 0xffffffff, however, since we're goint to add one, that would * actually leave the lower 32-bits as 0x100000000, which would break * our ability to write back a value that we received. To work around * this, if we actually get that value, we're going to bump the upper * portion by 1 and set this to zero. */ mod_assert.ok(val[1] <= 0x100000000); if (val[1] == -0x100000000) { val[1] = 0; val[0]--; } return (val); } /* * We now move onto IEEE 754: The traditional form for floating point numbers * and what is secretly hiding at the heart of everything in this. I really hope * that someone is actually using this, as otherwise, this effort is probably * going to be more wasted. * * One might be tempted to use parseFloat here, but that wouldn't work at all * for several reasons. Mostly due to the way floats actually work, and * parseFloat only actually works in base 10. I don't see base 10 anywhere near * this file. * * In this case we'll implement the single and double precision versions. The * quadruple precision, while probably useful, wouldn't really be accepted by * Javascript, so let's not even waste our time. * * So let's review how this format looks like. A single precision value is 32 * bits and has three parts: * - Sign bit * - Exponent (Using bias notation) * - Mantissa * * |s|eeeeeeee|mmmmmmmmmmmmmmmmmmmmmmmmm| * 31| 30-23 | 22 - 0 | * * The exponent is stored in a biased input. The bias in this case 127. * Therefore, our exponent is equal to the 8-bit value - 127. * * By default, a number is normalized in IEEE, that means that the mantissa has * an implicit one that we don't see. So really the value stored is 1.m. * However, if the exponent is all zeros, then instead we have to shift * everything to the right one and there is no more implicit one. * * Special values: * - Positive Infinity: * Sign: 0 * Exponent: All 1s * Mantissa: 0 * - Negative Infinity: * Sign: 1 * Exponent: All 1s * Mantissa: 0 * - NaN: * Sign: * * Exponent: All 1s * Mantissa: non-zero * - Zero: * Sign: * * Exponent: All 0s * Mantissa: 0 * * In the case of zero, the sign bit determines whether we get a positive or * negative zero. However, since Javascript cannot determine the difference * between the two: i.e. -0 == 0, we just always return 0. * */ function rfloat(buffer, endian, offset) { var bytes = []; var sign, exponent, mantissa, val; var bias = 127; var maxexp = 0xff; if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); /* Normalize the bytes to be in endian order */ if (endian == 'big') { bytes[0] = buffer[offset]; bytes[1] = buffer[offset+1]; bytes[2] = buffer[offset+2]; bytes[3] = buffer[offset+3]; } else { bytes[3] = buffer[offset]; bytes[2] = buffer[offset+1]; bytes[1] = buffer[offset+2]; bytes[0] = buffer[offset+3]; } sign = bytes[0] & 0x80; exponent = (bytes[0] & 0x7f) << 1; exponent |= (bytes[1] & 0x80) >>> 7; mantissa = (bytes[1] & 0x7f) << 16; mantissa |= bytes[2] << 8; mantissa |= bytes[3]; /* Check for special cases before we do general parsing */ if (!sign && exponent == maxexp && mantissa === 0) return (Number.POSITIVE_INFINITY); if (sign && exponent == maxexp && mantissa === 0) return (Number.NEGATIVE_INFINITY); if (exponent == maxexp && mantissa !== 0) return (Number.NaN); /* * Javascript really doesn't have support for positive or negative zero. * So we're not going to try and give it to you. That would be just * plain weird. Besides -0 == 0. */ if (exponent === 0 && mantissa === 0) return (0); /* * Now we can deal with the bias and the determine whether the mantissa * has the implicit one or not. */ exponent -= bias; if (exponent == -bias) { exponent++; val = 0; } else { val = 1; } val = (val + mantissa * Math.pow(2, -23)) * Math.pow(2, exponent); if (sign) val *= -1; return (val); } /* * Doubles in IEEE 754 are like their brothers except for a few changes and * increases in size: * - The exponent is now 11 bits * - The mantissa is now 52 bits * - The bias is now 1023 * * |s|eeeeeeeeeee|mmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm| * 63| 62 - 52 | 51 - 0 | * 63| 62 - 52 | 51 - 0 | * * While the size has increased a fair amount, we're going to end up keeping the * same general formula for calculating the final value. As a reminder, this * formula is: * * (-1)^s * (n + m) * 2^(e-b) * * Where: * s is the sign bit * n is (exponent > 0) ? 1 : 0 -- Determines whether we're normalized * or not * m is the mantissa * e is the exponent specified * b is the bias for the exponent * */ function rdouble(buffer, endian, offset) { var bytes = []; var sign, exponent, mantissa, val, lowmant; var bias = 1023; var maxexp = 0x7ff; if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 7 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); /* Normalize the bytes to be in endian order */ if (endian == 'big') { bytes[0] = buffer[offset]; bytes[1] = buffer[offset+1]; bytes[2] = buffer[offset+2]; bytes[3] = buffer[offset+3]; bytes[4] = buffer[offset+4]; bytes[5] = buffer[offset+5]; bytes[6] = buffer[offset+6]; bytes[7] = buffer[offset+7]; } else { bytes[7] = buffer[offset]; bytes[6] = buffer[offset+1]; bytes[5] = buffer[offset+2]; bytes[4] = buffer[offset+3]; bytes[3] = buffer[offset+4]; bytes[2] = buffer[offset+5]; bytes[1] = buffer[offset+6]; bytes[0] = buffer[offset+7]; } /* * We can construct the exponent and mantissa the same way as we did in * the case of a float, just increase the range of the exponent. */ sign = bytes[0] & 0x80; exponent = (bytes[0] & 0x7f) << 4; exponent |= (bytes[1] & 0xf0) >>> 4; /* * This is going to be ugly but then again, we're dealing with IEEE 754. * This could probably be done as a node add on in a few lines of C++, * but oh we'll, we've made it this far so let's be native the rest of * the way... * * What we're going to do is break the mantissa into two parts, the * lower 24 bits and the upper 28 bits. We'll multiply the upper 28 bits * by the appropriate power and then add in the lower 24-bits. Not * really that great. It's pretty much a giant kludge to deal with * Javascript eccentricities around numbers. */ lowmant = bytes[7]; lowmant |= bytes[6] << 8; lowmant |= bytes[5] << 16; mantissa = bytes[4]; mantissa |= bytes[3] << 8; mantissa |= bytes[2] << 16; mantissa |= (bytes[1] & 0x0f) << 24; mantissa *= Math.pow(2, 24); /* Equivalent to << 24, but JS compat */ mantissa += lowmant; /* Check for special cases before we do general parsing */ if (!sign && exponent == maxexp && mantissa === 0) return (Number.POSITIVE_INFINITY); if (sign && exponent == maxexp && mantissa === 0) return (Number.NEGATIVE_INFINITY); if (exponent == maxexp && mantissa !== 0) return (Number.NaN); /* * Javascript really doesn't have support for positive or negative zero. * So we're not going to try and give it to you. That would be just * plain weird. Besides -0 == 0. */ if (exponent === 0 && mantissa === 0) return (0); /* * Now we can deal with the bias and the determine whether the mantissa * has the implicit one or not. */ exponent -= bias; if (exponent == -bias) { exponent++; val = 0; } else { val = 1; } val = (val + mantissa * Math.pow(2, -52)) * Math.pow(2, exponent); if (sign) val *= -1; return (val); } /* * Now that we have gone through the pain of reading the individual types, we're * probably going to want some way to write these back. None of this is going to * be good. But since we have Javascript numbers this should certainly be more * interesting. Though we can constrain this end a little bit more in what is * valid. For now, let's go back to our friends the unsigned value. */ /* * Unsigned numbers seem deceptively easy. Here are the general steps and rules * that we are going to take: * - If the number is negative, throw an Error * - Truncate any floating point portion * - Take the modulus of the number in our base * - Write it out to the buffer in the endian format requested at the offset */ /* * We have to make sure that the value is a valid integer. This means that it is * non-negative. It has no fractional component and that it does not exceed the * maximum allowed value. * * value The number to check for validity * * max The maximum value */ function prepuint(value, max) { if (typeof (value) != 'number') throw (new (Error('cannot write a non-number as a number'))); if (value < 0) throw (new Error('specified a negative value for writing an ' + 'unsigned value')); if (value > max) throw (new Error('value is larger than maximum value for ' + 'type')); if (Math.floor(value) !== value) throw (new Error('value has a fractional component')); return (value); } /* * 8-bit version, classy. We can ignore endianness which is good. */ function wuint8(value, endian, buffer, offset) { var val; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = prepuint(value, 0xff); buffer[offset] = val; } /* * Pretty much the same as the 8-bit version, just this time we need to worry * about endian related issues. */ function wgint16(val, endian, buffer, offset) { if (endian == 'big') { buffer[offset] = (val & 0xff00) >>> 8; buffer[offset+1] = val & 0x00ff; } else { buffer[offset+1] = (val & 0xff00) >>> 8; buffer[offset] = val & 0x00ff; } } function wuint16(value, endian, buffer, offset) { var val; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 1 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = prepuint(value, 0xffff); wgint16(val, endian, buffer, offset); } /* * The 32-bit version is going to have to be a little different unfortunately. * We can't quite bitshift to get the largest byte, because that would end up * getting us caught by the signed values. * * And yes, we do want to subtract out the lower part by default. This means * that when we do the division, it will be treated as a bit shift and we won't * end up generating a floating point value. If we did generate a floating point * value we'd have to truncate it intelligently, this saves us that problem and * may even be somewhat faster under the hood. */ function wgint32(val, endian, buffer, offset) { if (endian == 'big') { buffer[offset] = (val - (val & 0x00ffffff)) / Math.pow(2, 24); buffer[offset+1] = (val >>> 16) & 0xff; buffer[offset+2] = (val >>> 8) & 0xff; buffer[offset+3] = val & 0xff; } else { buffer[offset+3] = (val - (val & 0x00ffffff)) / Math.pow(2, 24); buffer[offset+2] = (val >>> 16) & 0xff; buffer[offset+1] = (val >>> 8) & 0xff; buffer[offset] = val & 0xff; } } function wuint32(value, endian, buffer, offset) { var val; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = prepuint(value, 0xffffffff); wgint32(val, endian, buffer, offset); } /* * Unlike the other versions, we expect the value to be in the form of two * arrays where value[0] << 32 + value[1] would result in the value that we * want. */ function wgint64(value, endian, buffer, offset) { if (endian == 'big') { wgint32(value[0], endian, buffer, offset); wgint32(value[1], endian, buffer, offset+4); } else { wgint32(value[0], endian, buffer, offset+4); wgint32(value[1], endian, buffer, offset); } } function wuint64(value, endian, buffer, offset) { if (value === undefined) throw (new Error('missing value')); if (!(value instanceof Array)) throw (new Error('value must be an array')); if (value.length != 2) throw (new Error('value must be an array of length 2')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 7 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); prepuint(value[0], 0xffffffff); prepuint(value[1], 0xffffffff); wgint64(value, endian, buffer, offset); } /* * We now move onto our friends in the signed number category. Unlike unsigned * numbers, we're going to have to worry a bit more about how we put values into * arrays. Since we are only worrying about signed 32-bit values, we're in * slightly better shape. Unfortunately, we really can't do our favorite binary * & in this system. It really seems to do the wrong thing. For example: * * > -32 & 0xff * 224 * * What's happening above is really: 0xe0 & 0xff = 0xe0. However, the results of * this aren't treated as a signed number. Ultimately a bad thing. * * What we're going to want to do is basically create the unsigned equivalent of * our representation and pass that off to the wuint* functions. To do that * we're going to do the following: * * - if the value is positive * we can pass it directly off to the equivalent wuint * - if the value is negative * we do the following computation: * mb + val + 1, where * mb is the maximum unsigned value in that byte size * val is the Javascript negative integer * * * As a concrete value, take -128. In signed 16 bits this would be 0xff80. If * you do out the computations: * * 0xffff - 128 + 1 * 0xffff - 127 * 0xff80 * * You can then encode this value as the signed version. This is really rather * hacky, but it should work and get the job done which is our goal here. * * Thus the overall flow is: * - Truncate the floating point part of the number * - We don't have to take the modulus, because the unsigned versions will * take care of that for us. And we don't have to worry about that * potentially causing bad things to happen because of sign extension * - Pass it off to the appropriate unsigned version, potentially modifying * the negative portions as necessary. */ /* * A series of checks to make sure we actually have a signed 32-bit number */ function prepsint(value, max, min) { if (typeof (value) != 'number') throw (new (Error('cannot write a non-number as a number'))); if (value > max) throw (new Error('value larger than maximum allowed value')); if (value < min) throw (new Error('value smaller than minimum allowed value')); if (Math.floor(value) !== value) throw (new Error('value has a fractional component')); return (value); } /* * The 8-bit version of the signed value. Overall, fairly straightforward. */ function wsint8(value, endian, buffer, offset) { var val; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = prepsint(value, 0x7f, -0x80); if (val >= 0) wuint8(val, endian, buffer, offset); else wuint8(0xff + val + 1, endian, buffer, offset); } /* * The 16-bit version of the signed value. Also, fairly straightforward. */ function wsint16(value, endian, buffer, offset) { var val; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 1 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = prepsint(value, 0x7fff, -0x8000); if (val >= 0) wgint16(val, endian, buffer, offset); else wgint16(0xffff + val + 1, endian, buffer, offset); } /* * We can do this relatively easily by leveraging the code used for 32-bit * unsigned code. */ function wsint32(value, endian, buffer, offset) { var val; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); val = prepsint(value, 0x7fffffff, -0x80000000); if (val >= 0) wgint32(val, endian, buffer, offset); else wgint32(0xffffffff + val + 1, endian, buffer, offset); } /* * The signed 64 bit integer should by in the same format as when received. * Mainly it should ensure that the value is an array of two integers where * value[0] << 32 + value[1] is the desired number. Furthermore, the two values * need to be equal. */ function wsint64(value, endian, buffer, offset) { var vzpos, vopos; var vals = new Array(2); if (value === undefined) throw (new Error('missing value')); if (!(value instanceof Array)) throw (new Error('value must be an array')); if (value.length != 2) throw (new Error('value must be an array of length 2')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 7 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); /* * We need to make sure that we have the same sign on both values. The * hokiest way to to do this is to multiply the number by +inf. If we do * this, we'll get either +/-inf depending on the sign of the value. * Once we have this, we can compare it to +inf to see if the number is * positive or not. */ vzpos = (value[0] * Number.POSITIVE_INFINITY) == Number.POSITIVE_INFINITY; vopos = (value[1] * Number.POSITIVE_INFINITY) == Number.POSITIVE_INFINITY; /* * If either of these is zero, then we don't actually need this check. */ if (value[0] != 0 && value[1] != 0 && vzpos != vopos) throw (new Error('Both entries in the array must have ' + 'the same sign')); /* * Doing verification for a signed 64-bit integer is actually a big * trickier than it appears. We can't quite use our standard techniques * because we need to compare both sets of values. The first value is * pretty straightforward. If the first value is beond the extremes than * we error out. However, the valid range of the second value varies * based on the first one. If the first value is negative, and *not* the * largest negative value, than it can be any integer within the range [ * 0, 0xffffffff ]. If it is the largest negative number, it must be * zero. * * If the first number is positive, than it doesn't matter what the * value is. We just simply have to make sure we have a valid positive * integer. */ if (vzpos) { prepuint(value[0], 0x7fffffff); prepuint(value[1], 0xffffffff); } else { prepsint(value[0], 0, -0x80000000); prepsint(value[1], 0, -0xffffffff); if (value[0] == -0x80000000 && value[1] != 0) throw (new Error('value smaller than minimum ' + 'allowed value')); } /* Fix negative numbers */ if (value[0] < 0 || value[1] < 0) { vals[0] = 0xffffffff - Math.abs(value[0]); vals[1] = 0x100000000 - Math.abs(value[1]); if (vals[1] == 0x100000000) { vals[1] = 0; vals[0]++; } } else { vals[0] = value[0]; vals[1] = value[1]; } wgint64(vals, endian, buffer, offset); } /* * Now we are moving onto the weirder of these, the float and double. For this * we're going to just have to do something that's pretty weird. First off, we * have no way to get at the underlying float representation, at least not * easily. But that doesn't mean we can't figure it out, we just have to use our * heads. * * One might propose to use Number.toString(2). Of course, this is not really * that good, because the ECMAScript 262 v3 Standard says the following Section * 15.7.4.2-Number.prototype.toString (radix): * * If radix is an integer from 2 to 36, but not 10, the result is a string, the * choice of which is implementation-dependent. * * Well that doesn't really help us one bit now does it? We could use the * standard base 10 version of the string, but that's just going to create more * errors as we end up trying to convert it back to a binary value. So, really * this just means we have to be non-lazy and parse the structure intelligently. * * First off, we can do the basic checks: NaN, positive and negative infinity. * * Now that those are done we can work backwards to generate the mantissa and * exponent. * * The first thing we need to do is determine the sign bit, easy to do, check * whether the value is less than 0. And convert the number to its absolute * value representation. Next, we need to determine if the value is less than * one or greater than or equal to one and from there determine what power was * used to get there. What follows is now specific to floats, though the general * ideas behind this will hold for doubles as well, but the exact numbers * involved will change. * * Once we have that power we can determine the exponent and the mantissa. Call * the value that has the number of bits to reach the power ebits. In the * general case they have the following values: * * exponent 127 + ebits * mantissa value * 2^(23 - ebits) & 0x7fffff * * In the case where the value of ebits is <= -127 we are now in the case where * we no longer have normalized numbers. In this case the values take on the * following values: * * exponent 0 * mantissa value * 2^149 & 0x7fffff * * Once we have the values for the sign, mantissa, and exponent. We reconstruct * the four bytes as follows: * * byte0 sign bit and seven most significant bits from the exp * sign << 7 | (exponent & 0xfe) >>> 1 * * byte1 lsb from the exponent and 7 top bits from the mantissa * (exponent & 0x01) << 7 | (mantissa & 0x7f0000) >>> 16 * * byte2 bits 8-15 (zero indexing) from mantissa * mantissa & 0xff00 >> 8 * * byte3 bits 0-7 from mantissa * mantissa & 0xff * * Once we have this we have to assign them into the buffer in proper endian * order. */ /* * Compute the log base 2 of the value. Now, someone who remembers basic * properties of logarithms will point out that we could use the change of base * formula for logs, and in fact that would be astute, because that's what we'll * do for now. It feels cleaner, albeit it may be less efficient than just * iterating and dividing by 2. We may want to come back and revisit that some * day. */ function log2(value) { return (Math.log(value) / Math.log(2)); } /* * Helper to determine the exponent of the number we're looking at. */ function intexp(value) { return (Math.floor(log2(value))); } /* * Helper to determine the exponent of the fractional part of the value. */ function fracexp(value) { return (Math.floor(log2(value))); } function wfloat(value, endian, buffer, offset) { var sign, exponent, mantissa, ebits; var bytes = []; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 3 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); if (isNaN(value)) { sign = 0; exponent = 0xff; mantissa = 23; } else if (value == Number.POSITIVE_INFINITY) { sign = 0; exponent = 0xff; mantissa = 0; } else if (value == Number.NEGATIVE_INFINITY) { sign = 1; exponent = 0xff; mantissa = 0; } else { /* Well we have some work to do */ /* Thankfully the sign bit is trivial */ if (value < 0) { sign = 1; value = Math.abs(value); } else { sign = 0; } /* Use the correct function to determine number of bits */ if (value < 1) ebits = fracexp(value); else ebits = intexp(value); /* Time to deal with the issues surrounding normalization */ if (ebits <= -127) { exponent = 0; mantissa = (value * Math.pow(2, 149)) & 0x7fffff; } else { exponent = 127 + ebits; mantissa = value * Math.pow(2, 23 - ebits); mantissa &= 0x7fffff; } } bytes[0] = sign << 7 | (exponent & 0xfe) >>> 1; bytes[1] = (exponent & 0x01) << 7 | (mantissa & 0x7f0000) >>> 16; bytes[2] = (mantissa & 0x00ff00) >>> 8; bytes[3] = mantissa & 0x0000ff; if (endian == 'big') { buffer[offset] = bytes[0]; buffer[offset+1] = bytes[1]; buffer[offset+2] = bytes[2]; buffer[offset+3] = bytes[3]; } else { buffer[offset] = bytes[3]; buffer[offset+1] = bytes[2]; buffer[offset+2] = bytes[1]; buffer[offset+3] = bytes[0]; } } /* * Now we move onto doubles. Doubles are similar to floats in pretty much all * ways except that the processing isn't quite as straightforward because we * can't always use shifting, i.e. we have > 32 bit values. * * We're going to proceed in an identical fashion to floats and utilize the same * helper functions. All that really is changing are the specific values that we * use to do the calculations. Thus, to review we have to do the following. * * First get the sign bit and convert the value to its absolute value * representation. Next, we determine the number of bits that we used to get to * the value, branching whether the value is greater than or less than 1. Once * we have that value which we will again call ebits, we have to do the * following in the general case: * * exponent 1023 + ebits * mantissa [value * 2^(52 - ebits)] % 2^52 * * In the case where the value of ebits <= -1023 we no longer use normalized * numbers, thus like with floats we have to do slightly different processing: * * exponent 0 * mantissa [value * 2^1074] % 2^52 * * Once we have determined the sign, exponent and mantissa we can construct the * bytes as follows: * * byte0 sign bit and seven most significant bits form the exp * sign << 7 | (exponent & 0x7f0) >>> 4 * * byte1 Remaining 4 bits from the exponent and the four most * significant bits from the mantissa 48-51 * (exponent & 0x00f) << 4 | mantissa >>> 48 * * byte2 Bits 40-47 from the mantissa * (mantissa >>> 40) & 0xff * * byte3 Bits 32-39 from the mantissa * (mantissa >>> 32) & 0xff * * byte4 Bits 24-31 from the mantissa * (mantissa >>> 24) & 0xff * * byte5 Bits 16-23 from the Mantissa * (mantissa >>> 16) & 0xff * * byte6 Bits 8-15 from the mantissa * (mantissa >>> 8) & 0xff * * byte7 Bits 0-7 from the mantissa * mantissa & 0xff * * Now we can't quite do the right shifting that we want in bytes 1 - 3, because * we'll have extended too far and we'll lose those values when we try and do * the shift. Instead we have to use an alternate approach. To try and stay out * of floating point, what we'll do is say that mantissa -= bytes[4-7] and then * divide by 2^32. Once we've done that we can use binary arithmetic. Oof, * that's ugly, but it seems to avoid using floating point (just based on how v8 * seems to be optimizing for base 2 arithmetic). */ function wdouble(value, endian, buffer, offset) { var sign, exponent, mantissa, ebits; var bytes = []; if (value === undefined) throw (new Error('missing value')); if (endian === undefined) throw (new Error('missing endian')); if (buffer === undefined) throw (new Error('missing buffer')); if (offset === undefined) throw (new Error('missing offset')); if (offset + 7 >= buffer.length) throw (new Error('Trying to read beyond buffer length')); if (isNaN(value)) { sign = 0; exponent = 0x7ff; mantissa = 23; } else if (value == Number.POSITIVE_INFINITY) { sign = 0; exponent = 0x7ff; mantissa = 0; } else if (value == Number.NEGATIVE_INFINITY) { sign = 1; exponent = 0x7ff; mantissa = 0; } else { /* Well we have some work to do */ /* Thankfully the sign bit is trivial */ if (value < 0) { sign = 1; value = Math.abs(value); } else { sign = 0; } /* Use the correct function to determine number of bits */ if (value < 1) ebits = fracexp(value); else ebits = intexp(value); /* * This is a total hack to determine a denormalized value. * Unfortunately, we sometimes do not get a proper value for * ebits, i.e. we lose the values that would get rounded off. * * * The astute observer may wonder why we would be * multiplying by two Math.pows rather than just summing * them. Well, that's to get around a small bug in the * way v8 seems to implement the function. On occasion * doing: * * foo * Math.pow(2, 1023 + 51) * * Causes us to overflow to infinity, where as doing: * * foo * Math.pow(2, 1023) * Math.pow(2, 51) * * Does not cause us to overflow. Go figure. * */ if (value <= 2.225073858507201e-308 || ebits <= -1023) { exponent = 0; mantissa = value * Math.pow(2, 1023) * Math.pow(2, 51); mantissa %= Math.pow(2, 52); } else { /* * We might have gotten fucked by our floating point * logarithm magic. This is rather crappy, but that's * our luck. If we just had a log base 2 or access to * the stupid underlying representation this would have * been much easier and we wouldn't have such stupid * kludges or hacks. */ if (ebits > 1023) ebits = 1023; exponent = 1023 + ebits; mantissa = value * Math.pow(2, -ebits); mantissa *= Math.pow(2, 52); mantissa %= Math.pow(2, 52); } } /* Fill the bytes in backwards to deal with the size issues */ bytes[7] = mantissa & 0xff; bytes[6] = (mantissa >>> 8) & 0xff; bytes[5] = (mantissa >>> 16) & 0xff; mantissa = (mantissa - (mantissa & 0xffffff)) / Math.pow(2, 24); bytes[4] = mantissa & 0xff; bytes[3] = (mantissa >>> 8) & 0xff; bytes[2] = (mantissa >>> 16) & 0xff; bytes[1] = (exponent & 0x00f) << 4 | mantissa >>> 24; bytes[0] = (sign << 7) | (exponent & 0x7f0) >>> 4; if (endian == 'big') { buffer[offset] = bytes[0]; buffer[offset+1] = bytes[1]; buffer[offset+2] = bytes[2]; buffer[offset+3] = bytes[3]; buffer[offset+4] = bytes[4]; buffer[offset+5] = bytes[5]; buffer[offset+6] = bytes[6]; buffer[offset+7] = bytes[7]; } else { buffer[offset+7] = bytes[0]; buffer[offset+6] = bytes[1]; buffer[offset+5] = bytes[2]; buffer[offset+4] = bytes[3]; buffer[offset+3] = bytes[4]; buffer[offset+2] = bytes[5]; buffer[offset+1] = bytes[6]; buffer[offset] = bytes[7]; } } /* * Actually export our work above. One might argue that we shouldn't expose * these interfaces and just force people to use the higher level abstractions * around this work. However, unlike say other libraries we've come across, this * interface has several properties: it makes sense, it's simple, and it's * useful. */ exports.ruint8 = ruint8; exports.ruint16 = ruint16; exports.ruint32 = ruint32; exports.ruint64 = ruint64; exports.wuint8 = wuint8; exports.wuint16 = wuint16; exports.wuint32 = wuint32; exports.wuint64 = wuint64; exports.rsint8 = rsint8; exports.rsint16 = rsint16; exports.rsint32 = rsint32; exports.rsint64 = rsint64; exports.wsint8 = wsint8; exports.wsint16 = wsint16; exports.wsint32 = wsint32; exports.wsint64 = wsint64; exports.rfloat = rfloat; exports.rdouble = rdouble; exports.wfloat = wfloat; exports.wdouble = wdouble; �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/ctype.js���000644 �000766 �000024 �00000061251 12455173731 037713� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* * rm - Feb 2011 * ctype.js * * This module provides a simple abstraction towards reading and writing * different types of binary data. It is designed to use ctio.js and provide a * richer and more expressive API on top of it. * * By default we support the following as built in basic types: * int8_t * int16_t * int32_t * uint8_t * uint16_t * uint32_t * uint64_t * float * double * char * char[] * * Each type is returned as a Number, with the exception of char and char[] * which are returned as Node Buffers. A char is considered a uint8_t. * * Requests to read and write data are specified as an array of JSON objects. * This is also the same way that one declares structs. Even if just a single * value is requested, it must be done as a struct. The array order determines * the order that we try and read values. Each entry has the following format * with values marked with a * being optional. * * { key: { type: /type/, value*: /value/, offset*: /offset/ } * * If offset is defined, we lseek(offset, SEEK_SET) before reading the next * value. Value is defined when we're writing out data, otherwise it's ignored. * */ var mod_ctf = require('./ctf.js'); var mod_ctio = require('./ctio.js'); var mod_assert = require('assert'); /* * This is the set of basic types that we support. * * read The function to call to read in a value from a buffer * * write The function to call to write a value to a buffer * */ var deftypes = { 'uint8_t': { read: ctReadUint8, write: ctWriteUint8 }, 'uint16_t': { read: ctReadUint16, write: ctWriteUint16 }, 'uint32_t': { read: ctReadUint32, write: ctWriteUint32 }, 'uint64_t': { read: ctReadUint64, write: ctWriteUint64 }, 'int8_t': { read: ctReadSint8, write: ctWriteSint8 }, 'int16_t': { read: ctReadSint16, write: ctWriteSint16 }, 'int32_t': { read: ctReadSint32, write: ctWriteSint32 }, 'int64_t': { read: ctReadSint64, write: ctWriteSint64 }, 'float': { read: ctReadFloat, write: ctWriteFloat }, 'double': { read: ctReadDouble, write: ctWriteDouble }, 'char': { read: ctReadChar, write: ctWriteChar }, 'char[]': { read: ctReadCharArray, write: ctWriteCharArray } }; /* * The following are wrappers around the CType IO low level API. They encode * knowledge about the size and return something in the expected format. */ function ctReadUint8(endian, buffer, offset) { var val = mod_ctio.ruint8(buffer, endian, offset); return ({ value: val, size: 1 }); } function ctReadUint16(endian, buffer, offset) { var val = mod_ctio.ruint16(buffer, endian, offset); return ({ value: val, size: 2 }); } function ctReadUint32(endian, buffer, offset) { var val = mod_ctio.ruint32(buffer, endian, offset); return ({ value: val, size: 4 }); } function ctReadUint64(endian, buffer, offset) { var val = mod_ctio.ruint64(buffer, endian, offset); return ({ value: val, size: 8 }); } function ctReadSint8(endian, buffer, offset) { var val = mod_ctio.rsint8(buffer, endian, offset); return ({ value: val, size: 1 }); } function ctReadSint16(endian, buffer, offset) { var val = mod_ctio.rsint16(buffer, endian, offset); return ({ value: val, size: 2 }); } function ctReadSint32(endian, buffer, offset) { var val = mod_ctio.rsint32(buffer, endian, offset); return ({ value: val, size: 4 }); } function ctReadSint64(endian, buffer, offset) { var val = mod_ctio.rsint64(buffer, endian, offset); return ({ value: val, size: 8 }); } function ctReadFloat(endian, buffer, offset) { var val = mod_ctio.rfloat(buffer, endian, offset); return ({ value: val, size: 4 }); } function ctReadDouble(endian, buffer, offset) { var val = mod_ctio.rdouble(buffer, endian, offset); return ({ value: val, size: 8 }); } /* * Reads a single character into a node buffer */ function ctReadChar(endian, buffer, offset) { var res = new Buffer(1); res[0] = mod_ctio.ruint8(buffer, endian, offset); return ({ value: res, size: 1 }); } function ctReadCharArray(length, endian, buffer, offset) { var ii; var res = new Buffer(length); for (ii = 0; ii < length; ii++) res[ii] = mod_ctio.ruint8(buffer, endian, offset + ii); return ({ value: res, size: length }); } function ctWriteUint8(value, endian, buffer, offset) { mod_ctio.wuint8(value, endian, buffer, offset); return (1); } function ctWriteUint16(value, endian, buffer, offset) { mod_ctio.wuint16(value, endian, buffer, offset); return (2); } function ctWriteUint32(value, endian, buffer, offset) { mod_ctio.wuint32(value, endian, buffer, offset); return (4); } function ctWriteUint64(value, endian, buffer, offset) { mod_ctio.wuint64(value, endian, buffer, offset); return (8); } function ctWriteSint8(value, endian, buffer, offset) { mod_ctio.wsint8(value, endian, buffer, offset); return (1); } function ctWriteSint16(value, endian, buffer, offset) { mod_ctio.wsint16(value, endian, buffer, offset); return (2); } function ctWriteSint32(value, endian, buffer, offset) { mod_ctio.wsint32(value, endian, buffer, offset); return (4); } function ctWriteSint64(value, endian, buffer, offset) { mod_ctio.wsint64(value, endian, buffer, offset); return (8); } function ctWriteFloat(value, endian, buffer, offset) { mod_ctio.wfloat(value, endian, buffer, offset); return (4); } function ctWriteDouble(value, endian, buffer, offset) { mod_ctio.wdouble(value, endian, buffer, offset); return (8); } /* * Writes a single character into a node buffer */ function ctWriteChar(value, endian, buffer, offset) { if (!(value instanceof Buffer)) throw (new Error('Input must be a buffer')); mod_ctio.ruint8(value[0], endian, buffer, offset); return (1); } /* * We're going to write 0s into the buffer if the string is shorter than the * length of the array. */ function ctWriteCharArray(value, length, endian, buffer, offset) { var ii; if (!(value instanceof Buffer)) throw (new Error('Input must be a buffer')); if (value.length > length) throw (new Error('value length greater than array length')); for (ii = 0; ii < value.length && ii < length; ii++) mod_ctio.wuint8(value[ii], endian, buffer, offset + ii); for (; ii < length; ii++) mod_ctio.wuint8(0, endian, offset + ii); return (length); } /* * Each parser has their own set of types. We want to make sure that they each * get their own copy as they may need to modify it. */ function ctGetBasicTypes() { var ret = {}; var key; for (key in deftypes) ret[key] = deftypes[key]; return (ret); } /* * Given a string in the form of type[length] we want to split this into an * object that extracts that information. We want to note that we could possibly * have nested arrays so this should only check the furthest one. It may also be * the case that we have no [] pieces, in which case we just return the current * type. */ function ctParseType(str) { var begInd, endInd; var type, len; if (typeof (str) != 'string') throw (new Error('type must be a Javascript string')); endInd = str.lastIndexOf(']'); if (endInd == -1) { if (str.lastIndexOf('[') != -1) throw (new Error('found invalid type with \'[\' but ' + 'no corresponding \']\'')); return ({ type: str }); } begInd = str.lastIndexOf('['); if (begInd == -1) throw (new Error('found invalid type with \']\' but ' + 'no corresponding \'[\'')); if (begInd >= endInd) throw (new Error('malformed type, \']\' appears before \'[\'')); type = str.substring(0, begInd); len = str.substring(begInd + 1, endInd); return ({ type: type, len: len }); } /* * Given a request validate that all of the fields for it are valid and make * sense. This includes verifying the following notions: * - Each type requested is present in types * - Only allow a name for a field to be specified once * - If an array is specified, validate that the requested field exists and * comes before it. * - If fields is defined, check that each entry has the occurrence of field */ function ctCheckReq(def, types, fields) { var ii, jj; var req, keys, key; var found = {}; if (!(def instanceof Array)) throw (new Error('definition is not an array')); if (def.length === 0) throw (new Error('definition must have at least one element')); for (ii = 0; ii < def.length; ii++) { req = def[ii]; if (!(req instanceof Object)) throw (new Error('definition must be an array of' + 'objects')); keys = Object.keys(req); if (keys.length != 1) throw (new Error('definition entry must only have ' + 'one key')); if (keys[0] in found) throw (new Error('Specified name already ' + 'specified: ' + keys[0])); if (!('type' in req[keys[0]])) throw (new Error('missing required type definition')); key = ctParseType(req[keys[0]]['type']); /* * We may have nested arrays, we need to check the validity of * the types until the len field is undefined in key. However, * each time len is defined we need to verify it is either an * integer or corresponds to an already seen key. */ while (key['len'] !== undefined) { if (isNaN(parseInt(key['len'], 10))) { if (!(key['len'] in found)) throw (new Error('Given an array ' + 'length without a matching type')); } key = ctParseType(key['type']); } /* Now we can validate if the type is valid */ if (!(key['type'] in types)) throw (new Error('type not found or typdefed: ' + key['type'])); /* Check for any required fields */ if (fields !== undefined) { for (jj = 0; jj < fields.length; jj++) { if (!(fields[jj] in req[keys[0]])) throw (new Error('Missing required ' + 'field: ' + fields[jj])); } } found[keys[0]] = true; } } /* * Create a new instance of the parser. Each parser has its own store of * typedefs and endianness. Conf is an object with the following required * values: * * endian Either 'big' or 'little' do determine the endianness we * want to read from or write to. * * And the following optional values: * * char-type Valid options here are uint8 and int8. If uint8 is * specified this changes the default behavior of a single * char from being a buffer of a single character to being * a uint8_t. If int8, it becomes an int8_t instead. */ function CTypeParser(conf) { if (!conf) throw (new Error('missing required argument')); if (!('endian' in conf)) throw (new Error('missing required endian value')); if (conf['endian'] != 'big' && conf['endian'] != 'little') throw (new Error('Invalid endian type')); if ('char-type' in conf && (conf['char-type'] != 'uint8' && conf['char-type'] != 'int8')) throw (new Error('invalid option for char-type: ' + conf['char-type'])); this.endian = conf['endian']; this.types = ctGetBasicTypes(); /* * There may be a more graceful way to do this, but this will have to * serve. */ if ('char-type' in conf && conf['char-type'] == 'uint8') this.types['char'] = this.types['uint8_t']; if ('char-type' in conf && conf['char-type'] == 'int8') this.types['char'] = this.types['int8_t']; } /* * Sets the current endian value for the Parser. If the value is not valid, * throws an Error. * * endian Either 'big' or 'little' do determine the endianness we * want to read from or write to. * */ CTypeParser.prototype.setEndian = function (endian) { if (endian != 'big' && endian != 'little') throw (new Error('invalid endian type, must be big or ' + 'little')); this.endian = endian; }; /* * Returns the current value of the endian value for the parser. */ CTypeParser.prototype.getEndian = function () { return (this.endian); }; /* * A user has requested to add a type, let us honor their request. Yet, if their * request doth spurn us, send them unto the Hells which Dante describes. * * name The string for the type definition we're adding * * value Either a string that is a type/array name or an object * that describes a struct. */ CTypeParser.prototype.typedef = function (name, value) { var type; if (name === undefined) throw (new (Error('missing required typedef argument: name'))); if (value === undefined) throw (new (Error('missing required typedef argument: value'))); if (typeof (name) != 'string') throw (new (Error('the name of a type must be a string'))); type = ctParseType(name); if (type['len'] !== undefined) throw (new Error('Cannot have an array in the typedef name')); if (name in this.types) throw (new Error('typedef name already present: ' + name)); if (typeof (value) != 'string' && !(value instanceof Array)) throw (new Error('typedef value must either be a string or ' + 'struct')); if (typeof (value) == 'string') { type = ctParseType(value); if (type['len'] !== undefined) { if (isNaN(parseInt(type['len'], 10))) throw (new (Error('typedef value must use ' + 'fixed size array when outside of a ' + 'struct'))); } this.types[name] = value; } else { /* We have a struct, validate it */ ctCheckReq(value, this.types); this.types[name] = value; } }; /* * Include all of the typedefs, but none of the built in types. This should be * treated as read-only. */ CTypeParser.prototype.lstypes = function () { var key; var ret = {}; for (key in this.types) { if (key in deftypes) continue; ret[key] = this.types[key]; } return (ret); }; /* * Given a type string that may have array types that aren't numbers, try and * fill them in from the values object. The object should be of the format where * indexing into it should return a number for that type. * * str The type string * * values An object that can be used to fulfill type information */ function ctResolveArray(str, values) { var ret = ''; var type = ctParseType(str); while (type['len'] !== undefined) { if (isNaN(parseInt(type['len'], 10))) { if (typeof (values[type['len']]) != 'number') throw (new Error('cannot sawp in non-number ' + 'for array value')); ret = '[' + values[type['len']] + ']' + ret; } else { ret = '[' + type['len'] + ']' + ret; } type = ctParseType(type['type']); } ret = type['type'] + ret; return (ret); } /* * [private] Either the typedef resolves to another type string or to a struct. * If it resolves to a struct, we just pass it off to read struct. If not, we * can just pass it off to read entry. */ CTypeParser.prototype.resolveTypedef = function (type, dispatch, buffer, offset, value) { var pt; mod_assert.ok(type in this.types); if (typeof (this.types[type]) == 'string') { pt = ctParseType(this.types[type]); if (dispatch == 'read') return (this.readEntry(pt, buffer, offset)); else if (dispatch == 'write') return (this.writeEntry(value, pt, buffer, offset)); else throw (new Error('invalid dispatch type to ' + 'resolveTypedef')); } else { if (dispatch == 'read') return (this.readStruct(this.types[type], buffer, offset)); else if (dispatch == 'write') return (this.writeStruct(value, this.types[type], buffer, offset)); else throw (new Error('invalid dispatch type to ' + 'resolveTypedef')); } }; /* * [private] Try and read in the specific entry. */ CTypeParser.prototype.readEntry = function (type, buffer, offset) { var parse, len; /* * Because we want to special case char[]s this is unfortunately * a bit uglier than it really should be. We want to special * case char[]s so that we return a node buffer, thus they are a * first class type where as all other arrays just call into a * generic array routine which calls their data-specific routine * the specified number of times. * * The valid dispatch options we have are: * - Array and char => char[] handler * - Generic array handler * - Generic typedef handler * - Basic type handler */ if (type['len'] !== undefined) { len = parseInt(type['len'], 10); if (isNaN(len)) throw (new Error('somehow got a non-numeric length')); if (type['type'] == 'char') parse = this.types['char[]']['read'](len, this.endian, buffer, offset); else parse = this.readArray(type['type'], len, buffer, offset); } else { if (type['type'] in deftypes) parse = this.types[type['type']]['read'](this.endian, buffer, offset); else parse = this.resolveTypedef(type['type'], 'read', buffer, offset); } return (parse); }; /* * [private] Read an array of data */ CTypeParser.prototype.readArray = function (type, length, buffer, offset) { var ii, ent, pt; var baseOffset = offset; var ret = new Array(length); pt = ctParseType(type); for (ii = 0; ii < length; ii++) { ent = this.readEntry(pt, buffer, offset); offset += ent['size']; ret[ii] = ent['value']; } return ({ value: ret, size: offset - baseOffset }); }; /* * [private] Read a single struct in. */ CTypeParser.prototype.readStruct = function (def, buffer, offset) { var parse, ii, type, entry, key; var baseOffset = offset; var ret = {}; /* Walk it and handle doing what's necessary */ for (ii = 0; ii < def.length; ii++) { key = Object.keys(def[ii])[0]; entry = def[ii][key]; /* Resolve all array values */ type = ctParseType(ctResolveArray(entry['type'], ret)); if ('offset' in entry) offset = baseOffset + entry['offset']; parse = this.readEntry(type, buffer, offset); offset += parse['size']; ret[key] = parse['value']; } return ({ value: ret, size: (offset-baseOffset)}); }; /* * This is what we were born to do. We read the data from a buffer and return it * in an object whose keys match the values from the object. * * def The array definition of the data to read in * * buffer The buffer to read data from * * offset The offset to start writing to * * Returns an object where each key corresponds to an entry in def and the value * is the read value. */ CTypeParser.prototype.readData = function (def, buffer, offset) { /* Sanity check for arguments */ if (def === undefined) throw (new Error('missing definition for what we should be' + 'parsing')); if (buffer === undefined) throw (new Error('missing buffer for what we should be ' + 'parsing')); if (offset === undefined) throw (new Error('missing offset for what we should be ' + 'parsing')); /* Sanity check the object definition */ ctCheckReq(def, this.types); return (this.readStruct(def, buffer, offset)['value']); }; /* * [private] Write out an array of data */ CTypeParser.prototype.writeArray = function (value, type, length, buffer, offset) { var ii, pt; var baseOffset = offset; if (!(value instanceof Array)) throw (new Error('asked to write an array, but value is not ' + 'an array')); if (value.length != length) throw (new Error('asked to write array of length ' + length + ' but that does not match value length: ' + value.length)); pt = ctParseType(type); for (ii = 0; ii < length; ii++) offset += this.writeEntry(value[ii], pt, buffer, offset); return (offset - baseOffset); }; /* * [private] Write the specific entry */ CTypeParser.prototype.writeEntry = function (value, type, buffer, offset) { var len, ret; if (type['len'] !== undefined) { len = parseInt(type['len'], 10); if (isNaN(len)) throw (new Error('somehow got a non-numeric length')); if (type['type'] == 'char') ret = this.types['char[]']['write'](value, len, this.endian, buffer, offset); else ret = this.writeArray(value, type['type'], len, buffer, offset); } else { if (type['type'] in deftypes) ret = this.types[type['type']]['write'](value, this.endian, buffer, offset); else ret = this.resolveTypedef(type['type'], 'write', buffer, offset, value); } return (ret); }; /* * [private] Write a single struct out. */ CTypeParser.prototype.writeStruct = function (value, def, buffer, offset) { var ii, entry, type, key; var baseOffset = offset; var vals = {}; for (ii = 0; ii < def.length; ii++) { key = Object.keys(def[ii])[0]; entry = def[ii][key]; type = ctParseType(ctResolveArray(entry['type'], vals)); if ('offset' in entry) offset = baseOffset + entry['offset']; offset += this.writeEntry(value[ii], type, buffer, offset); /* Now that we've written it out, we can use it for arrays */ vals[key] = value[ii]; } return (offset); }; /* * Unfortunately, we're stuck with the sins of an initial poor design. Because * of that, we are going to have to support the old way of writing data via * writeData. There we insert the values that you want to write into the * definition. A little baroque. Internally, we use the new model. So we need to * just get those values out of there. But to maintain the principle of least * surprise, we're not going to modify the input data. */ function getValues(def) { var ii, out, key; out = []; for (ii = 0; ii < def.length; ii++) { key = Object.keys(def[ii])[0]; mod_assert.ok('value' in def[ii][key]); out.push(def[ii][key]['value']); } return (out); } /* * This is the second half of what we were born to do, write out the data * itself. Historically this function required you to put your values in the * definition section. This was not the smartest thing to do and a bit of an * oversight to be honest. As such, this function now takes a values argument. * If values is non-null and non-undefined, it will be used to determine the * values. This means that the old method is still supported, but is no longer * acceptable. * * def The array definition of the data to write out with * values * * buffer The buffer to write to * * offset The offset in the buffer to write to * * values An array of values to write. */ CTypeParser.prototype.writeData = function (def, buffer, offset, values) { var hv; if (def === undefined) throw (new Error('missing definition for what we should be' + 'parsing')); if (buffer === undefined) throw (new Error('missing buffer for what we should be ' + 'parsing')); if (offset === undefined) throw (new Error('missing offset for what we should be ' + 'parsing')); hv = (values != null && values != undefined); if (hv) { if (!Array.isArray(values)) throw (new Error('missing values for writing')); ctCheckReq(def, this.types); } else { ctCheckReq(def, this.types, [ 'value' ]); } this.writeStruct(hv ? values : getValues(def), def, buffer, offset); }; /* * Functions to go to and from 64 bit numbers in a way that is compatible with * Javascript limitations. There are two sets. One where the user is okay with * an approximation and one where they are definitely not okay with an * approximation. */ /* * Attempts to convert an array of two integers returned from rsint64 / ruint64 * into an absolute 64 bit number. If however the value would exceed 2^52 this * will instead throw an error. The mantissa in a double is a 52 bit number and * rather than potentially give you a value that is an approximation this will * error. If you would rather an approximation, please see toApprox64. * * val An array of two 32-bit integers */ function toAbs64(val) { if (val === undefined) throw (new Error('missing required arg: value')); if (!Array.isArray(val)) throw (new Error('value must be an array')); if (val.length != 2) throw (new Error('value must be an array of length 2')); /* We have 20 bits worth of precision in this range */ if (val[0] >= 0x100000) throw (new Error('value would become approximated')); return (val[0] * Math.pow(2, 32) + val[1]); } /* * Will return the 64 bit value as returned in an array from rsint64 / ruint64 * to a value as close as it can. Note that Javascript stores all numbers as a * double and the mantissa only has 52 bits. Thus this version may approximate * the value. * * val An array of two 32-bit integers */ function toApprox64(val) { if (val === undefined) throw (new Error('missing required arg: value')); if (!Array.isArray(val)) throw (new Error('value must be an array')); if (val.length != 2) throw (new Error('value must be an array of length 2')); return (Math.pow(2, 32) * val[0] + val[1]); } function parseCTF(json, conf) { var ctype = new CTypeParser(conf); mod_ctf.ctfParseJson(json, ctype); return (ctype); } /* * Export the few things we actually want to. Currently this is just the CType * Parser and ctio. */ exports.Parser = CTypeParser; exports.toAbs64 = toAbs64; exports.toApprox64 = toApprox64; exports.parseCTF = parseCTF; exports.ruint8 = mod_ctio.ruint8; exports.ruint16 = mod_ctio.ruint16; exports.ruint32 = mod_ctio.ruint32; exports.ruint64 = mod_ctio.ruint64; exports.wuint8 = mod_ctio.wuint8; exports.wuint16 = mod_ctio.wuint16; exports.wuint32 = mod_ctio.wuint32; exports.wuint64 = mod_ctio.wuint64; exports.rsint8 = mod_ctio.rsint8; exports.rsint16 = mod_ctio.rsint16; exports.rsint32 = mod_ctio.rsint32; exports.rsint64 = mod_ctio.rsint64; exports.wsint8 = mod_ctio.wsint8; exports.wsint16 = mod_ctio.wsint16; exports.wsint32 = mod_ctio.wsint32; exports.wsint64 = mod_ctio.wsint64; exports.rfloat = mod_ctio.rfloat; exports.rdouble = mod_ctio.rdouble; exports.wfloat = mod_ctio.wfloat; exports.wdouble = mod_ctio.wdouble; �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/LICENSE����000644 �000766 �000024 �00000002415 12455173731 037233� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������The following license applies to all files unless the file is specified below. Each file specified below has its license information embedded in it: tools/jsstyle Copyright 2011, Robert Mustacchi. All rights reserved. Copyright 2011, Joyent, Inc. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/man/�������000755 �000766 �000024 �00000000000 12456115120 036764� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/package.json���000644 �000766 �000024 �00000001606 12455173731 040515� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������{ "name": "ctype", "version": "0.5.2", "description": "read and write binary structures and data types", "homepage": "https://github.com/rmustacc/node-ctype", "author": { "name": "Robert Mustacchi", "email": "rm@fingolfin.org" }, "engines": { "node": ">= 0.4" }, "main": "ctype.js", "_id": "ctype@0.5.2", "dist": { "shasum": "fe8091d468a373a0b0c9ff8bbfb3425c00973a1d", "tarball": "http://registry.npmjs.org/ctype/-/ctype-0.5.2.tgz" }, "_npmVersion": "1.1.59", "_npmUser": { "name": "rm", "email": "rm@fingolfin.org" }, "maintainers": [ { "name": "rm", "email": "rm@fingolfin.org" } ], "directories": {}, "_shasum": "fe8091d468a373a0b0c9ff8bbfb3425c00973a1d", "_resolved": "https://registry.npmjs.org/ctype/-/ctype-0.5.2.tgz", "_from": "ctype@0.5.2", "readme": "ERROR: No README data found!", "scripts": {} } ��������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/README�����000644 �000766 �000024 �00000005512 12455173731 037107� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Node-CType is a way to read and write binary data in structured and easy to use format. Its name comes from the C header file. To get started, simply clone the repository or use npm to install it. Once it is there, simply require it. git clone git://github.com/rmustacc/node-ctype npm install ctype var mod_ctype = require('ctype') There are two APIs that you can use, depending on what abstraction you'd like. The low level API let's you read and write individual integers and floats from buffers. The higher level API let's you read and write structures of these. To illustrate this, let's looks look at how we would read and write a binary encoded x,y point. In C we would define this structure as follows: typedef struct point { uint16_t p_x; uint16_t p_y; } point_t; To read a binary encoded point from a Buffer, we first need to create a CType parser (where we specify the endian and other options) and add the typedef. var parser = new mod_ctype.Parser({ endian: 'big' }); parser.typedef('point_t', [ { x: { type: 'uint16_t' } }, { y: { type: 'uint16_t' } } ]); From here, given a buffer buf and an offset into it, we can read a point. var out = parser.readData([ { point: { type: 'point_t' } } ], buffer, 0); console.log(out); { point: { x: 23, y: 42 } } Another way to get the same information would be to use the low level methods. Note that these require you to manually deal with the offset. Here's how we'd get the same values of x and y from the buffer. var x = mod_ctype.ruint16(buf, 'big', 0); var y = mod_ctype.ruint16(buf, 'big', 2); console.log(x + ', ' + y); 23, 42 The true power of this API comes from the ability to define and nest typedefs, just as you would in C. By default, the following types are defined by default. Note that they return a Number, unless indicated otherwise. * int8_t * int16_t * int32_t * int64_t (returns an array where val[0] << 32 + val[1] would be the value) * uint8_t * uint16_t * uint32_t * uint64_t (returns an array where val[0] << 32 + val[1] would be the value) * float * double * char (either returns a buffer with that character or a uint8_t) * char[] (returns an object with the buffer and the number of characters read which is either the total amount requested or until the first 0) ctf2json integration: Node-CType supports consuming the output of ctf2json. Once you read in a JSON file, all you have to do to add all the definitions it contains is: var data, parser; data = JSON.parse(parsedJSONData); parser = mod_ctype.parseCTF(data, { endian: 'big' }); For more documentation, see the file README.old. Full documentation is in the process of being rewritten as a series of manual pages which will be available in the repository and online for viewing. To read the ctio manual page simple run, from the root of the workspace: man -Mman -s 3ctype ctio ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/README.old�000644 �000766 �000024 �00000023636 12455173731 037673� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������This library provides a way to read and write binary data. Node CType is a way to read and write binary data in structured and easy to use formats. It's name comes from the header file, though it does not share as much with it as it perhaps should. There are two levels of the API. One is the raw API which everything is built on top of, while the other provides a much nicer abstraction and is built entirely by using the lower level API. The hope is that the low level API is both clear and useful. The low level API gets it's names from stdint.h (a rather appropriate source). The lower level API is presented at the end of this document. Standard CType API The CType interface is presented as a parser object that controls the endianness combined with a series of methods to change that value, parse and write out buffers, and a way to provide typedefs. Standard Types The CType parser supports the following basic types which return Numbers except as indicated: * int8_t * int16_t * int32_t * int64_t (returns an array where val[0] << 32 + val[1] would be the value) * uint8_t * uint16_t * uint32_t * uint64_t (returns an array where val[0] << 32 + val[1] would be the value) * float * double * char (returns a buffer with just that single character) * char[] (returns an object with the buffer and the number of characters read which is either the total amount requested or until the first 0) Specifying Structs The CType parser also supports the notion of structs. A struct is an array of JSON objects that defines an order of keys which have types and values. One would build a struct to represent a point (x,y) as follows: [ { x: { type: 'int16_t' }}, { y: { type: 'int16_t' }} ] When this is passed into the read routine, it would read the first two bytes (as defined by int16_t) to determine the Number to use for X, and then it would read the next two bytes to determine the value of Y. When read this could return something like: { x: 42, y: -23 } When someone wants to write values, we use the same format as above, but with additional value field: [ { x: { type: 'int16_t', value: 42 }}, { y: { type: 'int16_t', value: -23 }} ] Now, the structure above may be optionally annotated with offsets. This tells us to rather than read continuously we should read the given value at the specified offset. If an offset is provided, it is is effectively the equivalent of lseek(offset, SEEK_SET). Thus, subsequent values will be read from that offset and incremented by the appropriate value. As an example: [ { x: { type: 'int16_t' }}, { y: { type: 'int16_t', offset: 20 }}, { z: { type: 'int16_t' }} ] We would read x from the first starting offset given to us, for the sake of example, let's assume that's 0. After reading x, the next offset to read from would be 2; however, y specifies an offset, thus we jump directly to that offset and read y from byte 20. We would then read z from byte 22. The same offsets may be used when writing values. Typedef The basic set of types while covers the basics, is somewhat limiting. To make this richer, there is functionality to typedef something like in C. One can use typedef to add a new name to an existing type or to define a name to refer to a struct. Thus the following are all examples of a typedef: typedef('size_t', 'uint32_t'); typedef('ssize_t', 'int32_t'); typedef('point_t', [ { x: { type: 'int16_t' }}, { y: { type: 'int16_t' }} ]); Once something has been typedef'd it can be used in any of the definitions previously shown. One cannot remove a typedef once created, this is analogous to C. The set of defined types can be printed with lstypes. The format of this output is subject to change, but likely will look something like: > lstypes(); { size_t: 'uint32_t', ssize_t: 'int32_t', point_t: [ { x: { type: 'int16_t' }}, { y: { type: 'int16_t' }} ] } Specifying arrays Arrays can be specified by appending []s to a type. Arrays must have the size specified. The size must be specified and it can be done in one of two ways: * An explicit non-zero integer size * A name of a previously declared variable in the struct whose value is a number. Note, that when using the name of a variable, it should be the string name for the key. This is only valid inside structs and the value must be declared before the value with the array. The following are examples: [ { ip_addr4: { type: 'uint8_t[4]' }}, { len: { type: 'uint32_t' }}, { data: { type: 'uint8_t[len]' }} ] Arrays are permitted in typedefs; however, they must have a declared integer size. The following are examples of valid and invalid arrays: typedef('path', 'char[1024]'); /* Good */ typedef('path', 'char[len]'); /* Bad! */ 64 bit values: Unfortunately Javascript represents values with a double, so you lose precision and the ability to represent Integers roughly beyond 2^53. To alleviate this, I propose the following for returning 64 bit integers when read: value[2]: Each entry is a 32 bit number which can be reconstructed to the original by the following formula: value[0] << 32 + value[1] (Note this will not work in Javascript) CTF JSON data: node-ctype can also handle JSON data that mathces the format described in the documentation of the tool ctf2json. Given the JSON data which specifies type information, it will transform that into a parser that understands all of the types defined inside of it. This is useful for more complicated structures that have a lot of typedefs. Interface overview The following is the header-file like interface to the parser object: /* * Create a new instance of the parser. Each parser has its own store of * typedefs and endianness. Conf is an object with the following values: * * endian Either 'big' or 'little' do determine the endianness we * want to read from or write to. * */ function CTypeParser(conf); /* * Parses the CTF JSON data and creates a parser that understands all of those * types. * * data Parsed JSON data that maches that CTF JSON * specification. * * conf The configuration object to create a new CTypeParser * from. */ CTypeParser parseCTF(data, conf); /* * This is what we were born to do. We read the data from a buffer and return it * in an object whose keys match the values from the object. * * def The array definition of the data to read in * * buffer The buffer to read data from * * offset The offset to start writing to * * Returns an object where each key corresponds to an entry in def and the value * is the read value. */ Object CTypeParser.readData(<Type Definition>, buffer, offset); /* * This is the second half of what we were born to do, write out the data * itself. * * def The array definition of the data to write out with * values * * buffer The buffer to write to * * offset The offset in the buffer to write to */ void CTypeParser.writeData(<Type Definition>, buffer, offset); /* * A user has requested to add a type, let us honor their request. Yet, if their * request doth spurn us, send them unto the Hells which Dante describes. * * name The string for the type definition we're adding * * value Either a string that is a type/array name or an object * that describes a struct. */ void CTypeParser.prototype.typedef(name, value); Object CTypeParser.prototype.lstypes(); /* * Get the endian value for the current parser */ String CTypeParser.prototype.getEndian(); /* * Sets the current endian value for the Parser. If the value is not valid, * throws an Error. * * endian Either 'big' or 'little' do determine the endianness we * want to read from or write to. * */ void CTypeParser.protoype.setEndian(String); /* * Attempts to convert an array of two integers returned from rsint64 / ruint64 * into an absolute 64 bit number. If however the value would exceed 2^52 this * will instead throw an error. The mantissa in a double is a 52 bit number and * rather than potentially give you a value that is an approximation this will * error. If you would rather an approximation, please see toApprox64. * * val An array of two 32-bit integers */ Number function toAbs64(val) /* * Will return the 64 bit value as returned in an array from rsint64 / ruint64 * to a value as close as it can. Note that Javascript stores all numbers as a * double and the mantissa only has 52 bits. Thus this version may approximate * the value. * * val An array of two 32-bit integers */ Number function toApprox64(val) Low Level API The following function are provided at the low level: Read unsigned integers from a buffer: Number ruint8(buffer, endian, offset); Number ruint16(buffer, endian, offset); Number ruint32(buffer, endian, offset); Number[] ruint64(buffer, endian, offset); Read signed integers from a buffer: Number rsint8(buffer, endian, offset); Number rsint16(buffer, endian, offset); Number rsint32(buffer, endian, offset); Number[] rsint64(buffer, endian, offset); Read floating point numbers from a buffer: Number rfloat(buffer, endian, offset); /* IEEE-754 Single precision */ Number rdouble(buffer, endian, offset); /* IEEE-754 Double precision */ Write unsigned integers to a buffer: void wuint8(Number, endian, buffer, offset); void wuint16(Number, endian, buffer, offset); void wuint32(Number, endian, buffer, offset); void wuint64(Number[], endian, buffer, offset); Write signed integers from a buffer: void wsint8(Number, endian, buffer, offset); void wsint16(Number, endian, buffer, offset); void wsint32(Number, endian, buffer, offset); void wsint64(Number[], endian, buffer offset); Write floating point numbers from a buffer: void wfloat(Number, buffer, endian, offset); /* IEEE-754 Single precision */ void wdouble(Number, buffer, endian, offset); /* IEEE-754 Double precision */ ��������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/�����000755 �000766 �000024 �00000000000 12456115120 037351� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/�������000755 �000766 �000024 �00000000000 12456115120 037023� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/���000755 �000766 �000024 �00000000000 12456115120 037577� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/��000755 �000766 �000024 �00000000000 12456115120 037761� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/�000755 �000766 �000024 �00000000000 12456115120 040147� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.basicr.js�����000644 �000766 �000024 �00000002366 12455173731 042603� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Simple does to see if it works at all */ var mod_ctype = require('../../ctype'); var ASSERT = require('assert'); var mod_sys = require('sys'); function test() { var ii, p, result, buffer; p = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer(4); buffer[0] = 23; buffer[3] = 42; result = p.readData([ { x: { type: 'uint8_t' }}, { y: { type: 'uint8_t', offset: 3 }} ], buffer, 0); ASSERT.equal(23, result['x']); ASSERT.equal(42, result['y']); buffer = new Buffer(23); for (ii = 0; ii < 23; ii++) buffer[ii] = 0; buffer.write('Hello, world!'); result = p.readData([ { x: { type: 'char[20]' }} ], buffer, 0); /* * This is currently broken behvaior, need to redesign check * ASSERT.equal('Hello, world!', result['x'].toString('utf-8', 0, * result['x'].length)); */ buffer = new Buffer(4); buffer[0] = 0x03; buffer[1] = 0x24; buffer[2] = 0x25; buffer[3] = 0x26; result = p.readData([ { y: { type: 'uint8_t' }}, { x: { type: 'uint8_t[y]' }}], buffer, 0); console.log(mod_sys.inspect(result, true)); p.typedef('ssize_t', 'int32_t'); ASSERT.deepEqual({ 'ssize_t': 'int32_t' }, p.lstypes()); result = p.readData([ { x: { type: 'ssize_t' } } ], buffer, 0); ASSERT.equal(0x26252403, result['x']); } test(); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.basicw.js�����000644 �000766 �000024 �00000002233 12455173731 042601� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Simple does it fucking work at all test */ var mod_ctype = require('../../ctype'); var ASSERT = require('assert'); var mod_sys = require('sys'); function test() { var ii, p, buffer, buf2; p = new mod_ctype.Parser({ endian: 'big' }); buffer = new Buffer(4); p.writeData([ { x: { type: 'uint8_t', value: 23 }}, { y: { type: 'uint8_t', offset: 3, value: 42 }} ], buffer, 0); ASSERT.equal(23, buffer[0]); ASSERT.equal(42, buffer[3]); buffer = new Buffer(20); for (ii = 0; ii < 20; ii++) buffer[ii] = 0; buffer.write('Hello, world!'); buf2 = new Buffer(22); p.writeData([ { x: { type: 'char[20]', value: buffer }} ], buf2, 0); for (ii = 0; ii < 20; ii++) ASSERT.equal(buffer[ii], buf2[ii]); /* * This is currently broken behvaior, need to redesign check * ASSERT.equal('Hello, world!', result['x'].toString('utf-8', 0, * result['x'].length)); */ buffer = new Buffer(4); p.writeData([ { y: { type: 'uint8_t', value: 3 }}, { x: { type: 'uint8_t[y]', value: [ 0x24, 0x25, 0x26] }}], buffer, 0); console.log(mod_sys.inspect(buffer)); p.typedef('ssize_t', 'int32_t'); ASSERT.deepEqual({ 'ssize_t': 'int32_t' }, p.lstypes()); } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.char.js�������000644 �000766 �000024 �00000002025 12455173731 042245� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Test the different forms of reading characters: * * - the default, a single element buffer * - uint8, values are uint8_ts * - int8, values are int8_ts */ var mod_ctype = require('../../ctype'); var mod_assert = require('assert'); function test() { var p, buf, res; buf = new Buffer(1); buf[0] = 255; p = new mod_ctype.Parser({ endian: 'little'}); res = p.readData([ { c: { type: 'char' }} ], buf, 0); res = res['c']; mod_assert.ok(res instanceof Buffer); mod_assert.equal(255, res[0]); p = new mod_ctype.Parser({ endian: 'little', 'char-type': 'int8' }); res = p.readData([ { c: { type: 'char' }} ], buf, 0); res = res['c']; mod_assert.ok(typeof (res) == 'number', 'got typeof (res): ' + typeof (res)); mod_assert.equal(-1, res); p = new mod_ctype.Parser({ endian: 'little', 'char-type': 'uint8' }); res = p.readData([ { c: { type: 'char' }} ], buf, 0); res = res['c']; mod_assert.ok(typeof (res) == 'number', 'got typeof (res): ' + typeof (res)); mod_assert.equal(255, res); } test(); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.endian.js�����000644 �000766 �000024 �00000001646 12455173731 042576� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Simple test to make sure that the endian setting works. */ var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var parser, buf; parser = new mod_ctype.Parser({ endian: 'little' }); buf = new Buffer(2); parser.writeData([ { key: { type: 'uint16_t' } } ], buf, 0, [ 0x1234 ]); mod_assert.equal(buf[0], 0x34); mod_assert.equal(buf[1], 0x12); parser.setEndian('big'); parser.writeData([ { key: { type: 'uint16_t' } } ], buf, 0, [ 0x1234 ]); mod_assert.equal(buf[0], 0x12); mod_assert.equal(buf[1], 0x34); parser.setEndian('little'); parser.writeData([ { key: { type: 'uint16_t' } } ], buf, 0, [ 0x1234 ]); mod_assert.equal(buf[0], 0x34); mod_assert.equal(buf[1], 0x12); } function fail() { var parser; parser = new mod_ctype.Parser({ endian: 'little' }); mod_assert.throws(function () { parser.setEndian('littlebigwrong'); }); } test(); fail(); ������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.oldwrite.js���000644 �000766 �000024 �00000001133 12455173731 043160� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * A long overdue test to go through and verify that we can read and write * structures as well as nested structures. */ var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var parser, buf, data; parser = new mod_ctype.Parser({ endian: 'little' }); parser.typedef('point_t', [ { x: { type: 'uint8_t' } }, { y: { type: 'uint8_t' } } ]); buf = new Buffer(2); data = [ { point: { type: 'point_t', value: [ 23, 42 ] } } ]; parser.writeData(data, buf, 0); mod_assert.ok(buf[0] == 23); mod_assert.ok(buf[1] == 42); } test(); �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.readSize.js���000644 �000766 �000024 �00000006206 12455173731 043103� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Testing to ensure we're reading the expected number bytes */ var mod_ctype = require('../../ctype'); var ASSERT = require('assert'); function testUint8() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('80', 'hex'); result = parser.readStruct([ { item: { type: 'uint8_t' } } ], buffer, 0); ASSERT.equal(result['size'], 1); } function testSint8() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('80', 'hex'); result = parser.readStruct([ { item: { type: 'int8_t' } } ], buffer, 0); ASSERT.equal(result['size'], 1); } function testUint16() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('8000', 'hex'); result = parser.readStruct([ { item: { type: 'uint16_t' } } ], buffer, 0); ASSERT.equal(result['size'], 2); } function testSint16() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('8000', 'hex'); result = parser.readStruct([ { item: { type: 'int16_t' } } ], buffer, 0); ASSERT.equal(result['size'], 2); } function testUint32() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('80000000', 'hex'); result = parser.readStruct([ { item: { type: 'uint32_t' } } ], buffer, 0); ASSERT.equal(result['size'], 4); } function testSint32() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('80000000', 'hex'); result = parser.readStruct([ { item: { type: 'int32_t' } } ], buffer, 0); ASSERT.equal(result['size'], 4); } function testUint64() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('8000000000000000', 'hex'); result = parser.readStruct([ { item: { type: 'uint64_t' } } ], buffer, 0); ASSERT.equal(result['size'], 8); } function testSint64() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('8000000000000000', 'hex'); result = parser.readStruct([ { item: { type: 'int64_t' } } ], buffer, 0); ASSERT.equal(result['size'], 8); } function testFloat() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('ABAAAA3E', 'hex'); result = parser.readStruct([ { item: { type: 'float' } } ], buffer, 0); ASSERT.equal(result['size'], 4); } function testDouble() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('000000000000F03F', 'hex'); result = parser.readStruct([ { item: { type: 'double' } } ], buffer, 0); ASSERT.equal(result['size'], 8); } function testChar() { var parser, result, buffer; parser = new mod_ctype.Parser({ endian: 'little' }); buffer = new Buffer('t'); result = parser.readStruct([ { item: { type: 'char' } } ], buffer, 0); ASSERT.equal(result['size'], 1); } function test() { testSint8(); testUint8(); testSint16(); testUint16(); testSint32(); testUint32(); testSint64(); testUint64(); testFloat(); testDouble(); testChar(); } test(); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.structw.js����000644 �000766 �000024 �00000001130 12455173731 043037� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * A long overdue test to go through and verify that we can read and write * structures as well as nested structures. */ var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var parser, buf, data; parser = new mod_ctype.Parser({ endian: 'little' }); parser.typedef('point_t', [ { x: { type: 'uint8_t' } }, { y: { type: 'uint8_t' } } ]); buf = new Buffer(2); data = [ { point: { type: 'point_t' } } ]; parser.writeData(data, buf, 0, [ [ 23, 42 ] ]); mod_assert.ok(buf[0] == 23); mod_assert.ok(buf[1] == 42); } test(); ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctype/tst.writeStruct.js000644 �000766 �000024 �00000001314 12455173731 043667� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Test to verify that the offset is incremented when structures are written to. * Hopefully we will not regress issue #41 */ var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var parser, buf, data; parser = new mod_ctype.Parser({ endian: 'little' }); parser.typedef('point_t', [ { x: { type: 'uint8_t' } }, { y: { type: 'uint8_t' } } ]); buf = new Buffer(4); data = [ { point1: { type: 'point_t' } }, { point2: { type: 'point_t' } } ]; parser.writeData(data, buf, 0, [ [ 23, 42 ], [ 91, 18 ] ]); mod_assert.ok(buf[0] == 23); mod_assert.ok(buf[1] == 42); mod_assert.ok(buf[2] == 91); mod_assert.ok(buf[3] == 18); } test(); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/000755 �000766 �000024 �00000000000 12456115120 041066� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/��000755 �000766 �000024 �00000000000 12456115120 040553� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/�000755 �000766 �000024 �00000000000 12456115120 040740� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.64.js�����000644 �000766 �000024 �00000026657 12455173731 042373� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Test our ability to read and write unsigned 64-bit integers. */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); function testRead() { var res, data; data = new Buffer(10); data[0] = 0x32; data[1] = 0x65; data[2] = 0x42; data[3] = 0x56; data[4] = 0x23; data[5] = 0xff; data[6] = 0xff; data[7] = 0xff; data[8] = 0x89; data[9] = 0x11; res = mod_ctype.ruint64(data, 'big', 0); ASSERT.equal(0x32654256, res[0]); ASSERT.equal(0x23ffffff, res[1]); res = mod_ctype.ruint64(data, 'big', 1); ASSERT.equal(0x65425623, res[0]); ASSERT.equal(0xffffff89, res[1]); res = mod_ctype.ruint64(data, 'big', 2); ASSERT.equal(0x425623ff, res[0]); ASSERT.equal(0xffff8911, res[1]); res = mod_ctype.ruint64(data, 'little', 0); ASSERT.equal(0xffffff23, res[0]); ASSERT.equal(0x56426532, res[1]); res = mod_ctype.ruint64(data, 'little', 1); ASSERT.equal(0x89ffffff, res[0]); ASSERT.equal(0x23564265, res[1]); res = mod_ctype.ruint64(data, 'little', 2); ASSERT.equal(0x1189ffff, res[0]); ASSERT.equal(0xff235642, res[1]); } function testReadOver() { var res, data; data = new Buffer(10); data[0] = 0x80; data[1] = 0xff; data[2] = 0x80; data[3] = 0xff; data[4] = 0x80; data[5] = 0xff; data[6] = 0x80; data[7] = 0xff; data[8] = 0x80; data[9] = 0xff; res = mod_ctype.ruint64(data, 'big', 0); ASSERT.equal(0x80ff80ff, res[0]); ASSERT.equal(0x80ff80ff, res[1]); res = mod_ctype.ruint64(data, 'big', 1); ASSERT.equal(0xff80ff80, res[0]); ASSERT.equal(0xff80ff80, res[1]); res = mod_ctype.ruint64(data, 'big', 2); ASSERT.equal(0x80ff80ff, res[0]); ASSERT.equal(0x80ff80ff, res[1]); res = mod_ctype.ruint64(data, 'little', 0); ASSERT.equal(0xff80ff80, res[0]); ASSERT.equal(0xff80ff80, res[1]); res = mod_ctype.ruint64(data, 'little', 1); ASSERT.equal(0x80ff80ff, res[0]); ASSERT.equal(0x80ff80ff, res[1]); res = mod_ctype.ruint64(data, 'little', 2); ASSERT.equal(0xff80ff80, res[0]); ASSERT.equal(0xff80ff80, res[1]); } function testWriteZero() { var data, buf; buf = new Buffer(10); buf.fill(0x66); data = [0, 0]; mod_ctype.wuint64(data, 'big', buf, 0); ASSERT.equal(0, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wuint64(data, 'big', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wuint64(data, 'big', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wuint64(data, 'little', buf, 0); ASSERT.equal(0, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wuint64(data, 'little', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wuint64(data, 'little', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0, buf[9]); } /* * Also include tests that are going to force us to go into a negative value and * insure that it's written correctly. */ function testWrite() { var data, buf; buf = new Buffer(10); data = [ 0x234456, 0x87 ]; buf.fill(0x66); mod_ctype.wuint64(data, 'big', buf, 0); ASSERT.equal(0x00, buf[0]); ASSERT.equal(0x23, buf[1]); ASSERT.equal(0x44, buf[2]); ASSERT.equal(0x56, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x87, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'big', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x23, buf[2]); ASSERT.equal(0x44, buf[3]); ASSERT.equal(0x56, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x87, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'big', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x23, buf[3]); ASSERT.equal(0x44, buf[4]); ASSERT.equal(0x56, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x00, buf[8]); ASSERT.equal(0x87, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'little', buf, 0); ASSERT.equal(0x87, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x56, buf[4]); ASSERT.equal(0x44, buf[5]); ASSERT.equal(0x23, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'little', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x87, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x56, buf[5]); ASSERT.equal(0x44, buf[6]); ASSERT.equal(0x23, buf[7]); ASSERT.equal(0x00, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'little', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0x87, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x56, buf[6]); ASSERT.equal(0x44, buf[7]); ASSERT.equal(0x23, buf[8]); ASSERT.equal(0x00, buf[9]); data = [0xffff3421, 0x34abcdba]; buf.fill(0x66); mod_ctype.wuint64(data, 'big', buf, 0); ASSERT.equal(0xff, buf[0]); ASSERT.equal(0xff, buf[1]); ASSERT.equal(0x34, buf[2]); ASSERT.equal(0x21, buf[3]); ASSERT.equal(0x34, buf[4]); ASSERT.equal(0xab, buf[5]); ASSERT.equal(0xcd, buf[6]); ASSERT.equal(0xba, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'big', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0xff, buf[1]); ASSERT.equal(0xff, buf[2]); ASSERT.equal(0x34, buf[3]); ASSERT.equal(0x21, buf[4]); ASSERT.equal(0x34, buf[5]); ASSERT.equal(0xab, buf[6]); ASSERT.equal(0xcd, buf[7]); ASSERT.equal(0xba, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'big', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0xff, buf[2]); ASSERT.equal(0xff, buf[3]); ASSERT.equal(0x34, buf[4]); ASSERT.equal(0x21, buf[5]); ASSERT.equal(0x34, buf[6]); ASSERT.equal(0xab, buf[7]); ASSERT.equal(0xcd, buf[8]); ASSERT.equal(0xba, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'little', buf, 0); ASSERT.equal(0xba, buf[0]); ASSERT.equal(0xcd, buf[1]); ASSERT.equal(0xab, buf[2]); ASSERT.equal(0x34, buf[3]); ASSERT.equal(0x21, buf[4]); ASSERT.equal(0x34, buf[5]); ASSERT.equal(0xff, buf[6]); ASSERT.equal(0xff, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'little', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0xba, buf[1]); ASSERT.equal(0xcd, buf[2]); ASSERT.equal(0xab, buf[3]); ASSERT.equal(0x34, buf[4]); ASSERT.equal(0x21, buf[5]); ASSERT.equal(0x34, buf[6]); ASSERT.equal(0xff, buf[7]); ASSERT.equal(0xff, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wuint64(data, 'little', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0xba, buf[2]); ASSERT.equal(0xcd, buf[3]); ASSERT.equal(0xab, buf[4]); ASSERT.equal(0x34, buf[5]); ASSERT.equal(0x21, buf[6]); ASSERT.equal(0x34, buf[7]); ASSERT.equal(0xff, buf[8]); ASSERT.equal(0xff, buf[9]); } /* * Make sure we catch invalid writes. */ function testWriteInvalid() { var data, buf; /* Buffer too small */ buf = new Buffer(4); data = [ 0, 0]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 0); }, Error, 'buffer too small'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 0); }, Error, 'buffer too small'); /* Beyond the end of the buffer */ buf = new Buffer(12); data = [ 0, 0]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 11); }, Error, 'write beyond end of buffer'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 11); }, Error, 'write beyond end of buffer'); /* Write negative values */ buf = new Buffer(12); data = [ -3, 0 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write negative number'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write negative number'); data = [ 0, -3 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write negative number'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write negative number'); data = [ -3, -3 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write negative number'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write negative number'); /* Write fractional values */ buf = new Buffer(12); data = [ 3.33, 0 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ 0, 3.3 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ 3.33, 2.42 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write fractions'); /* Write values that are too large */ buf = new Buffer(12); data = [ 0xffffffffff, 23 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ 0xffffffffff, 0xffffff238 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ 0x23, 0xffffff238 ]; ASSERT.throws(function () { mod_ctype.wuint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wuint64(data, 'little', buf, 1); }, Error, 'write too large'); } testRead(); testReadOver(); testWriteZero(); testWrite(); testWriteInvalid(); ���������������������������������������������������������������������������������node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.roundtrip.js��000644 �000766 �000024 �00000004100 12455173731 044143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������/* * A battery of tests for sucessful round-trip between writes and reads */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); /* * What the heck, let's just test every value for 8-bits. */ function test8() { var data = new Buffer(1); var i; for (i = 0; i < 256; i++) { mod_ctype.wuint8(i, 'big', data, 0); ASSERT.equal(i, mod_ctype.ruint8(data, 'big', 0)); mod_ctype.wuint8(i, 'little', data, 0); ASSERT.equal(i, mod_ctype.ruint8(data, 'little', 0)); } ASSERT.ok(true); } /* * Test a random sample of 256 values in the 16-bit unsigned range */ function test16() { var data = new Buffer(2); var i = 0; for (i = 0; i < 256; i++) { var value = Math.round(Math.random() * Math.pow(2, 16)); mod_ctype.wuint16(value, 'big', data, 0); ASSERT.equal(value, mod_ctype.ruint16(data, 'big', 0)); mod_ctype.wuint16(value, 'little', data, 0); ASSERT.equal(value, mod_ctype.ruint16(data, 'little', 0)); } } /* * Test a random sample of 256 values in the 32-bit unsigned range */ function test32() { var data = new Buffer(4); var i = 0; for (i = 0; i < 256; i++) { var value = Math.round(Math.random() * Math.pow(2, 32)); mod_ctype.wuint32(value, 'big', data, 0); ASSERT.equal(value, mod_ctype.ruint32(data, 'big', 0)); mod_ctype.wuint32(value, 'little', data, 0); ASSERT.equal(value, mod_ctype.ruint32(data, 'little', 0)); } } /* * Test a random sample of 256 values in the 64-bit unsigned range */ function test64() { var data = new Buffer(8); var i = 0; for (i = 0; i < 256; i++) { var low = Math.round(Math.random() * Math.pow(2, 32)); var high = Math.round(Math.random() * Math.pow(2, 32)); mod_ctype.wuint64([high, low], 'big', data, 0); var result = mod_ctype.ruint64(data, 'big', 0); ASSERT.equal(high, result[0]); ASSERT.equal(low, result[1]); mod_ctype.wuint64([high, low], 'little', data, 0); result = mod_ctype.ruint64(data, 'little', 0); ASSERT.equal(high, result[0]); ASSERT.equal(low, result[1]); } } exports.test8 = test8; exports.test16 = test16; exports.test32 = test32; exports.test64 = test64; ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.ruint.js��000644 �000766 �000024 �00000005710 12455173731 043266� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * A battery of tests to help us read a series of uints */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); /* * We need to check the following things: * - We are correctly resolving big endian (doesn't mean anything for 8 bit) * - Correctly resolving little endian (doesn't mean anything for 8 bit) * - Correctly using the offsets * - Correctly interpreting values that are beyond the signed range as unsigned */ function test8() { var data = new Buffer(4); data[0] = 23; data[1] = 23; data[2] = 23; data[3] = 23; ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 0)); ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 0)); ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 1)); ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 1)); ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 2)); ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 2)); ASSERT.equal(23, mod_ctype.ruint8(data, 'big', 3)); ASSERT.equal(23, mod_ctype.ruint8(data, 'little', 3)); data[0] = 255; /* If it became a signed int, would be -1 */ ASSERT.equal(255, mod_ctype.ruint8(data, 'big', 0)); ASSERT.equal(255, mod_ctype.ruint8(data, 'little', 0)); } /* * Test 16 bit unsigned integers. We need to verify the same set as 8 bit, only * now some of the issues actually matter: * - We are correctly resolving big endian * - Correctly resolving little endian * - Correctly using the offsets * - Correctly interpreting values that are beyond the signed range as unsigned */ function test16() { var data = new Buffer(4); /* Test signed values first */ data[0] = 0; data[1] = 0x23; data[2] = 0x42; data[3] = 0x3f; ASSERT.equal(0x23, mod_ctype.ruint16(data, 'big', 0)); ASSERT.equal(0x2342, mod_ctype.ruint16(data, 'big', 1)); ASSERT.equal(0x423f, mod_ctype.ruint16(data, 'big', 2)); ASSERT.equal(0x2300, mod_ctype.ruint16(data, 'little', 0)); ASSERT.equal(0x4223, mod_ctype.ruint16(data, 'little', 1)); ASSERT.equal(0x3f42, mod_ctype.ruint16(data, 'little', 2)); data[0] = 0xfe; data[1] = 0xfe; ASSERT.equal(0xfefe, mod_ctype.ruint16(data, 'big', 0)); ASSERT.equal(0xfefe, mod_ctype.ruint16(data, 'little', 0)); } /* * Test 32 bit unsigned integers. We need to verify the same set as 8 bit, only * now some of the issues actually matter: * - We are correctly resolving big endian * - Correctly using the offsets * - Correctly interpreting values that are beyond the signed range as unsigned */ function test32() { var data = new Buffer(8); data[0] = 0x32; data[1] = 0x65; data[2] = 0x42; data[3] = 0x56; data[4] = 0x23; data[5] = 0xff; ASSERT.equal(0x32654256, mod_ctype.ruint32(data, 'big', 0)); ASSERT.equal(0x65425623, mod_ctype.ruint32(data, 'big', 1)); ASSERT.equal(0x425623ff, mod_ctype.ruint32(data, 'big', 2)); ASSERT.equal(0x56426532, mod_ctype.ruint32(data, 'little', 0)); ASSERT.equal(0x23564265, mod_ctype.ruint32(data, 'little', 1)); ASSERT.equal(0xff235642, mod_ctype.ruint32(data, 'little', 2)); } test8(); test16(); test32(); ��������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/uint/tst.wuint.js��000644 �000766 �000024 �00000010075 12455173731 043273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * A battery of tests to help us read a series of uints */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); /* * We need to check the following things: * - We are correctly resolving big endian (doesn't mean anything for 8 bit) * - Correctly resolving little endian (doesn't mean anything for 8 bit) * - Correctly using the offsets * - Correctly interpreting values that are beyond the signed range as unsigned */ function test8() { var data = new Buffer(4); mod_ctype.wuint8(23, 'big', data, 0); mod_ctype.wuint8(23, 'big', data, 1); mod_ctype.wuint8(23, 'big', data, 2); mod_ctype.wuint8(23, 'big', data, 3); ASSERT.equal(23, data[0]); ASSERT.equal(23, data[1]); ASSERT.equal(23, data[2]); ASSERT.equal(23, data[3]); mod_ctype.wuint8(23, 'little', data, 0); mod_ctype.wuint8(23, 'little', data, 1); mod_ctype.wuint8(23, 'little', data, 2); mod_ctype.wuint8(23, 'little', data, 3); ASSERT.equal(23, data[0]); ASSERT.equal(23, data[1]); ASSERT.equal(23, data[2]); ASSERT.equal(23, data[3]); mod_ctype.wuint8(255, 'big', data, 0); ASSERT.equal(255, data[0]); mod_ctype.wuint8(255, 'little', data, 0); ASSERT.equal(255, data[0]); } function test16() { var value = 0x2343; var data = new Buffer(4); mod_ctype.wuint16(value, 'big', data, 0); ASSERT.equal(0x23, data[0]); ASSERT.equal(0x43, data[1]); mod_ctype.wuint16(value, 'big', data, 1); ASSERT.equal(0x23, data[1]); ASSERT.equal(0x43, data[2]); mod_ctype.wuint16(value, 'big', data, 2); ASSERT.equal(0x23, data[2]); ASSERT.equal(0x43, data[3]); mod_ctype.wuint16(value, 'little', data, 0); ASSERT.equal(0x23, data[1]); ASSERT.equal(0x43, data[0]); mod_ctype.wuint16(value, 'little', data, 1); ASSERT.equal(0x23, data[2]); ASSERT.equal(0x43, data[1]); mod_ctype.wuint16(value, 'little', data, 2); ASSERT.equal(0x23, data[3]); ASSERT.equal(0x43, data[2]); value = 0xff80; mod_ctype.wuint16(value, 'little', data, 0); ASSERT.equal(0xff, data[1]); ASSERT.equal(0x80, data[0]); mod_ctype.wuint16(value, 'big', data, 0); ASSERT.equal(0xff, data[0]); ASSERT.equal(0x80, data[1]); } function test32() { var data = new Buffer(6); var value = 0xe7f90a6d; mod_ctype.wuint32(value, 'big', data, 0); ASSERT.equal(0xe7, data[0]); ASSERT.equal(0xf9, data[1]); ASSERT.equal(0x0a, data[2]); ASSERT.equal(0x6d, data[3]); mod_ctype.wuint32(value, 'big', data, 1); ASSERT.equal(0xe7, data[1]); ASSERT.equal(0xf9, data[2]); ASSERT.equal(0x0a, data[3]); ASSERT.equal(0x6d, data[4]); mod_ctype.wuint32(value, 'big', data, 2); ASSERT.equal(0xe7, data[2]); ASSERT.equal(0xf9, data[3]); ASSERT.equal(0x0a, data[4]); ASSERT.equal(0x6d, data[5]); mod_ctype.wuint32(value, 'little', data, 0); ASSERT.equal(0xe7, data[3]); ASSERT.equal(0xf9, data[2]); ASSERT.equal(0x0a, data[1]); ASSERT.equal(0x6d, data[0]); mod_ctype.wuint32(value, 'little', data, 1); ASSERT.equal(0xe7, data[4]); ASSERT.equal(0xf9, data[3]); ASSERT.equal(0x0a, data[2]); ASSERT.equal(0x6d, data[1]); mod_ctype.wuint32(value, 'little', data, 2); ASSERT.equal(0xe7, data[5]); ASSERT.equal(0xf9, data[4]); ASSERT.equal(0x0a, data[3]); ASSERT.equal(0x6d, data[2]); } function test64() { var data = new Buffer(10); var value = 0x0007cda8e7f90a6d; var high = Math.floor(value / Math.pow(2, 32)); var low = value - (high * Math.pow(2, 32)); ASSERT.equal(0x0007cda8, high); ASSERT.equal(0xe7f90a6d, low); mod_ctype.wuint64([high, low], 'big', data, 0); ASSERT.equal(0x00, data[0]); ASSERT.equal(0x07, data[1]); ASSERT.equal(0xcd, data[2]); ASSERT.equal(0xa8, data[3]); ASSERT.equal(0xe7, data[4]); ASSERT.equal(0xf9, data[5]); ASSERT.equal(0x0a, data[6]); ASSERT.equal(0x6d, data[7]); mod_ctype.wuint64([high, low], 'little', data, 0); ASSERT.equal(0x6d, data[0]); ASSERT.equal(0x0a, data[1]); ASSERT.equal(0xf9, data[2]); ASSERT.equal(0xe7, data[3]); ASSERT.equal(0xa8, data[4]); ASSERT.equal(0xcd, data[5]); ASSERT.equal(0x07, data[6]); ASSERT.equal(0x00, data[7]); } test8(); test16(); test32(); test64(); exports.test8 = test8; exports.test16 = test16; exports.test32 = test32; exports.test64 = test64; �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.64.js������000644 �000766 �000024 �00000041105 12455173731 042167� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Test our ability to read and write signed 64-bit integers. */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); function testRead() { var res, data; data = new Buffer(10); data[0] = 0x32; data[1] = 0x65; data[2] = 0x42; data[3] = 0x56; data[4] = 0x23; data[5] = 0xff; data[6] = 0xff; data[7] = 0xff; data[8] = 0x89; data[9] = 0x11; res = mod_ctype.rsint64(data, 'big', 0); ASSERT.equal(0x32654256, res[0]); ASSERT.equal(0x23ffffff, res[1]); res = mod_ctype.rsint64(data, 'big', 1); ASSERT.equal(0x65425623, res[0]); ASSERT.equal(0xffffff89, res[1]); res = mod_ctype.rsint64(data, 'big', 2); ASSERT.equal(0x425623ff, res[0]); ASSERT.equal(0xffff8911, res[1]); res = mod_ctype.rsint64(data, 'little', 0); ASSERT.equal(-0x000000dc, res[0]); ASSERT.equal(-0xa9bd9ace, res[1]); res = mod_ctype.rsint64(data, 'little', 1); ASSERT.equal(-0x76000000, res[0]); ASSERT.equal(-0xdca9bd9b, res[1]); res = mod_ctype.rsint64(data, 'little', 2); ASSERT.equal(0x1189ffff, res[0]); ASSERT.equal(0xff235642, res[1]); data.fill(0x00); res = mod_ctype.rsint64(data, 'big', 0); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(0x00000000, res[1]); res = mod_ctype.rsint64(data, 'big', 1); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(0x00000000, res[1]); res = mod_ctype.rsint64(data, 'big', 2); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(0x00000000, res[1]); res = mod_ctype.rsint64(data, 'little', 0); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(0x00000000, res[1]); res = mod_ctype.rsint64(data, 'little', 1); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(0x00000000, res[1]); res = mod_ctype.rsint64(data, 'little', 2); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(0x00000000, res[1]); data.fill(0xff); res = mod_ctype.rsint64(data, 'big', 0); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(-1, res[1]); res = mod_ctype.rsint64(data, 'big', 1); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(-1, res[1]); res = mod_ctype.rsint64(data, 'big', 2); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(-1, res[1]); res = mod_ctype.rsint64(data, 'little', 0); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(-1, res[1]); res = mod_ctype.rsint64(data, 'little', 1); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(-1, res[1]); res = mod_ctype.rsint64(data, 'little', 2); ASSERT.equal(0x00000000, res[0]); ASSERT.equal(-1, res[1]); data[0] = 0x80; data[1] = 0x00; data[2] = 0x00; data[3] = 0x00; data[4] = 0x00; data[5] = 0x00; data[6] = 0x00; data[7] = 0x00; res = mod_ctype.rsint64(data, 'big', 0); ASSERT.equal(-0x80000000, res[0]); ASSERT.equal(0, res[1]); data[7] = 0x80; data[6] = 0x00; data[5] = 0x00; data[4] = 0x00; data[3] = 0x00; data[2] = 0x00; data[1] = 0x00; data[0] = 0x00; res = mod_ctype.rsint64(data, 'little', 0); ASSERT.equal(-0x80000000, res[0]); ASSERT.equal(0, res[1]); data[0] = 0x80; data[1] = 0x00; data[2] = 0x00; data[3] = 0x00; data[4] = 0x00; data[5] = 0x00; data[6] = 0x00; data[7] = 0x01; res = mod_ctype.rsint64(data, 'big', 0); ASSERT.equal(-0x7fffffff, res[0]); ASSERT.equal(-0xffffffff, res[1]); } function testWriteZero() { var data, buf; buf = new Buffer(10); buf.fill(0x66); data = [0, 0]; mod_ctype.wsint64(data, 'big', buf, 0); ASSERT.equal(0, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wsint64(data, 'big', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wsint64(data, 'big', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wsint64(data, 'little', buf, 0); ASSERT.equal(0, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wsint64(data, 'little', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); data = [0, 0]; mod_ctype.wsint64(data, 'little', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0, buf[2]); ASSERT.equal(0, buf[3]); ASSERT.equal(0, buf[4]); ASSERT.equal(0, buf[5]); ASSERT.equal(0, buf[6]); ASSERT.equal(0, buf[7]); ASSERT.equal(0, buf[8]); ASSERT.equal(0, buf[9]); } /* * Also include tests that are going to force us to go into a negative value and * insure that it's written correctly. */ function testWrite() { var data, buf; buf = new Buffer(10); data = [ 0x234456, 0x87 ]; buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 0); ASSERT.equal(0x00, buf[0]); ASSERT.equal(0x23, buf[1]); ASSERT.equal(0x44, buf[2]); ASSERT.equal(0x56, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x87, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x23, buf[2]); ASSERT.equal(0x44, buf[3]); ASSERT.equal(0x56, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x87, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x23, buf[3]); ASSERT.equal(0x44, buf[4]); ASSERT.equal(0x56, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x00, buf[8]); ASSERT.equal(0x87, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 0); ASSERT.equal(0x87, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x56, buf[4]); ASSERT.equal(0x44, buf[5]); ASSERT.equal(0x23, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x87, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x56, buf[5]); ASSERT.equal(0x44, buf[6]); ASSERT.equal(0x23, buf[7]); ASSERT.equal(0x00, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0x87, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x56, buf[6]); ASSERT.equal(0x44, buf[7]); ASSERT.equal(0x23, buf[8]); ASSERT.equal(0x00, buf[9]); data = [0x3421, 0x34abcdba]; buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 0); ASSERT.equal(0x00, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x34, buf[2]); ASSERT.equal(0x21, buf[3]); ASSERT.equal(0x34, buf[4]); ASSERT.equal(0xab, buf[5]); ASSERT.equal(0xcd, buf[6]); ASSERT.equal(0xba, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x34, buf[3]); ASSERT.equal(0x21, buf[4]); ASSERT.equal(0x34, buf[5]); ASSERT.equal(0xab, buf[6]); ASSERT.equal(0xcd, buf[7]); ASSERT.equal(0xba, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x34, buf[4]); ASSERT.equal(0x21, buf[5]); ASSERT.equal(0x34, buf[6]); ASSERT.equal(0xab, buf[7]); ASSERT.equal(0xcd, buf[8]); ASSERT.equal(0xba, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 0); ASSERT.equal(0xba, buf[0]); ASSERT.equal(0xcd, buf[1]); ASSERT.equal(0xab, buf[2]); ASSERT.equal(0x34, buf[3]); ASSERT.equal(0x21, buf[4]); ASSERT.equal(0x34, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 1); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0xba, buf[1]); ASSERT.equal(0xcd, buf[2]); ASSERT.equal(0xab, buf[3]); ASSERT.equal(0x34, buf[4]); ASSERT.equal(0x21, buf[5]); ASSERT.equal(0x34, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x00, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 2); ASSERT.equal(0x66, buf[0]); ASSERT.equal(0x66, buf[1]); ASSERT.equal(0xba, buf[2]); ASSERT.equal(0xcd, buf[3]); ASSERT.equal(0xab, buf[4]); ASSERT.equal(0x34, buf[5]); ASSERT.equal(0x21, buf[6]); ASSERT.equal(0x34, buf[7]); ASSERT.equal(0x00, buf[8]); ASSERT.equal(0x00, buf[9]); data = [ -0x80000000, 0 ]; buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 0); ASSERT.equal(0x80, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x00, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 0); ASSERT.equal(0x00, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x80, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); data = [ -0x7fffffff, -0xffffffff ]; buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 0); ASSERT.equal(0x80, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x01, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 0); ASSERT.equal(0x01, buf[0]); ASSERT.equal(0x00, buf[1]); ASSERT.equal(0x00, buf[2]); ASSERT.equal(0x00, buf[3]); ASSERT.equal(0x00, buf[4]); ASSERT.equal(0x00, buf[5]); ASSERT.equal(0x00, buf[6]); ASSERT.equal(0x80, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); data = [ 0x0, -0x1]; buf.fill(0x66); mod_ctype.wsint64(data, 'big', buf, 0); ASSERT.equal(0xff, buf[0]); ASSERT.equal(0xff, buf[1]); ASSERT.equal(0xff, buf[2]); ASSERT.equal(0xff, buf[3]); ASSERT.equal(0xff, buf[4]); ASSERT.equal(0xff, buf[5]); ASSERT.equal(0xff, buf[6]); ASSERT.equal(0xff, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); buf.fill(0x66); mod_ctype.wsint64(data, 'little', buf, 0); ASSERT.equal(0xff, buf[0]); ASSERT.equal(0xff, buf[1]); ASSERT.equal(0xff, buf[2]); ASSERT.equal(0xff, buf[3]); ASSERT.equal(0xff, buf[4]); ASSERT.equal(0xff, buf[5]); ASSERT.equal(0xff, buf[6]); ASSERT.equal(0xff, buf[7]); ASSERT.equal(0x66, buf[8]); ASSERT.equal(0x66, buf[9]); } /* * Make sure we catch invalid writes. */ function testWriteInvalid() { var data, buf; /* Buffer too small */ buf = new Buffer(4); data = [ 0, 0]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 0); }, Error, 'buffer too small'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 0); }, Error, 'buffer too small'); /* Beyond the end of the buffer */ buf = new Buffer(12); data = [ 0, 0]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 11); }, Error, 'write beyond end of buffer'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 11); }, Error, 'write beyond end of buffer'); /* Write fractional values */ buf = new Buffer(12); data = [ 3.33, 0 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ 0, 3.3 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ -3.33, 0 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ 0, -3.3 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ 3.33, 2.42 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ 3.33, -2.42 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ -3.33, -2.42 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); data = [ -3.33, 2.42 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write fractions'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write fractions'); /* Signs don't match */ buf = new Buffer(12); data = [ 0x800000, -0x32 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ -0x800000, 0x32 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); /* Write values that are too large */ buf = new Buffer(12); data = [ 0x80000000, 0 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ 0x7fffffff, 0x100000000 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ 0x00, 0x800000000 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ 0xffffffffff, 0xffffff238 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ 0x23, 0xffffff238 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ -0x80000000, -0xfff238 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); data = [ -0x80000004, -0xfff238 ]; ASSERT.throws(function () { mod_ctype.wsint64(data, 'big', buf, 1); }, Error, 'write too large'); ASSERT.throws(function () { mod_ctype.wsint64(data, 'little', buf, 1); }, Error, 'write too large'); } testRead(); testWrite(); testWriteZero(); testWriteInvalid(); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.rint.js����000644 �000766 �000024 �00000006104 12455173731 042712� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Tests to verify we're reading in signed integers correctly */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); /* * Test 8 bit signed integers */ function test8() { var data = new Buffer(4); data[0] = 0x23; ASSERT.equal(0x23, mod_ctype.rsint8(data, 'big', 0)); ASSERT.equal(0x23, mod_ctype.rsint8(data, 'little', 0)); data[0] = 0xff; ASSERT.equal(-1, mod_ctype.rsint8(data, 'big', 0)); ASSERT.equal(-1, mod_ctype.rsint8(data, 'little', 0)); data[0] = 0x87; data[1] = 0xab; data[2] = 0x7c; data[3] = 0xef; ASSERT.equal(-121, mod_ctype.rsint8(data, 'big', 0)); ASSERT.equal(-85, mod_ctype.rsint8(data, 'big', 1)); ASSERT.equal(124, mod_ctype.rsint8(data, 'big', 2)); ASSERT.equal(-17, mod_ctype.rsint8(data, 'big', 3)); ASSERT.equal(-121, mod_ctype.rsint8(data, 'little', 0)); ASSERT.equal(-85, mod_ctype.rsint8(data, 'little', 1)); ASSERT.equal(124, mod_ctype.rsint8(data, 'little', 2)); ASSERT.equal(-17, mod_ctype.rsint8(data, 'little', 3)); } function test16() { var buffer = new Buffer(6); buffer[0] = 0x16; buffer[1] = 0x79; ASSERT.equal(0x1679, mod_ctype.rsint16(buffer, 'big', 0)); ASSERT.equal(0x7916, mod_ctype.rsint16(buffer, 'little', 0)); buffer[0] = 0xff; buffer[1] = 0x80; ASSERT.equal(-128, mod_ctype.rsint16(buffer, 'big', 0)); ASSERT.equal(-32513, mod_ctype.rsint16(buffer, 'little', 0)); /* test offset with weenix */ buffer[0] = 0x77; buffer[1] = 0x65; buffer[2] = 0x65; buffer[3] = 0x6e; buffer[4] = 0x69; buffer[5] = 0x78; ASSERT.equal(0x7765, mod_ctype.rsint16(buffer, 'big', 0)); ASSERT.equal(0x6565, mod_ctype.rsint16(buffer, 'big', 1)); ASSERT.equal(0x656e, mod_ctype.rsint16(buffer, 'big', 2)); ASSERT.equal(0x6e69, mod_ctype.rsint16(buffer, 'big', 3)); ASSERT.equal(0x6978, mod_ctype.rsint16(buffer, 'big', 4)); ASSERT.equal(0x6577, mod_ctype.rsint16(buffer, 'little', 0)); ASSERT.equal(0x6565, mod_ctype.rsint16(buffer, 'little', 1)); ASSERT.equal(0x6e65, mod_ctype.rsint16(buffer, 'little', 2)); ASSERT.equal(0x696e, mod_ctype.rsint16(buffer, 'little', 3)); ASSERT.equal(0x7869, mod_ctype.rsint16(buffer, 'little', 4)); } function test32() { var buffer = new Buffer(6); buffer[0] = 0x43; buffer[1] = 0x53; buffer[2] = 0x16; buffer[3] = 0x79; ASSERT.equal(0x43531679, mod_ctype.rsint32(buffer, 'big', 0)); ASSERT.equal(0x79165343, mod_ctype.rsint32(buffer, 'little', 0)); buffer[0] = 0xff; buffer[1] = 0xfe; buffer[2] = 0xef; buffer[3] = 0xfa; ASSERT.equal(-69638, mod_ctype.rsint32(buffer, 'big', 0)); ASSERT.equal(-84934913, mod_ctype.rsint32(buffer, 'little', 0)); buffer[0] = 0x42; buffer[1] = 0xc3; buffer[2] = 0x95; buffer[3] = 0xa9; buffer[4] = 0x36; buffer[5] = 0x17; ASSERT.equal(0x42c395a9, mod_ctype.rsint32(buffer, 'big', 0)); ASSERT.equal(-1013601994, mod_ctype.rsint32(buffer, 'big', 1)); ASSERT.equal(-1784072681, mod_ctype.rsint32(buffer, 'big', 2)); ASSERT.equal(-1449802942, mod_ctype.rsint32(buffer, 'little', 0)); ASSERT.equal(917083587, mod_ctype.rsint32(buffer, 'little', 1)); ASSERT.equal(389458325, mod_ctype.rsint32(buffer, 'little', 2)); } test8(); test16(); test32(); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.wbounds.js�000644 �000766 �000024 �00000002215 12455173731 043416� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Test to make sure that we properly are erroring whenever we try to write * beyond the size of the integer. */ var mod_ctio = require('../../../ctio.js'); var mod_assert = require('assert'); var tb = new Buffer(16); /* Largest buffer we'll need */ var cases = [ { func: function () { mod_ctio.wsint8(0x80, 'big', tb, 0); }, test: '+int8_t' }, { func: function () { mod_ctio.wsint8(-0x81, 'big', tb, 0); }, test: '-int8_t' }, { func: function () { mod_ctio.wsint16(0x8000, 'big', tb, 0); }, test: '+int16_t' }, { func: function () { mod_ctio.wsint16(-0x8001, 'big', tb, 0); }, test: '-int16_t' }, { func: function () { mod_ctio.wsint32(0x80000000, 'big', tb, 0); }, test: '+int32_t' }, { func: function () { mod_ctio.wsint32(-0x80000001, 'big', tb, 0); }, test: '-int32_t' }, { func: function () { mod_ctio.wsint64([ 0x80000000, 0 ], 'big', tb, 0); }, test: '+int64_t' }, { func: function () { mod_ctio.wsint64([ -0x80000000, -1 ], 'big', tb, 0); }, test: '-int64_t' } ]; function test() { var ii; for (ii = 0; ii < cases.length; ii++) mod_assert.throws(cases[ii]['func'], Error, cases[ii]['test']); } test(); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/int/tst.wint.js����000644 �000766 �000024 �00000004777 12455173731 042735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Tests to verify we're writing signed integers correctly */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); function test8() { var buffer = new Buffer(4); mod_ctype.wsint8(0x23, 'big', buffer, 0); mod_ctype.wsint8(0x23, 'little', buffer, 1); mod_ctype.wsint8(-5, 'big', buffer, 2); mod_ctype.wsint8(-5, 'little', buffer, 3); ASSERT.equal(0x23, buffer[0]); ASSERT.equal(0x23, buffer[1]); ASSERT.equal(0xfb, buffer[2]); ASSERT.equal(0xfb, buffer[3]); /* Make sure we handle truncation correctly */ ASSERT.throws(function () { mod_ctype.wsint8(0xabc, 'big', buffer, 0); }); ASSERT.throws(function () { mod_ctype.wsint8(0xabc, 'little', buffer, 0); }); } function test16() { var buffer = new Buffer(6); mod_ctype.wsint16(0x0023, 'big', buffer, 0); mod_ctype.wsint16(0x0023, 'little', buffer, 2); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x23, buffer[1]); ASSERT.equal(0x23, buffer[2]); ASSERT.equal(0x00, buffer[3]); mod_ctype.wsint16(-5, 'big', buffer, 0); mod_ctype.wsint16(-5, 'little', buffer, 2); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0xfb, buffer[1]); ASSERT.equal(0xfb, buffer[2]); ASSERT.equal(0xff, buffer[3]); mod_ctype.wsint16(-1679, 'big', buffer, 1); mod_ctype.wsint16(-1679, 'little', buffer, 3); ASSERT.equal(0xf9, buffer[1]); ASSERT.equal(0x71, buffer[2]); ASSERT.equal(0x71, buffer[3]); ASSERT.equal(0xf9, buffer[4]); } function test32() { var buffer = new Buffer(8); mod_ctype.wsint32(0x23, 'big', buffer, 0); mod_ctype.wsint32(0x23, 'little', buffer, 4); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x23, buffer[3]); ASSERT.equal(0x23, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wsint32(-5, 'big', buffer, 0); mod_ctype.wsint32(-5, 'little', buffer, 4); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xfb, buffer[3]); ASSERT.equal(0xfb, buffer[4]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[6]); ASSERT.equal(0xff, buffer[7]); mod_ctype.wsint32(-805306713, 'big', buffer, 0); mod_ctype.wsint32(-805306713, 'litle', buffer, 4); ASSERT.equal(0xcf, buffer[0]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xfe, buffer[2]); ASSERT.equal(0xa7, buffer[3]); ASSERT.equal(0xa7, buffer[4]); ASSERT.equal(0xfe, buffer[5]); ASSERT.equal(0xff, buffer[6]); ASSERT.equal(0xcf, buffer[7]); } test8(); test16(); test32(); �npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/tst.rfloat.js000644 �000766 �000024 �00000041230 12455173731 043537� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Battery of tests to break our floating point implementation. Oh ho ho. * * There are a few useful ways to generate the expected output. The first is * just write a C program and write raw bytes out and inspect with xxd. Remember * to consider whether or not you're on a big endian or little endian machine. * Another useful site I found to help with some of this was: * * http://babbage.cs.qc.edu/IEEE-754/ */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); function testfloat() { var buffer = new Buffer(4); /* Start off with some of the easy ones: +zero */ buffer[0] = 0; buffer[1] = 0; buffer[2] = 0; buffer[3] = 0; ASSERT.equal(0, mod_ctype.rfloat(buffer, 'big', 0)); ASSERT.equal(0, mod_ctype.rfloat(buffer, 'little', 0)); /* Test -0 */ buffer[0] = 0x80; ASSERT.equal(0, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = buffer[0]; buffer[0] = 0; ASSERT.equal(0, mod_ctype.rfloat(buffer, 'little', 0)); /* Catch +infin */ buffer[0] = 0x7f; buffer[1] = 0x80; buffer[2] = 0x00; buffer[3] = 0x00; ASSERT.equal(Number.POSITIVE_INFINITY, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x7f; buffer[2] = 0x80; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(Number.POSITIVE_INFINITY, mod_ctype.rfloat(buffer, 'litle', 0)); /* Catch -infin */ buffer[0] = 0xff; buffer[1] = 0x80; buffer[2] = 0x00; buffer[3] = 0x00; ASSERT.equal(Number.NEGATIVE_INFINITY, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0xff; buffer[2] = 0x80; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(Number.NEGATIVE_INFINITY, mod_ctype.rfloat(buffer, 'litle', 0)); /* Catch NaN */ buffer[0] = 0x7f; buffer[1] = 0x80; buffer[2] = 0x00; buffer[3] = 0x23; ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'big', 0))); buffer[3] = 0x7f; buffer[2] = 0x80; buffer[1] = 0x00; buffer[0] = 0x23; ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'little', 0))); /* Catch -infin */ buffer[0] = 0xff; buffer[1] = 0x80; buffer[2] = 0x00; buffer[3] = 0x23; ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'big', 0))); buffer[3] = 0xff; buffer[2] = 0x80; buffer[1] = 0x00; buffer[0] = 0x23; ASSERT.ok(isNaN(mod_ctype.rfloat(buffer, 'little', 0))); /* On to some basic tests */ /* 1.125 */ buffer[0] = 0x3f; buffer[1] = 0x90; buffer[2] = 0x00; buffer[3] = 0x00; ASSERT.equal(1.125, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x3f; buffer[2] = 0x90; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(1.125, mod_ctype.rfloat(buffer, 'little', 0)); /* ff34a2b0 -2.4010576103645774e+38 */ buffer[0] = 0xff; buffer[1] = 0x34; buffer[2] = 0xa2; buffer[3] = 0xb0; ASSERT.equal(-2.4010576103645774e+38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0xff; buffer[2] = 0x34; buffer[1] = 0xa2; buffer[0] = 0xb0; ASSERT.equal(-2.4010576103645774e+38, mod_ctype.rfloat(buffer, 'little', 0)); /* Denormalized tests */ /* 0003f89a +/- 3.6468792534053364e-40 */ buffer[0] = 0x00; buffer[1] = 0x03; buffer[2] = 0xf8; buffer[3] = 0x9a; ASSERT.equal(3.6468792534053364e-40, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x00; buffer[2] = 0x03; buffer[1] = 0xf8; buffer[0] = 0x9a; ASSERT.equal(3.6468792534053364e-40, mod_ctype.rfloat(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x03; buffer[2] = 0xf8; buffer[3] = 0x9a; ASSERT.equal(-3.6468792534053364e-40, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x80; buffer[2] = 0x03; buffer[1] = 0xf8; buffer[0] = 0x9a; ASSERT.equal(-3.6468792534053364e-40, mod_ctype.rfloat(buffer, 'little', 0)); /* Maximum and minimum normalized and denormalized values */ /* Largest normalized number +/- 3.4028234663852886e+38 */ buffer[0] = 0x7f; buffer[1] = 0x7f; buffer[2] = 0xff; buffer[3] = 0xff; ASSERT.equal(3.4028234663852886e+38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x7f; buffer[2] = 0x7f; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(3.4028234663852886e+38, mod_ctype.rfloat(buffer, 'little', 0)); buffer[0] = 0xff; buffer[1] = 0x7f; buffer[2] = 0xff; buffer[3] = 0xff; ASSERT.equal(-3.4028234663852886e+38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0xff; buffer[2] = 0x7f; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(-3.4028234663852886e+38, mod_ctype.rfloat(buffer, 'little', 0)); /* Smallest normalied number +/- 1.1754943508222875e-38 */ buffer[0] = 0x00; buffer[1] = 0x80; buffer[2] = 0x00; buffer[3] = 0x00; ASSERT.equal(1.1754943508222875e-38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x00; buffer[2] = 0x80; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(1.1754943508222875e-38, mod_ctype.rfloat(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x80; buffer[2] = 0x00; buffer[3] = 0x00; ASSERT.equal(-1.1754943508222875e-38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x80; buffer[2] = 0x80; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(-1.1754943508222875e-38, mod_ctype.rfloat(buffer, 'little', 0)); /* Smallest denormalized number 1.401298464324817e-45 */ buffer[0] = 0x00; buffer[1] = 0x00; buffer[2] = 0x00; buffer[3] = 0x01; ASSERT.equal(1.401298464324817e-45, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x00; buffer[2] = 0x00; buffer[1] = 0x00; buffer[0] = 0x01; ASSERT.equal(1.401298464324817e-45, mod_ctype.rfloat(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x00; buffer[2] = 0x00; buffer[3] = 0x01; ASSERT.equal(-1.401298464324817e-45, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x80; buffer[2] = 0x00; buffer[1] = 0x00; buffer[0] = 0x01; ASSERT.equal(-1.401298464324817e-45, mod_ctype.rfloat(buffer, 'little', 0)); /* Largest denormalized value +/- 1.1754942106924411e-38 */ buffer[0] = 0x00; buffer[1] = 0x7f; buffer[2] = 0xff; buffer[3] = 0xff; ASSERT.equal(1.1754942106924411e-38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x00; buffer[2] = 0x7f; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(1.1754942106924411e-38, mod_ctype.rfloat(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x7f; buffer[2] = 0xff; buffer[3] = 0xff; ASSERT.equal(-1.1754942106924411e-38, mod_ctype.rfloat(buffer, 'big', 0)); buffer[3] = 0x80; buffer[2] = 0x7f; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(-1.1754942106924411e-38, mod_ctype.rfloat(buffer, 'little', 0)); /* Do some quick offset testing */ buffer = new Buffer(6); buffer[0] = 0x7f; buffer[1] = 0x4e; buffer[2] = 0x8a; buffer[3] = 0x79; buffer[4] = 0xcd; buffer[5] = 0x3f; ASSERT.equal(2.745399582697325e+38, mod_ctype.rfloat(buffer, 'big', 0)); ASSERT.equal(1161619072, mod_ctype.rfloat(buffer, 'big', 1)); ASSERT.equal(-1.2027516403607578e-32, mod_ctype.rfloat(buffer, 'big', 2)); ASSERT.equal(8.97661320504413e+34, mod_ctype.rfloat(buffer, 'little', 0)); ASSERT.equal(-261661920, mod_ctype.rfloat(buffer, 'little', 1)); ASSERT.equal(1.605271577835083, mod_ctype.rfloat(buffer, 'little', 2)); } function testdouble() { var buffer = new Buffer(10); /* Check 0 */ buffer[0] = 0; buffer[1] = 0; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 0; ASSERT.equal(0, mod_ctype.rdouble(buffer, 'big', 0)); ASSERT.equal(0, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 0; ASSERT.equal(0, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x80; buffer[6] = 0; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 0; ASSERT.equal(0, mod_ctype.rdouble(buffer, 'little', 0)); /* Check NaN */ buffer[0] = 0x7f; buffer[1] = 0xf0; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 23; ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'big', 0))); buffer[7] = 0x7f; buffer[6] = 0xf0; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 23; ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'little', 0))); buffer[0] = 0xff; buffer[1] = 0xf0; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 23; ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'big', 0))); buffer[7] = 0xff; buffer[6] = 0xf0; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 23; ASSERT.ok(isNaN(mod_ctype.rdouble(buffer, 'little', 0))); /* pos inf */ buffer[0] = 0x7f; buffer[1] = 0xf0; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 0; ASSERT.equal(Number.POSITIVE_INFINITY, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x7f; buffer[6] = 0xf0; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 0; ASSERT.equal(Number.POSITIVE_INFINITY, mod_ctype.rdouble(buffer, 'little', 0)); /* neg inf */ buffer[0] = 0xff; buffer[1] = 0xf0; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 0; ASSERT.equal(Number.NEGATIVE_INFINITY, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0xff; buffer[6] = 0xf0; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 0; ASSERT.equal(Number.NEGATIVE_INFINITY, mod_ctype.rdouble(buffer, 'little', 0)); /* Simple normalized values */ /* +/- 1.125 */ buffer[0] = 0x3f; buffer[1] = 0xf2; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 0; ASSERT.equal(1.125, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x3f; buffer[6] = 0xf2; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 0; ASSERT.equal(1.125, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0xbf; buffer[1] = 0xf2; buffer[2] = 0; buffer[3] = 0; buffer[4] = 0; buffer[5] = 0; buffer[6] = 0; buffer[7] = 0; ASSERT.equal(-1.125, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0xbf; buffer[6] = 0xf2; buffer[5] = 0; buffer[4] = 0; buffer[3] = 0; buffer[2] = 0; buffer[1] = 0; buffer[0] = 0; ASSERT.equal(-1.125, mod_ctype.rdouble(buffer, 'little', 0)); /* +/- 1.4397318913736026e+283 */ buffer[0] = 0x7a; buffer[1] = 0xb8; buffer[2] = 0xc9; buffer[3] = 0x34; buffer[4] = 0x72; buffer[5] = 0x16; buffer[6] = 0xf9; buffer[7] = 0x0e; ASSERT.equal(1.4397318913736026e+283, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x7a; buffer[6] = 0xb8; buffer[5] = 0xc9; buffer[4] = 0x34; buffer[3] = 0x72; buffer[2] = 0x16; buffer[1] = 0xf9; buffer[0] = 0x0e; ASSERT.equal(1.4397318913736026e+283, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0xfa; buffer[1] = 0xb8; buffer[2] = 0xc9; buffer[3] = 0x34; buffer[4] = 0x72; buffer[5] = 0x16; buffer[6] = 0xf9; buffer[7] = 0x0e; ASSERT.equal(-1.4397318913736026e+283, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0xfa; buffer[6] = 0xb8; buffer[5] = 0xc9; buffer[4] = 0x34; buffer[3] = 0x72; buffer[2] = 0x16; buffer[1] = 0xf9; buffer[0] = 0x0e; ASSERT.equal(-1.4397318913736026e+283, mod_ctype.rdouble(buffer, 'little', 0)); /* Denormalized values */ /* +/- 8.82521232268344e-309 */ buffer[0] = 0x00; buffer[1] = 0x06; buffer[2] = 0x58; buffer[3] = 0x94; buffer[4] = 0x13; buffer[5] = 0x27; buffer[6] = 0x8a; buffer[7] = 0xcd; ASSERT.equal(8.82521232268344e-309, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x00; buffer[6] = 0x06; buffer[5] = 0x58; buffer[4] = 0x94; buffer[3] = 0x13; buffer[2] = 0x27; buffer[1] = 0x8a; buffer[0] = 0xcd; ASSERT.equal(8.82521232268344e-309, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x06; buffer[2] = 0x58; buffer[3] = 0x94; buffer[4] = 0x13; buffer[5] = 0x27; buffer[6] = 0x8a; buffer[7] = 0xcd; ASSERT.equal(-8.82521232268344e-309, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x80; buffer[6] = 0x06; buffer[5] = 0x58; buffer[4] = 0x94; buffer[3] = 0x13; buffer[2] = 0x27; buffer[1] = 0x8a; buffer[0] = 0xcd; ASSERT.equal(-8.82521232268344e-309, mod_ctype.rdouble(buffer, 'little', 0)); /* Edge cases, maximum and minimum values */ /* Smallest denormalized value 5e-324 */ buffer[0] = 0x00; buffer[1] = 0x00; buffer[2] = 0x00; buffer[3] = 0x00; buffer[4] = 0x00; buffer[5] = 0x00; buffer[6] = 0x00; buffer[7] = 0x01; ASSERT.equal(5e-324, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x00; buffer[6] = 0x00; buffer[5] = 0x00; buffer[4] = 0x00; buffer[3] = 0x00; buffer[2] = 0x00; buffer[1] = 0x00; buffer[0] = 0x01; ASSERT.equal(5e-324, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x00; buffer[2] = 0x00; buffer[3] = 0x00; buffer[4] = 0x00; buffer[5] = 0x00; buffer[6] = 0x00; buffer[7] = 0x01; ASSERT.equal(-5e-324, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x80; buffer[6] = 0x00; buffer[5] = 0x00; buffer[4] = 0x00; buffer[3] = 0x00; buffer[2] = 0x00; buffer[1] = 0x00; buffer[0] = 0x01; ASSERT.equal(-5e-324, mod_ctype.rdouble(buffer, 'little', 0)); /* Largest denormalized value 2.225073858507201e-308 */ buffer[0] = 0x00; buffer[1] = 0x0f; buffer[2] = 0xff; buffer[3] = 0xff; buffer[4] = 0xff; buffer[5] = 0xff; buffer[6] = 0xff; buffer[7] = 0xff; ASSERT.equal(2.225073858507201e-308, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x00; buffer[6] = 0x0f; buffer[5] = 0xff; buffer[4] = 0xff; buffer[3] = 0xff; buffer[2] = 0xff; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(2.225073858507201e-308, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x0f; buffer[2] = 0xff; buffer[3] = 0xff; buffer[4] = 0xff; buffer[5] = 0xff; buffer[6] = 0xff; buffer[7] = 0xff; ASSERT.equal(-2.225073858507201e-308, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x80; buffer[6] = 0x0f; buffer[5] = 0xff; buffer[4] = 0xff; buffer[3] = 0xff; buffer[2] = 0xff; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(-2.225073858507201e-308, mod_ctype.rdouble(buffer, 'little', 0)); /* Smallest normalized value 2.2250738585072014e-308 */ buffer[0] = 0x00; buffer[1] = 0x10; buffer[2] = 0x00; buffer[3] = 0x00; buffer[4] = 0x00; buffer[5] = 0x00; buffer[6] = 0x00; buffer[7] = 0x00; ASSERT.equal(2.2250738585072014e-308, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x00; buffer[6] = 0x10; buffer[5] = 0x00; buffer[4] = 0x00; buffer[3] = 0x00; buffer[2] = 0x00; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(2.2250738585072014e-308, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0x80; buffer[1] = 0x10; buffer[2] = 0x00; buffer[3] = 0x00; buffer[4] = 0x00; buffer[5] = 0x00; buffer[6] = 0x00; buffer[7] = 0x00; ASSERT.equal(-2.2250738585072014e-308, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x80; buffer[6] = 0x10; buffer[5] = 0x00; buffer[4] = 0x00; buffer[3] = 0x00; buffer[2] = 0x00; buffer[1] = 0x00; buffer[0] = 0x00; ASSERT.equal(-2.2250738585072014e-308, mod_ctype.rdouble(buffer, 'little', 0)); /* Largest normalized value 1.7976931348623157e+308 */ buffer[0] = 0x7f; buffer[1] = 0xef; buffer[2] = 0xff; buffer[3] = 0xff; buffer[4] = 0xff; buffer[5] = 0xff; buffer[6] = 0xff; buffer[7] = 0xff; ASSERT.equal(1.7976931348623157e+308, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0x7f; buffer[6] = 0xef; buffer[5] = 0xff; buffer[4] = 0xff; buffer[3] = 0xff; buffer[2] = 0xff; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(1.7976931348623157e+308, mod_ctype.rdouble(buffer, 'little', 0)); buffer[0] = 0xff; buffer[1] = 0xef; buffer[2] = 0xff; buffer[3] = 0xff; buffer[4] = 0xff; buffer[5] = 0xff; buffer[6] = 0xff; buffer[7] = 0xff; ASSERT.equal(-1.7976931348623157e+308, mod_ctype.rdouble(buffer, 'big', 0)); buffer[7] = 0xff; buffer[6] = 0xef; buffer[5] = 0xff; buffer[4] = 0xff; buffer[3] = 0xff; buffer[2] = 0xff; buffer[1] = 0xff; buffer[0] = 0xff; ASSERT.equal(-1.7976931348623157e+308, mod_ctype.rdouble(buffer, 'little', 0)); /* Try offsets */ buffer[0] = 0xde; buffer[1] = 0xad; buffer[2] = 0xbe; buffer[3] = 0xef; buffer[4] = 0xba; buffer[5] = 0xdd; buffer[6] = 0xca; buffer[7] = 0xfe; buffer[8] = 0x16; buffer[9] = 0x79; ASSERT.equal(-1.1885958404126936e+148, mod_ctype.rdouble(buffer, 'big', 0)); ASSERT.equal(-2.4299184080448593e-88, mod_ctype.rdouble(buffer, 'big', 1)); ASSERT.equal(-0.000015130017658081283, mod_ctype.rdouble(buffer, 'big', 2)); ASSERT.equal(-5.757458694845505e+302, mod_ctype.rdouble(buffer, 'little', 0)); ASSERT.equal(6.436459604192476e-198, mod_ctype.rdouble(buffer, 'little', 1)); ASSERT.equal(1.9903745632417286e+275, mod_ctype.rdouble(buffer, 'little', 2)); } testfloat(); testdouble(); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctio/float/tst.wfloat.js000644 �000766 �000024 �00000053516 12455173731 043556� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Another place to find bugs that may yet plague us. This time with writing out * floats to arrays. We are lazy and did basically just take the opposite of our * test code to read in values. */ var mod_ctype = require('../../../ctio.js'); var ASSERT = require('assert'); /* * A useful thing to keep around for debugging * console.log('buffer[0]: ' + buffer[0].toString(16)); * console.log('buffer[1]: ' + buffer[1].toString(16)); * console.log('buffer[2]: ' + buffer[2].toString(16)); * console.log('buffer[3]: ' + buffer[3].toString(16)); * console.log('buffer[4]: ' + buffer[4].toString(16)); * console.log('buffer[5]: ' + buffer[5].toString(16)); * console.log('buffer[6]: ' + buffer[6].toString(16)); * console.log('buffer[7]: ' + buffer[7].toString(16)); */ function testfloat() { var buffer = new Buffer(4); mod_ctype.wfloat(0, 'big', buffer, 0); /* Start off with some of the easy ones: +zero */ ASSERT.equal(0, buffer[0]); ASSERT.equal(0, buffer[1]); ASSERT.equal(0, buffer[2]); ASSERT.equal(0, buffer[3]); mod_ctype.wfloat(0, 'little', buffer, 0); ASSERT.equal(0, buffer[0]); ASSERT.equal(0, buffer[1]); ASSERT.equal(0, buffer[2]); ASSERT.equal(0, buffer[3]); /* Catch +infin */ mod_ctype.wfloat(Number.POSITIVE_INFINITY, 'big', buffer, 0); ASSERT.equal(0x7f, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); mod_ctype.wfloat(Number.POSITIVE_INFINITY, 'little', buffer, 0); ASSERT.equal(0x7f, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* Catch -infin */ mod_ctype.wfloat(Number.NEGATIVE_INFINITY, 'big', buffer, 0); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); mod_ctype.wfloat(Number.NEGATIVE_INFINITY, 'little', buffer, 0); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* Catch NaN */ /* * NaN Is a litle weird in its requirements, so we're going to encode a * bit of how we actually implement it into this test. Probably not the * best, since technically the sign is a don't care and the mantissa * needs to just be non-zero. */ mod_ctype.wfloat(NaN, 'big', buffer, 0); ASSERT.equal(0x7f, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x17, buffer[3]); mod_ctype.wfloat(NaN, 'little', buffer, 0); ASSERT.equal(0x7f, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x17, buffer[0]); /* On to some basic tests */ /* 1.125 */ mod_ctype.wfloat(1.125, 'big', buffer, 0); ASSERT.equal(0x3f, buffer[0]); ASSERT.equal(0x90, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); mod_ctype.wfloat(1.125, 'little', buffer, 0); ASSERT.equal(0x3f, buffer[3]); ASSERT.equal(0x90, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); mod_ctype.wfloat(1.0000001192092896, 'big', buffer, 0); ASSERT.equal(0x3f, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x01, buffer[3]); mod_ctype.wfloat(1.0000001192092896, 'little', buffer, 0); ASSERT.equal(0x3f, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); mod_ctype.wfloat(1.0000001192092896, 'big', buffer, 0); ASSERT.equal(0x3f, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x01, buffer[3]); mod_ctype.wfloat(1.0000001192092896, 'little', buffer, 0); ASSERT.equal(0x3f, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); mod_ctype.wfloat(2.3283067140944524e-10, 'big', buffer, 0); ASSERT.equal(0x2f, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x01, buffer[3]); mod_ctype.wfloat(2.3283067140944524e-10, 'little', buffer, 0); ASSERT.equal(0x2f, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); /* ff34a2b0 -2.4010576103645774e+38 */ mod_ctype.wfloat(-2.4010576103645774e+38, 'big', buffer, 0); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0x34, buffer[1]); ASSERT.equal(0xa2, buffer[2]); ASSERT.equal(0xb0, buffer[3]); mod_ctype.wfloat(-2.4010576103645774e+38, 'little', buffer, 0); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0x34, buffer[2]); ASSERT.equal(0xa2, buffer[1]); ASSERT.equal(0xb0, buffer[0]); /* Denormalized tests */ /* 0003f89a +/- 3.6468792534053364e-40 */ mod_ctype.wfloat(3.6468792534053364e-40, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x03, buffer[1]); ASSERT.equal(0xf8, buffer[2]); ASSERT.equal(0x9a, buffer[3]); mod_ctype.wfloat(3.6468792534053364e-40, 'little', buffer, 0); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x03, buffer[2]); ASSERT.equal(0xf8, buffer[1]); ASSERT.equal(0x9a, buffer[0]); mod_ctype.wfloat(-3.6468792534053364e-40, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x03, buffer[1]); ASSERT.equal(0xf8, buffer[2]); ASSERT.equal(0x9a, buffer[3]); mod_ctype.wfloat(-3.6468792534053364e-40, 'little', buffer, 0); ASSERT.equal(0x80, buffer[3]); ASSERT.equal(0x03, buffer[2]); ASSERT.equal(0xf8, buffer[1]); ASSERT.equal(0x9a, buffer[0]); /* Maximum and minimum normalized and denormalized values */ /* Largest normalized number +/- 3.4028234663852886e+38 */ mod_ctype.wfloat(3.4028234663852886e+38, 'big', buffer, 0); ASSERT.equal(0x7f, buffer[0]); ASSERT.equal(0x7f, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); mod_ctype.wfloat(3.4028234663852886e+38, 'little', buffer, 0); ASSERT.equal(0x7f, buffer[3]); ASSERT.equal(0x7f, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); mod_ctype.wfloat(-3.4028234663852886e+38, 'big', buffer, 0); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0x7f, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); mod_ctype.wfloat(-3.4028234663852886e+38, 'little', buffer, 0); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0x7f, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); /* Smallest normalied number +/- 1.1754943508222875e-38 */ mod_ctype.wfloat(1.1754943508222875e-38, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); mod_ctype.wfloat(1.1754943508222875e-38, 'little', buffer, 0); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); mod_ctype.wfloat(-1.1754943508222875e-38, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x80, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); mod_ctype.wfloat(-1.1754943508222875e-38, 'little', buffer, 0); ASSERT.equal(0x80, buffer[3]); ASSERT.equal(0x80, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* Smallest denormalized number 1.401298464324817e-45 */ mod_ctype.wfloat(1.401298464324817e-45, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x01, buffer[3]); mod_ctype.wfloat(1.401298464324817e-45, 'little', buffer, 0); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); mod_ctype.wfloat(-1.401298464324817e-45, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x01, buffer[3]); mod_ctype.wfloat(-1.401298464324817e-45, 'little', buffer, 0); ASSERT.equal(0x80, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); /* Largest denormalized value +/- 1.1754942106924411e-38 */ mod_ctype.wfloat(1.1754942106924411e-38, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x7f, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); mod_ctype.wfloat(1.1754942106924411e-38, 'little', buffer, 0); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x7f, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); mod_ctype.wfloat(-1.1754942106924411e-38, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x7f, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); mod_ctype.wfloat(-1.1754942106924411e-38, 'little', buffer, 0); ASSERT.equal(0x80, buffer[3]); ASSERT.equal(0x7f, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); /* Do some quick offset testing */ buffer = new Buffer(6); mod_ctype.wfloat(-1.2027516403607578e-32, 'big', buffer, 2); ASSERT.equal(0x8a, buffer[2]); ASSERT.equal(0x79, buffer[3]); ASSERT.equal(0xcd, buffer[4]); ASSERT.equal(0x3f, buffer[5]); mod_ctype.wfloat(-1.2027516403607578e-32, 'little', buffer, 2); ASSERT.equal(0x8a, buffer[5]); ASSERT.equal(0x79, buffer[4]); ASSERT.equal(0xcd, buffer[3]); ASSERT.equal(0x3f, buffer[2]); } function testdouble() { var buffer = new Buffer(10); /* Check 0 */ mod_ctype.wdouble(0, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(0, 'little', buffer, 0); ASSERT.equal(0x00, buffer[7]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* Check NaN */ /* Similar to floats we are only generating a subset of values */ mod_ctype.wdouble(NaN, 'big', buffer, 0); ASSERT.equal(0x7f, buffer[0]); ASSERT.equal(0xf0, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x17, buffer[7]); mod_ctype.wdouble(NaN, 'little', buffer, 0); ASSERT.equal(0x7f, buffer[7]); ASSERT.equal(0xf0, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x17, buffer[0]); /* pos inf */ mod_ctype.wdouble(Number.POSITIVE_INFINITY, 'big', buffer, 0); ASSERT.equal(0x7f, buffer[0]); ASSERT.equal(0xf0, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(Number.POSITIVE_INFINITY, 'little', buffer, 0); ASSERT.equal(0x7f, buffer[7]); ASSERT.equal(0xf0, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* neg inf */ mod_ctype.wdouble(Number.NEGATIVE_INFINITY, 'big', buffer, 0); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0xf0, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(Number.NEGATIVE_INFINITY, 'little', buffer, 0); ASSERT.equal(0xff, buffer[7]); ASSERT.equal(0xf0, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* Simple normalized values */ /* +/- 1.125 */ mod_ctype.wdouble(1.125, 'big', buffer, 0); ASSERT.equal(0x3f, buffer[0]); ASSERT.equal(0xf2, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(1.125, 'little', buffer, 0); ASSERT.equal(0x3f, buffer[7]); ASSERT.equal(0xf2, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); mod_ctype.wdouble(-1.125, 'big', buffer, 0); ASSERT.equal(0xbf, buffer[0]); ASSERT.equal(0xf2, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(-1.125, 'little', buffer, 0); ASSERT.equal(0xbf, buffer[7]); ASSERT.equal(0xf2, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* +/- 1.4397318913736026e+283 */ mod_ctype.wdouble(1.4397318913736026e+283, 'big', buffer, 0); ASSERT.equal(0x7a, buffer[0]); ASSERT.equal(0xb8, buffer[1]); ASSERT.equal(0xc9, buffer[2]); ASSERT.equal(0x34, buffer[3]); ASSERT.equal(0x72, buffer[4]); ASSERT.equal(0x16, buffer[5]); ASSERT.equal(0xf9, buffer[6]); ASSERT.equal(0x0e, buffer[7]); mod_ctype.wdouble(1.4397318913736026e+283, 'little', buffer, 0); ASSERT.equal(0x7a, buffer[7]); ASSERT.equal(0xb8, buffer[6]); ASSERT.equal(0xc9, buffer[5]); ASSERT.equal(0x34, buffer[4]); ASSERT.equal(0x72, buffer[3]); ASSERT.equal(0x16, buffer[2]); ASSERT.equal(0xf9, buffer[1]); ASSERT.equal(0x0e, buffer[0]); mod_ctype.wdouble(-1.4397318913736026e+283, 'big', buffer, 0); ASSERT.equal(0xfa, buffer[0]); ASSERT.equal(0xb8, buffer[1]); ASSERT.equal(0xc9, buffer[2]); ASSERT.equal(0x34, buffer[3]); ASSERT.equal(0x72, buffer[4]); ASSERT.equal(0x16, buffer[5]); ASSERT.equal(0xf9, buffer[6]); ASSERT.equal(0x0e, buffer[7]); mod_ctype.wdouble(-1.4397318913736026e+283, 'little', buffer, 0); ASSERT.equal(0xfa, buffer[7]); ASSERT.equal(0xb8, buffer[6]); ASSERT.equal(0xc9, buffer[5]); ASSERT.equal(0x34, buffer[4]); ASSERT.equal(0x72, buffer[3]); ASSERT.equal(0x16, buffer[2]); ASSERT.equal(0xf9, buffer[1]); ASSERT.equal(0x0e, buffer[0]); /* Denormalized values */ /* +/- 8.82521232268344e-309 */ mod_ctype.wdouble(8.82521232268344e-309, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x06, buffer[1]); ASSERT.equal(0x58, buffer[2]); ASSERT.equal(0x94, buffer[3]); ASSERT.equal(0x13, buffer[4]); ASSERT.equal(0x27, buffer[5]); ASSERT.equal(0x8a, buffer[6]); ASSERT.equal(0xcd, buffer[7]); mod_ctype.wdouble(8.82521232268344e-309, 'little', buffer, 0); ASSERT.equal(0x00, buffer[7]); ASSERT.equal(0x06, buffer[6]); ASSERT.equal(0x58, buffer[5]); ASSERT.equal(0x94, buffer[4]); ASSERT.equal(0x13, buffer[3]); ASSERT.equal(0x27, buffer[2]); ASSERT.equal(0x8a, buffer[1]); ASSERT.equal(0xcd, buffer[0]); mod_ctype.wdouble(-8.82521232268344e-309, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x06, buffer[1]); ASSERT.equal(0x58, buffer[2]); ASSERT.equal(0x94, buffer[3]); ASSERT.equal(0x13, buffer[4]); ASSERT.equal(0x27, buffer[5]); ASSERT.equal(0x8a, buffer[6]); ASSERT.equal(0xcd, buffer[7]); mod_ctype.wdouble(-8.82521232268344e-309, 'little', buffer, 0); ASSERT.equal(0x80, buffer[7]); ASSERT.equal(0x06, buffer[6]); ASSERT.equal(0x58, buffer[5]); ASSERT.equal(0x94, buffer[4]); ASSERT.equal(0x13, buffer[3]); ASSERT.equal(0x27, buffer[2]); ASSERT.equal(0x8a, buffer[1]); ASSERT.equal(0xcd, buffer[0]); /* Edge cases, maximum and minimum values */ /* Smallest denormalized value 5e-324 */ mod_ctype.wdouble(5e-324, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x01, buffer[7]); mod_ctype.wdouble(5e-324, 'little', buffer, 0); ASSERT.equal(0x00, buffer[7]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); mod_ctype.wdouble(-5e-324, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x01, buffer[7]); mod_ctype.wdouble(-5e-324, 'little', buffer, 0); ASSERT.equal(0x80, buffer[7]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x01, buffer[0]); /* Largest denormalized value 2.225073858507201e-308 */ mod_ctype.wdouble(2.225073858507201e-308, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x0f, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[6]); ASSERT.equal(0xff, buffer[7]); mod_ctype.wdouble(2.225073858507201e-308, 'little', buffer, 0); ASSERT.equal(0x00, buffer[7]); ASSERT.equal(0x0f, buffer[6]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); mod_ctype.wdouble(-2.225073858507201e-308, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x0f, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[6]); ASSERT.equal(0xff, buffer[7]); mod_ctype.wdouble(-2.225073858507201e-308, 'little', buffer, 0); ASSERT.equal(0x80, buffer[7]); ASSERT.equal(0x0f, buffer[6]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); /* Smallest normalized value 2.2250738585072014e-308 */ mod_ctype.wdouble(2.2250738585072014e-308, 'big', buffer, 0); ASSERT.equal(0x00, buffer[0]); ASSERT.equal(0x10, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(2.2250738585072014e-308, 'little', buffer, 0); ASSERT.equal(0x00, buffer[7]); ASSERT.equal(0x10, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); mod_ctype.wdouble(-2.2250738585072014e-308, 'big', buffer, 0); ASSERT.equal(0x80, buffer[0]); ASSERT.equal(0x10, buffer[1]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[6]); ASSERT.equal(0x00, buffer[7]); mod_ctype.wdouble(-2.2250738585072014e-308, 'little', buffer, 0); ASSERT.equal(0x80, buffer[7]); ASSERT.equal(0x10, buffer[6]); ASSERT.equal(0x00, buffer[5]); ASSERT.equal(0x00, buffer[4]); ASSERT.equal(0x00, buffer[3]); ASSERT.equal(0x00, buffer[2]); ASSERT.equal(0x00, buffer[1]); ASSERT.equal(0x00, buffer[0]); /* Largest normalized value 1.7976931348623157e+308 */ mod_ctype.wdouble(1.7976931348623157e+308, 'big', buffer, 0); ASSERT.equal(0x7f, buffer[0]); ASSERT.equal(0xef, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[6]); ASSERT.equal(0xff, buffer[7]); mod_ctype.wdouble(1.7976931348623157e+308, 'little', buffer, 0); ASSERT.equal(0x7f, buffer[7]); ASSERT.equal(0xef, buffer[6]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); mod_ctype.wdouble(-1.7976931348623157e+308, 'big', buffer, 0); ASSERT.equal(0xff, buffer[0]); ASSERT.equal(0xef, buffer[1]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[6]); ASSERT.equal(0xff, buffer[7]); mod_ctype.wdouble(-1.7976931348623157e+308, 'little', buffer, 0); ASSERT.equal(0xff, buffer[7]); ASSERT.equal(0xef, buffer[6]); ASSERT.equal(0xff, buffer[5]); ASSERT.equal(0xff, buffer[4]); ASSERT.equal(0xff, buffer[3]); ASSERT.equal(0xff, buffer[2]); ASSERT.equal(0xff, buffer[1]); ASSERT.equal(0xff, buffer[0]); /* Try offsets */ buffer[0] = 0xde; buffer[1] = 0xad; buffer[2] = 0xbe; buffer[3] = 0xef; buffer[4] = 0xba; buffer[5] = 0xdd; buffer[6] = 0xca; buffer[7] = 0xfe; buffer[8] = 0x16; buffer[9] = 0x79; mod_ctype.wdouble(-0.000015130017658081283, 'big', buffer, 2); ASSERT.equal(0xbe, buffer[2]); ASSERT.equal(0xef, buffer[3]); ASSERT.equal(0xba, buffer[4]); ASSERT.equal(0xdd, buffer[5]); ASSERT.equal(0xca, buffer[6]); ASSERT.equal(0xfe, buffer[7]); ASSERT.equal(0x16, buffer[8]); ASSERT.equal(0x79, buffer[9]); mod_ctype.wdouble(-0.000015130017658081283, 'little', buffer, 2); ASSERT.equal(0xbe, buffer[9]); ASSERT.equal(0xef, buffer[8]); ASSERT.equal(0xba, buffer[7]); ASSERT.equal(0xdd, buffer[6]); ASSERT.equal(0xca, buffer[5]); ASSERT.equal(0xfe, buffer[4]); ASSERT.equal(0x16, buffer[3]); ASSERT.equal(0x79, buffer[2]); } testfloat(); testdouble(); ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/float.json����������000644 �000766 �000024 �00000000345 12455173731 041614� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "metadata": { "ctf2json_version": "1.0", "created_at": 1316563626, "derived_from": "/lib/libc.so", "ctf_version": 2, "requested_types": [ "float" ] }, "data": [ { "name": "float", "float": { "length": 4 } } ] } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/int.json������������000644 �000766 �000024 �00000000363 12455173731 041301� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "metadata": { "ctf2json_version": "1.0", "created_at": 1316563631, "derived_from": "/lib/libc.so", "ctf_version": 2, "requested_types": [ "int" ] }, "data": [ { "name": "int", "integer": { "length": 4, "signed": true } } ] } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/psinfo.json���������000644 �000766 �000024 �00000010541 12455173731 042004� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "metadata": { "ctf2json_version": "1.0", "created_at": 1316563573, "derived_from": "/lib/libc.so", "ctf_version": 2, "requested_types": [ "psinfo_t" ] }, "data": [ { "name": "int", "integer": { "length": 4, "signed": true } }, { "name": "char", "integer": { "length": 1, "signed": true } }, { "name": "unsigned short", "integer": { "length": 2, "signed": false } }, { "name": "long", "integer": { "length": 4, "signed": true } }, { "name": "unsigned", "integer": { "length": 4, "signed": false } }, { "name": "size_t", "typedef": "unsigned" }, { "name": "unsigned long", "integer": { "length": 4, "signed": false } }, { "name": "time_t", "typedef": "long" }, { "name": "struct timespec", "struct": [ { "name": "tv_sec", "type": "time_t" }, { "name": "tv_nsec", "type": "long" } ] }, { "name": "zoneid_t", "typedef": "long" }, { "name": "taskid_t", "typedef": "long" }, { "name": "dev_t", "typedef": "unsigned long" }, { "name": "uid_t", "typedef": "unsigned" }, { "name": "gid_t", "typedef": "unsigned" }, { "name": "timestruc_t", "typedef": "struct timespec" }, { "name": "short", "integer": { "length": 2, "signed": true } }, { "name": "projid_t", "typedef": "long" }, { "name": "ushort_t", "typedef": "unsigned short" }, { "name": "poolid_t", "typedef": "long" }, { "name": "uintptr_t", "typedef": "unsigned" }, { "name": "id_t", "typedef": "long" }, { "name": "pid_t", "typedef": "long" }, { "name": "processorid_t", "typedef": "int" }, { "name": "psetid_t", "typedef": "int" }, { "name": "struct lwpsinfo", "struct": [ { "name": "pr_flag", "type": "int" }, { "name": "pr_lwpid", "type": "id_t" }, { "name": "pr_addr", "type": "uintptr_t" }, { "name": "pr_wchan", "type": "uintptr_t" }, { "name": "pr_stype", "type": "char" }, { "name": "pr_state", "type": "char" }, { "name": "pr_sname", "type": "char" }, { "name": "pr_nice", "type": "char" }, { "name": "pr_syscall", "type": "short" }, { "name": "pr_oldpri", "type": "char" }, { "name": "pr_cpu", "type": "char" }, { "name": "pr_pri", "type": "int" }, { "name": "pr_pctcpu", "type": "ushort_t" }, { "name": "pr_pad", "type": "ushort_t" }, { "name": "pr_start", "type": "timestruc_t" }, { "name": "pr_time", "type": "timestruc_t" }, { "name": "pr_clname", "type": "char [8]" }, { "name": "pr_name", "type": "char [16]" }, { "name": "pr_onpro", "type": "processorid_t" }, { "name": "pr_bindpro", "type": "processorid_t" }, { "name": "pr_bindpset", "type": "psetid_t" }, { "name": "pr_lgrp", "type": "int" }, { "name": "pr_filler", "type": "int [4]" } ] }, { "name": "lwpsinfo_t", "typedef": "struct lwpsinfo" }, { "name": "struct psinfo", "struct": [ { "name": "pr_flag", "type": "int" }, { "name": "pr_nlwp", "type": "int" }, { "name": "pr_pid", "type": "pid_t" }, { "name": "pr_ppid", "type": "pid_t" }, { "name": "pr_pgid", "type": "pid_t" }, { "name": "pr_sid", "type": "pid_t" }, { "name": "pr_uid", "type": "uid_t" }, { "name": "pr_euid", "type": "uid_t" }, { "name": "pr_gid", "type": "gid_t" }, { "name": "pr_egid", "type": "gid_t" }, { "name": "pr_addr", "type": "uintptr_t" }, { "name": "pr_size", "type": "size_t" }, { "name": "pr_rssize", "type": "size_t" }, { "name": "pr_pad1", "type": "size_t" }, { "name": "pr_ttydev", "type": "dev_t" }, { "name": "pr_pctcpu", "type": "ushort_t" }, { "name": "pr_pctmem", "type": "ushort_t" }, { "name": "pr_start", "type": "timestruc_t" }, { "name": "pr_time", "type": "timestruc_t" }, { "name": "pr_ctime", "type": "timestruc_t" }, { "name": "pr_fname", "type": "char [16]" }, { "name": "pr_psargs", "type": "char [80]" }, { "name": "pr_wstat", "type": "int" }, { "name": "pr_argc", "type": "int" }, { "name": "pr_argv", "type": "uintptr_t" }, { "name": "pr_envp", "type": "uintptr_t" }, { "name": "pr_dmodel", "type": "char" }, { "name": "pr_pad2", "type": "char [3]" }, { "name": "pr_taskid", "type": "taskid_t" }, { "name": "pr_projid", "type": "projid_t" }, { "name": "pr_nzomb", "type": "int" }, { "name": "pr_poolid", "type": "poolid_t" }, { "name": "pr_zoneid", "type": "zoneid_t" }, { "name": "pr_contract", "type": "id_t" }, { "name": "pr_filler", "type": "int [1]" }, { "name": "pr_lwp", "type": "lwpsinfo_t" } ] }, { "name": "psinfo_t", "typedef": "struct psinfo" } ] } ���������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/struct.json���������000644 �000766 �000024 �00000000750 12455173731 042033� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "metadata": { "ctf2json_version": "1.0", "created_at": 1316563648, "derived_from": "/lib/libc.so", "ctf_version": 2, "requested_types": [ "timestruc_t" ] }, "data": [ { "name": "long", "integer": { "length": 4, "signed": true } }, { "name": "time_t", "typedef": "long" }, { "name": "struct timespec", "struct": [ { "name": "tv_sec", "type": "time_t" }, { "name": "tv_nsec", "type": "long" } ] }, { "name": "timestruc_t", "typedef": "struct timespec" } ] } ������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.fail.js���������000644 �000766 �000024 �00000002272 12455173731 041677� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* * Test several conditions that should always cause us to throw. */ var mod_assert = require('assert'); var mod_ctype = require('../../ctype.js'); var cases = [ { json: { }, msg: 'bad JSON - no metadata or data' }, { json: { metadata: {} }, msg: 'bad JSON - bad metadata section' }, { json: { metadata: { 'JSON version': [] } }, msg: 'bad JSON - bad JSON version' }, { json: { metadata: { 'JSON version': 2 } }, msg: 'bad JSON - bad JSON version' }, { json: { metadata: { 'JSON version': '100.20' } }, msg: 'bad JSON - bad JSON version' }, { json: { metadata: { 'JSON version': '1.0' } }, msg: 'missing data section' }, { json: { metadata: { 'JSON version': '1.0' }, data: 1 }, msg: 'invalid data section' }, { json: { metadata: { 'JSON version': '1.0' }, data: 1.1 }, msg: 'invalid data section' }, { json: { metadata: { 'JSON version': '1.0' }, data: '1.1' }, msg: 'invalid data section' }, { json: { metadata: { 'JSON version': '1.0' }, data: {} }, msg: 'invalid data section' } ]; function test() { var ii; for (ii = 0; ii < cases.length; ii++) { mod_assert.throws(function () { mod_ctype.parseCTF(cases[ii].json); }, Error, cases[ii].msg); } } test(); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.float.js��������000644 �000766 �000024 �00000000505 12455173731 042066� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var mod_fs = require('fs'); var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var data, parser; data = JSON.parse(mod_fs.readFileSync('./float.json').toString()); parser = mod_ctype.parseCTF(data, { endian: 'big' }); mod_assert.deepEqual(parser.lstypes(), {}); } test(); �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.int.js����������000644 �000766 �000024 �00000000525 12455173731 041555� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var mod_fs = require('fs'); var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var data, parser; data = JSON.parse(mod_fs.readFileSync('./int.json').toString()); parser = mod_ctype.parseCTF(data, { endian: 'big' }); mod_assert.deepEqual(parser.lstypes(), { 'int': 'int32_t' }); } test(); ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.psinfo.js�������000644 �000766 �000024 �00000000562 12455173731 042262� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var mod_fs = require('fs'); var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); /* * This is too unwieldly to actually write out. Just make sure we can parse it * without errrors. */ function test() { var data; data = JSON.parse(mod_fs.readFileSync('./psinfo.json').toString()); mod_ctype.parseCTF(data, { endian: 'big' }); } test(); ����������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.struct.js�������000644 �000766 �000024 �00000001003 12455173731 042277� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var mod_fs = require('fs'); var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var data, parser; data = JSON.parse(mod_fs.readFileSync('./struct.json').toString()); parser = mod_ctype.parseCTF(data, { endian: 'big' }); mod_assert.deepEqual(parser.lstypes(), { 'long': 'int32_t', 'time_t': 'long', 'timestruc_t': 'struct timespec', 'struct timespec': [ { 'tv_sec': { 'type': 'time_t' } }, { 'tv_nsec': { 'type': 'long' } } ] }); } test(); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/tst.typedef.js������000644 �000766 �000024 �00000000556 12455173731 042427� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var mod_fs = require('fs'); var mod_ctype = require('../../ctype.js'); var mod_assert = require('assert'); function test() { var data, parser; data = JSON.parse(mod_fs.readFileSync('./typedef.json').toString()); parser = mod_ctype.parseCTF(data, { endian: 'big' }); mod_assert.deepEqual(parser.lstypes(), { 'bar_t': 'int', 'int': 'int32_t' }); } test(); ��������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tst/ctf/typedef.json��������000644 �000766 �000024 �00000000436 12455173731 042150� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "metadata": { "ctf2json_version": "1.0", "created_at": 1316302348, "derived_from": "/lib/libc.so", "ctf_version": 2, "requested_types": [ "bar_t" ] }, "data": [ { "name": "int", "integer": { "length": 4, "signed": true } }, { "name": "bar_t", "typedef": "int" } ] } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/jsl.conf�000755 �000766 �000024 �00000014140 12455173731 041026� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������# # Configuration File for JavaScript Lint 0.3.0 # Developed by Matthias Miller (http://www.JavaScriptLint.com) # # This configuration file can be used to lint a collection of scripts, or to enable # or disable warnings for scripts that are linted via the command line. # ### Warnings # Enable or disable warnings based on requirements. # Use "+WarningName" to display or "-WarningName" to suppress. # +no_return_value # function {0} does not always return a value +duplicate_formal # duplicate formal argument {0} +equal_as_assign # test for equality (==) mistyped as assignment (=)?{0} +var_hides_arg # variable {0} hides argument +redeclared_var # redeclaration of {0} {1} +anon_no_return_value # anonymous function does not always return a value +missing_semicolon # missing semicolon +meaningless_block # meaningless block; curly braces have no impact +comma_separated_stmts # multiple statements separated by commas (use semicolons?) +unreachable_code # unreachable code +missing_break # missing break statement +missing_break_for_last_case # missing break statement for last case in switch +comparison_type_conv # comparisons against null, 0, true, false, or an empty string allowing implicit type conversion (use === or !==) -inc_dec_within_stmt # increment (++) and decrement (--) operators used as part of greater statement +useless_void # use of the void type may be unnecessary (void is always undefined) -useless_quotes # quotation marks are unnecessary +multiple_plus_minus # unknown order of operations for successive plus (e.g. x+++y) or minus (e.g. x---y) signs +use_of_label # use of label -block_without_braces # block statement without curly braces +leading_decimal_point # leading decimal point may indicate a number or an object member +trailing_decimal_point # trailing decimal point may indicate a number or an object member -octal_number # leading zeros make an octal number +nested_comment # nested comment +misplaced_regex # regular expressions should be preceded by a left parenthesis, assignment, colon, or comma +ambiguous_newline # unexpected end of line; it is ambiguous whether these lines are part of the same statement +empty_statement # empty statement or extra semicolon -missing_option_explicit # the "option explicit" control comment is missing +partial_option_explicit # the "option explicit" control comment, if used, must be in the first script tag +dup_option_explicit # duplicate "option explicit" control comment +useless_assign # useless assignment +ambiguous_nested_stmt # block statements containing block statements should use curly braces to resolve ambiguity +ambiguous_else_stmt # the else statement could be matched with one of multiple if statements (use curly braces to indicate intent) +missing_default_case # missing default case in switch statement +duplicate_case_in_switch # duplicate case in switch statements +default_not_at_end # the default case is not at the end of the switch statement +legacy_cc_not_understood # couldn't understand control comment using /*@keyword@*/ syntax +jsl_cc_not_understood # couldn't understand control comment using /*jsl:keyword*/ syntax +useless_comparison # useless comparison; comparing identical expressions +with_statement # with statement hides undeclared variables; use temporary variable instead +trailing_comma_in_array # extra comma is not recommended in array initializers +assign_to_function_call # assignment to a function call +parseint_missing_radix # parseInt missing radix parameter -unreferenced_argument # argument declared but never referenced: {name} ### Output format # Customize the format of the error message. # __FILE__ indicates current file path # __FILENAME__ indicates current file name # __LINE__ indicates current line # __ERROR__ indicates error message # # Visual Studio syntax (default): +output-format __FILE__(__LINE__): __ERROR__ # Alternative syntax: #+output-format __FILE__:__LINE__: __ERROR__ ### Context # Show the in-line position of the error. # Use "+context" to display or "-context" to suppress. # +context ### Semicolons # By default, assignments of an anonymous function to a variable or # property (such as a function prototype) must be followed by a semicolon. # #+lambda_assign_requires_semicolon # deprecated setting ### Control Comments # Both JavaScript Lint and the JScript interpreter confuse each other with the syntax for # the /*@keyword@*/ control comments and JScript conditional comments. (The latter is # enabled in JScript with @cc_on@). The /*jsl:keyword*/ syntax is preferred for this reason, # although legacy control comments are enabled by default for backward compatibility. # +legacy_control_comments ### JScript Function Extensions # JScript allows member functions to be defined like this: # function MyObj() { /*constructor*/ } # function MyObj.prototype.go() { /*member function*/ } # # It also allows events to be attached like this: # function window::onload() { /*init page*/ } # # This is a Microsoft-only JavaScript extension. Enable this setting to allow them. # #-jscript_function_extensions # deprecated setting ### Defining identifiers # By default, "option explicit" is enabled on a per-file basis. # To enable this for all files, use "+always_use_option_explicit" -always_use_option_explicit # Define certain identifiers of which the lint is not aware. # (Use this in conjunction with the "undeclared identifier" warning.) # # Common uses for webpages might be: #+define window #+define document +define require +define exports +define console +define Buffer +define JSON ### Files # Specify which files to lint # Use "+recurse" to enable recursion (disabled by default). # To add a set of files, use "+process FileName", "+process Folder\Path\*.js", # or "+process Folder\Path\*.htm". # #+process jsl-test.js ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/tools/jsstyle��000755 �000766 �000024 �00000052127 12455173731 041016� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env perl # # CDDL HEADER START # # The contents of this file are subject to the terms of the # Common Development and Distribution License (the "License"). # You may not use this file except in compliance with the License. # # You can obtain a copy of the license at usr/src/OPENSOLARIS.LICENSE # or http://www.opensolaris.org/os/licensing. # See the License for the specific language governing permissions # and limitations under the License. # # When distributing Covered Code, include this CDDL HEADER in each # file and include the License file at usr/src/OPENSOLARIS.LICENSE. # If applicable, add the following below this CDDL HEADER, with the # fields enclosed by brackets "[]" replaced with your own identifying # information: Portions Copyright [yyyy] [name of copyright owner] # # CDDL HEADER END # # # Copyright 2008 Sun Microsystems, Inc. All rights reserved. # Use is subject to license terms. # # Copyright 2011 Joyent, Inc. All rights reserved. # # jsstyle - check for some common stylistic errors. # # jsstyle is a sort of "lint" for Javascript coding style. This tool is # derived from the cstyle tool, used to check for the style used in the # Solaris kernel, sometimes known as "Bill Joy Normal Form". # # There's a lot this can't check for, like proper indentation of code # blocks. There's also a lot more this could check for. # # A note to the non perl literate: # # perl regular expressions are pretty much like egrep # regular expressions, with the following special symbols # # \s any space character # \S any non-space character # \w any "word" character [a-zA-Z0-9_] # \W any non-word character # \d a digit [0-9] # \D a non-digit # \b word boundary (between \w and \W) # \B non-word boundary # require 5.0; use IO::File; use Getopt::Std; use strict; my $usage = "usage: jsstyle [-chvC] [-o constructs] file ... -c check continuation indentation inside functions -h perform heuristic checks that are sometimes wrong -v verbose -C don't check anything in header block comments -o constructs allow a comma-seperated list of optional constructs: doxygen allow doxygen-style block comments (/** /*!) splint allow splint-style lint comments (/*@ ... @*/) "; my %opts; if (!getopts("cho:vC", \%opts)) { print $usage; exit 2; } my $check_continuation = $opts{'c'}; my $heuristic = $opts{'h'}; my $verbose = $opts{'v'}; my $ignore_hdr_comment = $opts{'C'}; my $doxygen_comments = 0; my $splint_comments = 0; if (defined($opts{'o'})) { for my $x (split /,/, $opts{'o'}) { if ($x eq "doxygen") { $doxygen_comments = 1; } elsif ($x eq "splint") { $splint_comments = 1; } else { print "jsstyle: unrecognized construct \"$x\"\n"; print $usage; exit 2; } } } my ($filename, $line, $prev); # shared globals my $fmt; my $hdr_comment_start; if ($verbose) { $fmt = "%s: %d: %s\n%s\n"; } else { $fmt = "%s: %d: %s\n"; } if ($doxygen_comments) { # doxygen comments look like "/*!" or "/**"; allow them. $hdr_comment_start = qr/^\s*\/\*[\!\*]?$/; } else { $hdr_comment_start = qr/^\s*\/\*$/; } # Note, following must be in single quotes so that \s and \w work right. my $lint_re = qr/\/\*(?: jsl:\w+?|ARGSUSED[0-9]*|NOTREACHED|LINTLIBRARY|VARARGS[0-9]*| CONSTCOND|CONSTANTCOND|CONSTANTCONDITION|EMPTY| FALLTHRU|FALLTHROUGH|LINTED.*?|PRINTFLIKE[0-9]*| PROTOLIB[0-9]*|SCANFLIKE[0-9]*|JSSTYLED.*? )\*\//x; my $splint_re = qr/\/\*@.*?@\*\//x; my $err_stat = 0; # exit status if ($#ARGV >= 0) { foreach my $arg (@ARGV) { my $fh = new IO::File $arg, "r"; if (!defined($fh)) { printf "%s: cannot open\n", $arg; } else { &jsstyle($arg, $fh); close $fh; } } } else { &jsstyle("<stdin>", *STDIN); } exit $err_stat; my $no_errs = 0; # set for JSSTYLED-protected lines sub err($) { my ($error) = @_; unless ($no_errs) { printf $fmt, $filename, $., $error, $line; $err_stat = 1; } } sub err_prefix($$) { my ($prevline, $error) = @_; my $out = $prevline."\n".$line; unless ($no_errs) { printf $fmt, $filename, $., $error, $out; $err_stat = 1; } } sub err_prev($) { my ($error) = @_; unless ($no_errs) { printf $fmt, $filename, $. - 1, $error, $prev; $err_stat = 1; } } sub jsstyle($$) { my ($fn, $filehandle) = @_; $filename = $fn; # share it globally my $in_cpp = 0; my $next_in_cpp = 0; my $in_comment = 0; my $in_header_comment = 0; my $comment_done = 0; my $in_function = 0; my $in_function_header = 0; my $in_declaration = 0; my $note_level = 0; my $nextok = 0; my $nocheck = 0; my $in_string = 0; my ($okmsg, $comment_prefix); $line = ''; $prev = ''; reset_indent(); line: while (<$filehandle>) { s/\r?\n$//; # strip return and newline # save the original line, then remove all text from within # double or single quotes, we do not want to check such text. $line = $_; # # C allows strings to be continued with a backslash at the end of # the line. We translate that into a quoted string on the previous # line followed by an initial quote on the next line. # # (we assume that no-one will use backslash-continuation with character # constants) # $_ = '"' . $_ if ($in_string && !$nocheck && !$in_comment); # # normal strings and characters # s/'([^\\']|\\.)*'/\'\'/g; s/"([^\\"]|\\.)*"/\"\"/g; # # detect string continuation # if ($nocheck || $in_comment) { $in_string = 0; } else { # # Now that all full strings are replaced with "", we check # for unfinished strings continuing onto the next line. # $in_string = (s/([^"](?:"")*)"([^\\"]|\\.)*\\$/$1""/ || s/^("")*"([^\\"]|\\.)*\\$/""/); } # # figure out if we are in a cpp directive # $in_cpp = $next_in_cpp || /^\s*#/; # continued or started $next_in_cpp = $in_cpp && /\\$/; # only if continued # strip off trailing backslashes, which appear in long macros s/\s*\\$//; # an /* END JSSTYLED */ comment ends a no-check block. if ($nocheck) { if (/\/\* *END *JSSTYLED *\*\//) { $nocheck = 0; } else { reset_indent(); next line; } } # a /*JSSTYLED*/ comment indicates that the next line is ok. if ($nextok) { if ($okmsg) { err($okmsg); } $nextok = 0; $okmsg = 0; if (/\/\* *JSSTYLED.*\*\//) { /^.*\/\* *JSSTYLED *(.*) *\*\/.*$/; $okmsg = $1; $nextok = 1; } $no_errs = 1; } elsif ($no_errs) { $no_errs = 0; } # check length of line. # first, a quick check to see if there is any chance of being too long. if (($line =~ tr/\t/\t/) * 7 + length($line) > 80) { # yes, there is a chance. # replace tabs with spaces and check again. my $eline = $line; 1 while $eline =~ s/\t+/' ' x (length($&) * 8 - length($`) % 8)/e; if (length($eline) > 80) { err("line > 80 characters"); } } # ignore NOTE(...) annotations (assumes NOTE is on lines by itself). if ($note_level || /\b_?NOTE\s*\(/) { # if in NOTE or this is NOTE s/[^()]//g; # eliminate all non-parens $note_level += s/\(//g - length; # update paren nest level next; } # a /* BEGIN JSSTYLED */ comment starts a no-check block. if (/\/\* *BEGIN *JSSTYLED *\*\//) { $nocheck = 1; } # a /*JSSTYLED*/ comment indicates that the next line is ok. if (/\/\* *JSSTYLED.*\*\//) { /^.*\/\* *JSSTYLED *(.*) *\*\/.*$/; $okmsg = $1; $nextok = 1; } if (/\/\/ *JSSTYLED/) { /^.*\/\/ *JSSTYLED *(.*)$/; $okmsg = $1; $nextok = 1; } # universal checks; apply to everything if (/\t +\t/) { err("spaces between tabs"); } if (/ \t+ /) { err("tabs between spaces"); } if (/\s$/) { err("space or tab at end of line"); } if (/[^ \t(]\/\*/ && !/\w\(\/\*.*\*\/\);/) { err("comment preceded by non-blank"); } # is this the beginning or ending of a function? # (not if "struct foo\n{\n") if (/^{$/ && $prev =~ /\)\s*(const\s*)?(\/\*.*\*\/\s*)?\\?$/) { $in_function = 1; $in_declaration = 1; $in_function_header = 0; $prev = $line; next line; } if (/^}\s*(\/\*.*\*\/\s*)*$/) { if ($prev =~ /^\s*return\s*;/) { err_prev("unneeded return at end of function"); } $in_function = 0; reset_indent(); # we don't check between functions $prev = $line; next line; } if (/^\w*\($/) { $in_function_header = 1; } # a blank line terminates the declarations within a function. # XXX - but still a problem in sub-blocks. if ($in_declaration && /^$/) { $in_declaration = 0; } if ($comment_done) { $in_comment = 0; $in_header_comment = 0; $comment_done = 0; } # does this looks like the start of a block comment? if (/$hdr_comment_start/) { if (!/^\t*\/\*/) { err("block comment not indented by tabs"); } $in_comment = 1; /^(\s*)\//; $comment_prefix = $1; if ($comment_prefix eq "") { $in_header_comment = 1; } $prev = $line; next line; } # are we still in the block comment? if ($in_comment) { if (/^$comment_prefix \*\/$/) { $comment_done = 1; } elsif (/\*\//) { $comment_done = 1; err("improper block comment close") unless ($ignore_hdr_comment && $in_header_comment); } elsif (!/^$comment_prefix \*[ \t]/ && !/^$comment_prefix \*$/) { err("improper block comment") unless ($ignore_hdr_comment && $in_header_comment); } } if ($in_header_comment && $ignore_hdr_comment) { $prev = $line; next line; } # check for errors that might occur in comments and in code. # allow spaces to be used to draw pictures in header comments. #if (/[^ ] / && !/".* .*"/ && !$in_header_comment) { # err("spaces instead of tabs"); #} #if (/^ / && !/^ \*[ \t\/]/ && !/^ \*$/ && # (!/^ \w/ || $in_function != 0)) { # err("indent by spaces instead of tabs"); #} if (/^ {2,}/ && !/^ [^ ]/) { err("indent by spaces instead of tabs"); } if (/^\t+ [^ \t\*]/ || /^\t+ \S/ || /^\t+ \S/) { err("continuation line not indented by 4 spaces"); } if (/^\s*\/\*./ && !/^\s*\/\*.*\*\// && !/$hdr_comment_start/) { err("improper first line of block comment"); } if ($in_comment) { # still in comment, don't do further checks $prev = $line; next line; } if ((/[^(]\/\*\S/ || /^\/\*\S/) && !(/$lint_re/ || ($splint_comments && /$splint_re/))) { err("missing blank after open comment"); } if (/\S\*\/[^)]|\S\*\/$/ && !(/$lint_re/ || ($splint_comments && /$splint_re/))) { err("missing blank before close comment"); } if (/\/\/\S/) { # C++ comments err("missing blank after start comment"); } # check for unterminated single line comments, but allow them when # they are used to comment out the argument list of a function # declaration. if (/\S.*\/\*/ && !/\S.*\/\*.*\*\// && !/\(\/\*/) { err("unterminated single line comment"); } if (/^(#else|#endif|#include)(.*)$/) { $prev = $line; next line; } # # delete any comments and check everything else. Note that # ".*?" is a non-greedy match, so that we don't get confused by # multiple comments on the same line. # s/\/\*.*?\*\///g; s/\/\/.*$//; # C++ comments # delete any trailing whitespace; we have already checked for that. s/\s*$//; # following checks do not apply to text in comments. if (/"/) { err("literal string using double-quote instead of single"); } if (/[^=!<>\s][!<>=]=/ || /[^<>!=][!<>=]==?[^\s,=]/ || (/[^->]>[^,=>\s]/ && !/[^->]>$/) || (/[^<]<[^,=<\s]/ && !/[^<]<$/) || /[^<\s]<[^<]/ || /[^->\s]>[^>]/) { err("missing space around relational operator"); } if (/\S>>=/ || /\S<<=/ || />>=\S/ || /<<=\S/ || /\S[-+*\/&|^%]=/ || (/[^-+*\/&|^%!<>=\s]=[^=]/ && !/[^-+*\/&|^%!<>=\s]=$/) || (/[^!<>=]=[^=\s]/ && !/[^!<>=]=$/)) { # XXX - should only check this for C++ code # XXX - there are probably other forms that should be allowed if (!/\soperator=/) { err("missing space around assignment operator"); } } if (/[,;]\S/ && !/\bfor \(;;\)/) { err("comma or semicolon followed by non-blank"); } # allow "for" statements to have empty "while" clauses if (/\s[,;]/ && !/^[\t]+;$/ && !/^\s*for \([^;]*; ;[^;]*\)/) { err("comma or semicolon preceded by blank"); } if (/^\s*(&&|\|\|)/) { err("improper boolean continuation"); } if (/\S *(&&|\|\|)/ || /(&&|\|\|) *\S/) { err("more than one space around boolean operator"); } if (/\b(delete|typeof|instanceOf|throw|with|catch|new|function|in|for|if|while|switch|return|case)\(/) { err("missing space between keyword and paren"); } if (/(\b(catch|for|if|with|while|switch|return)\b.*){2,}/) { # multiple "case" and "sizeof" allowed err("more than one keyword on line"); } if (/\b(delete|typeof|instanceOf|with|throw|catch|new|function|in|for|if|while|switch|return|case)\s\s+\(/ && !/^#if\s+\(/) { err("extra space between keyword and paren"); } # try to detect "func (x)" but not "if (x)" or # "#define foo (x)" or "int (*func)();" if (/\w\s\(/) { my $s = $_; # strip off all keywords on the line s/\b(delete|typeof|instanceOf|throw|with|catch|new|function|in|for|if|while|switch|return|case)\s\(/XXX(/g; s/#elif\s\(/XXX(/g; s/^#define\s+\w+\s+\(/XXX(/; # do not match things like "void (*f)();" # or "typedef void (func_t)();" s/\w\s\(+\*/XXX(*/g; s/\b(void)\s+\(+/XXX(/og; if (/\w\s\(/) { err("extra space between function name and left paren"); } $_ = $s; } if (/^\s*return\W[^;]*;/ && !/^\s*return\s*\(.*\);/) { err("unparenthesized return expression"); } if (/\btypeof\b/ && !/\btypeof\s*\(.*\)/) { err("unparenthesized typeof expression"); } if (/\(\s/) { err("whitespace after left paren"); } # allow "for" statements to have empty "continue" clauses if (/\s\)/ && !/^\s*for \([^;]*;[^;]*; \)/) { err("whitespace before right paren"); } if (/^\s*\(void\)[^ ]/) { err("missing space after (void) cast"); } if (/\S{/ && !/({|\(){/) { err("missing space before left brace"); } if ($in_function && /^\s+{/ && ($prev =~ /\)\s*$/ || $prev =~ /\bstruct\s+\w+$/)) { err("left brace starting a line"); } if (/}(else|while)/) { err("missing space after right brace"); } if (/}\s\s+(else|while)/) { err("extra space after right brace"); } if (/^\s+#/) { err("preprocessor statement not in column 1"); } if (/^#\s/) { err("blank after preprocessor #"); } # # We completely ignore, for purposes of indentation: # * lines outside of functions # * preprocessor lines # if ($check_continuation && $in_function && !$in_cpp) { process_indent($_); } if ($heuristic) { # cannot check this everywhere due to "struct {\n...\n} foo;" if ($in_function && !$in_declaration && /}./ && !/}\s+=/ && !/{.*}[;,]$/ && !/}(\s|)*$/ && !/} (else|while)/ && !/}}/) { err("possible bad text following right brace"); } # cannot check this because sub-blocks in # the middle of code are ok if ($in_function && /^\s+{/) { err("possible left brace starting a line"); } } if (/^\s*else\W/) { if ($prev =~ /^\s*}$/) { err_prefix($prev, "else and right brace should be on same line"); } } $prev = $line; } if ($prev eq "") { err("last line in file is blank"); } } # # Continuation-line checking # # The rest of this file contains the code for the continuation checking # engine. It's a pretty simple state machine which tracks the expression # depth (unmatched '('s and '['s). # # Keep in mind that the argument to process_indent() has already been heavily # processed; all comments have been replaced by control-A, and the contents of # strings and character constants have been elided. # my $cont_in; # currently inside of a continuation my $cont_off; # skipping an initializer or definition my $cont_noerr; # suppress cascading errors my $cont_start; # the line being continued my $cont_base; # the base indentation my $cont_first; # this is the first line of a statement my $cont_multiseg; # this continuation has multiple segments my $cont_special; # this is a C statement (if, for, etc.) my $cont_macro; # this is a macro my $cont_case; # this is a multi-line case my @cont_paren; # the stack of unmatched ( and [s we've seen sub reset_indent() { $cont_in = 0; $cont_off = 0; } sub delabel($) { # # replace labels with tabs. Note that there may be multiple # labels on a line. # local $_ = $_[0]; while (/^(\t*)( *(?:(?:\w+\s*)|(?:case\b[^:]*)): *)(.*)$/) { my ($pre_tabs, $label, $rest) = ($1, $2, $3); $_ = $pre_tabs; while ($label =~ s/^([^\t]*)(\t+)//) { $_ .= "\t" x (length($2) + length($1) / 8); } $_ .= ("\t" x (length($label) / 8)).$rest; } return ($_); } sub process_indent($) { require strict; local $_ = $_[0]; # preserve the global $_ s///g; # No comments s/\s+$//; # Strip trailing whitespace return if (/^$/); # skip empty lines # regexps used below; keywords taking (), macros, and continued cases my $special = '(?:(?:\}\s*)?else\s+)?(?:if|for|while|switch)\b'; my $macro = '[A-Z_][A-Z_0-9]*\('; my $case = 'case\b[^:]*$'; # skip over enumerations, array definitions, initializers, etc. if ($cont_off <= 0 && !/^\s*$special/ && (/(?:(?:\b(?:enum|struct|union)\s*[^\{]*)|(?:\s+=\s*)){/ || (/^\s*{/ && $prev =~ /=\s*(?:\/\*.*\*\/\s*)*$/))) { $cont_in = 0; $cont_off = tr/{/{/ - tr/}/}/; return; } if ($cont_off) { $cont_off += tr/{/{/ - tr/}/}/; return; } if (!$cont_in) { $cont_start = $line; if (/^\t* /) { err("non-continuation indented 4 spaces"); $cont_noerr = 1; # stop reporting } $_ = delabel($_); # replace labels with tabs # check if the statement is complete return if (/^\s*\}?$/); return if (/^\s*\}?\s*else\s*\{?$/); return if (/^\s*do\s*\{?$/); return if (/{$/); return if (/}[,;]?$/); # Allow macros on their own lines return if (/^\s*[A-Z_][A-Z_0-9]*$/); # cases we don't deal with, generally non-kosher if (/{/) { err("stuff after {"); return; } # Get the base line, and set up the state machine /^(\t*)/; $cont_base = $1; $cont_in = 1; @cont_paren = (); $cont_first = 1; $cont_multiseg = 0; # certain things need special processing $cont_special = /^\s*$special/? 1 : 0; $cont_macro = /^\s*$macro/? 1 : 0; $cont_case = /^\s*$case/? 1 : 0; } else { $cont_first = 0; # Strings may be pulled back to an earlier (half-)tabstop unless ($cont_noerr || /^$cont_base / || (/^\t*(?: )?(?:gettext\()?\"/ && !/^$cont_base\t/)) { err_prefix($cont_start, "continuation should be indented 4 spaces"); } } my $rest = $_; # keeps the remainder of the line # # The split matches 0 characters, so that each 'special' character # is processed separately. Parens and brackets are pushed and # popped off the @cont_paren stack. For normal processing, we wait # until a ; or { terminates the statement. "special" processing # (if/for/while/switch) is allowed to stop when the stack empties, # as is macro processing. Case statements are terminated with a : # and an empty paren stack. # foreach $_ (split /[^\(\)\[\]\{\}\;\:]*/) { next if (length($_) == 0); # rest contains the remainder of the line my $rxp = "[^\Q$_\E]*\Q$_\E"; $rest =~ s/^$rxp//; if (/\(/ || /\[/) { push @cont_paren, $_; } elsif (/\)/ || /\]/) { my $cur = $_; tr/\)\]/\(\[/; my $old = (pop @cont_paren); if (!defined($old)) { err("unexpected '$cur'"); $cont_in = 0; last; } elsif ($old ne $_) { err("'$cur' mismatched with '$old'"); $cont_in = 0; last; } # # If the stack is now empty, do special processing # for if/for/while/switch and macro statements. # next if (@cont_paren != 0); if ($cont_special) { if ($rest =~ /^\s*{?$/) { $cont_in = 0; last; } if ($rest =~ /^\s*;$/) { err("empty if/for/while body ". "not on its own line"); $cont_in = 0; last; } if (!$cont_first && $cont_multiseg == 1) { err_prefix($cont_start, "multiple statements continued ". "over multiple lines"); $cont_multiseg = 2; } elsif ($cont_multiseg == 0) { $cont_multiseg = 1; } # We've finished this section, start # processing the next. goto section_ended; } if ($cont_macro) { if ($rest =~ /^$/) { $cont_in = 0; last; } } } elsif (/\;/) { if ($cont_case) { err("unexpected ;"); } elsif (!$cont_special) { err("unexpected ;") if (@cont_paren != 0); if (!$cont_first && $cont_multiseg == 1) { err_prefix($cont_start, "multiple statements continued ". "over multiple lines"); $cont_multiseg = 2; } elsif ($cont_multiseg == 0) { $cont_multiseg = 1; } if ($rest =~ /^$/) { $cont_in = 0; last; } if ($rest =~ /^\s*special/) { err("if/for/while/switch not started ". "on its own line"); } goto section_ended; } } elsif (/\{/) { err("{ while in parens/brackets") if (@cont_paren != 0); err("stuff after {") if ($rest =~ /[^\s}]/); $cont_in = 0; last; } elsif (/\}/) { err("} while in parens/brackets") if (@cont_paren != 0); if (!$cont_special && $rest !~ /^\s*(while|else)\b/) { if ($rest =~ /^$/) { err("unexpected }"); } else { err("stuff after }"); } $cont_in = 0; last; } } elsif (/\:/ && $cont_case && @cont_paren == 0) { err("stuff after multi-line case") if ($rest !~ /$^/); $cont_in = 0; last; } next; section_ended: # End of a statement or if/while/for loop. Reset # cont_special and cont_macro based on the rest of the # line. $cont_special = ($rest =~ /^\s*$special/)? 1 : 0; $cont_macro = ($rest =~ /^\s*$macro/)? 1 : 0; $cont_case = 0; next; } $cont_noerr = 0 if (!$cont_in); } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/ctype/man/man3ctype/�000755 �000766 �000024 �00000000000 12456115120 040667� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/ctype/man/man3ctype/ctio.3ctype���000644 �000766 �000024 �00000020111 12455173731 042764� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������'\" te .\" Copyright (c) 2011, Robert Mustacchi. All Rights Reserved. .\" Copyright (c) 2011, Joyent, Inc. All Rights Reserved. .\" .\" Permission is hereby granted, free of charge, to any person obtaining a copy .\" of this software and associated documentation files (the "Software"), to .\" deal in the Software without restriction, including without limitation the .\" rights to use, copy, modify, merge, publish, distribute, sublicense, and/or .\" sell copies of the Software, and to permit persons to whom the Software is .\" furnished to do so, subject to the following conditions: .\" .\" The above copyright notice and this permission notice shall be included in .\" all copies or substantial portions of the Software. .\" .\" THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR .\" IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, .\" FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE .\" AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER .\" LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING .\" FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS .\" IN THE SOFTWARE. .TH CTIO 3CTYPE "December 12, 2011" .SH NAME ctio, ruint8, ruint16, ruint32, ruint64, wuint8, wuint16, wuint32, wuint64, rsint8, rsint16, rsint32, rsint64, wsint8, wsint16, wsint32, wsint64, rfloat, rdouble, wfloat, wdouble \- integer and float operations .SH SYNOPSIS .LP .nf var mod_ctype = require('ctype'); \fBNumber\fR \fBmod_ctype.ruint8\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.ruint16\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.ruint32\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber[2]\fR \fBmod_ctype.ruint64\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.rsint8\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.rsint16\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.rsint32\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber[2]\fR \fBmod_ctype.rsint64\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.rfloat\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBNumber\fR \fBmod_ctype.rdouble\fR(\fBBuffer\fR \fIbuf\fR, \fBString\fR \fIendian\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wuint8\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wuint16\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wuint32\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wuint64\fR(\fBNumber[2]\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wsint8\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wsint16\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wsint32\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wsint64\fR(\fBNumber[2]\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wfloat\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .LP .nf \fBvoid\fR \fBmod_ctype.wdouble\fR(\fBNumber\fR value, \fBString\fR \fIendian\fR, \fBBuffer\fR \fIbuf\fR, \fBNumber\fR \fIoffset\fR); .fi .SH DESCRIPTION .sp .LP The argument \fIbuf\fR refers to a valid buffer (from calling new Buffer()). The argument \fIendian\fR is either the string 'big' or 'little' and controls whether the data in the buffer is interpreted as big or little endian. The argument \fIoffset\fR indicates the starting index into the buffer to read or write. All functions ensure that starting at \fIoffset\fR does not overflow the end of the buffer. The argument \fIvalue\fR is a Number that is the valid type for the specific function. All functions that take \fIvalue\fR as an argument, verify that the passed value is valid. .SS "\fBruint8()\fR, \fBruint16()\fR, \fBruint32()\fR" .sp .LP The \fBruint8()\fR, \fBruint16()\fR, and \fBruint32()\fR functions read an 8, 16, and 32-bit unsigned value from \fIbuf\fR and return it. The value read is influenced by the values of \fIoffset\fR and \fRendian\fI. .SS "\fBrsint8()\fR, \fBrsint16()\fR, \fBrsint32()\fR" .sp .LP The \fBruint8()\fR, \fBruint16()\fR, and \fBruint32()\fR functions work just as \fBruint8()\fR, \fBruint16()\fR, and \fBruint32()\fR, except they return signed integers. .SS "\fBruint64()\fR, \fBrsint64()\fR" .sp .LP The \fBruint64()\fR and \fBrsint64()\fR functions read unsigned and signed 64 bit integers respectively from \fBbuf\fR. Due to the limitations of ECMAScript's \fBNumber\fR type, they cannot be stored as one value without a loss of precision. Instead of returning the values as a single \fBNumber\fR, the functions return an array of two numbers. The first entry always contains the upper 32-bits and the second value contains the lower 32-bits. The lossy transformation into a number would be \fIres[0]*Math.pow(2,32)+res[1]\fR. Note that, unless an entry is zero, both array entries are guaranteed to have the same sign. .SS "\fBwuint8()\fR, \fBwuint16()\fR, \fBwuint32()\fR" .sp .LP The functions \fBwuint8()\fR, \fBwuint16()\fR, and \fBwuint32()\fR modify the contents of \fBbuf\fR by writing an 8, 16, and 32-bit unsigned integer respectively to \fBbuf\fR. It is illegal to pass a number that is not an integer within the domain of the integer size, for example, for \fBwuint8()\fR the valid range is \fB[0, 255]\fR. The value will be written in either big or little endian format based upon the value of \fBendian\fR. .SS "\fBwsint8()\fR, \fBwsint16()\fR, \fBwsint32()\fR" .sp .LP The functions \fBwsint8()\fR, \fBwsint16()\fR, and \fBwsint32()\fR function identically to the functions \fBwuint8()\fR, \fBwuint16()\fR, and \fBwuint32()\fR except that they the valid domain for \fBvalue\fR is that of a signed number instead of an unsigned number. For example the \fBwsint8()\fR has a domain of \fB[-128, 127]\fR. .SS "\fBwuint64()\fR, \fBwsint64()\fR" .sp .LP The functions \fBwuint64()\fR and \fBswint64()\fR write out 64-bit unsigned and signed integers to \fBbuf\fR. The \fBvalue\fR argument must be in the same format as described in \fBruint64()\fR and \fBrsint64()\fR. .SS "\fBrfloat()\fR, \fBrdouble()\fR" .sp .LP The functions "\fBrfloat()\fR and \fBrdouble()\fR" work like the other read functions, except that they read a single precision and double precision IEEE-754 floating point value instead. .SS "\fBwfloat()\fR, \fBwdouble()\fR" .sp .LP The functions "\fBrfloat()\fR and \fBrdouble()\fR" work like the other write functions, except that the domain for a float is that of a single precision 4 byte value. The domain for a double is any \fBNumber\fR in ECMAScript, which is defined to be represented by a double. .SH ATTRIBUTES .sp .LP See \fBattributes\fR(5) for descriptions of the following attributes: .sp .sp .TS box; c | c l | l . ATTRIBUTE TYPE ATTRIBUTE VALUE _ Interface Stability Committed _ MT-Level See below. _ Standard Not standardized. .TE .sp .LP All functions are MT-safe in so far as there aren't shared memory MT concerns in most node programs. If one where to concoct such an environment, these functions wouldn't be MT-safe. .SH SEE ALSO .sp .LP �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/assert.js000644 �000766 �000024 �00000011263 12455173731 041224� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������// Copyright (c) 2012, Mark Cavage. All rights reserved. var assert = require('assert'); var Stream = require('stream').Stream; var util = require('util'); ///--- Globals var NDEBUG = process.env.NODE_NDEBUG || false; ///--- Messages var ARRAY_TYPE_REQUIRED = '%s ([%s]) required'; var TYPE_REQUIRED = '%s (%s) is required'; ///--- Internal function capitalize(str) { return (str.charAt(0).toUpperCase() + str.slice(1)); } function uncapitalize(str) { return (str.charAt(0).toLowerCase() + str.slice(1)); } function _() { return (util.format.apply(util, arguments)); } function _assert(arg, type, name, stackFunc) { if (!NDEBUG) { name = name || type; stackFunc = stackFunc || _assert.caller; var t = typeof (arg); if (t !== type) { throw new assert.AssertionError({ message: _(TYPE_REQUIRED, name, type), actual: t, expected: type, operator: '===', stackStartFunction: stackFunc }); } } } ///--- API function array(arr, type, name) { if (!NDEBUG) { name = name || type; if (!Array.isArray(arr)) { throw new assert.AssertionError({ message: _(ARRAY_TYPE_REQUIRED, name, type), actual: typeof (arr), expected: 'array', operator: 'Array.isArray', stackStartFunction: array.caller }); } for (var i = 0; i < arr.length; i++) { _assert(arr[i], type, name, array); } } } function bool(arg, name) { _assert(arg, 'boolean', name, bool); } function buffer(arg, name) { if (!Buffer.isBuffer(arg)) { throw new assert.AssertionError({ message: _(TYPE_REQUIRED, name, type), actual: typeof (arg), expected: 'buffer', operator: 'Buffer.isBuffer', stackStartFunction: buffer }); } } function func(arg, name) { _assert(arg, 'function', name); } function number(arg, name) { _assert(arg, 'number', name); } function object(arg, name) { _assert(arg, 'object', name); } function stream(arg, name) { if (!(arg instanceof Stream)) { throw new assert.AssertionError({ message: _(TYPE_REQUIRED, name, type), actual: typeof (arg), expected: 'Stream', operator: 'instanceof', stackStartFunction: buffer }); } } function string(arg, name) { _assert(arg, 'string', name); } ///--- Exports module.exports = { bool: bool, buffer: buffer, func: func, number: number, object: object, stream: stream, string: string }; Object.keys(module.exports).forEach(function (k) { if (k === 'buffer') return; var name = 'arrayOf' + capitalize(k); if (k === 'bool') k = 'boolean'; if (k === 'func') k = 'function'; module.exports[name] = function (arg, name) { array(arg, k, name); }; }); Object.keys(module.exports).forEach(function (k) { var _name = 'optional' + capitalize(k); var s = uncapitalize(k.replace('arrayOf', '')); if (s === 'bool') s = 'boolean'; if (s === 'func') s = 'function'; if (k.indexOf('arrayOf') !== -1) { module.exports[_name] = function (arg, name) { if (!NDEBUG && arg !== undefined) { array(arg, s, name); } }; } else { module.exports[_name] = function (arg, name) { if (!NDEBUG && arg !== undefined) { _assert(arg, s, name); } }; } }); // Reexport built-in assertions Object.keys(assert).forEach(function (k) { if (k === 'AssertionError') { module.exports[k] = assert[k]; return; } module.exports[k] = function () { if (!NDEBUG) { assert[k].apply(assert[k], arguments); } }; }); ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/package.json����������000644 �000766 �000024 �00000011020 12455173731 041642� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Mark Cavage", "email": "mcavage@gmail.com" }, "name": "assert-plus", "description": "Extra assertions on top of node's assert module", "version": "0.1.2", "main": "./assert.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": ">=0.6" }, "readme": "# node-assert-plus\n\nThis library is a super small wrapper over node's assert module that has two\nthings: (1) the ability to disable assertions with the environment variable\nNODE_NDEBUG, and (2) some API wrappers for argument testing. Like\n`assert.string(myArg, 'myArg')`. As a simple example, most of my code looks\nlike this:\n\n var assert = require('assert-plus');\n\n function fooAccount(options, callback) {\n\t assert.object(options, 'options');\n\t\tassert.number(options.id, 'options.id);\n\t\tassert.bool(options.isManager, 'options.isManager');\n\t\tassert.string(options.name, 'options.name');\n\t\tassert.arrayOfString(options.email, 'options.email');\n\t\tassert.func(callback, 'callback');\n\n // Do stuff\n\t\tcallback(null, {});\n }\n\n# API\n\nAll methods that *aren't* part of node's core assert API are simply assumed to\ntake an argument, and then a string 'name' that's not a message; `AssertionError`\nwill be thrown if the assertion fails with a message like:\n\n AssertionError: foo (string) is required\n\tat test (/home/mark/work/foo/foo.js:3:9)\n\tat Object.<anonymous> (/home/mark/work/foo/foo.js:15:1)\n\tat Module._compile (module.js:446:26)\n\tat Object..js (module.js:464:10)\n\tat Module.load (module.js:353:31)\n\tat Function._load (module.js:311:12)\n\tat Array.0 (module.js:484:10)\n\tat EventEmitter._tickCallback (node.js:190:38)\n\nfrom:\n\n function test(foo) {\n\t assert.string(foo, 'foo');\n }\n\nThere you go. You can check that arrays are of a homogenous type with `Arrayof$Type`:\n\n function test(foo) {\n\t assert.arrayOfString(foo, 'foo');\n }\n\nYou can assert IFF an argument is not `undefined` (i.e., an optional arg):\n\n assert.optionalString(foo, 'foo');\n\nLastly, you can opt-out of assertion checking altogether by setting the\nenvironment variable `NODE_NDEBUG=1`. This is pseudo-useful if you have\nlots of assertions, and don't want to pay `typeof ()` taxes to v8 in\nproduction.\n\nThe complete list of APIs is:\n\n* assert.bool\n* assert.buffer\n* assert.func\n* assert.number\n* assert.object\n* assert.string\n* assert.arrayOfBool\n* assert.arrayOfFunc\n* assert.arrayOfNumber\n* assert.arrayOfObject\n* assert.arrayOfString\n* assert.optionalBool\n* assert.optionalBuffer\n* assert.optionalFunc\n* assert.optionalNumber\n* assert.optionalObject\n* assert.optionalString\n* assert.optionalArrayOfBool\n* assert.optionalArrayOfFunc\n* assert.optionalArrayOfNumber\n* assert.optionalArrayOfObject\n* assert.optionalArrayOfString\n* assert.AssertionError\n* assert.fail\n* assert.ok\n* assert.equal\n* assert.notEqual\n* assert.deepEqual\n* assert.notDeepEqual\n* assert.strictEqual\n* assert.notStrictEqual\n* assert.throws\n* assert.doesNotThrow\n* assert.ifError\n\n# Installation\n\n npm install assert-plus\n\n## License\n\nThe MIT License (MIT)\nCopyright (c) 2012 Mark Cavage\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of\nthe Software, and to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n## Bugs\n\nSee <https://github.com/mcavage/node-assert-plus/issues>.\n", "readmeFilename": "README.md", "_id": "assert-plus@0.1.2", "_shasum": "d93ffdbb67ac5507779be316a7d65146417beef8", "_resolved": "https://registry.npmjs.org/assert-plus/-/assert-plus-0.1.2.tgz", "_from": "assert-plus@0.1.2", "scripts": {} } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/assert-plus/README.md000644 �000766 �000024 �00000007432 12455173731 040647� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������# node-assert-plus This library is a super small wrapper over node's assert module that has two things: (1) the ability to disable assertions with the environment variable NODE_NDEBUG, and (2) some API wrappers for argument testing. Like `assert.string(myArg, 'myArg')`. As a simple example, most of my code looks like this: var assert = require('assert-plus'); function fooAccount(options, callback) { assert.object(options, 'options'); assert.number(options.id, 'options.id); assert.bool(options.isManager, 'options.isManager'); assert.string(options.name, 'options.name'); assert.arrayOfString(options.email, 'options.email'); assert.func(callback, 'callback'); // Do stuff callback(null, {}); } # API All methods that *aren't* part of node's core assert API are simply assumed to take an argument, and then a string 'name' that's not a message; `AssertionError` will be thrown if the assertion fails with a message like: AssertionError: foo (string) is required at test (/home/mark/work/foo/foo.js:3:9) at Object.<anonymous> (/home/mark/work/foo/foo.js:15:1) at Module._compile (module.js:446:26) at Object..js (module.js:464:10) at Module.load (module.js:353:31) at Function._load (module.js:311:12) at Array.0 (module.js:484:10) at EventEmitter._tickCallback (node.js:190:38) from: function test(foo) { assert.string(foo, 'foo'); } There you go. You can check that arrays are of a homogenous type with `Arrayof$Type`: function test(foo) { assert.arrayOfString(foo, 'foo'); } You can assert IFF an argument is not `undefined` (i.e., an optional arg): assert.optionalString(foo, 'foo'); Lastly, you can opt-out of assertion checking altogether by setting the environment variable `NODE_NDEBUG=1`. This is pseudo-useful if you have lots of assertions, and don't want to pay `typeof ()` taxes to v8 in production. The complete list of APIs is: * assert.bool * assert.buffer * assert.func * assert.number * assert.object * assert.string * assert.arrayOfBool * assert.arrayOfFunc * assert.arrayOfNumber * assert.arrayOfObject * assert.arrayOfString * assert.optionalBool * assert.optionalBuffer * assert.optionalFunc * assert.optionalNumber * assert.optionalObject * assert.optionalString * assert.optionalArrayOfBool * assert.optionalArrayOfFunc * assert.optionalArrayOfNumber * assert.optionalArrayOfObject * assert.optionalArrayOfString * assert.AssertionError * assert.fail * assert.ok * assert.equal * assert.notEqual * assert.deepEqual * assert.notDeepEqual * assert.strictEqual * assert.notStrictEqual * assert.throws * assert.doesNotThrow * assert.ifError # Installation npm install assert-plus ## License The MIT License (MIT) Copyright (c) 2012 Mark Cavage Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ## Bugs See <https://github.com/mcavage/node-assert-plus/issues>. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/.npmignore��000644 �000766 �000024 �00000000023 12455173731 037734� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules *.log �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/��������000755 �000766 �000024 �00000000000 12456115120 036475� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/LICENSE�����000644 �000766 �000024 �00000002064 12455173731 036751� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2011 Mark Cavage, All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/package.json000644 �000766 �000024 �00000004546 12455173731 040241� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Mark Cavage", "email": "mcavage@gmail.com" }, "contributors": [ { "name": "David Gwynne", "email": "loki@animata.net" }, { "name": "Yunong Xiao", "email": "yunong@joyent.com" } ], "name": "asn1", "description": "Contains parsers and serializers for ASN.1 (currently BER only)", "version": "0.1.11", "repository": { "type": "git", "url": "git://github.com/mcavage/node-asn1.git" }, "main": "lib/index.js", "engines": { "node": ">=0.4.9" }, "dependencies": {}, "devDependencies": { "tap": "0.1.4" }, "scripts": { "pretest": "which gjslint; if [[ \"$?\" = 0 ]] ; then gjslint --nojsdoc -r lib -r tst; else echo \"Missing gjslint. Skipping lint\"; fi", "test": "tap ./tst" }, "readme": "node-asn1 is a library for encoding and decoding ASN.1 datatypes in pure JS.\nCurrently BER encoding is supported; at some point I'll likely have to do DER.\n\n## Usage\n\nMostly, if you're *actually* needing to read and write ASN.1, you probably don't\nneed this readme to explain what and why. If you have no idea what ASN.1 is,\nsee this: ftp://ftp.rsa.com/pub/pkcs/ascii/layman.asc\n\nThe source is pretty much self-explanatory, and has read/write methods for the\ncommon types out there.\n\n### Decoding\n\nThe following reads an ASN.1 sequence with a boolean.\n\n var Ber = require('asn1').Ber;\n\n var reader = new Ber.Reader(new Buffer([0x30, 0x03, 0x01, 0x01, 0xff]));\n\n reader.readSequence();\n console.log('Sequence len: ' + reader.length);\n if (reader.peek() === Ber.Boolean)\n console.log(reader.readBoolean());\n\n### Encoding\n\nThe following generates the same payload as above.\n\n var Ber = require('asn1').Ber;\n\n var writer = new Ber.Writer();\n\n writer.startSequence();\n writer.writeBoolean(true);\n writer.endSequence();\n\n console.log(writer.buffer);\n\n## Installation\n\n npm install asn1\n\n## License\n\nMIT.\n\n## Bugs\n\nSee <https://github.com/mcavage/node-asn1/issues>.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/mcavage/node-asn1/issues" }, "homepage": "https://github.com/mcavage/node-asn1", "_id": "asn1@0.1.11", "_shasum": "559be18376d08a4ec4dbe80877d27818639b2df7", "_resolved": "https://registry.npmjs.org/asn1/-/asn1-0.1.11.tgz", "_from": "asn1@0.1.11" } ����������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/README.md���000644 �000766 �000024 �00000002261 12455173731 037222� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node-asn1 is a library for encoding and decoding ASN.1 datatypes in pure JS. Currently BER encoding is supported; at some point I'll likely have to do DER. ## Usage Mostly, if you're *actually* needing to read and write ASN.1, you probably don't need this readme to explain what and why. If you have no idea what ASN.1 is, see this: ftp://ftp.rsa.com/pub/pkcs/ascii/layman.asc The source is pretty much self-explanatory, and has read/write methods for the common types out there. ### Decoding The following reads an ASN.1 sequence with a boolean. var Ber = require('asn1').Ber; var reader = new Ber.Reader(new Buffer([0x30, 0x03, 0x01, 0x01, 0xff])); reader.readSequence(); console.log('Sequence len: ' + reader.length); if (reader.peek() === Ber.Boolean) console.log(reader.readBoolean()); ### Encoding The following generates the same payload as above. var Ber = require('asn1').Ber; var writer = new Ber.Writer(); writer.startSequence(); writer.writeBoolean(true); writer.endSequence(); console.log(writer.buffer); ## Installation npm install asn1 ## License MIT. ## Bugs See <https://github.com/mcavage/node-asn1/issues>. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/��������000755 �000766 �000024 �00000000000 12456115120 036541� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/����000755 �000766 �000024 �00000000000 12456115120 037311� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/reader.test.js�������000644 �000766 �000024 �00000011150 12455173731 042100� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. var test = require('tap').test; ///--- Globals var BerReader; ///--- Tests test('load library', function(t) { BerReader = require('../../lib/index').BerReader; t.ok(BerReader); try { new BerReader(); t.fail('Should have thrown'); } catch (e) { t.ok(e instanceof TypeError, 'Should have been a type error'); } t.end(); }); test('read byte', function(t) { var reader = new BerReader(new Buffer([0xde])); t.ok(reader); t.equal(reader.readByte(), 0xde, 'wrong value'); t.end(); }); test('read 1 byte int', function(t) { var reader = new BerReader(new Buffer([0x02, 0x01, 0x03])); t.ok(reader); t.equal(reader.readInt(), 0x03, 'wrong value'); t.equal(reader.length, 0x01, 'wrong length'); t.end(); }); test('read 2 byte int', function(t) { var reader = new BerReader(new Buffer([0x02, 0x02, 0x7e, 0xde])); t.ok(reader); t.equal(reader.readInt(), 0x7ede, 'wrong value'); t.equal(reader.length, 0x02, 'wrong length'); t.end(); }); test('read 3 byte int', function(t) { var reader = new BerReader(new Buffer([0x02, 0x03, 0x7e, 0xde, 0x03])); t.ok(reader); t.equal(reader.readInt(), 0x7ede03, 'wrong value'); t.equal(reader.length, 0x03, 'wrong length'); t.end(); }); test('read 4 byte int', function(t) { var reader = new BerReader(new Buffer([0x02, 0x04, 0x7e, 0xde, 0x03, 0x01])); t.ok(reader); t.equal(reader.readInt(), 0x7ede0301, 'wrong value'); t.equal(reader.length, 0x04, 'wrong length'); t.end(); }); test('read boolean true', function(t) { var reader = new BerReader(new Buffer([0x01, 0x01, 0xff])); t.ok(reader); t.equal(reader.readBoolean(), true, 'wrong value'); t.equal(reader.length, 0x01, 'wrong length'); t.end(); }); test('read boolean false', function(t) { var reader = new BerReader(new Buffer([0x01, 0x01, 0x00])); t.ok(reader); t.equal(reader.readBoolean(), false, 'wrong value'); t.equal(reader.length, 0x01, 'wrong length'); t.end(); }); test('read enumeration', function(t) { var reader = new BerReader(new Buffer([0x0a, 0x01, 0x20])); t.ok(reader); t.equal(reader.readEnumeration(), 0x20, 'wrong value'); t.equal(reader.length, 0x01, 'wrong length'); t.end(); }); test('read string', function(t) { var dn = 'cn=foo,ou=unit,o=test'; var buf = new Buffer(dn.length + 2); buf[0] = 0x04; buf[1] = Buffer.byteLength(dn); buf.write(dn, 2); var reader = new BerReader(buf); t.ok(reader); t.equal(reader.readString(), dn, 'wrong value'); t.equal(reader.length, dn.length, 'wrong length'); t.end(); }); test('read sequence', function(t) { var reader = new BerReader(new Buffer([0x30, 0x03, 0x01, 0x01, 0xff])); t.ok(reader); t.equal(reader.readSequence(), 0x30, 'wrong value'); t.equal(reader.length, 0x03, 'wrong length'); t.equal(reader.readBoolean(), true, 'wrong value'); t.equal(reader.length, 0x01, 'wrong length'); t.end(); }); test('anonymous LDAPv3 bind', function(t) { var BIND = new Buffer(14); BIND[0] = 0x30; // Sequence BIND[1] = 12; // len BIND[2] = 0x02; // ASN.1 Integer BIND[3] = 1; // len BIND[4] = 0x04; // msgid (make up 4) BIND[5] = 0x60; // Bind Request BIND[6] = 7; // len BIND[7] = 0x02; // ASN.1 Integer BIND[8] = 1; // len BIND[9] = 0x03; // v3 BIND[10] = 0x04; // String (bind dn) BIND[11] = 0; // len BIND[12] = 0x80; // ContextSpecific (choice) BIND[13] = 0; // simple bind // Start testing ^^ var ber = new BerReader(BIND); t.equal(ber.readSequence(), 48, 'Not an ASN.1 Sequence'); t.equal(ber.length, 12, 'Message length should be 12'); t.equal(ber.readInt(), 4, 'Message id should have been 4'); t.equal(ber.readSequence(), 96, 'Bind Request should have been 96'); t.equal(ber.length, 7, 'Bind length should have been 7'); t.equal(ber.readInt(), 3, 'LDAP version should have been 3'); t.equal(ber.readString(), '', 'Bind DN should have been empty'); t.equal(ber.length, 0, 'string length should have been 0'); t.equal(ber.readByte(), 0x80, 'Should have been ContextSpecific (choice)'); t.equal(ber.readByte(), 0, 'Should have been simple bind'); t.equal(null, ber.readByte(), 'Should be out of data'); t.end(); }); test('long string', function(t) { var buf = new Buffer(256); var o; var s = '2;649;CN=Red Hat CS 71GA Demo,O=Red Hat CS 71GA Demo,C=US;' + 'CN=RHCS Agent - admin01,UID=admin01,O=redhat,C=US [1] This is ' + 'Teena Vradmin\'s description.'; buf[0] = 0x04; buf[1] = 0x81; buf[2] = 0x94; buf.write(s, 3); var ber = new BerReader(buf.slice(0, 3 + s.length)); t.equal(ber.readString(), s); t.end(); }); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/asn1/tst/ber/writer.test.js�������000644 �000766 �000024 �00000017071 12455173731 042162� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. var test = require('tap').test; var sys = require('sys'); ///--- Globals var BerWriter; var BerReader; ///--- Tests test('load library', function(t) { BerWriter = require('../../lib/index').BerWriter; t.ok(BerWriter); t.ok(new BerWriter()); t.end(); }); test('write byte', function(t) { var writer = new BerWriter(); writer.writeByte(0xC2); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 1, 'Wrong length'); t.equal(ber[0], 0xC2, 'value wrong'); t.end(); }); test('write 1 byte int', function(t) { var writer = new BerWriter(); writer.writeInt(0x7f); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 3, 'Wrong length for an int: ' + ber.length); t.equal(ber[0], 0x02, 'ASN.1 tag wrong (2) -> ' + ber[0]); t.equal(ber[1], 0x01, 'length wrong(1) -> ' + ber[1]); t.equal(ber[2], 0x7f, 'value wrong(3) -> ' + ber[2]); t.end(); }); test('write 2 byte int', function(t) { var writer = new BerWriter(); writer.writeInt(0x7ffe); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 4, 'Wrong length for an int'); t.equal(ber[0], 0x02, 'ASN.1 tag wrong'); t.equal(ber[1], 0x02, 'length wrong'); t.equal(ber[2], 0x7f, 'value wrong (byte 1)'); t.equal(ber[3], 0xfe, 'value wrong (byte 2)'); t.end(); }); test('write 3 byte int', function(t) { var writer = new BerWriter(); writer.writeInt(0x7ffffe); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 5, 'Wrong length for an int'); t.equal(ber[0], 0x02, 'ASN.1 tag wrong'); t.equal(ber[1], 0x03, 'length wrong'); t.equal(ber[2], 0x7f, 'value wrong (byte 1)'); t.equal(ber[3], 0xff, 'value wrong (byte 2)'); t.equal(ber[4], 0xfe, 'value wrong (byte 3)'); t.end(); }); test('write 4 byte int', function(t) { var writer = new BerWriter(); writer.writeInt(0x7ffffffe); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 6, 'Wrong length for an int'); t.equal(ber[0], 0x02, 'ASN.1 tag wrong'); t.equal(ber[1], 0x04, 'length wrong'); t.equal(ber[2], 0x7f, 'value wrong (byte 1)'); t.equal(ber[3], 0xff, 'value wrong (byte 2)'); t.equal(ber[4], 0xff, 'value wrong (byte 3)'); t.equal(ber[5], 0xfe, 'value wrong (byte 4)'); t.end(); }); test('write boolean', function(t) { var writer = new BerWriter(); writer.writeBoolean(true); writer.writeBoolean(false); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 6, 'Wrong length'); t.equal(ber[0], 0x01, 'tag wrong'); t.equal(ber[1], 0x01, 'length wrong'); t.equal(ber[2], 0xff, 'value wrong'); t.equal(ber[3], 0x01, 'tag wrong'); t.equal(ber[4], 0x01, 'length wrong'); t.equal(ber[5], 0x00, 'value wrong'); t.end(); }); test('write string', function(t) { var writer = new BerWriter(); writer.writeString('hello world'); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 13, 'wrong length'); t.equal(ber[0], 0x04, 'wrong tag'); t.equal(ber[1], 11, 'wrong length'); t.equal(ber.slice(2).toString('utf8'), 'hello world', 'wrong value'); t.end(); }); test('write buffer', function(t) { var writer = new BerWriter(); // write some stuff to start with writer.writeString('hello world'); var ber = writer.buffer; var buf = new Buffer([0x04, 0x0b, 0x30, 0x09, 0x02, 0x01, 0x0f, 0x01, 0x01, 0xff, 0x01, 0x01, 0xff]); writer.writeBuffer(buf.slice(2, buf.length), 0x04); ber = writer.buffer; t.ok(ber); t.equal(ber.length, 26, 'wrong length'); t.equal(ber[0], 0x04, 'wrong tag'); t.equal(ber[1], 11, 'wrong length'); t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value'); t.equal(ber[13], buf[0], 'wrong tag'); t.equal(ber[14], buf[1], 'wrong length'); for (var i = 13, j = 0; i < ber.length && j < buf.length; i++, j++) { t.equal(ber[i], buf[j], 'buffer contents not identical'); } t.end(); }); test('write string array', function(t) { var writer = new BerWriter(); writer.writeStringArray(['hello world', 'fubar!']); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 21, 'wrong length'); t.equal(ber[0], 0x04, 'wrong tag'); t.equal(ber[1], 11, 'wrong length'); t.equal(ber.slice(2, 13).toString('utf8'), 'hello world', 'wrong value'); t.equal(ber[13], 0x04, 'wrong tag'); t.equal(ber[14], 6, 'wrong length'); t.equal(ber.slice(15).toString('utf8'), 'fubar!', 'wrong value'); t.end(); }); test('resize internal buffer', function(t) { var writer = new BerWriter({size: 2}); writer.writeString('hello world'); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 13, 'wrong length'); t.equal(ber[0], 0x04, 'wrong tag'); t.equal(ber[1], 11, 'wrong length'); t.equal(ber.slice(2).toString('utf8'), 'hello world', 'wrong value'); t.end(); }); test('sequence', function(t) { var writer = new BerWriter({size: 25}); writer.startSequence(); writer.writeString('hello world'); writer.endSequence(); var ber = writer.buffer; t.ok(ber); console.log(ber); t.equal(ber.length, 15, 'wrong length'); t.equal(ber[0], 0x30, 'wrong tag'); t.equal(ber[1], 13, 'wrong length'); t.equal(ber[2], 0x04, 'wrong tag'); t.equal(ber[3], 11, 'wrong length'); t.equal(ber.slice(4).toString('utf8'), 'hello world', 'wrong value'); t.end(); }); test('nested sequence', function(t) { var writer = new BerWriter({size: 25}); writer.startSequence(); writer.writeString('hello world'); writer.startSequence(); writer.writeString('hello world'); writer.endSequence(); writer.endSequence(); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 30, 'wrong length'); t.equal(ber[0], 0x30, 'wrong tag'); t.equal(ber[1], 28, 'wrong length'); t.equal(ber[2], 0x04, 'wrong tag'); t.equal(ber[3], 11, 'wrong length'); t.equal(ber.slice(4, 15).toString('utf8'), 'hello world', 'wrong value'); t.equal(ber[15], 0x30, 'wrong tag'); t.equal(ber[16], 13, 'wrong length'); t.equal(ber[17], 0x04, 'wrong tag'); t.equal(ber[18], 11, 'wrong length'); t.equal(ber.slice(19, 30).toString('utf8'), 'hello world', 'wrong value'); t.end(); }); test('LDAP bind message', function(t) { var dn = 'cn=foo,ou=unit,o=test'; var writer = new BerWriter(); writer.startSequence(); writer.writeInt(3); // msgid = 3 writer.startSequence(0x60); // ldap bind writer.writeInt(3); // ldap v3 writer.writeString(dn); writer.writeByte(0x80); writer.writeByte(0x00); writer.endSequence(); writer.endSequence(); var ber = writer.buffer; t.ok(ber); t.equal(ber.length, 35, 'wrong length (buffer)'); t.equal(ber[0], 0x30, 'wrong tag'); t.equal(ber[1], 33, 'wrong length'); t.equal(ber[2], 0x02, 'wrong tag'); t.equal(ber[3], 1, 'wrong length'); t.equal(ber[4], 0x03, 'wrong value'); t.equal(ber[5], 0x60, 'wrong tag'); t.equal(ber[6], 28, 'wrong length'); t.equal(ber[7], 0x02, 'wrong tag'); t.equal(ber[8], 1, 'wrong length'); t.equal(ber[9], 0x03, 'wrong value'); t.equal(ber[10], 0x04, 'wrong tag'); t.equal(ber[11], dn.length, 'wrong length'); t.equal(ber.slice(12, 33).toString('utf8'), dn, 'wrong value'); t.equal(ber[33], 0x80, 'wrong tag'); t.equal(ber[34], 0x00, 'wrong len'); t.end(); }); test('Write OID', function(t) { var oid = '1.2.840.113549.1.1.1'; var writer = new BerWriter(); writer.writeOID(oid); var ber = writer.buffer; t.ok(ber); console.log(require('util').inspect(ber)); console.log(require('util').inspect(new Buffer([0x06, 0x09, 0x2a, 0x86, 0x48, 0x86, 0xf7, 0x0d, 0x01, 0x01, 0x01]))); t.end(); }); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/����000755 �000766 �000024 �00000000000 12456115120 037245� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/index.js000644 �000766 �000024 �00000000500 12455173731 040150� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. // If you have no idea what ASN.1 or BER is, see this: // ftp://ftp.rsa.com/pub/pkcs/ascii/layman.asc var Ber = require('./ber/index'); ///--- Exported API module.exports = { Ber: Ber, BerReader: Ber.Reader, BerWriter: Ber.Writer }; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/errors.js������������000644 �000766 �000024 �00000000357 12455173731 041137� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. module.exports = { newInvalidAsn1Error: function(msg) { var e = new Error(); e.name = 'InvalidAsn1Error'; e.message = msg || ''; return e; } }; ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/index.js000644 �000766 �000024 �00000000725 12455173731 040731� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. var errors = require('./errors'); var types = require('./types'); var Reader = require('./reader'); var Writer = require('./writer'); ///--- Exports module.exports = { Reader: Reader, Writer: Writer }; for (var t in types) { if (types.hasOwnProperty(t)) module.exports[t] = types[t]; } for (var e in errors) { if (errors.hasOwnProperty(e)) module.exports[e] = errors[e]; } �������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/reader.js������������000644 �000766 �000024 �00000013200 12455173731 041054� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. var assert = require('assert'); var ASN1 = require('./types'); var errors = require('./errors'); ///--- Globals var newInvalidAsn1Error = errors.newInvalidAsn1Error; ///--- API function Reader(data) { if (!data || !Buffer.isBuffer(data)) throw new TypeError('data must be a node Buffer'); this._buf = data; this._size = data.length; // These hold the "current" state this._len = 0; this._offset = 0; var self = this; this.__defineGetter__('length', function() { return self._len; }); this.__defineGetter__('offset', function() { return self._offset; }); this.__defineGetter__('remain', function() { return self._size - self._offset; }); this.__defineGetter__('buffer', function() { return self._buf.slice(self._offset); }); } /** * Reads a single byte and advances offset; you can pass in `true` to make this * a "peek" operation (i.e., get the byte, but don't advance the offset). * * @param {Boolean} peek true means don't move offset. * @return {Number} the next byte, null if not enough data. */ Reader.prototype.readByte = function(peek) { if (this._size - this._offset < 1) return null; var b = this._buf[this._offset] & 0xff; if (!peek) this._offset += 1; return b; }; Reader.prototype.peek = function() { return this.readByte(true); }; /** * Reads a (potentially) variable length off the BER buffer. This call is * not really meant to be called directly, as callers have to manipulate * the internal buffer afterwards. * * As a result of this call, you can call `Reader.length`, until the * next thing called that does a readLength. * * @return {Number} the amount of offset to advance the buffer. * @throws {InvalidAsn1Error} on bad ASN.1 */ Reader.prototype.readLength = function(offset) { if (offset === undefined) offset = this._offset; if (offset >= this._size) return null; var lenB = this._buf[offset++] & 0xff; if (lenB === null) return null; if ((lenB & 0x80) == 0x80) { lenB &= 0x7f; if (lenB == 0) throw newInvalidAsn1Error('Indefinite length not supported'); if (lenB > 4) throw newInvalidAsn1Error('encoding too long'); if (this._size - offset < lenB) return null; this._len = 0; for (var i = 0; i < lenB; i++) this._len = (this._len << 8) + (this._buf[offset++] & 0xff); } else { // Wasn't a variable length this._len = lenB; } return offset; }; /** * Parses the next sequence in this BER buffer. * * To get the length of the sequence, call `Reader.length`. * * @return {Number} the sequence's tag. */ Reader.prototype.readSequence = function(tag) { var seq = this.peek(); if (seq === null) return null; if (tag !== undefined && tag !== seq) throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) + ': got 0x' + seq.toString(16)); var o = this.readLength(this._offset + 1); // stored in `length` if (o === null) return null; this._offset = o; return seq; }; Reader.prototype.readInt = function() { return this._readTag(ASN1.Integer); }; Reader.prototype.readBoolean = function() { return (this._readTag(ASN1.Boolean) === 0 ? false : true); }; Reader.prototype.readEnumeration = function() { return this._readTag(ASN1.Enumeration); }; Reader.prototype.readString = function(tag, retbuf) { if (!tag) tag = ASN1.OctetString; var b = this.peek(); if (b === null) return null; if (b !== tag) throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) + ': got 0x' + b.toString(16)); var o = this.readLength(this._offset + 1); // stored in `length` if (o === null) return null; if (this.length > this._size - o) return null; this._offset = o; if (this.length === 0) return ''; var str = this._buf.slice(this._offset, this._offset + this.length); this._offset += this.length; return retbuf ? str : str.toString('utf8'); }; Reader.prototype.readOID = function(tag) { if (!tag) tag = ASN1.OID; var b = this.peek(); if (b === null) return null; if (b !== tag) throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) + ': got 0x' + b.toString(16)); var o = this.readLength(this._offset + 1); // stored in `length` if (o === null) return null; if (this.length > this._size - o) return null; this._offset = o; var values = []; var value = 0; for (var i = 0; i < this.length; i++) { var byte = this._buf[this._offset++] & 0xff; value <<= 7; value += byte & 0x7f; if ((byte & 0x80) == 0) { values.push(value); value = 0; } } value = values.shift(); values.unshift(value % 40); values.unshift((value / 40) >> 0); return values.join('.'); }; Reader.prototype._readTag = function(tag) { assert.ok(tag !== undefined); var b = this.peek(); if (b === null) return null; if (b !== tag) throw newInvalidAsn1Error('Expected 0x' + tag.toString(16) + ': got 0x' + b.toString(16)); var o = this.readLength(this._offset + 1); // stored in `length` if (o === null) return null; if (this.length > 4) throw newInvalidAsn1Error('Integer too long: ' + this.length); if (this.length > this._size - o) return null; this._offset = o; var fb = this._buf[this._offset++]; var value = 0; value = fb & 0x7F; for (var i = 1; i < this.length; i++) { value <<= 8; value |= (this._buf[this._offset++] & 0xff); } if ((fb & 0x80) == 0x80) value = -value; return value; }; ///--- Exported API module.exports = Reader; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/types.js000644 �000766 �000024 �00000001176 12455173731 040767� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. module.exports = { EOC: 0, Boolean: 1, Integer: 2, BitString: 3, OctetString: 4, Null: 5, OID: 6, ObjectDescriptor: 7, External: 8, Real: 9, // float Enumeration: 10, PDV: 11, Utf8String: 12, RelativeOID: 13, Sequence: 16, Set: 17, NumericString: 18, PrintableString: 19, T61String: 20, VideotexString: 21, IA5String: 22, UTCTime: 23, GeneralizedTime: 24, GraphicString: 25, VisibleString: 26, GeneralString: 28, UniversalString: 29, CharacterString: 30, BMPString: 31, Constructor: 32, Context: 128 }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/http-signature/node_modules/asn1/lib/ber/writer.js������������000644 �000766 �000024 �00000016663 12455173731 041146� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Mark Cavage <mcavage@gmail.com> All rights reserved. var assert = require('assert'); var ASN1 = require('./types'); var errors = require('./errors'); ///--- Globals var newInvalidAsn1Error = errors.newInvalidAsn1Error; var DEFAULT_OPTS = { size: 1024, growthFactor: 8 }; ///--- Helpers function merge(from, to) { assert.ok(from); assert.equal(typeof(from), 'object'); assert.ok(to); assert.equal(typeof(to), 'object'); var keys = Object.getOwnPropertyNames(from); keys.forEach(function(key) { if (to[key]) return; var value = Object.getOwnPropertyDescriptor(from, key); Object.defineProperty(to, key, value); }); return to; } ///--- API function Writer(options) { options = merge(DEFAULT_OPTS, options || {}); this._buf = new Buffer(options.size || 1024); this._size = this._buf.length; this._offset = 0; this._options = options; // A list of offsets in the buffer where we need to insert // sequence tag/len pairs. this._seq = []; var self = this; this.__defineGetter__('buffer', function() { if (self._seq.length) throw new InvalidAsn1Error(self._seq.length + ' unended sequence(s)'); return self._buf.slice(0, self._offset); }); } Writer.prototype.writeByte = function(b) { if (typeof(b) !== 'number') throw new TypeError('argument must be a Number'); this._ensure(1); this._buf[this._offset++] = b; }; Writer.prototype.writeInt = function(i, tag) { if (typeof(i) !== 'number') throw new TypeError('argument must be a Number'); if (typeof(tag) !== 'number') tag = ASN1.Integer; var sz = 4; while ((((i & 0xff800000) === 0) || ((i & 0xff800000) === 0xff800000)) && (sz > 1)) { sz--; i <<= 8; } if (sz > 4) throw new InvalidAsn1Error('BER ints cannot be > 0xffffffff'); this._ensure(2 + sz); this._buf[this._offset++] = tag; this._buf[this._offset++] = sz; while (sz-- > 0) { this._buf[this._offset++] = ((i & 0xff000000) >> 24); i <<= 8; } }; Writer.prototype.writeNull = function() { this.writeByte(ASN1.Null); this.writeByte(0x00); }; Writer.prototype.writeEnumeration = function(i, tag) { if (typeof(i) !== 'number') throw new TypeError('argument must be a Number'); if (typeof(tag) !== 'number') tag = ASN1.Enumeration; return this.writeInt(i, tag); }; Writer.prototype.writeBoolean = function(b, tag) { if (typeof(b) !== 'boolean') throw new TypeError('argument must be a Boolean'); if (typeof(tag) !== 'number') tag = ASN1.Boolean; this._ensure(3); this._buf[this._offset++] = tag; this._buf[this._offset++] = 0x01; this._buf[this._offset++] = b ? 0xff : 0x00; }; Writer.prototype.writeString = function(s, tag) { if (typeof(s) !== 'string') throw new TypeError('argument must be a string (was: ' + typeof(s) + ')'); if (typeof(tag) !== 'number') tag = ASN1.OctetString; var len = Buffer.byteLength(s); this.writeByte(tag); this.writeLength(len); if (len) { this._ensure(len); this._buf.write(s, this._offset); this._offset += len; } }; Writer.prototype.writeBuffer = function(buf, tag) { if (typeof(tag) !== 'number') throw new TypeError('tag must be a number'); if (!Buffer.isBuffer(buf)) throw new TypeError('argument must be a buffer'); this.writeByte(tag); this.writeLength(buf.length); this._ensure(buf.length); buf.copy(this._buf, this._offset, 0, buf.length); this._offset += buf.length; }; Writer.prototype.writeStringArray = function(strings) { if ((!strings instanceof Array)) throw new TypeError('argument must be an Array[String]'); var self = this; strings.forEach(function(s) { self.writeString(s); }); }; // This is really to solve DER cases, but whatever for now Writer.prototype.writeOID = function(s, tag) { if (typeof(s) !== 'string') throw new TypeError('argument must be a string'); if (typeof(tag) !== 'number') tag = ASN1.OID; if (!/^([0-9]+\.){3,}[0-9]+$/.test(s)) throw new Error('argument is not a valid OID string'); function encodeOctet(bytes, octet) { if (octet < 128) { bytes.push(octet); } else if (octet < 16384) { bytes.push((octet >>> 7) | 0x80); bytes.push(octet & 0x7F); } else if (octet < 2097152) { bytes.push((octet >>> 14) | 0x80); bytes.push(((octet >>> 7) | 0x80) & 0xFF); bytes.push(octet & 0x7F); } else if (octet < 268435456) { bytes.push((octet >>> 21) | 0x80); bytes.push(((octet >>> 14) | 0x80) & 0xFF); bytes.push(((octet >>> 7) | 0x80) & 0xFF); bytes.push(octet & 0x7F); } else { bytes.push(((octet >>> 28) | 0x80) & 0xFF); bytes.push(((octet >>> 21) | 0x80) & 0xFF); bytes.push(((octet >>> 14) | 0x80) & 0xFF); bytes.push(((octet >>> 7) | 0x80) & 0xFF); bytes.push(octet & 0x7F); } } var tmp = s.split('.'); var bytes = []; bytes.push(parseInt(tmp[0], 10) * 40 + parseInt(tmp[1], 10)); tmp.slice(2).forEach(function(b) { encodeOctet(bytes, parseInt(b, 10)); }); var self = this; this._ensure(2 + bytes.length); this.writeByte(tag); this.writeLength(bytes.length); bytes.forEach(function(b) { self.writeByte(b); }); }; Writer.prototype.writeLength = function(len) { if (typeof(len) !== 'number') throw new TypeError('argument must be a Number'); this._ensure(4); if (len <= 0x7f) { this._buf[this._offset++] = len; } else if (len <= 0xff) { this._buf[this._offset++] = 0x81; this._buf[this._offset++] = len; } else if (len <= 0xffff) { this._buf[this._offset++] = 0x82; this._buf[this._offset++] = len >> 8; this._buf[this._offset++] = len; } else if (len <= 0xffffff) { this._shift(start, len, 1); this._buf[this._offset++] = 0x83; this._buf[this._offset++] = len >> 16; this._buf[this._offset++] = len >> 8; this._buf[this._offset++] = len; } else { throw new InvalidAsn1ERror('Length too long (> 4 bytes)'); } }; Writer.prototype.startSequence = function(tag) { if (typeof(tag) !== 'number') tag = ASN1.Sequence | ASN1.Constructor; this.writeByte(tag); this._seq.push(this._offset); this._ensure(3); this._offset += 3; }; Writer.prototype.endSequence = function() { var seq = this._seq.pop(); var start = seq + 3; var len = this._offset - start; if (len <= 0x7f) { this._shift(start, len, -2); this._buf[seq] = len; } else if (len <= 0xff) { this._shift(start, len, -1); this._buf[seq] = 0x81; this._buf[seq + 1] = len; } else if (len <= 0xffff) { this._buf[seq] = 0x82; this._buf[seq + 1] = len >> 8; this._buf[seq + 2] = len; } else if (len <= 0xffffff) { this._shift(start, len, 1); this._buf[seq] = 0x83; this._buf[seq + 1] = len >> 16; this._buf[seq + 2] = len >> 8; this._buf[seq + 3] = len; } else { throw new InvalidAsn1Error('Sequence too long'); } }; Writer.prototype._shift = function(start, len, shift) { assert.ok(start !== undefined); assert.ok(len !== undefined); assert.ok(shift); this._buf.copy(this._buf, start + shift, start, start + len); this._offset += shift; }; Writer.prototype._ensure = function(len) { assert.ok(len); if (this._size - this._offset < len) { var sz = this._size * this._options.growthFactor; if (sz - this._offset < len) sz += len; var buf = new Buffer(sz); this._buf.copy(buf, 0, 0, this._offset); this._buf = buf; this._size = sz; } }; ///--- Exported API module.exports = Writer; �����������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/index.js������������������000644 �000766 �000024 �00000000771 12455173731 034643� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Joyent, Inc. All rights reserved. var parser = require('./parser'); var signer = require('./signer'); var verify = require('./verify'); var util = require('./util'); ///--- API module.exports = { parse: parser.parseRequest, parseRequest: parser.parseRequest, sign: signer.signRequest, signRequest: signer.signRequest, sshKeyToPEM: util.sshKeyToPEM, sshKeyFingerprint: util.fingerprint, verify: verify.verifySignature, verifySignature: verify.verifySignature }; �������lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/parser.js�����������������000644 �000766 �000024 �00000021240 12455173731 035022� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2012 Joyent, Inc. All rights reserved. var assert = require('assert-plus'); var util = require('util'); ///--- Globals var Algorithms = { 'rsa-sha1': true, 'rsa-sha256': true, 'rsa-sha512': true, 'dsa-sha1': true, 'hmac-sha1': true, 'hmac-sha256': true, 'hmac-sha512': true }; var State = { New: 0, Params: 1 }; var ParamsState = { Name: 0, Quote: 1, Value: 2, Comma: 3 }; ///--- Specific Errors function HttpSignatureError(message, caller) { if (Error.captureStackTrace) Error.captureStackTrace(this, caller || HttpSignatureError); this.message = message; this.name = caller.name; } util.inherits(HttpSignatureError, Error); function ExpiredRequestError(message) { HttpSignatureError.call(this, message, ExpiredRequestError); } util.inherits(ExpiredRequestError, HttpSignatureError); function InvalidHeaderError(message) { HttpSignatureError.call(this, message, InvalidHeaderError); } util.inherits(InvalidHeaderError, HttpSignatureError); function InvalidParamsError(message) { HttpSignatureError.call(this, message, InvalidParamsError); } util.inherits(InvalidParamsError, HttpSignatureError); function MissingHeaderError(message) { HttpSignatureError.call(this, message, MissingHeaderError); } util.inherits(MissingHeaderError, HttpSignatureError); ///--- Exported API module.exports = { /** * Parses the 'Authorization' header out of an http.ServerRequest object. * * Note that this API will fully validate the Authorization header, and throw * on any error. It will not however check the signature, or the keyId format * as those are specific to your environment. You can use the options object * to pass in extra constraints. * * As a response object you can expect this: * * { * "scheme": "Signature", * "params": { * "keyId": "foo", * "algorithm": "rsa-sha256", * "headers": [ * "date" or "x-date", * "content-md5" * ], * "signature": "base64" * }, * "signingString": "ready to be passed to crypto.verify()" * } * * @param {Object} request an http.ServerRequest. * @param {Object} options an optional options object with: * - clockSkew: allowed clock skew in seconds (default 300). * - headers: required header names (def: date or x-date) * - algorithms: algorithms to support (default: all). * @return {Object} parsed out object (see above). * @throws {TypeError} on invalid input. * @throws {InvalidHeaderError} on an invalid Authorization header error. * @throws {InvalidParamsError} if the params in the scheme are invalid. * @throws {MissingHeaderError} if the params indicate a header not present, * either in the request headers from the params, * or not in the params from a required header * in options. * @throws {ExpiredRequestError} if the value of date or x-date exceeds skew. */ parseRequest: function parseRequest(request, options) { assert.object(request, 'request'); assert.object(request.headers, 'request.headers'); if (options === undefined) { options = {}; } if (options.headers === undefined) { options.headers = [request.headers['x-date'] ? 'x-date' : 'date']; } assert.object(options, 'options'); assert.arrayOfString(options.headers, 'options.headers'); assert.optionalNumber(options.clockSkew, 'options.clockSkew'); if (!request.headers.authorization) throw new MissingHeaderError('no authorization header present in ' + 'the request'); options.clockSkew = options.clockSkew || 300; var i = 0; var state = State.New; var substate = ParamsState.Name; var tmpName = ''; var tmpValue = ''; var parsed = { scheme: '', params: {}, signingString: '', get algorithm() { return this.params.algorithm.toUpperCase(); }, get keyId() { return this.params.keyId; } }; var authz = request.headers.authorization; for (i = 0; i < authz.length; i++) { var c = authz.charAt(i); switch (Number(state)) { case State.New: if (c !== ' ') parsed.scheme += c; else state = State.Params; break; case State.Params: switch (Number(substate)) { case ParamsState.Name: var code = c.charCodeAt(0); // restricted name of A-Z / a-z if ((code >= 0x41 && code <= 0x5a) || // A-Z (code >= 0x61 && code <= 0x7a)) { // a-z tmpName += c; } else if (c === '=') { if (tmpName.length === 0) throw new InvalidHeaderError('bad param format'); substate = ParamsState.Quote; } else { throw new InvalidHeaderError('bad param format'); } break; case ParamsState.Quote: if (c === '"') { tmpValue = ''; substate = ParamsState.Value; } else { throw new InvalidHeaderError('bad param format'); } break; case ParamsState.Value: if (c === '"') { parsed.params[tmpName] = tmpValue; substate = ParamsState.Comma; } else { tmpValue += c; } break; case ParamsState.Comma: if (c === ',') { tmpName = ''; substate = ParamsState.Name; } else { throw new InvalidHeaderError('bad param format'); } break; default: throw new Error('Invalid substate'); } break; default: throw new Error('Invalid substate'); } } if (!parsed.params.headers || parsed.params.headers === '') { if (request.headers['x-date']) { parsed.params.headers = ['x-date']; } else { parsed.params.headers = ['date']; } } else { parsed.params.headers = parsed.params.headers.split(' '); } // Minimally validate the parsed object if (!parsed.scheme || parsed.scheme !== 'Signature') throw new InvalidHeaderError('scheme was not "Signature"'); if (!parsed.params.keyId) throw new InvalidHeaderError('keyId was not specified'); if (!parsed.params.algorithm) throw new InvalidHeaderError('algorithm was not specified'); if (!parsed.params.signature) throw new InvalidHeaderError('signature was not specified'); // Check the algorithm against the official list parsed.params.algorithm = parsed.params.algorithm.toLowerCase(); if (!Algorithms[parsed.params.algorithm]) throw new InvalidParamsError(parsed.params.algorithm + ' is not supported'); // Build the signingString for (i = 0; i < parsed.params.headers.length; i++) { var h = parsed.params.headers[i].toLowerCase(); parsed.params.headers[i] = h; if (h !== 'request-line') { var value = request.headers[h]; if (!value) throw new MissingHeaderError(h + ' was not in the request'); parsed.signingString += h + ': ' + value; } else { parsed.signingString += request.method + ' ' + request.url + ' HTTP/' + request.httpVersion; } if ((i + 1) < parsed.params.headers.length) parsed.signingString += '\n'; } // Check against the constraints var date; if (request.headers.date || request.headers['x-date']) { if (request.headers['x-date']) { date = new Date(request.headers['x-date']); } else { date = new Date(request.headers.date); } var now = new Date(); var skew = Math.abs(now.getTime() - date.getTime()); if (skew > options.clockSkew * 1000) { throw new ExpiredRequestError('clock skew of ' + (skew / 1000) + 's was greater than ' + options.clockSkew + 's'); } } options.headers.forEach(function (hdr) { // Remember that we already checked any headers in the params // were in the request, so if this passes we're good. if (parsed.params.headers.indexOf(hdr) < 0) throw new MissingHeaderError(hdr + ' was not a signed header'); }); if (options.algorithms) { if (options.algorithms.indexOf(parsed.params.algorithm) === -1) throw new InvalidParamsError(parsed.params.algorithm + ' is not a supported algorithm'); } return parsed; } }; ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/signer.js�����������������000644 �000766 �000024 �00000012143 12455173731 035017� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2012 Joyent, Inc. All rights reserved. var assert = require('assert-plus'); var crypto = require('crypto'); var http = require('http'); var sprintf = require('util').format; ///--- Globals var Algorithms = { 'rsa-sha1': true, 'rsa-sha256': true, 'rsa-sha512': true, 'dsa-sha1': true, 'hmac-sha1': true, 'hmac-sha256': true, 'hmac-sha512': true }; var Authorization = 'Signature keyId="%s",algorithm="%s",headers="%s",signature="%s"'; ///--- Specific Errors function MissingHeaderError(message) { this.name = 'MissingHeaderError'; this.message = message; this.stack = (new Error()).stack; } MissingHeaderError.prototype = new Error(); function InvalidAlgorithmError(message) { this.name = 'InvalidAlgorithmError'; this.message = message; this.stack = (new Error()).stack; } InvalidAlgorithmError.prototype = new Error(); ///--- Internal Functions function _pad(val) { if (parseInt(val, 10) < 10) { val = '0' + val; } return val; } function _rfc1123() { var date = new Date(); var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec']; var days = ['Sun', 'Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat']; return days[date.getUTCDay()] + ', ' + _pad(date.getUTCDate()) + ' ' + months[date.getUTCMonth()] + ' ' + date.getUTCFullYear() + ' ' + _pad(date.getUTCHours()) + ':' + _pad(date.getUTCMinutes()) + ':' + _pad(date.getUTCSeconds()) + ' GMT'; } ///--- Exported API module.exports = { /** * Adds an 'Authorization' header to an http.ClientRequest object. * * Note that this API will add a Date header if it's not already set. Any * other headers in the options.headers array MUST be present, or this * will throw. * * You shouldn't need to check the return type; it's just there if you want * to be pedantic. * * @param {Object} request an instance of http.ClientRequest. * @param {Object} options signing parameters object: * - {String} keyId required. * - {String} key required (either a PEM or HMAC key). * - {Array} headers optional; defaults to ['date']. * - {String} algorithm optional; defaults to 'rsa-sha256'. * - {String} httpVersion optional; defaults to '1.1'. * @return {Boolean} true if Authorization (and optionally Date) were added. * @throws {TypeError} on bad parameter types (input). * @throws {InvalidAlgorithmError} if algorithm was bad. * @throws {MissingHeaderError} if a header to be signed was specified but * was not present. */ signRequest: function signRequest(request, options) { assert.object(request, 'request'); assert.object(options, 'options'); assert.optionalString(options.algorithm, 'options.algorithm'); assert.string(options.keyId, 'options.keyId'); assert.optionalArrayOfString(options.headers, 'options.headers'); assert.optionalString(options.httpVersion, 'options.httpVersion'); if (!request.getHeader('Date')) request.setHeader('Date', _rfc1123()); if (!options.headers) options.headers = ['date']; if (!options.algorithm) options.algorithm = 'rsa-sha256'; if (!options.httpVersion) options.httpVersion = '1.1'; options.algorithm = options.algorithm.toLowerCase(); if (!Algorithms[options.algorithm]) throw new InvalidAlgorithmError(options.algorithm + ' is not supported'); var i; var stringToSign = ''; for (i = 0; i < options.headers.length; i++) { if (typeof (options.headers[i]) !== 'string') throw new TypeError('options.headers must be an array of Strings'); var h = options.headers[i].toLowerCase(); if (h !== 'request-line') { var value = request.getHeader(h); if (!value) { throw new MissingHeaderError(h + ' was not in the request'); } stringToSign += h + ': ' + value; } else { value = stringToSign += request.method + ' ' + request.path + ' HTTP/' + options.httpVersion; } if ((i + 1) < options.headers.length) stringToSign += '\n'; } var alg = options.algorithm.match(/(hmac|rsa)-(\w+)/); var signature; if (alg[1] === 'hmac') { var hmac = crypto.createHmac(alg[2].toUpperCase(), options.key); hmac.update(stringToSign); signature = hmac.digest('base64'); } else { var signer = crypto.createSign(options.algorithm.toUpperCase()); signer.update(stringToSign); signature = signer.sign(options.key, 'base64'); } request.setHeader('Authorization', sprintf(Authorization, options.keyId, options.algorithm, options.headers.join(' '), signature)); return true; } }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/util.js�������������������000644 �000766 �000024 �00000011653 12455173731 034512� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2012 Joyent, Inc. All rights reserved. var assert = require('assert-plus'); var crypto = require('crypto'); var asn1 = require('asn1'); var ctype = require('ctype'); ///--- Helpers function readNext(buffer, offset) { var len = ctype.ruint32(buffer, 'big', offset); offset += 4; var newOffset = offset + len; return { data: buffer.slice(offset, newOffset), offset: newOffset }; } function writeInt(writer, buffer) { writer.writeByte(0x02); // ASN1.Integer writer.writeLength(buffer.length); for (var i = 0; i < buffer.length; i++) writer.writeByte(buffer[i]); return writer; } function rsaToPEM(key) { var buffer; var der; var exponent; var i; var modulus; var newKey = ''; var offset = 0; var type; var tmp; try { buffer = new Buffer(key.split(' ')[1], 'base64'); tmp = readNext(buffer, offset); type = tmp.data.toString(); offset = tmp.offset; if (type !== 'ssh-rsa') throw new Error('Invalid ssh key type: ' + type); tmp = readNext(buffer, offset); exponent = tmp.data; offset = tmp.offset; tmp = readNext(buffer, offset); modulus = tmp.data; } catch (e) { throw new Error('Invalid ssh key: ' + key); } // DER is a subset of BER der = new asn1.BerWriter(); der.startSequence(); der.startSequence(); der.writeOID('1.2.840.113549.1.1.1'); der.writeNull(); der.endSequence(); der.startSequence(0x03); // bit string der.writeByte(0x00); // Actual key der.startSequence(); writeInt(der, modulus); writeInt(der, exponent); der.endSequence(); // bit string der.endSequence(); der.endSequence(); tmp = der.buffer.toString('base64'); for (i = 0; i < tmp.length; i++) { if ((i % 64) === 0) newKey += '\n'; newKey += tmp.charAt(i); } if (!/\\n$/.test(newKey)) newKey += '\n'; return '-----BEGIN PUBLIC KEY-----' + newKey + '-----END PUBLIC KEY-----\n'; } function dsaToPEM(key) { var buffer; var offset = 0; var tmp; var der; var newKey = ''; var type; var p; var q; var g; var y; try { buffer = new Buffer(key.split(' ')[1], 'base64'); tmp = readNext(buffer, offset); type = tmp.data.toString(); offset = tmp.offset; /* JSSTYLED */ if (!/^ssh-ds[as].*/.test(type)) throw new Error('Invalid ssh key type: ' + type); tmp = readNext(buffer, offset); p = tmp.data; offset = tmp.offset; tmp = readNext(buffer, offset); q = tmp.data; offset = tmp.offset; tmp = readNext(buffer, offset); g = tmp.data; offset = tmp.offset; tmp = readNext(buffer, offset); y = tmp.data; } catch (e) { console.log(e.stack); throw new Error('Invalid ssh key: ' + key); } // DER is a subset of BER der = new asn1.BerWriter(); der.startSequence(); der.startSequence(); der.writeOID('1.2.840.10040.4.1'); der.startSequence(); writeInt(der, p); writeInt(der, q); writeInt(der, g); der.endSequence(); der.endSequence(); der.startSequence(0x03); // bit string der.writeByte(0x00); writeInt(der, y); der.endSequence(); der.endSequence(); tmp = der.buffer.toString('base64'); for (var i = 0; i < tmp.length; i++) { if ((i % 64) === 0) newKey += '\n'; newKey += tmp.charAt(i); } if (!/\\n$/.test(newKey)) newKey += '\n'; return '-----BEGIN PUBLIC KEY-----' + newKey + '-----END PUBLIC KEY-----\n'; } ///--- API module.exports = { /** * Converts an OpenSSH public key (rsa only) to a PKCS#8 PEM file. * * The intent of this module is to interoperate with OpenSSL only, * specifically the node crypto module's `verify` method. * * @param {String} key an OpenSSH public key. * @return {String} PEM encoded form of the RSA public key. * @throws {TypeError} on bad input. * @throws {Error} on invalid ssh key formatted data. */ sshKeyToPEM: function sshKeyToPEM(key) { assert.string(key, 'ssh_key'); /* JSSTYLED */ if (/^ssh-rsa.*/.test(key)) return rsaToPEM(key); /* JSSTYLED */ if (/^ssh-ds[as].*/.test(key)) return dsaToPEM(key); throw new Error('Only RSA and DSA public keys are allowed'); }, /** * Generates an OpenSSH fingerprint from an ssh public key. * * @param {String} key an OpenSSH public key. * @return {String} key fingerprint. * @throws {TypeError} on bad input. * @throws {Error} if what you passed doesn't look like an ssh public key. */ fingerprint: function fingerprint(key) { assert.string(key, 'ssh_key'); var pieces = key.split(' '); if (!pieces || !pieces.length || pieces.length < 2) throw new Error('invalid ssh key'); var data = new Buffer(pieces[1], 'base64'); var hash = crypto.createHash('md5'); hash.update(data); var digest = hash.digest('hex'); var fp = ''; for (var i = 0; i < digest.length; i++) { if (i && i % 2 === 0) fp += ':'; fp += digest[i]; } return fp; } }; �������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/http-signature/lib/verify.js�����������������000644 �000766 �000024 �00000002600 12455173731 035031� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Copyright 2011 Joyent, Inc. All rights reserved. var assert = require('assert-plus'); var crypto = require('crypto'); ///--- Exported API module.exports = { /** * Simply wraps up the node crypto operations for you, and returns * true or false. You are expected to pass in an object that was * returned from `parse()`. * * @param {Object} parsedSignature the object you got from `parse`. * @param {String} key either an RSA private key PEM or HMAC secret. * @return {Boolean} true if valid, false otherwise. * @throws {TypeError} if you pass in bad arguments. */ verifySignature: function verifySignature(parsedSignature, key) { assert.object(parsedSignature, 'parsedSignature'); assert.string(key, 'key'); var alg = parsedSignature.algorithm.match(/(HMAC|RSA|DSA)-(\w+)/); if (!alg || alg.length !== 3) throw new TypeError('parsedSignature: unsupported algorithm ' + parsedSignature.algorithm); if (alg[1] === 'HMAC') { var hmac = crypto.createHmac(alg[2].toUpperCase(), key); hmac.update(parsedSignature.signingString); return (hmac.digest('base64') === parsedSignature.params.signature); } else { var verify = crypto.createVerify(alg[0]); verify.update(parsedSignature.signingString); return verify.verify(key, parsedSignature.params.signature, 'base64'); } } }; ��������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/.npmignore�������000644 �000766 �000024 �00000000304 12455173731 032452� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/.travis.yml������000755 �000766 �000024 �00000000053 12455173731 032570� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/example/���������000755 �000766 �000024 �00000000000 12456115120 032076� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/����������000755 �000766 �000024 �00000000000 12456115120 031710� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/index.js���������000755 �000766 �000024 �00000000042 12455173731 032122� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib');����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/�������������000755 �000766 �000024 �00000000000 12456115120 031211� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/LICENSE����������000755 �000766 �000024 �00000002722 12455173731 031471� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2012-2013, Eran Hammer. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Eran Hammer nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL ERAN HAMMER BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/Makefile���������000755 �000766 �000024 �00000000473 12455173731 032125� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������test: @node node_modules/lab/bin/lab test-cov: @node node_modules/lab/bin/lab -r threshold -t 100 test-cov-html: @node node_modules/lab/bin/lab -r html -o coverage.html complexity: @node node_modules/complexity-report/src/cli.js -o complexity.md -f markdown lib .PHONY: test test-cov test-cov-html complexity �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/����000755 �000766 �000024 �00000000000 12456115120 033120� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/package.json�����000755 �000766 �000024 �00000003036 12455173731 032751� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "hawk", "description": "HTTP Hawk Authentication Scheme", "version": "1.1.1", "author": { "name": "Eran Hammer", "email": "eran@hueniverse.com", "url": "http://hueniverse.com" }, "contributors": [], "repository": { "type": "git", "url": "git://github.com/hueniverse/hawk" }, "main": "index", "keywords": [ "http", "authentication", "scheme", "hawk" ], "engines": { "node": ">=0.8.0" }, "dependencies": { "hoek": "0.9.x", "boom": "0.4.x", "cryptiles": "0.2.x", "sntp": "0.2.x" }, "devDependencies": { "lab": "0.1.x", "complexity-report": "0.x.x", "localStorage": "1.0.x" }, "scripts": { "test": "make test-cov" }, "licenses": [ { "type": "BSD", "url": "http://github.com/hueniverse/hawk/raw/master/LICENSE" } ], "bugs": { "url": "https://github.com/hueniverse/hawk/issues" }, "_id": "hawk@1.1.1", "dist": { "shasum": "87cd491f9b46e4e2aeaca335416766885d2d1ed9", "tarball": "http://registry.npmjs.org/hawk/-/hawk-1.1.1.tgz" }, "_from": "hawk@1.1.1", "_npmVersion": "1.3.8", "_npmUser": { "name": "hueniverse", "email": "eran@hueniverse.com" }, "maintainers": [ { "name": "hueniverse", "email": "eran@hueniverse.com" } ], "directories": {}, "_shasum": "87cd491f9b46e4e2aeaca335416766885d2d1ed9", "_resolved": "https://registry.npmjs.org/hawk/-/hawk-1.1.1.tgz", "readme": "ERROR: No README data found!", "homepage": "https://github.com/hueniverse/hawk" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/README.md��������000755 �000766 �000024 �00000073123 12455173731 031746� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������![hawk Logo](https://raw.github.com/hueniverse/hawk/master/images/hawk.png) <img align="right" src="https://raw.github.com/hueniverse/hawk/master/images/logo.png" /> **Hawk** is an HTTP authentication scheme using a message authentication code (MAC) algorithm to provide partial HTTP request cryptographic verification. For more complex use cases such as access delegation, see [Oz](https://github.com/hueniverse/oz). Current version: **1.0** [![Build Status](https://secure.travis-ci.org/hueniverse/hawk.png)](http://travis-ci.org/hueniverse/hawk) # Table of Content - [**Introduction**](#introduction) - [Replay Protection](#replay-protection) - [Usage Example](#usage-example) - [Protocol Example](#protocol-example) - [Payload Validation](#payload-validation) - [Response Payload Validation](#response-payload-validation) - [Browser Support and Considerations](#browser-support-and-considerations) <p></p> - [**Single URI Authorization**](#single-uri-authorization) - [Usage Example](#bewit-usage-example) <p></p> - [**Security Considerations**](#security-considerations) - [MAC Keys Transmission](#mac-keys-transmission) - [Confidentiality of Requests](#confidentiality-of-requests) - [Spoofing by Counterfeit Servers](#spoofing-by-counterfeit-servers) - [Plaintext Storage of Credentials](#plaintext-storage-of-credentials) - [Entropy of Keys](#entropy-of-keys) - [Coverage Limitations](#coverage-limitations) - [Future Time Manipulation](#future-time-manipulation) - [Client Clock Poisoning](#client-clock-poisoning) - [Bewit Limitations](#bewit-limitations) - [Host Header Forgery](#host-header-forgery) <p></p> - [**Frequently Asked Questions**](#frequently-asked-questions) <p></p> - [**Acknowledgements**](#acknowledgements) # Introduction **Hawk** is an HTTP authentication scheme providing mechanisms for making authenticated HTTP requests with partial cryptographic verification of the request and response, covering the HTTP method, request URI, host, and optionally the request payload. Similar to the HTTP [Digest access authentication schemes](http://www.ietf.org/rfc/rfc2617.txt), **Hawk** uses a set of client credentials which include an identifier (e.g. username) and key (e.g. password). Likewise, just as with the Digest scheme, the key is never included in authenticated requests. Instead, it is used to calculate a request MAC value which is included in its place. However, **Hawk** has several differences from Digest. In particular, while both use a nonce to limit the possibility of replay attacks, in **Hawk** the client generates the nonce and uses it in combination with a timestamp, leading to less "chattiness" (interaction with the server). Also unlike Digest, this scheme is not intended to protect the key itself (the password in Digest) because the client and server must both have access to the key material in the clear. The primary design goals of this scheme are to: * simplify and improve HTTP authentication for services that are unwilling or unable to deploy TLS for all resources, * secure credentials against leakage (e.g., when the client uses some form of dynamic configuration to determine where to send an authenticated request), and * avoid the exposure of credentials sent to a malicious server over an unauthenticated secure channel due to client failure to validate the server's identity as part of its TLS handshake. In addition, **Hawk** supports a method for granting third-parties temporary access to individual resources using a query parameter called _bewit_ (in falconry, a leather strap used to attach a tracking device to the leg of a hawk). The **Hawk** scheme requires the establishment of a shared symmetric key between the client and the server, which is beyond the scope of this module. Typically, the shared credentials are established via an initial TLS-protected phase or derived from some other shared confidential information available to both the client and the server. ## Replay Protection Without replay protection, an attacker can use a compromised (but otherwise valid and authenticated) request more than once, gaining access to a protected resource. To mitigate this, clients include both a nonce and a timestamp when making requests. This gives the server enough information to prevent replay attacks. The nonce is generated by the client, and is a string unique across all requests with the same timestamp and key identifier combination. The timestamp enables the server to restrict the validity period of the credentials where requests occuring afterwards are rejected. It also removes the need for the server to retain an unbounded number of nonce values for future checks. By default, **Hawk** uses a time window of 1 minute to allow for time skew between the client and server (which in practice translates to a maximum of 2 minutes as the skew can be positive or negative). Using a timestamp requires the client's clock to be in sync with the server's clock. **Hawk** requires both the client clock and the server clock to use NTP to ensure synchronization. However, given the limitations of some client types (e.g. browsers) to deploy NTP, the server provides the client with its current time (in seconds precision) in response to a bad timestamp. There is no expectation that the client will adjust its system clock to match the server (in fact, this would be a potential attack vector). Instead, the client only uses the server's time to calculate an offset used only for communications with that particular server. The protocol rewards clients with synchronized clocks by reducing the number of round trips required to authenticate the first request. ## Usage Example Server code: ```javascript var Http = require('http'); var Hawk = require('hawk'); // Credentials lookup function var credentialsFunc = function (id, callback) { var credentials = { key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn', algorithm: 'sha256', user: 'Steve' }; return callback(null, credentials); }; // Create HTTP server var handler = function (req, res) { // Authenticate incoming request Hawk.server.authenticate(req, credentialsFunc, {}, function (err, credentials, artifacts) { // Prepare response var payload = (!err ? 'Hello ' + credentials.user + ' ' + artifacts.ext : 'Shoosh!'); var headers = { 'Content-Type': 'text/plain' }; // Generate Server-Authorization response header var header = Hawk.server.header(credentials, artifacts, { payload: payload, contentType: headers['Content-Type'] }); headers['Server-Authorization'] = header; // Send the response back res.writeHead(!err ? 200 : 401, headers); res.end(payload); }); }; // Start server Http.createServer(handler).listen(8000, 'example.com'); ``` Client code: ```javascript var Request = require('request'); var Hawk = require('hawk'); // Client credentials var credentials = { id: 'dh37fgj492je', key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn', algorithm: 'sha256' } // Request options var requestOptions = { uri: 'http://example.com:8000/resource/1?b=1&a=2', method: 'GET', headers: {} }; // Generate Authorization request header var header = Hawk.client.header('http://example.com:8000/resource/1?b=1&a=2', 'GET', { credentials: credentials, ext: 'some-app-data' }); requestOptions.headers.Authorization = header.field; // Send authenticated request Request(requestOptions, function (error, response, body) { // Authenticate the server's response var isValid = Hawk.client.authenticate(response, credentials, header.artifacts, { payload: body }); // Output results console.log(response.statusCode + ': ' + body + (isValid ? ' (valid)' : ' (invalid)')); }); ``` **Hawk** utilized the [**SNTP**](https://github.com/hueniverse/sntp) module for time sync management. By default, the local machine time is used. To automatically retrieve and synchronice the clock within the application, use the SNTP 'start()' method. ```javascript Hawk.sntp.start(); ``` ## Protocol Example The client attempts to access a protected resource without authentication, sending the following HTTP request to the resource server: ``` GET /resource/1?b=1&a=2 HTTP/1.1 Host: example.com:8000 ``` The resource server returns an authentication challenge. ``` HTTP/1.1 401 Unauthorized WWW-Authenticate: Hawk ``` The client has previously obtained a set of **Hawk** credentials for accessing resources on the "http://example.com/" server. The **Hawk** credentials issued to the client include the following attributes: * Key identifier: dh37fgj492je * Key: werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn * Algorithm: sha256 The client generates the authentication header by calculating a timestamp (e.g. the number of seconds since January 1, 1970 00:00:00 GMT), generating a nonce, and constructing the normalized request string (each value followed by a newline character): ``` hawk.1.header 1353832234 j4h3g2 GET /resource/1?b=1&a=2 example.com 8000 some-app-ext-data ``` The request MAC is calculated using HMAC with the specified hash algorithm "sha256" and the key over the normalized request string. The result is base64-encoded to produce the request MAC: ``` 6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE= ``` The client includes the **Hawk** key identifier, timestamp, nonce, application specific data, and request MAC with the request using the HTTP `Authorization` request header field: ``` GET /resource/1?b=1&a=2 HTTP/1.1 Host: example.com:8000 Authorization: Hawk id="dh37fgj492je", ts="1353832234", nonce="j4h3g2", ext="some-app-ext-data", mac="6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE=" ``` The server validates the request by calculating the request MAC again based on the request received and verifies the validity and scope of the **Hawk** credentials. If valid, the server responds with the requested resource. ### Payload Validation **Hawk** provides optional payload validation. When generating the authentication header, the client calculates a payload hash using the specified hash algorithm. The hash is calculated over the concatenated value of (each followed by a newline character): * `hawk.1.payload` * the content-type in lowercase, without any parameters (e.g. `application/json`) * the request payload prior to any content encoding (the exact representation requirements should be specified by the server for payloads other than simple single-part ascii to ensure interoperability) For example: * Payload: `Thank you for flying Hawk` * Content Type: `text/plain` * Hash (sha256): `Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=` Results in the following input to the payload hash function (newline terminated values): ``` hawk.1.payload text/plain Thank you for flying Hawk ``` Which produces the following hash value: ``` Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY= ``` The client constructs the normalized request string (newline terminated values): ``` hawk.1.header 1353832234 j4h3g2 POST /resource/1?a=1&b=2 example.com 8000 Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY= some-app-ext-data ``` Then calculates the request MAC and includes the **Hawk** key identifier, timestamp, nonce, payload hash, application specific data, and request MAC, with the request using the HTTP `Authorization` request header field: ``` POST /resource/1?a=1&b=2 HTTP/1.1 Host: example.com:8000 Authorization: Hawk id="dh37fgj492je", ts="1353832234", nonce="j4h3g2", hash="Yi9LfIIFRtBEPt74PVmbTF/xVAwPn7ub15ePICfgnuY=", ext="some-app-ext-data", mac="aSe1DERmZuRl3pI36/9BdZmnErTw3sNzOOAUlfeKjVw=" ``` It is up to the server if and when it validates the payload for any given request, based solely on it's security policy and the nature of the data included. If the payload is available at the time of authentication, the server uses the hash value provided by the client to construct the normalized string and validates the MAC. If the MAC is valid, the server calculates the payload hash and compares the value with the provided payload hash in the header. In many cases, checking the MAC first is faster than calculating the payload hash. However, if the payload is not available at authentication time (e.g. too large to fit in memory, streamed elsewhere, or processed at a different stage in the application), the server may choose to defer payload validation for later by retaining the hash value provided by the client after validating the MAC. It is important to note that MAC validation does not mean the hash value provided by the client is valid, only that the value included in the header was not modified. Without calculating the payload hash on the server and comparing it to the value provided by the client, the payload may be modified by an attacker. ## Response Payload Validation **Hawk** provides partial response payload validation. The server includes the `Server-Authorization` response header which enables the client to authenticate the response and ensure it is talking to the right server. **Hawk** defines the HTTP `Server-Authorization` header as a response header using the exact same syntax as the `Authorization` request header field. The header is contructed using the same process as the client's request header. The server uses the same credentials and other artifacts provided by the client to constructs the normalized request string. The `ext` and `hash` values are replaced with new values based on the server response. The rest as identical to those used by the client. The result MAC digest is included with the optional `hash` and `ext` values: ``` Server-Authorization: Hawk mac="XIJRsMl/4oL+nn+vKoeVZPdCHXB4yJkNnBbTbHFZUYE=", hash="f9cDF/TDm7TkYRLnGwRMfeDzT6LixQVLvrIKhh0vgmM=", ext="response-specific" ``` ## Browser Support and Considerations A browser script is provided for including using a `<script>` tag in [lib/browser.js](/lib/browser.js). **Hawk** relies on the _Server-Authorization_ and _WWW-Authenticate_ headers in its response to communicate with the client. Therefore, in case of CORS requests, it is important to consider sending _Access-Control-Expose-Headers_ with the value _"WWW-Authenticate, Server-Authorization"_ on each response from your server. As explained in the [specifications](http://www.w3.org/TR/cors/#access-control-expose-headers-response-header), it will indicate that these headers can safely be accessed by the client (using getResponseHeader() on the XmlHttpRequest object). Otherwise you will be met with a ["simple response header"](http://www.w3.org/TR/cors/#simple-response-header) which excludes these fields and would prevent the Hawk client from authenticating the requests.You can read more about the why and how in this [article](http://www.html5rocks.com/en/tutorials/cors/#toc-adding-cors-support-to-the-server) # Single URI Authorization There are cases in which limited and short-term access to a protected resource is granted to a third party which does not have access to the shared credentials. For example, displaying a protected image on a web page accessed by anyone. **Hawk** provides limited support for such URIs in the form of a _bewit_ - a URI query parameter appended to the request URI which contains the necessary credentials to authenticate the request. Because of the significant security risks involved in issuing such access, bewit usage is purposely limited only to GET requests and for a finite period of time. Both the client and server can issue bewit credentials, however, the server should not use the same credentials as the client to maintain clear traceability as to who issued which credentials. In order to simplify implementation, bewit credentials do not support single-use policy and can be replayed multiple times within the granted access timeframe. ## Bewit Usage Example Server code: ```javascript var Http = require('http'); var Hawk = require('hawk'); // Credentials lookup function var credentialsFunc = function (id, callback) { var credentials = { key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn', algorithm: 'sha256' }; return callback(null, credentials); }; // Create HTTP server var handler = function (req, res) { Hawk.uri.authenticate(req, credentialsFunc, {}, function (err, credentials, attributes) { res.writeHead(!err ? 200 : 401, { 'Content-Type': 'text/plain' }); res.end(!err ? 'Access granted' : 'Shoosh!'); }); }; Http.createServer(handler).listen(8000, 'example.com'); ``` Bewit code generation: ```javascript var Request = require('request'); var Hawk = require('hawk'); // Client credentials var credentials = { id: 'dh37fgj492je', key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn', algorithm: 'sha256' } // Generate bewit var duration = 60 * 5; // 5 Minutes var bewit = Hawk.uri.getBewit('http://example.com:8080/resource/1?b=1&a=2', { credentials: credentials, ttlSec: duration, ext: 'some-app-data' }); var uri = 'http://example.com:8000/resource/1?b=1&a=2' + '&bewit=' + bewit; ``` # Security Considerations The greatest sources of security risks are usually found not in **Hawk** but in the policies and procedures surrounding its use. Implementers are strongly encouraged to assess how this module addresses their security requirements. This section includes an incomplete list of security considerations that must be reviewed and understood before deploying **Hawk** on the server. Many of the protections provided in **Hawk** depends on whether and how they are used. ### MAC Keys Transmission **Hawk** does not provide any mechanism for obtaining or transmitting the set of shared credentials required. Any mechanism used to obtain **Hawk** credentials must ensure that these transmissions are protected using transport-layer mechanisms such as TLS. ### Confidentiality of Requests While **Hawk** provides a mechanism for verifying the integrity of HTTP requests, it provides no guarantee of request confidentiality. Unless other precautions are taken, eavesdroppers will have full access to the request content. Servers should carefully consider the types of data likely to be sent as part of such requests, and employ transport-layer security mechanisms to protect sensitive resources. ### Spoofing by Counterfeit Servers **Hawk** provides limited verification of the server authenticity. When receiving a response back from the server, the server may choose to include a response `Server-Authorization` header which the client can use to verify the response. However, it is up to the server to determine when such measure is included, to up to the client to enforce that policy. A hostile party could take advantage of this by intercepting the client's requests and returning misleading or otherwise incorrect responses. Service providers should consider such attacks when developing services using this protocol, and should require transport-layer security for any requests where the authenticity of the resource server or of server responses is an issue. ### Plaintext Storage of Credentials The **Hawk** key functions the same way passwords do in traditional authentication systems. In order to compute the request MAC, the server must have access to the key in plaintext form. This is in contrast, for example, to modern operating systems, which store only a one-way hash of user credentials. If an attacker were to gain access to these keys - or worse, to the server's database of all such keys - he or she would be able to perform any action on behalf of any resource owner. Accordingly, it is critical that servers protect these keys from unauthorized access. ### Entropy of Keys Unless a transport-layer security protocol is used, eavesdroppers will have full access to authenticated requests and request MAC values, and will thus be able to mount offline brute-force attacks to recover the key used. Servers should be careful to assign keys which are long enough, and random enough, to resist such attacks for at least the length of time that the **Hawk** credentials are valid. For example, if the credentials are valid for two weeks, servers should ensure that it is not possible to mount a brute force attack that recovers the key in less than two weeks. Of course, servers are urged to err on the side of caution, and use the longest key reasonable. It is equally important that the pseudo-random number generator (PRNG) used to generate these keys be of sufficiently high quality. Many PRNG implementations generate number sequences that may appear to be random, but which nevertheless exhibit patterns or other weaknesses which make cryptanalysis or brute force attacks easier. Implementers should be careful to use cryptographically secure PRNGs to avoid these problems. ### Coverage Limitations The request MAC only covers the HTTP `Host` header and optionally the `Content-Type` header. It does not cover any other headers which can often affect how the request body is interpreted by the server. If the server behavior is influenced by the presence or value of such headers, an attacker can manipulate the request headers without being detected. Implementers should use the `ext` feature to pass application-specific information via the `Authorization` header which is protected by the request MAC. The response authentication, when performed, only covers the response payload, content-type, and the request information provided by the client in it's request (method, resource, timestamp, nonce, etc.). It does not cover the HTTP status code or any other response header field (e.g. Location) which can affect the client's behaviour. ### Future Time Manipulation The protocol relies on a clock sync between the client and server. To accomplish this, the server informs the client of its current time when an invalid timestamp is received. If an attacker is able to manipulate this information and cause the client to use an incorrect time, it would be able to cause the client to generate authenticated requests using time in the future. Such requests will fail when sent by the client, and will not likely leave a trace on the server (given the common implementation of nonce, if at all enforced). The attacker will then be able to replay the request at the correct time without detection. The client must only use the time information provided by the server if: * it was delivered over a TLS connection and the server identity has been verified, or * the `tsm` MAC digest calculated using the same client credentials over the timestamp has been verified. ### Client Clock Poisoning When receiving a request with a bad timestamp, the server provides the client with its current time. The client must never use the time received from the server to adjust its own clock, and must only use it to calculate an offset for communicating with that particular server. ### Bewit Limitations Special care must be taken when issuing bewit credentials to third parties. Bewit credentials are valid until expiration and cannot be revoked or limited without using other means. Whatever resource they grant access to will be completely exposed to anyone with access to the bewit credentials which act as bearer credentials for that particular resource. While bewit usage is limited to GET requests only and therefore cannot be used to perform transactions or change server state, it can still be used to expose private and sensitive information. ### Host Header Forgery Hawk validates the incoming request MAC against the incoming HTTP Host header. However, unless the optional `host` and `port` options are used with `server.authenticate()`, a malicous client can mint new host names pointing to the server's IP address and use that to craft an attack by sending a valid request that's meant for another hostname than the one used by the server. Server implementors must manually verify that the host header received matches their expectation (or use the options mentioned above). # Frequently Asked Questions ### Where is the protocol specification? If you are looking for some prose explaining how all this works, **this is it**. **Hawk** is being developed as an open source project instead of a standard. In other words, the [code](/hueniverse/hawk/tree/master/lib) is the specification. Not sure about something? Open an issue! ### Is it done? At if version 0.10.0, **Hawk** is feature-complete. However, until this module reaches version 1.0.0 it is considered experimental and is likely to change. This also means your feedback and contribution are very welcome. Feel free to open issues with questions and suggestions. ### Where can I find **Hawk** implementations in other languages? **Hawk**'s only reference implementation is provided in JavaScript as a node.js module. However, it has been ported to other languages. The full list is maintained [here](https://github.com/hueniverse/hawk/issues?labels=port&state=closed). Please add an issue if you are working on another port. A cross-platform test-suite is in the works. ### Why isn't the algorithm part of the challenge or dynamically negotiated? The algorithm used is closely related to the key issued as different algorithms require different key sizes (and other requirements). While some keys can be used for multiple algorithm, the protocol is designed to closely bind the key and algorithm together as part of the issued credentials. ### Why is Host and Content-Type the only headers covered by the request MAC? It is really hard to include other headers. Headers can be changed by proxies and other intermediaries and there is no well-established way to normalize them. Many platforms change the case of header field names and values. The only straight-forward solution is to include the headers in some blob (say, base64 encoded JSON) and include that with the request, an approach taken by JWT and other such formats. However, that design violates the HTTP header boundaries, repeats information, and introduces other security issues because firewalls will not be aware of these "hidden" headers. In addition, any information repeated must be compared to the duplicated information in the header and therefore only moves the problem elsewhere. ### Why not just use HTTP Digest? Digest requires pre-negotiation to establish a nonce. This means you can't just make a request - you must first send a protocol handshake to the server. This pattern has become unacceptable for most web services, especially mobile where extra round-trip are costly. ### Why bother with all this nonce and timestamp business? **Hawk** is an attempt to find a reasonable, practical compromise between security and usability. OAuth 1.0 got timestamp and nonces halfway right but failed when it came to scalability and consistent developer experience. **Hawk** addresses it by requiring the client to sync its clock, but provides it with tools to accomplish it. In general, replay protection is a matter of application-specific threat model. It is less of an issue on a TLS-protected system where the clients are implemented using best practices and are under the control of the server. Instead of dropping replay protection, **Hawk** offers a required time window and an optional nonce verification. Together, it provides developers with the ability to decide how to enforce their security policy without impacting the client's implementation. ### What are `app` and `dlg` in the authorization header and normalized mac string? The original motivation for **Hawk** was to replace the OAuth 1.0 use cases. This included both a simple client-server mode which this module is specifically designed for, and a delegated access mode which is being developed separately in [Oz](https://github.com/hueniverse/oz). In addition to the **Hawk** use cases, Oz requires another attribute: the application id `app`. This provides binding between the credentials and the application in a way that prevents an attacker from tricking an application to use credentials issued to someone else. It also has an optional 'delegated-by' attribute `dlg` which is the application id of the application the credentials were directly issued to. The goal of these two additions is to allow Oz to utilize **Hawk** directly, but with the additional security of delegated credentials. ### What is the purpose of the static strings used in each normalized MAC input? When calculating a hash or MAC, a static prefix (tag) is added. The prefix is used to prevent MAC values from being used or reused for a purpose other than what they were created for (i.e. prevents switching MAC values between a request, response, and a bewit use cases). It also protects against expliots created after a potential change in how the protocol creates the normalized string. For example, if a future version would switch the order of nonce and timestamp, it can create an exploit opportunity for cases where the nonce is similar in format to a timestamp. ### Does **Hawk** have anything to do with OAuth? Short answer: no. **Hawk** was originally proposed as the OAuth MAC Token specification. However, the OAuth working group in its consistent incompetence failed to produce a final, usable solution to address one of the most popular use cases of OAuth 1.0 - using it to authenticate simple client-server transactions (i.e. two-legged). As you can guess, the OAuth working group is still hard at work to produce more garbage. **Hawk** provides a simple HTTP authentication scheme for making client-server requests. It does not address the OAuth use case of delegating access to a third party. If you are looking for an OAuth alternative, check out [Oz](https://github.com/hueniverse/oz). # Acknowledgements **Hawk** is a derivative work of the [HTTP MAC Authentication Scheme](http://tools.ietf.org/html/draft-hammer-oauth-v2-mac-token-05) proposal co-authored by Ben Adida, Adam Barth, and Eran Hammer, which in turn was based on the OAuth 1.0 community specification. Special thanks to Ben Laurie for his always insightful feedback and advice. The **Hawk** logo was created by [Chris Carrasco](http://chriscarrasco.com). ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/����������������������000755 �000766 �000024 �00000000000 12456115120 033775� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/�����������������000755 �000766 �000024 �00000000000 12456115120 035057� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/����������������������000755 �000766 �000024 �00000000000 12456115120 033767� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/����������������������000755 �000766 �000024 �00000000000 12456115120 034025� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/.npmignore������������000644 �000766 �000024 �00000000262 12455173731 036037� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/.travis.yml�����������000755 �000766 �000024 �00000000046 12455173731 036154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/�������������000755 �000766 �000024 �00000000000 12456115120 035643� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/index.js��������������000755 �000766 �000024 �00000000042 12455173731 035504� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib');����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/lib/������������������000755 �000766 �000024 �00000000000 12456115120 034573� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/LICENSE���������������000755 �000766 �000024 �00000002722 12455173731 035053� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2012-2013, Eran Hammer. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Eran Hammer nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL ERAN HAMMER BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/Makefile��������������000755 �000766 �000024 �00000000422 12455173731 035501� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������test: @./node_modules/.bin/lab test-cov: @./node_modules/.bin/lab -r threshold -t 100 test-cov-html: @./node_modules/.bin/lab -r html -o coverage.html complexity: @./node_modules/.bin/cr -o complexity.md -f markdown lib .PHONY: test test-cov test-cov-html complexity ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/package.json����������000755 �000766 �000024 �00000002632 12455173731 036334� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "sntp", "description": "SNTP Client", "version": "0.2.4", "author": { "name": "Eran Hammer", "email": "eran@hueniverse.com", "url": "http://hueniverse.com" }, "contributors": [], "repository": { "type": "git", "url": "git://github.com/hueniverse/sntp" }, "main": "index", "keywords": [ "sntp", "ntp", "time" ], "engines": { "node": ">=0.8.0" }, "dependencies": { "hoek": "0.9.x" }, "devDependencies": { "lab": "0.1.x", "complexity-report": "0.x.x" }, "scripts": { "test": "make test-cov" }, "licenses": [ { "type": "BSD", "url": "http://github.com/hueniverse/sntp/raw/master/LICENSE" } ], "_id": "sntp@0.2.4", "dist": { "shasum": "fb885f18b0f3aad189f824862536bceeec750900", "tarball": "http://registry.npmjs.org/sntp/-/sntp-0.2.4.tgz" }, "_from": "sntp@>=0.2.0 <0.3.0", "_npmVersion": "1.2.18", "_npmUser": { "name": "hueniverse", "email": "eran@hueniverse.com" }, "maintainers": [ { "name": "hueniverse", "email": "eran@hueniverse.com" } ], "directories": {}, "_shasum": "fb885f18b0f3aad189f824862536bceeec750900", "_resolved": "https://registry.npmjs.org/sntp/-/sntp-0.2.4.tgz", "bugs": { "url": "https://github.com/hueniverse/sntp/issues" }, "readme": "ERROR: No README data found!", "homepage": "https://github.com/hueniverse/sntp" } ������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/README.md�������������000755 �000766 �000024 �00000003552 12455173731 035327� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# sntp An SNTP v4 client (RFC4330) for node. Simpy connects to the NTP or SNTP server requested and returns the server time along with the roundtrip duration and clock offset. To adjust the local time to the NTP time, add the returned `t` offset to the local time. [![Build Status](https://secure.travis-ci.org/hueniverse/sntp.png)](http://travis-ci.org/hueniverse/sntp) # Usage ```javascript var Sntp = require('sntp'); // All options are optional var options = { host: 'nist1-sj.ustiming.org', // Defaults to pool.ntp.org port: 123, // Defaults to 123 (NTP) resolveReference: true, // Default to false (not resolving) timeout: 1000 // Defaults to zero (no timeout) }; // Request server time Sntp.time(options, function (err, time) { if (err) { console.log('Failed: ' + err.message); process.exit(1); } console.log('Local clock is off by: ' + time.t + ' milliseconds'); process.exit(0); }); ``` If an application needs to maintain continuous time synchronization, the module provides a stateful method for querying the current offset only when the last one is too old (defaults to daily). ```javascript // Request offset once Sntp.offset(function (err, offset) { console.log(offset); // New (served fresh) // Request offset again Sntp.offset(function (err, offset) { console.log(offset); // Identical (served from cache) }); }); ``` To set a background offset refresh, start the interval and use the provided now() method. If for any reason the client fails to obtain an up-to-date offset, the current system clock is used. ```javascript var before = Sntp.now(); // System time without offset Sntp.start(function () { var now = Sntp.now(); // With offset Sntp.stop(); }); ``` ������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/lib/index.js����������000755 �000766 �000024 �00000023006 12455173731 036257� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Dgram = require('dgram'); var Dns = require('dns'); var Hoek = require('hoek'); // Declare internals var internals = {}; exports.time = function (options, callback) { if (arguments.length !== 2) { callback = arguments[0]; options = {}; } var settings = Hoek.clone(options); settings.host = settings.host || 'pool.ntp.org'; settings.port = settings.port || 123; settings.resolveReference = settings.resolveReference || false; // Declare variables used by callback var timeoutId = 0; var sent = 0; // Ensure callback is only called once var isFinished = false; var finish = function (err, result) { if (timeoutId) { clearTimeout(timeoutId); timeoutId = 0; } if (!isFinished) { isFinished = true; socket.removeAllListeners(); socket.close(); return callback(err, result); } }; // Create UDP socket var socket = Dgram.createSocket('udp4'); socket.once('error', function (err) { return finish(err); }); // Listen to incoming messages socket.on('message', function (buffer, rinfo) { var received = Date.now(); var message = new internals.NtpMessage(buffer); if (!message.isValid) { return finish(new Error('Invalid server response'), message); } if (message.originateTimestamp !== sent) { return finish(new Error('Wrong originate timestamp'), message); } // Timestamp Name ID When Generated // ------------------------------------------------------------ // Originate Timestamp T1 time request sent by client // Receive Timestamp T2 time request received by server // Transmit Timestamp T3 time reply sent by server // Destination Timestamp T4 time reply received by client // // The roundtrip delay d and system clock offset t are defined as: // // d = (T4 - T1) - (T3 - T2) t = ((T2 - T1) + (T3 - T4)) / 2 var T1 = message.originateTimestamp; var T2 = message.receiveTimestamp; var T3 = message.transmitTimestamp; var T4 = received; message.d = (T4 - T1) - (T3 - T2); message.t = ((T2 - T1) + (T3 - T4)) / 2; message.receivedLocally = received; if (!settings.resolveReference || message.stratum !== 'secondary') { return finish(null, message); } // Resolve reference IP address Dns.reverse(message.referenceId, function (err, domains) { if (!err) { message.referenceHost = domains[0]; } return finish(null, message); }); }); // Set timeout if (settings.timeout) { timeoutId = setTimeout(function () { timeoutId = 0; return finish(new Error('Timeout')); }, settings.timeout); } // Construct NTP message var message = new Buffer(48); for (var i = 0; i < 48; i++) { // Zero message message[i] = 0; } message[0] = (0 << 6) + (4 << 3) + (3 << 0) // Set version number to 4 and Mode to 3 (client) sent = Date.now(); internals.fromMsecs(sent, message, 40); // Set transmit timestamp (returns as originate) // Send NTP request socket.send(message, 0, message.length, settings.port, settings.host, function (err, bytes) { if (err || bytes !== 48) { return finish(err || new Error('Could not send entire message')); } }); }; internals.NtpMessage = function (buffer) { this.isValid = false; // Validate if (buffer.length !== 48) { return; } // Leap indicator var li = (buffer[0] >> 6); switch (li) { case 0: this.leapIndicator = 'no-warning'; break; case 1: this.leapIndicator = 'last-minute-61'; break; case 2: this.leapIndicator = 'last-minute-59'; break; case 3: this.leapIndicator = 'alarm'; break; } // Version var vn = ((buffer[0] & 0x38) >> 3); this.version = vn; // Mode var mode = (buffer[0] & 0x7); switch (mode) { case 1: this.mode = 'symmetric-active'; break; case 2: this.mode = 'symmetric-passive'; break; case 3: this.mode = 'client'; break; case 4: this.mode = 'server'; break; case 5: this.mode = 'broadcast'; break; case 0: case 6: case 7: this.mode = 'reserved'; break; } // Stratum var stratum = buffer[1]; if (stratum === 0) { this.stratum = 'death'; } else if (stratum === 1) { this.stratum = 'primary'; } else if (stratum <= 15) { this.stratum = 'secondary'; } else { this.stratum = 'reserved'; } // Poll interval (msec) this.pollInterval = Math.round(Math.pow(2, buffer[2])) * 1000; // Precision (msecs) this.precision = Math.pow(2, buffer[3]) * 1000; // Root delay (msecs) var rootDelay = 256 * (256 * (256 * buffer[4] + buffer[5]) + buffer[6]) + buffer[7]; this.rootDelay = 1000 * (rootDelay / 0x10000); // Root dispersion (msecs) this.rootDispersion = ((buffer[8] << 8) + buffer[9] + ((buffer[10] << 8) + buffer[11]) / Math.pow(2, 16)) * 1000; // Reference identifier this.referenceId = ''; switch (this.stratum) { case 'death': case 'primary': this.referenceId = String.fromCharCode(buffer[12]) + String.fromCharCode(buffer[13]) + String.fromCharCode(buffer[14]) + String.fromCharCode(buffer[15]); break; case 'secondary': this.referenceId = '' + buffer[12] + '.' + buffer[13] + '.' + buffer[14] + '.' + buffer[15]; break; } // Reference timestamp this.referenceTimestamp = internals.toMsecs(buffer, 16); // Originate timestamp this.originateTimestamp = internals.toMsecs(buffer, 24); // Receive timestamp this.receiveTimestamp = internals.toMsecs(buffer, 32); // Transmit timestamp this.transmitTimestamp = internals.toMsecs(buffer, 40); // Validate if (this.version === 4 && this.stratum !== 'reserved' && this.mode === 'server' && this.originateTimestamp && this.receiveTimestamp && this.transmitTimestamp) { this.isValid = true; } return this; }; internals.toMsecs = function (buffer, offset) { var seconds = 0; var fraction = 0; for (var i = 0; i < 4; ++i) { seconds = (seconds * 256) + buffer[offset + i]; } for (i = 4; i < 8; ++i) { fraction = (fraction * 256) + buffer[offset + i]; } return ((seconds - 2208988800 + (fraction / Math.pow(2, 32))) * 1000); }; internals.fromMsecs = function (ts, buffer, offset) { var seconds = Math.floor(ts / 1000) + 2208988800; var fraction = Math.round((ts % 1000) / 1000 * Math.pow(2, 32)); buffer[offset + 0] = (seconds & 0xFF000000) >> 24; buffer[offset + 1] = (seconds & 0x00FF0000) >> 16; buffer[offset + 2] = (seconds & 0x0000FF00) >> 8; buffer[offset + 3] = (seconds & 0x000000FF); buffer[offset + 4] = (fraction & 0xFF000000) >> 24; buffer[offset + 5] = (fraction & 0x00FF0000) >> 16; buffer[offset + 6] = (fraction & 0x0000FF00) >> 8; buffer[offset + 7] = (fraction & 0x000000FF); }; // Offset singleton internals.last = { offset: 0, expires: 0, host: '', port: 0 }; exports.offset = function (options, callback) { if (arguments.length !== 2) { callback = arguments[0]; options = {}; } var now = Date.now(); var clockSyncRefresh = options.clockSyncRefresh || 24 * 60 * 60 * 1000; // Daily if (internals.last.offset && internals.last.host === options.host && internals.last.port === options.port && now < internals.last.expires) { process.nextTick(function () { callback(null, internals.last.offset); }); return; } exports.time(options, function (err, time) { if (err) { return callback(err, 0); } internals.last = { offset: Math.round(time.t), expires: now + clockSyncRefresh, host: options.host, port: options.port }; return callback(null, internals.last.offset); }); }; // Now singleton internals.now = { intervalId: 0 }; exports.start = function (options, callback) { if (arguments.length !== 2) { callback = arguments[0]; options = {}; } if (internals.now.intervalId) { process.nextTick(function () { callback(); }); return; } exports.offset(options, function (err, offset) { internals.now.intervalId = setInterval(function () { exports.offset(options, function () { }); }, options.clockSyncRefresh || 24 * 60 * 60 * 1000); // Daily return callback(); }); }; exports.stop = function () { if (!internals.now.intervalId) { return; } clearInterval(internals.now.intervalId); internals.now.intervalId = 0; }; exports.isLive = function () { return !!internals.now.intervalId; }; exports.now = function () { var now = Date.now(); if (!exports.isLive() || now >= internals.last.expires) { return now; } return now + internals.last.offset; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/offset.js����000755 �000766 �000024 �00000000501 12455173731 037501� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var Sntp = require('../lib'); // Request offset once Sntp.offset(function (err, offset) { console.log(offset); // New (served fresh) // Request offset again Sntp.offset(function (err, offset) { console.log(offset); // Identical (served from cache) }); }); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/sntp/examples/time.js������000755 �000766 �000024 �00000001163 12455173731 037156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var Sntp = require('../lib'); // All options are optional var options = { host: 'nist1-sj.ustiming.org', // Defaults to pool.ntp.org port: 123, // Defaults to 123 (NTP) resolveReference: true, // Default to false (not resolving) timeout: 1000 // Defaults to zero (no timeout) }; // Request server time Sntp.time(options, function (err, time) { if (err) { console.log('Failed: ' + err.message); process.exit(1); } console.log(time); console.log('Local clock is off by: ' + time.t + ' milliseconds'); process.exit(0); }); �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/.npmignore������������000644 �000766 �000024 �00000000321 12455173731 035775� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov complexity.md ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/.travis.yml�����������000755 �000766 �000024 �00000000053 12455173731 036114� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/images/���������������000755 �000766 �000024 �00000000000 12456115120 035234� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/index.js��������������000755 �000766 �000024 �00000000042 12455173731 035446� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib');����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/������������������000755 �000766 �000024 �00000000000 12456115120 034535� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/LICENSE���������������000755 �000766 �000024 �00000003415 12455173731 035015� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2011-2013, Walmart. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Walmart nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL WALMART BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. * * * Portions of this project were initially based on Postmile, Copyright (c) 2011, Yahoo Inc. Postmile is published at https://github.com/yahoo/postmile and its licensing terms are published at https://github.com/yahoo/postmile/blob/master/LICENSE. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/Makefile��������������000755 �000766 �000024 �00000000505 12455173731 035445� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������test: @node node_modules/lab/bin/lab test-cov: @node node_modules/lab/bin/lab -r threshold -t 100 test-cov-html: @node node_modules/lab/bin/lab -r html -o coverage.html complexity: @node node_modules/complexity-report/src/cli.js -o complexity.md -f markdown lib .PHONY: test test-cov test-cov-html complexity �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/package.json����������000755 �000766 �000024 �00000003035 12455173731 036274� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "hoek", "description": "General purpose node utilities", "version": "0.9.1", "author": { "name": "Eran Hammer", "email": "eran@hueniverse.com", "url": "http://hueniverse.com" }, "contributors": [ { "name": "Van Nguyen", "email": "the.gol.effect@gmail.com" } ], "repository": { "type": "git", "url": "git://github.com/spumko/hoek" }, "main": "index", "keywords": [ "utilities" ], "engines": { "node": ">=0.8.0" }, "dependencies": {}, "devDependencies": { "lab": "0.1.x", "complexity-report": "0.x.x" }, "scripts": { "test": "make test-cov" }, "licenses": [ { "type": "BSD", "url": "http://github.com/spumko/hoek/raw/master/LICENSE" } ], "_id": "hoek@0.9.1", "dist": { "shasum": "3d322462badf07716ea7eb85baf88079cddce505", "tarball": "http://registry.npmjs.org/hoek/-/hoek-0.9.1.tgz" }, "_from": "hoek@>=0.9.0 <0.10.0", "_npmVersion": "1.2.18", "_npmUser": { "name": "hueniverse", "email": "eran@hueniverse.com" }, "maintainers": [ { "name": "hueniverse", "email": "eran@hueniverse.com" }, { "name": "thegoleffect", "email": "thegoleffect@gmail.com" } ], "directories": {}, "_shasum": "3d322462badf07716ea7eb85baf88079cddce505", "_resolved": "https://registry.npmjs.org/hoek/-/hoek-0.9.1.tgz", "bugs": { "url": "https://github.com/spumko/hoek/issues" }, "readme": "ERROR: No README data found!", "homepage": "https://github.com/spumko/hoek" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/README.md�������������000755 �000766 �000024 �00000024647 12455173731 035301� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������<a href="https://github.com/spumko"><img src="https://raw.github.com/spumko/spumko/master/images/from.png" align="right" /></a> ![hoek Logo](https://raw.github.com/spumko/hoek/master/images/hoek.png) General purpose node utilities [![Build Status](https://secure.travis-ci.org/spumko/hoek.png)](http://travis-ci.org/spumko/hoek) # Table of Contents * [Introduction](#introduction "Introduction") * [Object](#object "Object") * [clone](#cloneobj "clone") * [merge](#mergetarget-source-isnulloverride-ismergearrays "merge") * [applyToDefaults](#applytodefaultsdefaults-options "applyToDefaults") * [unique](#uniquearray-key "unique") * [mapToObject](#maptoobjectarray-key "mapToObject") * [intersect](#intersectarray1-array2 "intersect") * [matchKeys](#matchkeysobj-keys "matchKeys") * [flatten](#flattenarray-target "flatten") * [removeKeys](#removekeysobject-keys "removeKeys") * [reach](#reachobj-chain "reach") * [inheritAsync](#inheritasyncself-obj-keys "inheritAsync") * [rename](#renameobj-from-to "rename") * [Timer](#timer "Timer") * [Binary Encoding/Decoding](#binary "Binary Encoding/Decoding") * [base64urlEncode](#binary64urlEncodevalue "binary64urlEncode") * [base64urlDecode](#binary64urlDecodevalue "binary64urlDecode") * [Escaping Characters](#escaped "Escaping Characters") * [escapeHtml](#escapeHtmlstring "escapeHtml") * [escapeHeaderAttribute](#escapeHeaderAttributeattribute "escapeHeaderAttribute") * [escapeRegex](#escapeRegexstring "escapeRegex") * [Errors](#errors "Errors") * [assert](#assertmessage "assert") * [abort](#abortmessage "abort") * [displayStack](#displayStackslice "displayStack") * [callStack](#callStackslice "callStack") * [toss](#tosscondition "toss") * [Load files](#load-files "Load Files") * [loadPackage](#loadPackagedir "loadpackage") * [loadDirModules](#loadDirModulespath-excludefiles-target "loaddirmodules") # Introduction The *Hoek* general purpose node utilities library is used to aid in a variety of manners. It comes with useful methods for Arrays (clone, merge, applyToDefaults), Objects (removeKeys, copy), Asserting and more. For example, to use Hoek to set configuration with default options: ```javascript var Hoek = require('hoek'); var default = {url : "www.github.com", port : "8000", debug : true} var config = Hoek.applyToDefaults(default, {port : "3000", admin : true}); // In this case, config would be { url: 'www.github.com', port: '3000', debug: true, admin: true } ``` Under each of the sections (such as Array), there are subsections which correspond to Hoek methods. Each subsection will explain how to use the corresponding method. In each js excerpt below, the var Hoek = require('hoek') is omitted for brevity. ## Object Hoek provides several helpful methods for objects and arrays. ### clone(obj) This method is used to clone an object or an array. A *deep copy* is made (duplicates everything, including values that are objects). ```javascript var nestedObj = { w: /^something$/ig, x: { a: [1, 2, 3], b: 123456, c: new Date() }, y: 'y', z: new Date() }; var copy = Hoek.clone(nestedObj); copy.x.b = 100; console.log(copy.y) // results in 'y' console.log(nestedObj.x.b) // results in 123456 console.log(copy.x.b) // results in 100 ``` ### merge(target, source, isNullOverride, isMergeArrays) isNullOverride, isMergeArrays default to true Merge all the properties of source into target, source wins in conflic, and by default null and undefined from source are applied ```javascript var target = {a: 1, b : 2} var source = {a: 0, c: 5} var source2 = {a: null, c: 5} var targetArray = [1, 2, 3]; var sourceArray = [4, 5]; var newTarget = Hoek.merge(target, source); // results in {a: 0, b: 2, c: 5} newTarget = Hoek.merge(target, source2); // results in {a: null, b: 2, c: 5} newTarget = Hoek.merge(target, source2, false); // results in {a: 1, b: 2, c: 5} newTarget = Hoek.merge(targetArray, sourceArray) // results in [1, 2, 3, 4, 5] newTarget = Hoek.merge(targetArray, sourceArray, true, false) // results in [4, 5] ``` ### applyToDefaults(defaults, options) Apply options to a copy of the defaults ```javascript var defaults = {host: "localhost", port: 8000}; var options = {port: 8080}; var config = Hoek.applyToDefaults(defaults, options); // results in {host: "localhost", port: 8080}; ``` ### unique(array, key) Remove duplicate items from Array ```javascript var array = [1, 2, 2, 3, 3, 4, 5, 6]; var newArray = Hoek.unique(array); // results in [1,2,3,4,5,6]; array = [{id: 1}, {id: 1}, {id: 2}]; newArray = Hoek.unique(array, "id") // results in [{id: 1}, {id: 2}] ``` ### mapToObject(array, key) Convert an Array into an Object ```javascript var array = [1,2,3]; var newObject = Hoek.mapToObject(array); // results in [{"1": true}, {"2": true}, {"3": true}] array = [{id: 1}, {id: 2}]; newObject = Hoek.mapToObject(array, "id") // results in [{"id": 1}, {"id": 2}] ``` ### intersect(array1, array2) Find the common unique items in two arrays ```javascript var array1 = [1, 2, 3]; var array2 = [1, 4, 5]; var newArray = Hoek.intersect(array1, array2) // results in [1] ``` ### matchKeys(obj, keys) Find which keys are present ```javascript var obj = {a: 1, b: 2, c: 3}; var keys = ["a", "e"]; Hoek.matchKeys(obj, keys) // returns ["a"] ``` ### flatten(array, target) Flatten an array ```javascript var array = [1, 2, 3]; var target = [4, 5]; var flattenedArray = Hoek.flatten(array, target) // results in [4, 5, 1, 2, 3]; ``` ### removeKeys(object, keys) Remove keys ```javascript var object = {a: 1, b: 2, c: 3, d: 4}; var keys = ["a", "b"]; Hoek.removeKeys(object, keys) // object is now {c: 3, d: 4} ``` ### reach(obj, chain) Converts an object key chain string to reference ```javascript var chain = 'a.b.c'; var obj = {a : {b : { c : 1}}}; Hoek.reach(obj, chain) // returns 1 ``` ### inheritAsync(self, obj, keys) Inherits a selected set of methods from an object, wrapping functions in asynchronous syntax and catching errors ```javascript var targetFunc = function () { }; var proto = { a: function () { return 'a!'; }, b: function () { return 'b!'; }, c: function () { throw new Error('c!'); } }; var keys = ['a', 'c']; Hoek.inheritAsync(targetFunc, proto, ['a', 'c']); var target = new targetFunc(); target.a(function(err, result){console.log(result)} // returns 'a!' target.c(function(err, result){console.log(result)} // returns undefined target.b(function(err, result){console.log(result)} // gives error: Object [object Object] has no method 'b' ``` ### rename(obj, from, to) Rename a key of an object ```javascript var obj = {a : 1, b : 2}; Hoek.rename(obj, "a", "c"); // obj is now {c : 1, b : 2} ``` # Timer A Timer object. Initializing a new timer object sets the ts to the number of milliseconds elapsed since 1 January 1970 00:00:00 UTC. ```javascript example : var timerObj = new Hoek.Timer(); console.log("Time is now: " + timerObj.ts) console.log("Elapsed time from initialization: " + timerObj.elapsed() + 'milliseconds') ``` # Binary Encoding/Decoding ### base64urlEncode(value) Encodes value in Base64 or URL encoding ### base64urlDecode(value) Decodes data in Base64 or URL encoding. # Escaping Characters Hoek provides convenient methods for escaping html characters. The escaped characters are as followed: ```javascript internals.htmlEscaped = { '&': '&', '<': '<', '>': '>', '"': '"', "'": ''', '`': '`' }; ``` ### escapeHtml(string) ```javascript var string = '<html> hey </html>'; var escapedString = Hoek.escapeHtml(string); // returns <html> hey </html> ``` ### escapeHeaderAttribute(attribute) Escape attribute value for use in HTTP header ```javascript var a = Hoek.escapeHeaderAttribute('I said "go w\\o me"'); //returns I said \"go w\\o me\" ``` ### escapeRegex(string) Escape string for Regex construction ```javascript var a = Hoek.escapeRegex('4^f$s.4*5+-_?%=#!:@|~\\/`"(>)[<]d{}s,'); // returns 4\^f\$s\.4\*5\+\-_\?%\=#\!\:@\|~\\\/`"\(>\)\[<\]d\{\}s\, ``` # Errors ### assert(message) ```javascript var a = 1, b =2; Hoek.assert(a === b, 'a should equal b'); // ABORT: a should equal b ``` ### abort(message) First checks if process.env.NODE_ENV === 'test', and if so, throws error message. Otherwise, displays most recent stack and then exits process. ### displayStack(slice) Displays the trace stack ```javascript var stack = Hoek.displayStack(); console.log(stack) // returns something like: [ 'null (/Users/user/Desktop/hoek/test.js:4:18)', 'Module._compile (module.js:449:26)', 'Module._extensions..js (module.js:467:10)', 'Module.load (module.js:356:32)', 'Module._load (module.js:312:12)', 'Module.runMain (module.js:492:10)', 'startup.processNextTick.process._tickCallback (node.js:244:9)' ] ``` ### callStack(slice) Returns a trace stack array. ```javascript var stack = Hoek.callStack(); console.log(stack) // returns something like: [ [ '/Users/user/Desktop/hoek/test.js', 4, 18, null, false ], [ 'module.js', 449, 26, 'Module._compile', false ], [ 'module.js', 467, 10, 'Module._extensions..js', false ], [ 'module.js', 356, 32, 'Module.load', false ], [ 'module.js', 312, 12, 'Module._load', false ], [ 'module.js', 492, 10, 'Module.runMain', false ], [ 'node.js', 244, 9, 'startup.processNextTick.process._tickCallback', false ] ] ``` ### toss(condition) toss(condition /*, [message], callback */) Return an error as first argument of a callback # Load Files ### loadPackage(dir) Load and parse package.json process root or given directory ```javascript var pack = Hoek.loadPackage(); // pack.name === 'hoek' ``` ### loadDirModules(path, excludeFiles, target) Loads modules from a given path; option to exclude files (array). �����������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/escape.js���������000755 �000766 �000024 �00000005367 12455173731 036364� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Declare internals var internals = {}; exports.escapeJavaScript = function (input) { if (!input) { return ''; } var escaped = ''; for (var i = 0, il = input.length; i < il; ++i) { var charCode = input.charCodeAt(i); if (internals.isSafe(charCode)) { escaped += input[i]; } else { escaped += internals.escapeJavaScriptChar(charCode); } } return escaped; }; exports.escapeHtml = function (input) { if (!input) { return ''; } var escaped = ''; for (var i = 0, il = input.length; i < il; ++i) { var charCode = input.charCodeAt(i); if (internals.isSafe(charCode)) { escaped += input[i]; } else { escaped += internals.escapeHtmlChar(charCode); } } return escaped; }; internals.escapeJavaScriptChar = function (charCode) { if (charCode >= 256) { return '\\u' + internals.padLeft('' + charCode, 4); } var hexValue = new Buffer(String.fromCharCode(charCode), 'ascii').toString('hex'); return '\\x' + internals.padLeft(hexValue, 2); }; internals.escapeHtmlChar = function (charCode) { var namedEscape = internals.namedHtml[charCode]; if (typeof namedEscape !== 'undefined') { return namedEscape; } if (charCode >= 256) { return '&#' + charCode + ';'; } var hexValue = new Buffer(String.fromCharCode(charCode), 'ascii').toString('hex'); return '&#x' + internals.padLeft(hexValue, 2) + ';'; }; internals.padLeft = function (str, len) { while (str.length < len) { str = '0' + str; } return str; }; internals.isSafe = function (charCode) { return (typeof internals.safeCharCodes[charCode] !== 'undefined'); }; internals.namedHtml = { '38': '&', '60': '<', '62': '>', '34': '"', '160': ' ', '162': '¢', '163': '£', '164': '¤', '169': '©', '174': '®' }; internals.safeCharCodes = (function () { var safe = {}; for (var i = 32; i < 123; ++i) { if ((i >= 97 && i <= 122) || // a-z (i >= 65 && i <= 90) || // A-Z (i >= 48 && i <= 57) || // 0-9 i === 32 || // space i === 46 || // . i === 44 || // , i === 45 || // - i === 58 || // : i === 95) { // _ safe[i] = null; } } return safe; }());�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/lib/index.js����������000755 �000766 �000024 �00000033623 12455173731 036227� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Fs = require('fs'); var Escape = require('./escape'); // Declare internals var internals = {}; // Clone object or array exports.clone = function (obj, seen) { if (typeof obj !== 'object' || obj === null) { return obj; } seen = seen || { orig: [], copy: [] }; var lookup = seen.orig.indexOf(obj); if (lookup !== -1) { return seen.copy[lookup]; } var newObj = (obj instanceof Array) ? [] : {}; seen.orig.push(obj); seen.copy.push(newObj); for (var i in obj) { if (obj.hasOwnProperty(i)) { if (obj[i] instanceof Buffer) { newObj[i] = new Buffer(obj[i]); } else if (obj[i] instanceof Date) { newObj[i] = new Date(obj[i].getTime()); } else if (obj[i] instanceof RegExp) { var flags = '' + (obj[i].global ? 'g' : '') + (obj[i].ignoreCase ? 'i' : '') + (obj[i].multiline ? 'm' : ''); newObj[i] = new RegExp(obj[i].source, flags); } else { newObj[i] = exports.clone(obj[i], seen); } } } return newObj; }; // Merge all the properties of source into target, source wins in conflic, and by default null and undefined from source are applied exports.merge = function (target, source, isNullOverride /* = true */, isMergeArrays /* = true */) { exports.assert(target && typeof target == 'object', 'Invalid target value: must be an object'); exports.assert(source === null || source === undefined || typeof source === 'object', 'Invalid source value: must be null, undefined, or an object'); if (!source) { return target; } if (source instanceof Array) { exports.assert(target instanceof Array, 'Cannot merge array onto an object'); if (isMergeArrays === false) { // isMergeArrays defaults to true target.length = 0; // Must not change target assignment } for (var i = 0, il = source.length; i < il; ++i) { target.push(source[i]); } return target; } var keys = Object.keys(source); for (var k = 0, kl = keys.length; k < kl; ++k) { var key = keys[k]; var value = source[key]; if (value && typeof value === 'object') { if (!target[key] || typeof target[key] !== 'object') { target[key] = exports.clone(value); } else { exports.merge(target[key], source[key], isNullOverride, isMergeArrays); } } else { if (value !== null && value !== undefined) { // Explicit to preserve empty strings target[key] = value; } else if (isNullOverride !== false) { // Defaults to true target[key] = value; } } } return target; }; // Apply options to a copy of the defaults exports.applyToDefaults = function (defaults, options) { exports.assert(defaults && typeof defaults == 'object', 'Invalid defaults value: must be an object'); exports.assert(!options || options === true || typeof options === 'object', 'Invalid options value: must be true, falsy or an object'); if (!options) { // If no options, return null return null; } var copy = exports.clone(defaults); if (options === true) { // If options is set to true, use defaults return copy; } return exports.merge(copy, options, false, false); }; // Remove duplicate items from array exports.unique = function (array, key) { var index = {}; var result = []; for (var i = 0, il = array.length; i < il; ++i) { var id = (key ? array[i][key] : array[i]); if (index[id] !== true) { result.push(array[i]); index[id] = true; } } return result; }; // Convert array into object exports.mapToObject = function (array, key) { if (!array) { return null; } var obj = {}; for (var i = 0, il = array.length; i < il; ++i) { if (key) { if (array[i][key]) { obj[array[i][key]] = true; } } else { obj[array[i]] = true; } } return obj; }; // Find the common unique items in two arrays exports.intersect = function (array1, array2, justFirst) { if (!array1 || !array2) { return []; } var common = []; var hash = (array1 instanceof Array ? exports.mapToObject(array1) : array1); var found = {}; for (var i = 0, il = array2.length; i < il; ++i) { if (hash[array2[i]] && !found[array2[i]]) { if (justFirst) { return array2[i]; } common.push(array2[i]); found[array2[i]] = true; } } return (justFirst ? null : common); }; // Find which keys are present exports.matchKeys = function (obj, keys) { var matched = []; for (var i = 0, il = keys.length; i < il; ++i) { if (obj.hasOwnProperty(keys[i])) { matched.push(keys[i]); } } return matched; }; // Flatten array exports.flatten = function (array, target) { var result = target || []; for (var i = 0, il = array.length; i < il; ++i) { if (Array.isArray(array[i])) { exports.flatten(array[i], result); } else { result.push(array[i]); } } return result; }; // Remove keys exports.removeKeys = function (object, keys) { for (var i = 0, il = keys.length; i < il; i++) { delete object[keys[i]]; } }; // Convert an object key chain string ('a.b.c') to reference (object[a][b][c]) exports.reach = function (obj, chain) { var path = chain.split('.'); var ref = obj; for (var i = 0, il = path.length; i < il; ++i) { if (ref) { ref = ref[path[i]]; } } return ref; }; // Inherits a selected set of methods from an object, wrapping functions in asynchronous syntax and catching errors exports.inheritAsync = function (self, obj, keys) { keys = keys || null; for (var i in obj) { if (obj.hasOwnProperty(i)) { if (keys instanceof Array && keys.indexOf(i) < 0) { continue; } self.prototype[i] = (function (fn) { return function (next) { var result = null; try { result = fn(); } catch (err) { return next(err); } return next(null, result); }; })(obj[i]); } } }; exports.formatStack = function (stack) { var trace = []; for (var i = 0, il = stack.length; i < il; ++i) { var item = stack[i]; trace.push([item.getFileName(), item.getLineNumber(), item.getColumnNumber(), item.getFunctionName(), item.isConstructor()]); } return trace; }; exports.formatTrace = function (trace) { var display = []; for (var i = 0, il = trace.length; i < il; ++i) { var row = trace[i]; display.push((row[4] ? 'new ' : '') + row[3] + ' (' + row[0] + ':' + row[1] + ':' + row[2] + ')'); } return display; }; exports.callStack = function (slice) { // http://code.google.com/p/v8/wiki/JavaScriptStackTraceApi var v8 = Error.prepareStackTrace; Error.prepareStackTrace = function (err, stack) { return stack; }; var capture = {}; Error.captureStackTrace(capture, arguments.callee); var stack = capture.stack; Error.prepareStackTrace = v8; var trace = exports.formatStack(stack); if (slice) { return trace.slice(slice); } return trace; }; exports.displayStack = function (slice) { var trace = exports.callStack(slice === undefined ? 1 : slice + 1); return exports.formatTrace(trace); }; exports.abortThrow = false; exports.abort = function (message, hideStack) { if (process.env.NODE_ENV === 'test' || exports.abortThrow === true) { throw new Error(message || 'Unknown error'); } var stack = ''; if (!hideStack) { stack = exports.displayStack(1).join('\n\t'); } console.log('ABORT: ' + message + '\n\t' + stack); process.exit(1); }; exports.assert = function (condition /*, msg1, msg2, msg3 */) { if (condition) { return; } var msgs = Array.prototype.slice.call(arguments, 1); msgs = msgs.map(function (msg) { return typeof msg === 'string' ? msg : msg instanceof Error ? msg.message : JSON.stringify(msg); }); throw new Error(msgs.join(' ') || 'Unknown error'); }; exports.loadDirModules = function (path, excludeFiles, target) { // target(filename, name, capName) var exclude = {}; for (var i = 0, il = excludeFiles.length; i < il; ++i) { exclude[excludeFiles[i] + '.js'] = true; } var files = Fs.readdirSync(path); for (i = 0, il = files.length; i < il; ++i) { var filename = files[i]; if (/\.js$/.test(filename) && !exclude[filename]) { var name = filename.substr(0, filename.lastIndexOf('.')); var capName = name.charAt(0).toUpperCase() + name.substr(1).toLowerCase(); if (typeof target !== 'function') { target[capName] = require(path + '/' + name); } else { target(path + '/' + name, name, capName); } } } }; exports.rename = function (obj, from, to) { obj[to] = obj[from]; delete obj[from]; }; exports.Timer = function () { this.reset(); }; exports.Timer.prototype.reset = function () { this.ts = Date.now(); }; exports.Timer.prototype.elapsed = function () { return Date.now() - this.ts; }; // Load and parse package.json process root or given directory exports.loadPackage = function (dir) { var result = {}; var filepath = (dir || process.env.PWD) + '/package.json'; if (Fs.existsSync(filepath)) { try { result = JSON.parse(Fs.readFileSync(filepath)); } catch (e) { } } return result; }; // Escape string for Regex construction exports.escapeRegex = function (string) { // Escape ^$.*+-?=!:|\/()[]{}, return string.replace(/[\^\$\.\*\+\-\?\=\!\:\|\\\/\(\)\[\]\{\}\,]/g, '\\$&'); }; // Return an error as first argument of a callback exports.toss = function (condition /*, [message], next */) { var message = (arguments.length === 3 ? arguments[1] : ''); var next = (arguments.length === 3 ? arguments[2] : arguments[1]); var err = (message instanceof Error ? message : (message ? new Error(message) : (condition instanceof Error ? condition : new Error()))); if (condition instanceof Error || !condition) { return next(err); } }; // Base64url (RFC 4648) encode exports.base64urlEncode = function (value) { return (new Buffer(value, 'binary')).toString('base64').replace(/\+/g, '-').replace(/\//g, '_').replace(/\=/g, ''); }; // Base64url (RFC 4648) decode exports.base64urlDecode = function (encoded) { if (encoded && !encoded.match(/^[\w\-]*$/)) { return new Error('Invalid character'); } try { return (new Buffer(encoded.replace(/-/g, '+').replace(/:/g, '/'), 'base64')).toString('binary'); } catch (err) { return err; } }; // Escape attribute value for use in HTTP header exports.escapeHeaderAttribute = function (attribute) { // Allowed value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9, \, " exports.assert(attribute.match(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~\"\\]*$/), 'Bad attribute value (' + attribute + ')'); return attribute.replace(/\\/g, '\\\\').replace(/\"/g, '\\"'); // Escape quotes and slash }; exports.escapeHtml = function (string) { return Escape.escapeHtml(string); }; exports.escapeJavaScript = function (string) { return Escape.escapeJavaScript(string); }; /* var event = { timestamp: now.getTime(), tags: ['tag'], data: { some: 'data' } }; */ exports.consoleFunc = console.log; exports.printEvent = function (event) { var pad = function (value) { return (value < 10 ? '0' : '') + value; }; var now = new Date(event.timestamp); var timestring = (now.getYear() - 100).toString() + pad(now.getMonth() + 1) + pad(now.getDate()) + '/' + pad(now.getHours()) + pad(now.getMinutes()) + pad(now.getSeconds()) + '.' + now.getMilliseconds(); var data = event.data; if (typeof event.data !== 'string') { try { data = JSON.stringify(event.data); } catch (e) { data = 'JSON Error: ' + e.message; } } var output = timestring + ', ' + event.tags[0] + ', ' + data; exports.consoleFunc(output); }; exports.nextTick = function (callback) { return function () { var args = arguments; process.nextTick(function () { callback.apply(null, args); }); }; }; �������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/hoek/images/hoek.png�������000755 �000766 �000024 �00000112063 12455173731 036711� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������PNG  ��� IHDR�����}���W��� pHYs�� �� ��� OiCCPPhotoshop ICC profile��xڝSgTS=BKKoR RB&*! J!QEEȠQ, !{kּ> H3Q5 B.@ $p�d!s#�~<<+"�x �M0B\t8K�@zB�@F&S��`cb�P-�`'�{�[!� eD�h;�VE�X0�fK9�-�0IWfH�� � �0Q)�{�`##x��FW<+*��x<$9E[-qWW.(I+6aa@.y24��x6_-"bbϫp@��t~,/;m%h^ uf@�Wp~<<EJB[aW}g_Wl~<$2]GLϒ bG "IbX*QqD2"B)%d,>5�j>{-]cK'Xt��o(hw?G%�fIq��^D$.Tʳ?��D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;�2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ<FFi\$mmƣ&&!&KMMRM);L;L֙͢5=12כ߷`ZxZ,eIZYnZ9YXUZ]F%ֻNNgðɶۮm}agbgŮ}}= Z~sr:V:ޚΜ?}/gX3)iSGggs󈋉K.>.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz�%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9<qy +V<*mOW~&zMk^ʂk U }]OX/Yߵa>(xoʿܔĹdff-[n ڴ VE/(ۻC<e;?TTTT6ݵan{4[>ɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG4<YyJTiӓgό}~.`ۢ{cjotE;;\tWW:_mt<Oǻ\kz{f7y՞9=ݽzo~r'˻w'O_@AC݇?[jwGCˆ 8>99?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-��� cHRM��z%��������u0��`��:��o_F��^IDATxu\Ggvݲw;:BKү N !nlvn;~LRFнk339G0M>}C>bOЇ>}#>}CI}CH}CG}CЇ>CЇ>DЇ>$Ї>_G;@#>uyCI`͚5456BsK eGS~?쫮!++p(祣234p8J(ᰣ(*iddDJr +֮RSR}MuMHMM{\i+ʓG rݎ̃REi~r66vutWnZj p|nimkC%JJQw/ihJ(f`AHww7�9H6uude� EF# ɳlݻX4᤹6 ώ8Nʢ`2hk@4VmkC ޿%K$Ħf8GWdW3^^$3O(k)?,?7?_ٿ8K EK,0շߴaM,I7%Y ]&}CHsfSTP8CEo/cʌ1C1CTV(hah$TJkg7/wV_sϞƕ_.8~߃/LouWV4l7a m}LЇ> eg~#|QysD*n/}u<`5qL(_<Ꚉ5.;k&%uvv1+*55ea\QA-ׇ>}$)@$m"ۺBAȓDC֫ #,PEQnnE>a((+s/_'€~ٜ~ƌ%1 ñew]5|͡+V6zm>/ݶDRCGn �Ra^~f\Q567HMN) ™)@(3`?U1&ߣj_=$KϠ_^&v,˄"ݝJ]0qzv)C]On ?5ջ1r`9_:sC*^ُ\uͥk6n]IJvM.>}$q U{vU{v j$}pK8X=y$Ma剷[_"%GW(.SNF% yk6TUsrdTMG$\;O^_ͨ|,f`QmAj%H([א5L'u2NWwi?ˑWLX(}WЇ>8N'�ylx1'z6V1[<qCHqy1@A�M7ffOa\� @LF4^{}?k#sXMu};`8F8YyJ/w5U_9s&{.l\}ȏ?{Ȳ-{^nnlÑ؋Q$DIFЇ>^�RR%Y8#Kj6n (ۿLVz2(q͹3 FfCN[HFTSA\Hyy6|s5~[?ʂ ۺD$ 2J篟~nS'g@y!nL,܍?>O,[-k?ڢ.߲MOVսz2"^Iz_ }CI|"kkdyKnJMrUT/7n#/AM(0 ^Z[>ǪV4DA1&YkΚ$Bͤ$yHLqtK5f ,7Woa8v 2ErorurUgNrDtias7w?kkjym>»߭maA wP.>}$ܷG�\~?ÙϼIrya&We@qfR<t"%{y)`ܐrTUv"�5xL;ZDхdt$&<vl'2v@UAt~k +pgRFW8F[ į[Wn|Ѷ:0@;I? w?<^wi>nx͚oG ٗW @@e,۲#(.N<~i/' vAXl |9*yx~vtMyA.?RGe1m ";+ tePy+7l#X&vZ"τETg no( \u瓼z'>P\kN'ZO$Ђ`jVZA?_`{3^^]ce:Bd'sSظe/?WU vz$I7P`1g7E  z=�x 'Fݿ#vdg+Dqq; T`&U GpMf}&PBmS.>йbD9$y001 eQ$c 㘦IN$:V[3Vï;ZÏxڦΚ2;̏~.E>>,`(/G\_~ptX囗~~̬MXB!4H@A$`p %N'-�IiiSG~xU\}wHMN⽕q8 (-$-Gߋw0'ϋ8/Y}z:i Q@nTo33c=NdI0MNF+f)<z&n~$ ÆP$I1ٹu7G9xwp%4$=%T?6c:]<--6?q `@w]6W[#CQPPxzInƟJ.=s O`((hMXaUyW5G @D̄VtVs11q;d.5T1Dq~|x익86nnς?0!E<|緸vd&8a(:~.?m87=\ֽyy7qmm2 6Y.Kly5[lv",^OБd,܁ ;<}*k;vdYF$DQsDuBK}@G/zǿ}ǃ?,�‚9٩ۗg͌1xdΤ돀|beTDCӈG8NHz|eXt SN/'!躉n{_в<Rv(54sE7[~(q\Ѹd%|p̟[O:˦ gG!  5MCKij%ˌ()!kK欉č(h*b"l0i07GyF@i4v<n,q$Iax +rhϮ ^$$Lf_>"�^7:oqqB:!tݠ0p^Y闟EgW~Y].!A@ǐnrlZŹ#&f% w.FsGab"RF28fΞ#w|[7߿U~ճ-PTMqa#qzr!=a ?N;gmY{k]٧ cpE1 g׾\{(dƈ2MsE:04 ;fʴ/ {XQkӚsq]G>0MAR?7ˬ@_IHhű_X%T;PqwS1jP9["K)>'VVn͞FFPUcG{"](EQƷrpXR|6"1U70 ,Q,⬉(I#fO}QUū/F̯%깸6bqEՑ%Ok įBlO,06a:N^UMbp,R}ntZZ))*k'SQj(F8e^B\r,r~G +$IE%jpJ@h/,I($}Iͻ`MC/.`=ϯu<Q`˞z6,_<O1P5A@"fP̕G[ش+f`TE^-UXSU˃oBMΝ<Qƹ|P吝#7F}7G<Ox{q.6YBtT]!JzTuKy,ڼ&<~|eT䧣jqM%0MIo%tpSٲ>]NUu=0bP )혮d!8P0:}EQt/G} 7n|憜7IDbcpAF<n,Y0H!@$ss2hl$vǺa""LDk+vGuR(t ht -9?2hD=MD�wZ.7I8C7LU'仗M^e:~ql7r@( !J-F{K sߒ';=m<n,;}ݑn4]_"j.x HMM "qZ]ͽ<>Yp(q_zdg^w9S5[.ɸ~)\1sө,HpP8q86첄im2qQ\1{46'(8veq9?YUEȢĒyff~tG.`}"k|$:e\9gWo'3ًD!Aht.9}u>~9LX,)?> <v3  Ϋo755&Z >P]N~' nDW|]�RRRR&*/slugAD⪆je$q�7 M-&86^'~MB:I(N%/,eDE>*HOi:I'.4w1v@!a0DI*:xn7S'[;gܠ~)MYK݈R|=3) hIW8Bߍ,I# alfݡ(7m5YW#ϾO[Gwv`M^tD7M"|g/?6zMKĵܧxI<*!~/U4:&${jI5U7x}j@5@D"3%ZGEa&7GMC;,FEnj^qM#h7y0Ӈu+݋bDdIdfvl0 j6~p?s i"Ɍ,`v{iH5Gyme!8*cwLVBr3' ;_\{ x~ M83fa~�Lk{&eS)}$p:nxD1#L>";fgQM!/+IYfYŲk u˯^?%e&"iEklx~6ř,d},ڸŒdN$M4 R}n+a ָgp 0Jc_tn+˶qi#F dey[ b[ EDw4ז``q6#s4ejo77%W5o]-੗sޙyqS[hm *¥]dƇ*,Fc{T>&Ϧ ^8�{a~A?0蛗,'v˟|X<C!I"&Vk&Db Pʰe%F"ZБMTehQ:m(Ekª*o؍&4tBQTM'ZIoIK+pe=A+UdI%8F/Ͼ1DA5@;WIYDt8AIN*a ;dn#aqә<*zi)g͋-g! uO8pҲ$ˈHg @gG�Q\ 0Jq>҇/`Lv]Im\"g /wW.(/g篢lX9Gt"ihZ Cv9ظ-'mPM\Qum,ڴCC覀 ƕe0mYD4f6fw9cT9gN Sp%Vn?ڪZBQEխnmYHh۩錢H+˶qcKX$"̃Ƀq;D*c+8H\rWV`rޔ ]8EQut=ƍ#o-ӆ"!r׷Hyg͎?-a|^C}#_I|AHBssY=]Sڹ} 8}0tô"q84bb 4¾}9�MS1MNMS'5]!Ҙ3pL8p7vgNf:^^ _;k<!3cT9Y)A$ACGe3rP.lC!"oo%bع;ד�"Q:yU|os1<%'fՎZ~k y\V7A3 ~xi /%ƸnX"1U"h; m< EQ:3`ӜDI዆UOgنzlҵ߾lQ4MTMf$WOY8<iBIvH8 L$;G0CDSCh4$jFlr+4~ Yy36r͜1(_?]q<p+!64EQqg"\ln~LN4A9)Jg|w? _>%9e|opc7pT.KD4UGKzD7TzDϫ+3g0TUc<ʛ;EdP�X.�b^N^j(˭opv5DI>{&pSMə1&7#=C gh?X`n 0yXdI$~>�•;Hyؽ>s�1EcTe>v,䦠h:( "ab("�x\3Fl[Ie}OoqMpe$I.Kb;Ol~o0~``7F -頭:{\IZj"&2,#6D3'CDMʙ]= 3u4n]8FeM&S [.Sh6ϛ]Yy!]b_eA^lYeSnwEjjٽÀDS]$}_&L޵\c! PqJm@C[ndPE>}@,yp4ƶ)ͧj7ec!MB& ,DQuˀ"1E%vvؐ%7cC#+8p$LS3sOo_x?~<lquqɹq chDA吹O%Ι>@K-vID7l@\aQhh3 `sU5W1R|DΞ8PTAOxAa08 Qh u-]T73uX 8;t 5ԁMv0s0FܝM͓|ג+WHE9sSR^HAv:)Hi9ft"C7R}͝mՍ"l8)\g=n+ DY% HKII?ǟWI$ˬ]ϐ%y,"Kd:dkwDQ]61DQ4vl+L~t("xwf,{N'8̱t(KCQ5D )7$)5nkW^\">|'>mMtvt[OE+v9XW^~5D5xknY a~ 01ba:k8cP¯MY^%Ab0(*tn&'1tXۻh 3J$IJf6_7^ޖ惕IzÐq\yxE~ C*Q`3[+{ cƨJl)wں{o-M-(z25=7nh8 I}v}Dy' '٧.$ˮi${T](m]i$Uk ?9Mwsbbq 2p-Anp eD*hY;U Q8mD)(*XL\77]4Xȹ16meڨ2U *+^31֚ݘ-54u2_hD<BYqU(LF%$A�a-2ҼSA,S^{wsc[NME~jK̤TEq؊VGQ(QH$f ~],-H'l'岹9{ژW?_Z4i`vڱz'чO_X;$!�)>_!C& *FQ5./M!䦳{WM{hh tw0p9.f,1DbVxvR}.ں{\U Q]|8hA("`""1EC8D tA5cDqfde&6YBKI5Md_]NS}'}J ,sQd y鸝v@ϊ+*vA83o o⡷0}xMȘ;y0幖y͗Ϟކ6Tcɼ~77%%!0u`ǹםl*[bQ,ܴFUw$ "O/;vq͜1LVL43"hÝuΦ07sm>䔉,]_ç1=d'ۓH{x�k?" 'PzYcJi^"1KKgrg]A^N*.4sq9lV'n>C Úa`%;0yp^1EE� 3)K#TնrA0Lր$n<s n00 H6QQYe?STE.ד Y8l`�L3(^u:(Xd$Bl]||Y>vщ)//F{wdY)MC⚎`L֏J%ƶ.vյ6틦0kd9aZτAE*^xksh ]ulW 5g;mnw=Nfmf~Ȓ(G6jg#SqCٗcWwKx IIdkN x߉$Jۀ ܼ3cH(, &nG56_5.;)><ND ȒDLQTD*=^Y>8m1U?L�a$HfFڃ1|}*.nS@8J]k>=`8ݛM;IvTUp&ƍ(CHE#/.ژ5y�F<rx:% &Yuם1 \ (etfQF0F@4!s;XyvYiCYp\EU5vCSrKd]KW0#=+_{6kV,㟿)_:}$�(;0M $>,~}%_:k6[v~{k?pSId �p|O{C'ғ8 x}o$MIʽ<oEt3O$!W)pUU4wdO !ST^79Idx{閜ETX4L>8DQ$h (b2:#LTLLdElfӞ f24iG ch]a^xc%_7 CDp6DA W1 Mt# <N;&&3$MuQq(lH م0)o`Kd7p9l奒dgO˴eȐ̙=\fΚ= mM'LK(L!vxSGYGUm+(?ŗMeՎګ.=s'j/É!8[=<D5ԍ^\C?Ŝ0+)^G8q1 [شtGbcUaiv8d@U38 ӿ04 TENZ ӴM7P膁$J1v�YWj覉n-kB\Q;tF4mݍo.3q@o #qe[Hi4Y^(% o,ܴ@8a${\yp2n& .F j:Hqډ)Yb3Kpәc?pi௏/ hx\6n5_3j&N?V~mONII CFb= *#׈>'i~751OD\vi`$j 2Yu PȊk�/pSϦ |(}>Cwp{H T$짟7y "6ƽ~{۰I.>'E9)vnJS𹝸6beq KټCu7VWH>Q:QThܒ툫:KQK4b&|ds (r1o0..6EY5Mdx[WǂĂ@aco};8v*3HO"- gEtZ?]@(33$W3$,qfN6T4qr6B}[CKLd5rW}ƱI"a:&:aw$ƿ\CťӇr1<p3ӆdpA*/-x $ _çIG>OC +Ń_W(HGf捪,`[7PFF1qp?73g'v I"6I$SE{ յs8m2yן{u +mI{KgڰJrҐ%X\CQ5 ad&f"UT[jYj=y#JyLZBVTq]]L6Qhٍ( x\N4]'X2uh)0Dn0n:FVfuU-~ѕy\>s$zs Eɀb̝0s' U,ٲ,?:qe{tq)CJhn4x6 k֬aȈteDbuK㲳zA~g%9cPqa~^t%Zyoz}^Ds|!8OxH%<M'$@V%F{j(02mf$3//FKC{N ,!I$UT;CAk BTtEѐ$ θl]yuYq5t~R| e&#n%1ML ]+|$'!etevԂ|?e0WIdZ[40yj& So͢pvtbq$Jd`u/vL¬dQo_8n& *#ePqv6$;w^i#{m`ٖj5d`$n h\!-Ƀצs#2T={/(`QlٳK;Fy~VvQSe'/#p  N$|(igp(�JxQ,ћkOr|Uu ׷lN/; Q\vյrx6 `${] (w_G l=NtC''0D\N yonrٕnl%6V<-Ẍ́Ags#!�l:T_YƔB6Lҏw)Cp$<N'?"vv\vv3yh#DM7d6`}gBӭL٪hNZ]Ub(`ΘJfH[|-X<F4ZlH$%I۷m'?+뷰dv)K;("IvhEՑ%PjHዉ=A mC]g` J{9sAj% 0lN8,%܈"h\?w,`betx\I8{Q�0x5Զh3_>w?\5$4* 3Ǩm GYa#/-q )NFӡvX]PٲuOCQuҒPΚf*(g>C+h=X n:N4U7FI ]7>T~c&>74t4-CZ s;hFk+Ya/, ,qqﳋ\¦5lSM}$HjEQ?D( N{̠4Ldte?|5GtTM.o`z(%D%Z:Hv AXU0G&vx`Aʰ<Z:_ɱG>$ o�qm=E7̘.T+BJa躁(BTQqeZ:#12sVYx]vv M7:l|\q(y{-o^0 l^YVʙ#k{-/W#$*Lr|\{LVn#ow7#d2v`!}i)ϝM͢5D p#ocH^'n Ns)ȒnlۻwI8ꡄ<Fw$ƫ+whSsڈ2 2{\,\w7W߿ P4D<.;G UU%5rVmOFږ�E٩\2}(sTZMisc m ŷ.$ tCރ~$8uaƳauv :Y;\P ")?±`cX9&ppqO˧~FOCFn5cDA6M! e)u;|dU�ո%`^6 SUkfde>٩258(qU.}Pqpy8slA)LCih u4 wSMSGrHL掩T 3\Dhgaj:0#k<h um]\6c88l2o3Wϙ7q/ebg# ݬs3$TfRa`k~d> ,$Mnb69HnX_C1<^4lTUl\*O.~:E~ۼV@uS'͝a.qQӌy < L;VbR`ZbQt%\<Nd`Fq|g灹w+>vǕk%>ŝ af I* ;)JAuv` k$Db*̔Ř.C#u'vVueA>pLg1In (bƈ2dQ /=TKdT%;Y ]ɕggΤA<EU7~:DC%ܞ\@2u:V뫪Putg޺ 藅&rcپ-YU78}L9%9$QJi ?L>/Nw0 BQΛ7[DzM՜>H\4 $ц/)CL^/h ֱqZ]];Z:v)bJWV;{tO!bPQ&߻l*јA@uCv6Y,;9x$ p'Oxwh1Mไ {(2^kOb%lw|kmh z Z<>]'K"Vڻ6$A+4tU!HN"zs-Ս\<m3G`)I.B8R"fqeMmk}/._9^5Q$K`O4v42th9sϝʐ|2p7HUM%i`.si9h"VGygRAya ~AB10U̽>ey PLa:Vm;n$'TQI\ALQ*FjKΟ3OˌQVA ںd:\.0-bҥfcSSM$YVe;Uɇ`i#V+´7�0+?`#y~G' 9X5U?|j8�KQϐ0Yހ`js齐u{h¨(}<& vnv7;4r$ܨhqNMh T\Ͱl|n'L,D\i:S1dZ}/,e}>H<N$}#kjY ,3o(n1LdIDU5‘E̙>V5b􈡜3}4ݭBz<SN0GO4}(ަ6}Q*i yTdc檝\?w$"8ԂM<Z+/ⴓACTn G˖ @w0œ)C#o9CQD233شa<Pgjioo~rDT;#H]$iҍ䱷A7<o`o5jI?E"Yixe Qv/Ӄ?9||#X|ǎ%3[x- # qi! EAt@2S|]]FHH{Wږ.fQR.nx* ݌(ϣ83ή}nÙ5em (e[IORuyenJ2iT%~H4N<3? ̾NN<[>HWmPlۋ(qΞ8hoPu$~7vY//@7 UGea&iIm5odΨ <N;7$ <jںB4)+_4߇01i:$gœgiP/coukb{WD\%>lji)/ c%j[֮@55^]+"F6om[vrb=I d'aq%V S/NY8WL> hE=$n>kz 4#n {n⪪dIQL^ڝoGQU2S] Enzv]ρ�lS`= /#3w|fS5q @Er}X+K4GUSR"1:eh%+=_~rDt:Y]Eja>xulY΍gGu4s8nqh~c2<NӆpΤhv~۴tG0t+f+y+*~]h:?yX //e&k7nű͔^ǚb6hneGm bFߘNmZ`x`w,US&]S_޴u˻Oı Iws{"V*O�z“#r=$?N 'XJ' .2M=0LӈRd=${f&fk�Mœv\v<x hT M-TKvCJH\#mY|TTKykUM )a++ik O0d?hhM&x~gYv/9$Zx~EOko,sPY=4xf߽l:}}]v<>=#+s8 .ac'oKaf p̑4w2_T4!K> 7fj"%U07 Ιě7ݠY_667�tL3VP9s枩֛ rchZ,ձdqE֏%!X5'hjHLǪɜM>k@POP\ ;둰&ݚ0܋zt]g&"8v~} &�4STE0 U3XpL%-`tÆ!Sxnx^_o'z9,X.DQ*w?>qU&KtuE:ΐ<2\pYz//dNCW{#|Fg2jH)vJ?~~CPSܣד #AEYMUڻ|~}/`ݮet;<0tZêMon?15ʫKii/? 2T8r$1ʜ9uZ$ALxov}#N0A5| x >a+a4ONaL̿Q' f1^MNJ ŠSB=% PB1E FDAP,Ng(J8]x\v)?{-R\<˹p`\N+#Ă d${ZH c˾FINj~ IeƯ0_O}s'6 w(埞gђ_oл**m|)t@4�& %??_B0M.^= /cLe 2Ĺ|Λ< %=C4W5z|!{>C˹.F^¶Uʍk.K㫪FAv*Cfվt3HNM0†=%I ؅srzȸ7@Fppbg,ʰVᑠ& ')Td?M UOq,/Ŝ(['F㪺}_(&(,1mx)/,7rLaW= ]d`51Ӓ F1_3_Z rѴa\<c86Lߋd$䥾e9n#+4xx܎7>L P/J%|9 Dha&e Dx位m'7’-XȆ]ycL"T߹h2_:}$^MAzlo_|l2S ЁtL/DLXe7]adOd],ZSnރ 8}p8HIMşL^^scНZ$zb`%">WxʱJ*ep&v'*/lc%: gV>;s-o5J]}<)Ih@ӈmn9FEL jIIeo<zGV`8~x"m綦&LZ/;_=6 A?c4y_@K Yqf2,uqf +-3ag$xkf~ru:4Ԯf^ \r3)K$PGbqCՁ(+#wO-47TFUڝz l r9dm˿apE>`SA k`RZCjQ }y3,c y..'N˅+7QPTD;^@'rgC:#_qPYO 7'Q Ě|(&%'p2p9XhI6rw/߻az$b=ĺeq**EY\={ [,ܰyXDA&ˬU-ʯVZG0¹2g�~2=iNae)L&3Úu?u{jYan$7k7/_H_Djemqg?O`܈ ]GNF Y$;=Zaٌ,c޸9B|O/H5tL~/a&PF Һw."ajڻa'4z\< ֬eG1uh?d|>I>CCnn.驩t?$}ucVSщ 3$r30x_4cuZ(NT_ pP0=Bh;E$�H<ޖ7:aj8@n4vȜ;y l\6l7O.Q4Q}hw. 6n"qFUx> R(Jw""Qn?;q`Fv7kZ\p9}R.7`Œ$"2 OQEuc;ScXi.,Ǚ0_]?_)&f{_;PҵTKX~_̡ 堾_YΟ2$ 1{Vg8P=jj�OuoR/$G��~{]'@?o'ww'BL{O:KbrEzmzKͧ<7A$@-gŶHLH �ұ~/5H,#1VV2sd];7u947VVrkjyi6VK˶sʊNNQFK 6fy6w33 7Q9Ϗq|9t}qhÝuA xv& ,$7O]O-nd`a&]>4O2uH~tW L2eM4w Gnw2N;ם9h\%(ӟTИ̑4b&|r!O~1z'xG1\I`^roy/6>Xe{f`{=Oۓ8!SP  T ڪádAV2v+]d<N;[5u;)Jl__^#o'_:^Zf|n'"`˼b;m] *Φ,/w4'CK'jhw_> Q {?_7^_n>KΙDg AH{x%Zn:k<{zd)I#vU4dhn8s I~/iiI՞I;UM+FqO[ςI: II|kX4F[[tJ7ᦏ87۬Dh1~W`Ip|;V[w6Vj$&M o;ѓX9]1U9o%f@A:lk 3(Og]U-k0{L%ariX�ӆYcʞ6i2cD)|W֋nȒA<`Ssna.N%ҿ0\3?&?<EՒ8�)In^{5|m~q<ȚP 0&j =muG8o|/eJ>Dq4h/4V[?{3FW0ip1n&n�H$U%$򐺻(~ސG5Z blWiGYBƚ7 lh^ű59w'uې, A$t jHU5=$y6w!e_N*˷@ 6i_Ny鼿qFCKs(IlTi:gNMPTP$5g;𽿽7|(XΩCKkf(n:k&A�My):wuʊYtu?d)IVO˯o<ESƴaDY ?¬dT]G vնӇA3L ~҃?&KZΓ((*_:=w<^1|D<N;%[J ,S{iVIBo*B=}ģ=dH7>{H꣘sJ%??wҀb7CZO2Ip)I@4͸la:~ӆlDQa&j[Lbe`;Nqsyy6v$.1>o~jÚˬ_?y~~;|ɌP$  zs 9IxXf /%U>P5pOg0̗dI⺋F>d\S^V=}o\0 {yu ,&K,'r.λRLqv2 04 qᩅs !([!.7l+D,5Y]"qqmtc풐8a,̡މOs۱#F*s]x{$oT?.$RX} sIb1d%N A$L@7MS p4Nzgnnj%Y騺Nf]ư0m -ɡ 3]?3ioi&QtMHr9)sdxeHHdxCoQ_y>)^(v@4IynKE ?'Y3Fօlܼ2-+vT61m©C(I;Wl6^<|ҩtNZA//C1 /yW\EӇdS5Y>Tz|ikMRUUS7Y@#M|@OqN,I DqxwOl..5}5�=k8OsE=%C~4H\˒Dke4nH{5$7iP 61(]5 4**qEPT!:xwH.G@` \v;=:i^6& B3 :CQv-z'LHeAqE;\q%qߺ:gM4oWϢ+KK1kt9ƶ&f(K4t8l2E)v⊎$d${ET?;z=xo(&݆bѕL+v0T$I@ Nq.%3c:Nb'ma!>e>'ǍRiO~c?G&ՙ}IZ6AV^'cI%�nK rj:nS4 2T̢M]%WuSXWW젲 I~d~_0 3 c2mh)tbɖjf'ѿ p eÄ_YAcG]ӹ`%tvr!/<Β4矷^6 -fʐZ!rR^dU* WUqd"qd09UuT5UQ̗ sճQrNa'_Y}I$3d闝 {Y]UKvlfLFڪ*޸esMiIɱpQS Ia1p}'7Idɒ?&Kēp",�7=w $aR gźr7iD(cnPFw(ʛwrH, /eku#~ZBh !/OE~:yiq"qEYNnZڹE ,/.g@ae4ql A x\#kg\ueL֏J)N`kvIdƽoh<?P4(mW $4urc2zdY^c/撳'Dw0¾,ʳ)18tbbr8c\nGA>;Ky^˶9�% $]I`$%H='G,19At$ͧ93c3wIxMM�*dAHefވ֓u1rRIx]NnKw8Ƭ12n͌T@144{g$#n:6iCDw\!w(hA]K٩^\a~$z<NFvg*.Вl.'ۺhǰ9t"ds9Pt㰇"hy,vsoUΛLF.{f+),d]nPTf3,$)L+gGMU~XdId"6:O-yyԔ "jEQ3,bq䑿q l*ܽ}u:wy:,e׏C '|WaU}buM^߱tN&<+gsL=SO"لy޴aSUB 0:diI~m+E"-5NADDUϦ}MdRQ, khO,98w BQOq:u=~!a=1dxm/bpdqT͛Han:[w س)C)N崣h:Mls%j9ϐ",X铇bLe30+tIg8D*j!q@Z_\;To_HmkśxCs4;݆f#lcS1Hp0cHMKE ^Þ1<N'VWLqJMÑ<n`ɬ%DہM FRMDQqQ(1@aQ7i Y�n+-_nq9`: gN&W!K"kwcJr06A;] \0oCeIሂal^%N`ɚ<D i'++[r63' &73ۙ +Ё6'+'yHObc&Ḋ$vSZV4t( 9ԏJI]FӱaOGA#Cl` `(+_ʩ:Y)GhOj+X] O[dD6$@Kg ?u)>=q Z:NR<NN;>7iEpeDQ`k7WWqݼqH,&QBi^&&6Iw3" nY =rmUm5LZ~Ym27// W"#ٍ$h]O,k7{,ONaC]ٕrI{o(H<N, G ;ϞHQn$76SWTl6ʒ<AF /5U\4uEY)h,I<:ںœ> {4ha<.EYT G ̭a%\bNMqR[O:'Q9 6DHce"w{ LOi$ϫJB?@:H*Vo"8bFs\7[4u2Ĭ * 2hph\E 10MQI"0w\gNih$ tEb65U;s9Haw=210PuYi x p]iO]uQ_8M[Hy CzNBZ%#5 Mӑm2B4u0b`1p4hIF4ie*mǒ-Sil[P @_\;* EY]UKMS^AlNkvtsTEQ`kնr Ih5VWҋ3x lyOd6B͉xkXNQ3lD# 2Aj[$Fq30 I,C+XF}[mݨrRQK,˒D vu4:_O$)L\nia;EDȊ|lϜ1 )aW] ~ah~5EQn:'DDqnU5jh[3uL\N; 8vv.^yo7a[íUΨ<NΟ<I`k'Սv{'NUE+=#R8t rϭ᳉?kˉ4>I>'ֿ3~Q9{*H k {dY0M4u0ܢ-D"Ql ttGƐ^ xkuvl"1A(qECkw/@fPPu(e'+GZ9c*TȼLӄT/9Co7SQe)K'SSByQ6WRnѸ_+o%o%0n>~tb,\Kzn|oE[Nea&I^'߿l:aYqň\ 0IObծڦ]|i@~u'tqH"p%gDr{qܝ,N4½ 0'1{KhnR>=ʱZ*H" kv[v6TLCnNw(Fkgt;Gn3IKKu!vY,];, bƢ(kq<;S�~ 31#hIg{UMu$7~MA~"ۻ9 R藝&Q̖7s$fm $7{j"wGea&NE-u\?suQWd7v q\rPE\UTPI"6U';͏!&O?A_jzLIOB䩼'%~F NΏBIc1&b5죷y&3 'qѩ =5xlousxDn6U3ݽAA~%9̘0`8&c%Q8܉db5tUg82nPŅ3modXiZB 00 @IHOr&]2\II{hYe?~.:(O$' ݰJff͖jEᰡ}@qw )L$5Uӭ.lSf0$UKҿ `8MT7ZmGT*n#Id~9d${Ir;y" Hnnl>b]X+SQ3BWd ƒI?3לӛ Wz?!n)Ϝ7G ;]ՍƉZpOEfۛVt@uDAf)Kcێ!cB4'H`0Jj3gbjA.:m(DAV*5 HWuںX%z.,K--ZÃof[W46W7ކlSOC[i~/jY~]!TUCuLL:"vcLN$)ttf@˹sG㭵U,ظ;jhi: fw]+I\XD$Aa)H"dW]u]6vY2 ~|=Sj{@NohWDH>a*ƪRz!S|t!hۛ:PkP${cusnya,�QEE7 nA6W`ĠbtE;hh#SNe2/|$'uUzDjb .3DAaJlۺ"ln0L|x]l9c%HL!@sDݠz <*HJ_4%l+cr3iXS;sɄAEdzX\E0&F/@@"q@(FvjN NfuHJfކdɩUd 7 p)Zc+I}*[�xKŚq*Ī<7%A$T56I Ѿr Wߖ]w`ENUŬ\! cL]AGgmӆwӿ(WVqax3t 5飭2ܴ$T͠%vqtt\gN򲭜7eav:!-J^A)I<NVogǎ.=m8 "W5[I* CBQҒ\~VlN2_deOaRQV#'G}k7cpRkƄ 6 ݰyگ+G>; /pIZkg1iA=>k|Wk4<sH>>IJ~VGoMMKHnzi!'m^n�ɴ#4u3,%cǞ:<n'j 19Ѹb&0cd)>\]QT#Mvzg.:R|N2S<9ʁNtDΟ6Axq⪎nKd v~^xm=?_ZE8ֆF$Z& .fdy$e ݆aC3 v3Trāx]Ón$38[ދ�qU;a$,aW-vkv/u#=(�=N"5'qHQSgOqk?t}H7O -҃s!SD`uiSވ¤~gNL[ $ 2Rkfa5dHCk0t kOl$Ántä܌,8'WAd&{011LKCDmfoC;GSJLUu^ZyrtbD7)I:uK&KgAHHLSDں"Th%j EfM3(:kYۜ)Oq()>\v(io//ݪ޶}ο]~E#I⽷:/},)GXr݄8~qx0=z{iq^5ߜ}XÛ>i18pͫ8/s�f'DcnbUlml[vyGʴ+3,}K KTY @, " ŬrW T:xqΚ4/ &U%Ie$joI8pāe0tS064oCC[&qfrpD}:NLUUuȊ< EeKu3}x)4w#K"B(bj:-a2p;m&4uyg.6]tEͲ{q}(EOaO"ScI7 O␱+ٚ))&8rGzOAx9G +9f:]07=~KsS$D\mC4M maRL3,!ˀ]^,Yoہ$x\6\Hzݎi<Nf<:.9AZd(BB[3%[{]5i (]*B"S@bBeA)>oe?qD5Q#I&JMA;N󳭦HDv +BQ"r(M%j.7uQθEA8'U)N@EeQmk1}? N'J,S0dNEocou :'GR~r9 Gߏ@�2W|7ᦣU}k.xIM5ǺcUc5FM.L ,"#F&v `UiA((+1 /,qNULN;YҼtf)GIC!/-݊\4m(DTHr;7?UZx>^'#I!vQ+EM4L '-4 AJrSYMB& -* ap5(yhso=7 YJ4%=KWhNJN}ѣs$B'RXTgJܨg;SJՓ mIt>Q|^(w'<+y$ d 6u*6k.Nq3}VU:X @$;'8d IRCqv //E oaRBEvh᝵(NfR!>QhT+C�4]lj/ʠ4/ {X$72_Ei^*{Z?MW)H"h4B ."qrm21E,)Cy 4X݄sјA<bJ& ڼʲ$=<uÇűĶS1O@:w2Pp(z~`!ږۏbĒp~ Vj6p1Fb2N'ћsɛ V wCaRNĦ#FWK `I{v2$I@]86R<Vg&x6FVsYe?۪;6Ɣq(&I._508"EIZ͞6na[M3#+m$bqhA\"B EgȒā.:" FU't@Kx<>?M>.!OCM&i| -'q*M3x'XSh?ϴ>{s`OONl2]Oz s!ĞcM$ �cO?Ր@ccد()箘>!a# `@,4MX\qa$Q4ӒشuUu3eVv(#躁eA4 $Iɇ{,+ee奱nAX@(;jQEih Ya/(4UTdQ Sn0/Ayj"bt0>ni/QX }`{rS7 `ɜ˽%ןU�+)ی%qrן"paUj:|r[ rqu`DJ>z(akh N`3T?!!1GF$X[Ըn՞ȲDLW?)}lӀGm%EDQD׌g<ȒŜ3ykwԲe_5H7-XvE%E?;vYB`o}; ݇uTQ$f +eAxoh8Ȓy(@zdbq 1Љ Sث0]w:i`x_""$A Iy 'b߀z_*Nos |8wGW`IioB+iOJ U'GZ)-L.x ;8w߽tʒvuZmK>$NlsvH‘(00e`Z�SΔδ3)GCSBL !vl!Ki]ծ>+11Zr$gC{<w}}aW3:؟hRN9'8t,qk.ܚ\ot|QCfDzقN{C[;򛃽<n,KVD:="C0`x"nʞ= Gy� LѤ}7~ O;] |n ӢO(XANYRaXhJd:K4׷Q]]FQ7gN4l6Ϯ3/}\|>.<4*RsG|k.{u <v!wpi<c/س )~7 C|܆L!e#Rhmܟ\5mJp13 %ߺp B Kd2Y*gw㲲eKuT.@heo3x4Y\t5԰cFv[O6\<N[5!9I$O``ŭm:us99>+y{Gb_Vٸ<nXv㞅 cqnfz9h ~Ǟ"W>si~tCG79Io\%Fb=[9V&u"'=eԾR׉XEs>` [\"1/%m$3羌X/)\6R|};it"r^/p`ǍϣaYM5}rƦ|0E-kLLNNz\Qp +xrq{jnh>7QNUWqyo[7iq!=C`e"O﹑{n^+|W!rύKU\ g)LƆu-|?͉ᑉ/#Bu.<JXl.ƥI3IMv>8M}x>%%~)`<l\RBIszS<t>cnj1`OD*G揜E�Ut93݇i SOI) ["H{6eI{trE2-As]5܅%li UQ/:Ӵ+tă(?*/%.`Y V5R PYĩA~l:dT*uH&d2=ޣ*CC ,7CQ%zy"G N >6 +A%v }_<ĕ,چ_0HRM#^Jfcam(,ۅ?{oW"{Cgk_|J{:}});rfߥj HrC[q ˉ|<{\�nZ<Jƅ/u!'I<;638c|DIjp*^ۭi4~b<d׸!;>+|7;f},dKM PTj对ٸ9E2hu*!鏽E"xƻ\RMqn3a?(ʬ9J^a=sg Y%G&=-`UEYaUK-lˮt Trwa=|H_jO<(vσ�AFc9㻏 %UTHr*EX@ ,U�/>C[;y\t沈Lvjj%+r$Ó<]45TG`uMCQMu>e!ir|y.sG>!!" 5]+IR@6?} Y:O߽>?u~ XLߍi]{ַ�Zd̽ren)MkgN՘EKW?l':bDdb|hȻMS[70 @-%TSɩΌq˺Vd*/0aYP 4&:1B}-u{e^+gmM=x,Ƀwtvi=y]ǣh P5LK%+4ذ,9)ә)D풠!Dl, d3GNJx{ټ¥*^ 寗0_V)=߲&C`_ ;꣔;\^Ob>"W\}ݾvT`/obMݨ>`P(F\vX<ݶuRWkFVx4,uL%Eve;6qk@o[CM(XNÒ*n73Mrj0³O TQ_SID,d<M4%/ЖNn\T2@Mk1M7Un'OE "<x:VP]08)=gzJBp�Y:7k@%t%K@=jm>_U!s̓OF^Be4_kw+h$3dgE^W#;%S?)ᓧ@P4M3N떹iܲ�kZYZۭ,TIer缀X2O01~cS)+4m R# F)z5 U~}*<ϼtŲyKL"̱h?+jlqÊ&NMS]u/ ÓE_N^J^@m07*f9FZ*+%_b t=v"E#O?/EwCZنy rw#d:[y9/c~6pҦ㽵Fbn,s?s"O䲹TxP4VWf:'PU͘% Q(5tjqw 5t*`Z[=Lcm54.)(.YdrGݛi"W00-KGhD`u2j5{?/Zף7ĴcSz1Rr=y=zolڏlbk Wnd,{oүs1hҋFb/A R[;~w3:c(hOg3Ж͕e{nNiInӖhdIU7/ѰLr2IZ*XB`F Oϕ uIslY߆ϫ΍4T/4VS4,6lZjLE^PD-{9](w2 D۾8y$ej xS?"}z[r9_罳+]ZdY`5da{EW# a y[Y.>nq ?^_J.]8fNu{c[cs5 ~KXXucd(E[8HMu9iRYŲ^/UUD:yL /yUO^8»vPö m O%83^@;QЭOڶډw{i:!YB_$^`|*{2 ߳NtQlm7W,O`Fcl^~#ĭiɊ^>6njPl!5*|nEΌO0qE#5UeLRT{ߴ`F3jﮛVƪzc)"IJ!z].SB6KEeu\̓!rhיՀc / 6N2ۉ.ǻxЊD&/Et*f˼/oZsTN�Shc 0 Rdl4-BJnKvÚVQ S(rv2hdhWwNW;Ӆ7L"@nm"/0^/Fb7B-y,l)}jb ψe 򹃹|r- 7w׷אNdbiHB UU({XRnsE bx"nn:LALMd"g$2=iv%S>Jk2aY�2r֮H�Btg-ػ,=,z#q 7ugCOǁ*4}nס}V޴_p%膅(6nzG LLx^7,_/4H3Pn TUMR'mc5],!L 7 ͭ9kys!XȤLm0:8\._* .rwl}JL^=z$m f*`Uuˏ{K ͥ*" tb`*\eYX&DR8u켟W ~<o  u~w>jEWEMf{K5#I)-%)Ϡem~d[U^P�!S tSL4QFXw}ÊR>UE6NvnNwvYX@qY98Na^PfXģ2Ȯ A"c{!q|/EpppUk$^oFNr]Hx id(H888̃ppppp~QK H888888F1ppppppc$#p�P'�����IENDB`�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/.npmignore�������000644 �000766 �000024 �00000000304 12455173731 037066� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/.travis.yml������000755 �000766 �000024 �00000000053 12455173731 037204� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/index.js���������000755 �000766 �000024 �00000000042 12455173731 036536� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib');����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/lib/�������������000755 �000766 �000024 �00000000000 12456115120 035625� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/LICENSE����������000755 �000766 �000024 �00000002722 12455173731 036105� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2012-2013, Eran Hammer. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Eran Hammer nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL ERAN HAMMER BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/Makefile���������000755 �000766 �000024 �00000000422 12455173731 036533� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������test: @./node_modules/.bin/lab test-cov: @./node_modules/.bin/lab -r threshold -t 100 test-cov-html: @./node_modules/.bin/lab -r html -o coverage.html complexity: @./node_modules/.bin/cr -o complexity.md -f markdown lib .PHONY: test test-cov test-cov-html complexity ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/package.json�����000755 �000766 �000024 �00000002767 12455173731 037377� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "cryptiles", "description": "General purpose crypto utilities", "version": "0.2.2", "author": { "name": "Eran Hammer", "email": "eran@hueniverse.com", "url": "http://hueniverse.com" }, "contributors": [], "repository": { "type": "git", "url": "git://github.com/hueniverse/cryptiles" }, "main": "index", "keywords": [ "cryptography", "security", "utilites" ], "engines": { "node": ">=0.8.0" }, "dependencies": { "boom": "0.4.x" }, "devDependencies": { "lab": "0.1.x", "complexity-report": "0.x.x" }, "scripts": { "test": "make test-cov" }, "licenses": [ { "type": "BSD", "url": "http://github.com/hueniverse/cryptiles/raw/master/LICENSE" } ], "bugs": { "url": "https://github.com/hueniverse/cryptiles/issues" }, "_id": "cryptiles@0.2.2", "dist": { "shasum": "ed91ff1f17ad13d3748288594f8a48a0d26f325c", "tarball": "http://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz" }, "_from": "cryptiles@>=0.2.0 <0.3.0", "_npmVersion": "1.2.24", "_npmUser": { "name": "hueniverse", "email": "eran@hueniverse.com" }, "maintainers": [ { "name": "hueniverse", "email": "eran@hueniverse.com" } ], "directories": {}, "_shasum": "ed91ff1f17ad13d3748288594f8a48a0d26f325c", "_resolved": "https://registry.npmjs.org/cryptiles/-/cryptiles-0.2.2.tgz", "readme": "ERROR: No README data found!", "homepage": "https://github.com/hueniverse/cryptiles" } ���������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/README.md��������000644 �000766 �000024 �00000000253 12455173731 036351� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������cryptiles ========= General purpose crypto utilities [![Build Status](https://secure.travis-ci.org/hueniverse/cryptiles.png)](http://travis-ci.org/hueniverse/cryptiles) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/cryptiles/lib/index.js�����000755 �000766 �000024 �00000002536 12455173731 037316� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Crypto = require('crypto'); var Boom = require('boom'); // Declare internals var internals = {}; // Generate a cryptographically strong pseudo-random data exports.randomString = function (size) { var buffer = exports.randomBits((size + 1) * 6); if (buffer instanceof Error) { return buffer; } var string = buffer.toString('base64').replace(/\+/g, '-').replace(/\//g, '_').replace(/\=/g, ''); return string.slice(0, size); }; exports.randomBits = function (bits) { if (!bits || bits < 0) { return Boom.internal('Invalid random bits count'); } var bytes = Math.ceil(bits / 8); try { return Crypto.randomBytes(bytes); } catch (err) { return Boom.internal('Failed generating random bits: ' + err.message); } }; // Compare two strings using fixed time algorithm (to prevent time-based analysis of MAC digest match) exports.fixedTimeComparison = function (a, b) { if (typeof a !== 'string' || typeof b !== 'string') { return false; } var mismatch = (a.length === b.length ? 0 : 1); if (mismatch) { b = a; } for (var i = 0, il = a.length; i < il; ++i) { var ac = a.charCodeAt(i); var bc = b.charCodeAt(i); mismatch += (ac === bc ? 0 : 1); } return (mismatch === 0); }; ������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/.npmignore������������000644 �000766 �000024 �00000000262 12455173731 036007� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/.travis.yml�����������000755 �000766 �000024 �00000000046 12455173731 036124� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/images/���������������000755 �000766 �000024 �00000000000 12456115120 035242� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/index.js��������������000755 �000766 �000024 �00000000042 12455173731 035454� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('./lib');����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/lib/������������������000755 �000766 �000024 �00000000000 12456115120 034543� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/LICENSE���������������000755 �000766 �000024 �00000002706 12455173731 035025� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2012-2013, Walmart. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Walmart nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL WALMART BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/Makefile��������������000755 �000766 �000024 �00000000474 12455173731 035460� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������test: @node node_modules/lab/bin/lab test-cov: @node node_modules/lab/bin/lab -r threshold -t 100 test-cov-html: @node node_modules/lab/bin/lab -r html -o coverage.html complexity: @node node_modules/complexity-report/src/cli.js -o complexity.md -f markdown lib .PHONY: test test-cov test-cov-html complexity ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/package.json����������000755 �000766 �000024 �00000002620 12455173731 036301� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "boom", "description": "HTTP-friendly error objects", "version": "0.4.2", "author": { "name": "Eran Hammer", "email": "eran@hueniverse.com", "url": "http://hueniverse.com" }, "contributors": [], "repository": { "type": "git", "url": "git://github.com/spumko/boom" }, "main": "index", "keywords": [ "error", "http" ], "engines": { "node": ">=0.8.0" }, "dependencies": { "hoek": "0.9.x" }, "devDependencies": { "lab": "0.1.x", "complexity-report": "0.x.x" }, "scripts": { "test": "make test-cov" }, "licenses": [ { "type": "BSD", "url": "http://github.com/spumko/boom/raw/master/LICENSE" } ], "_id": "boom@0.4.2", "dist": { "shasum": "7a636e9ded4efcefb19cef4947a3c67dfaee911b", "tarball": "http://registry.npmjs.org/boom/-/boom-0.4.2.tgz" }, "_from": "boom@>=0.4.0 <0.5.0", "_npmVersion": "1.2.18", "_npmUser": { "name": "hueniverse", "email": "eran@hueniverse.com" }, "maintainers": [ { "name": "hueniverse", "email": "eran@hueniverse.com" } ], "directories": {}, "_shasum": "7a636e9ded4efcefb19cef4947a3c67dfaee911b", "_resolved": "https://registry.npmjs.org/boom/-/boom-0.4.2.tgz", "bugs": { "url": "https://github.com/spumko/boom/issues" }, "readme": "ERROR: No README data found!", "homepage": "https://github.com/spumko/boom" } ����������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/README.md�������������000755 �000766 �000024 �00000000510 12455173731 035266� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������<a href="https://github.com/spumko"><img src="https://raw.github.com/spumko/spumko/master/images/from.png" align="right" /></a> ![boom Logo](https://raw.github.com/spumko/boom/master/images/boom.png) HTTP-friendly error objects [![Build Status](https://secure.travis-ci.org/spumko/boom.png)](http://travis-ci.org/spumko/boom) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/lib/index.js����������000755 �000766 �000024 �00000011014 12455173731 036223� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Http = require('http'); var NodeUtil = require('util'); var Hoek = require('hoek'); // Declare internals var internals = {}; exports = module.exports = internals.Boom = function (/* (new Error) or (code, message) */) { var self = this; Hoek.assert(this.constructor === internals.Boom, 'Error must be instantiated using new'); Error.call(this); this.isBoom = true; this.response = { code: 0, payload: {}, headers: {} // type: 'content-type' }; if (arguments[0] instanceof Error) { // Error var error = arguments[0]; this.data = error; this.response.code = error.code || 500; if (error.message) { this.message = error.message; } } else { // code, message var code = arguments[0]; var message = arguments[1]; Hoek.assert(!isNaN(parseFloat(code)) && isFinite(code) && code >= 400, 'First argument must be a number (400+)'); this.response.code = code; if (message) { this.message = message; } } // Response format this.reformat(); return this; }; NodeUtil.inherits(internals.Boom, Error); internals.Boom.prototype.reformat = function () { this.response.payload.code = this.response.code; this.response.payload.error = Http.STATUS_CODES[this.response.code] || 'Unknown'; if (this.message) { this.response.payload.message = Hoek.escapeHtml(this.message); // Prevent XSS from error message } }; // Utilities internals.Boom.badRequest = function (message) { return new internals.Boom(400, message); }; internals.Boom.unauthorized = function (message, scheme, attributes) { // Or function (message, wwwAuthenticate[]) var err = new internals.Boom(401, message); if (!scheme) { return err; } var wwwAuthenticate = ''; if (typeof scheme === 'string') { // function (message, scheme, attributes) wwwAuthenticate = scheme; if (attributes) { var names = Object.keys(attributes); for (var i = 0, il = names.length; i < il; ++i) { if (i) { wwwAuthenticate += ','; } var value = attributes[names[i]]; if (value === null || value === undefined) { // Value can be zero value = ''; } wwwAuthenticate += ' ' + names[i] + '="' + Hoek.escapeHeaderAttribute(value.toString()) + '"'; } } if (message) { if (attributes) { wwwAuthenticate += ','; } wwwAuthenticate += ' error="' + Hoek.escapeHeaderAttribute(message) + '"'; } else { err.isMissing = true; } } else { // function (message, wwwAuthenticate[]) var wwwArray = scheme; for (var i = 0, il = wwwArray.length; i < il; ++i) { if (i) { wwwAuthenticate += ', '; } wwwAuthenticate += wwwArray[i]; } } err.response.headers['WWW-Authenticate'] = wwwAuthenticate; return err; }; internals.Boom.clientTimeout = function (message) { return new internals.Boom(408, message); }; internals.Boom.serverTimeout = function (message) { return new internals.Boom(503, message); }; internals.Boom.forbidden = function (message) { return new internals.Boom(403, message); }; internals.Boom.notFound = function (message) { return new internals.Boom(404, message); }; internals.Boom.internal = function (message, data) { var err = new internals.Boom(500, message); if (data && data.stack) { err.trace = data.stack.split('\n'); err.outterTrace = Hoek.displayStack(1); } else { err.trace = Hoek.displayStack(1); } err.data = data; err.response.payload.message = 'An internal server error occurred'; // Hide actual error from user return err; }; internals.Boom.passThrough = function (code, payload, contentType, headers) { var err = new internals.Boom(500, 'Pass-through'); // 500 code is only used to initialize err.data = { code: code, payload: payload, type: contentType }; err.response.code = code; err.response.type = contentType; err.response.headers = headers; err.response.payload = payload; return err; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/hawk/node_modules/boom/images/boom.png�������000755 �000766 �000024 �00000071447 12455173731 036737� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������PNG  ��� IHDR��9������vZ=B��� pHYs����+�� OiCCPPhotoshop ICC profile��xڝSgTS=BKKoR RB&*! J!QEEȠQ, !{kּ> H3Q5 B.@ $p�d!s#�~<<+"�x �M0B\t8K�@zB�@F&S��`cb�P-�`'�{�[!� eD�h;�VE�X0�fK9�-�0IWfH�� � �0Q)�{�`##x��FW<+*��x<$9E[-qWW.(I+6aa@.y24��x6_-"bbϫp@��t~,/;m%h^ uf@�Wp~<<EJB[aW}g_Wl~<$2]GLϒ bG "IbX*QqD2"B)%d,>5�j>{-]cK'Xt��o(hw?G%�fIq��^D$.Tʳ?��D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;�2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ<FFi\$mmƣ&&!&KMMRM);L;L֙͢5=12כ߷`ZxZ,eIZYnZ9YXUZ]F%ֻNNgðɶۮm}agbgŮ}}= Z~sr:V:ޚΜ?}/gX3)iSGggs󈋉K.>.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz�%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9<qy +V<*mOW~&zMk^ʂk U }]OX/Yߵa>(xoʿܔĹdff-[n ڴ VE/(ۻC<e;?TTTT6ݵan{4[>ɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG4<YyJTiӓgό}~.`ۢ{cjotE;;\tWW:_mt<Oǻ\kz{f7y՞9=ݽzo~r'˻w'O_@AC݇?[jwGCˆ 8>99?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-��� cHRM��z%��������u0��`��:��o_F��hRIDATxwWy>gfUW]"\SBB$@!4R~)J|Ihc`U\e{r+Feo{JX<3s>󜧈ロ"L~;v055mlݺ;wrbY[lu]fr9:::lo0333lݺ={H$fLLNrWp!x3J2djzuk֐Z.֭8ySY34(X 1WӻV340 RN$7m"Nd2ܰqc[>_={zZ-l;b1[|!X:utwYw]WI)V<zGZͻf3R<XP+J΂ȶmrǏ-ʥRTO$eY:UK/Vquבdf۶mۋ˲zN>G)ny'cl޼={OOO�e֯_\UVQVm,<6oL.DZm4{!N)H)(BJd2R 줿9z(T uV[�)%RJ&''r 233uI&H)IR!F)`0\lsd"ykJˮ\OǺ]� h6VBP'%euteK@kMH  VH֔tTJ%�4WwuhR#r=orrTJYBbmYb1cL`0c�)$u}?|&.d%b}{'v b$$!$mR"GJ4*--RPԵ�\S.E?VLxqLPbhĤ3w|"59xX2Nb1qBXR`0"AmCe?cRzʕmT٫$ܽmA'FU@uZ$N(1ĉS^Bht񴢂fQ_1n-;!郇{<ڟG`esmK';3́X,p-:<`0YH)ۥ7_y޺Jğpyk 9qڤ�4u>*I8:~zHB $.%yhך�M)Pu@M+rWg K*gۉwUA ypq۾GJC1>+ `0"g`I ]^|%Lnm"K 7d=| N#b< sƬ t/+:�$! )%i$XOi5>zy^VG*cǦ;:|a)e]J\%`0"2bfʥͼ2>>I?Ɗ<-CiME)J*xXYsտI%cvb<=ՁPU~+yygY,3>x"L{]x|m1 9.nl/V7V>U*>#~mNwNSĄ0Wg䔚FiAGBĹ}3|| rUyI<?hqs %7x|Z=T|y-"Ƌm<Yl<:0⚥eh}$2F]^E{pŞɿTUj[4L}q{ `DNRڞ꾝;>==$/^˳ӝuX*:`N 7U Z8.uR\K6Jo3?=p)N%_5Ͷ킔A)@\R`0"g1j!BpllOj6kM<=NJZZS hDY)?#VLz;5}| e !vG1`0 F)%B8>|nG{?orbfF #n-z<qu,!b <+ɸqwȷ9JW~Ǐk` to !5 9gAk8!WOMWk^\ȒS /F,ф(|4(MZ</ŋ<V8Z?;:~^}WƉ}$H|CJyM`0s8G{{|gycO?@ぇ<_6ZC ֪䢔MN˜Bt'|4-mZGO9Us ©x<ÿ|Wh~fMZL+OkhBQ,DQ*@yueb{�:'LάHaٿ/6џeP:qB HDh )>=5v݁3> `0 ȱ,\~wK¶g{xOjq *`<~O+)C"DԵSUFP/D UbBZ饒CZ֣Fi-!0 زBAy>=)4SltAj^)>`pm^L*IEs 9QVBq6$?j ЊCih Wjhj(#P>B_h4|$ͰNf9+$XI cI,;v9C%Fu\ɜ `]]^n)Mљ+KxUow?fdh0 VDly(%oweYDVBBkTZ^&j( w$NSSJ,'O:c$b׭\�Jk:@�iƎc' WwGoH)4 pމ˲PJ XMܔ/-ў_/Urx_9Mm*Ҏ::SeT80CK1(⏻Wl7΍/NLB}LR}j.M`0"Dz,<߿]rbdwϲ*ŭ,pz~zqV:R[O]jq bm8*$vP׊@qQ,oibGW_l).sy aE˶o>:2. m՚oq@H2sZ<!nMf'4<WKwoŊ *@/`K�3OKl'a>?6qngq\`bYm޳o<'Z[lUm}ySBR9Nal~xB܈ JQ+NRKid7:w96dI |3뵸hC]wiԀia0 yHi=];5_lӵL>zR:}jnɬlǫ%j3#ة.D:;s/^8{ * &/vqm"{Y}@o`0 --rl۶F_{^ </M>)%{M i+:n@P2^@;^^ۑNd(2%{eM>V^ 7]f)%k0 9:?}>7Ňzq`* .n9q^Z~8Epʾ ;m^ƫG"ӉH#-)�G>ؽG.۵g.z;_4`0ZJH)3\>zoikI Dn$[];7T RYt'NˉjXH†S;:68yxhCnk[8?{g`0WH)qgKPr?Z;4L82ܫ!j F4(wNdeU 26A<Iˉ j9>ٟ=Z}6]788GJ`04add; Rлwv͞|Z_R;QSnq ûN?LAq:ӓꋹN8;<'&`0 K.rabr򊛿VGDf^G!kݲ')-N ;jQ8 P>BF]!tc䓃d㣣/_BlRCm0 9mS(.َ?fM.Z Э'@Z6~Di�^l x$廔M4\aQ>=Gyܷcg֗c0 5Xɫ~|7Ǧr{&^b"9hҢ^8߭TRGql?^<WGBy_7<oǙ+`08΢mk*dN;h)ςC 0^Bu(jD{?ɎhjP :h<m>744:6st �` B>O\(lfd,[NnijnZ<e0,$\9W͓^l>Щk�ձ߱㓫Wzp9`6&&};Ç=zL3jvF Vќ8'uBW+QGk5ް>ЩiŸx][/hͻ&9֯^!̑5 \ PBdo^lo易8BHOS93LqV:PGq )ţG ^~w}}|k1 \r- ۀmqK_K8н5^  ˁ>yrn}ӺR'T $w L>m8.ڰB3ue0 (}(Rb۶u"_uИ郦J^JF#1J>jm}u:(6:rM֕`0\O箌X,2<<ΝAbRP&(ߧ8߯+LhS8BVt콯`Zg}.5sT 9[nlq* c)ӵ t+Z؛*tJy(t>4l }Tt's%<]W\&`0,6 �T({?Ի蠵 +Z[-N!5}UK)f=b0 ]*!qot`Ϟ'YOf Z)ZLQtN!0BӚ 쨗+<xO__߿lٲl6R^`8Eή]X,}yO>h²:chm*x2$ANTK(2X}ymDkm <*ZS{l{Z,!P`!$BCeW3^YI"hsP|wbq;5X,f`0zֳSCR>{'aj.'ɝɒB 2uziOjVqٜ xa{;?߾CY~o <9k׮mhE˲rwoέvi2UHZ~jnțKIJ= Ξiu*~TwYwkK6m y,rFrKo}U :¯Lm "ǯD&@Q[9G:Gu  y*rz{{)nO7-_tQFP(MBZh dzӝܒ<Z`8EN#eQ*_T+~L-20Qfp.�J 6Vhtrky{-ͧ`0ZH$8=ЃoD\h /6$ny � n!V@!oyawl+k0 !jhbl߾{v~;\0x-QG]Qc ~h9NY ^ϧ|/Rd &r2mj<llԊSQ=Å_+<3 [c Os|۷_\2G`0/٦�y=kc[%e\HR/NF9 :TymC+n:tx<`0V6iZs?zwRjRJiCVA`Ο<6FjsldV4 H.RJI.Kع}/}ߪ5d$B­�S9Z[dfs,SYs:b8`0G"?/<}<5FAܪ@+r됞'f:ԑҶjp "r:(!Zx$xk J<[_"ЁO"n.o ZKe֑1 Fe/{ n^V'X܄]PQ~fgr7d:,˚\6 :z.333w^j'xm{/9Њ|f(dje&6`0V>;N#Ln{oX\ ZceQ~:_m%d0.`8OD ] >)~-tc8%KXJ{'B} :"u]�x\K AM)Z$1,Ҭc 17\#4"`0cYpȑ9rٯpM"CQrW$erUY1>==4j՚d< Gvvv A 1Hz}4#zA\j: }}ڼcާZb z%^qcX!;F ]ڒƓc0 q]BZwzfS6PV|ōL؄aqDI,ڽe 9R)j7l6W'3{z @kf|98r@-r:yZ-mb\( Ɠc0 +_l7mS V-/tU#r.t#E>H!HW^| `0"K:IR"Z;kI[u}9"ֺmZB2.!Dx=p<cQ`hFb'jnP,ChǍu HYh`V.K$Tqa#̻`0Ά6YXBiQ>NP//΅AH"kCv j#G-) nwoOyk`0 rޡ A )mR̰8H; 9Aec tvv-tb^u@4L CCgOϺZ6AG[0K 'E|ЀT|fzR[}TI`?g8F CDL>˶YeǨB߫sBZHA/bJxۿ\.Yivs5U S,IAx b8ЀBZMMe#L&.`_1 f�XLU7!h�$pWD�xeYV7úƓc0 +VFBJ!L}Ce'\#gVܧrGȓW/p9k<9 #hy'V İ)c1 bEk&-V1y,@MM=~a)cS`hȱ,KF  (̴FH%YkZs)=wԙúƓc0 9RJ :SmҝF\,;e`2ZE.|@` =ڌ1 e9"iIu!8ƛsb;q\jZ”ΨUGX'HDy@X`0 rRXB` A+,;AoH m.TdgmF8T|!Ƕ'H/`04K$ +BoR>T йH;$s<Ti +YORqCT.pg2,c[:љsT@tP4saHv!1)yЭ ɋ8^owBGm!'`0%r֪5y`Rkt'ݻ4B9C`QC$B\y^0>1/pw2"`0%r2D])j(-WҀ5Z)d"Ljy.p43QmA0g H)d"[.7Z LW CD[R>HJd֨#Gg-T*Lׂ(LR].:::O|h[IcY`hOMO[܉ܕ&To'ݽ& 6B|BqP,@$$[89�9 vkw'`0%r*r_/)m !JHt 0@, N \9e lP�2Bn aEΥ^:0|~{;"C4gNfXر~*k%?iViM8p*R CwuB\xr Y"gݺuvvv]~s KR'@JR]CH'n$yB< vij2[N8:9umƍǻ]ը1l.vAUVx_kAJ}uYqȄ=xX˜+&KdсmZi0xȁ8-p__oA9ټS9T.PXVs1H+ѣmaGYGI{5rl%Y̭<Efv0^gP=]:[EG*LW�=]p"'y(z}6<sąDBYLjtǂ AಁGzq$aKО)bq? V, XsA;3;vu23mnNiᰲRuNfQF h%9`{keZZ6؄@owO**1 Lx)N�yA1>7;zzFOS+(\ktZ,1)+D�$3]${֠b8-!p sa#w w_uU\uU 9u̧ Qx#r֬YڵkI&�PuS⃓GZYR4uD *r τւ i_/縭<oNMLʱXlo<'/42]eq CwO�B98C֬T*+:u^DVGXTz*Z#Vd]hbNd:t <uK D:;?NQjQh1SUY<r <y/rH&86<r aYУSQ?H#`<�V=5l)O#tZNX"ӿH/Xh$6?:JūL_�RA۷Zkny}Q79Y5->DHӓv+U2 <4v lvFwE5D�(#i \Hd,pfl> ωL4$Ws\aX;~@n0'cKMCܷoˮ]ضm7X �DC0Ri]|VrVL( Hv ֿIDYW&Ng9R:3iWPnKD8drޡUXz5W^nd\xr C3xw}ƒ$3BZ좵+aAXqb@ 5kFjΕJk ^D,1Bg9l/^6!!$\CXtXAG 8\*( M!75r rxIm>~cGޏR뮻? #wSpk&nJS+$Ykt#8فcTs#h,kuTq=mR_IXWBβ~`O,vgU_F=9f.k>+Z:i4zDX {2Z#~@gjc4V؝[d�E9|#^Baݺo~~8ptѠq <~sl?п:ȩ�O\-=ЪP%:qTgFgN3b:CqGm~_Ng[$s|n '=kj?јY<z'F`D5Q:}rO4�;;,5 x} FbVؼ4Qؖu;v-A*a[Sgwo2|ZQt7+P/LU( idxPh.Q@RHZ$:7zY8A{w̶wybFpn錖3+GY<F<@8[Ru F,}3ee<Zx߼Sy-OeFBE9###8�H9eӦ/y%H<J$t*();ɯ( M,݅lǫpK3x<AF,LHil#޵ \ 䝓^-둀9$sxmrdd8U}r~`.$..{4nx~T]ӦF0Og-emzH8>X˴/>ÙI~Ŏm +JGGtOU`+ O&SP@)WV@ <'݁j'Wq+9JV:1X37cID[/v+A`^ˏω*gp~:| H]QAR H/e]-x*= xa[? +�g�7Y~aEtrk-"Xiӏ{eѣ:@Sd5>FM+sVZRX$xD{[*|ZE<g8N<MwV<`Q8!p0'qc$k XENUZfu722SZM|{ + 7�&jh'��=O. APyc4~W`_ﷱCEWq0P<N$ 8Co__k+%9$l:iZQ,!HJ$G)bGᔊX,ɮ!C[iD[/HDli$ӷHE8_c\qƧI!X=4y!H^uG, 7˸W?n9G L{"o¥-+OX\z 7|q9e Uzsz񶶑s'p)Y* +s#L(^Gȫ`Ed]IRƊHnPϏV &+qHu]cVG.ZwNb00Vss?_V+gy`Lଡ଼9fz;ң"/n<&yZE֟,5+1g\EDΙOOw˶l=w-!gHpjy-I^|}x\s@ 'ʼng9j JMc]bGKf{I8ŧ ~+Ʒ+92qWɄAƍx h><f˖nõڢY_l#?=]~># 'YqKCkءQEXDӟ5a_/*O<O~sݯ8V_gG|^>ggBy?㤻mx"ny#H,'!ݽM-Qmu mģG3W/L~_;tM]<3A͊Ye8YB~ h~Cka."nI84rn;Se=:zp})Y |n>һ_Jf2x!pR86~jïOͻs2 R.Xt(Z[ͥ)=y^Iӭ{tWzwuW3M󏵄(O^mTzO"Lnva;cְ\hؤ{؟oY0{!ֺyeg5Ç_6FR&nʫGu8II)R1AGYY2$ʇin 2WgT<A,ۍj!OXL3#q.Fޛ#8ř" m˖lKGGa;H|XӁAjw;h|aƅ,ΕE2==/zOu` @g<ABSP:;IfIvkx"~@UOfoB Ym'IudPZGՋwz-I>n<^v%I8U@b&,xamp%aX{b$o.±7vh=;y&jaaO-g//HRcj2i髭@1^Ce3O-% F@j؉ t}^%pK*ʯ|DJGOT-#b1"ϢI{e=y}&ޫnP&B>R./fmS9r�O_j^\M phgv<p�On=f‰ŢK>'a)]ыy _D|Z4Hv~e{J4<ZhC,Ak MdLwUij=8,҉GxBNT晭ϳ4GMlLF^/HmƄ1^4| pfE1\~tMJeYg\@Wә/"1OdM8mPZK=FhV 2c{ƍ ΚL6{#SEnx?⥙nU@z^-@DJ!lRtGh]ߋ>.u"sq[,6Rݫ,o"| ,}26ڌ+mN__g*$0Gsa6|n|0vh ;̇'�\ j. :az3Xe=z٥~큻z0a5FI8;?~h'Z&\ h& BZF:@2#N @uC5NhENx)s^堛'g2<ZDgtp k‡7oX+./Ÿpf1h<>ovhe/,N8y6g3 47q5Rr{cZ :۶*ߦ[寺0h9T@pȇqXwDsZR~zaÊTsđv &'IJc @Id8`+B_ >0un9}Fn/ٹvhZ itC a -�_<;F¨d-a͠7ձ.r.䒆ENXuۇ \)}�x.ۊ<^o{sS je5%~B#Gv+"ݻ",9ѧR �G:1f K;RBpƍ]vj .3ƃd.x \GAa,OzK†K%6+xvbѯ\|C7z}.Vy>.?�z,ĠhA#ZKAЁ"|&L(]ZtIV ^X~0ior۶/jqe$.%4>"sq6h;(;̕N/%{>o|ʿㄵ~o a΁R)E#K8<+nq!p4X-'xpMyAv֫[aurY5H!:y^2/$4f+o֬Zu֭[fhh1"ao[]p%gN0vh=;̇7F?]˖]ӟqE$%ݷqoݽWF-E%`a])~^enO0'j +Gk٨rͥi6?j 'mb/NXn_@Kv7ƍJh:E=.c|&ޱ̿06f2|crV[W&q˕7>{Kǚt7EOzu>=3W+3v�4TM6 IVZW/]-^ :=uFQNQd{U�!, ::W+ ӋCKaje66ņEN29B.ڼo;t-ДB~ 6t|^rwt p}"ZX8BEI>7N\zoA �ܴaV&px4 <b%Z̩/;{!BoNٜ7R?sЕWN 4#^F Ih*rey{G?R %W ?W*qZB ^_OZ%,^Y(C3M,hFXL +dxYWcְ\x2s80êqcɹj۶ݻ{r4I:hUh~Zkum\$R2HIp+|0ͅI⅄G*K/NtƯ=D^0Z:q]JYR[\ l7vhY;̅wCy9ܽZkѱ{.wh`i0"8Yڿ-!(gx|Wf{1A >i%@q!lu0' O" $ק8HeW"ӠZ%_B;ah ^aw7"'_墋{ q1o0WjѠ>Ǽ:w{Vqm2x\"K([L(]`_UэUHМ] i*`>Jx@sW\޾Іsj{"AWVr#rZ_!,hXvmTַ5o X,CjKh@Y6;34or02= Vh�}b6gD,!pĤd2LnOlҚ K( .m$)'FJ$8qaQSSk m OU믿e```uݥ9԰xzT!#Gd0^b(?Ww277Wђ e;x9 o8'?2aĥĿVq28ph̭i~Nͱm"@Idx9X5{_/n)”2 $mLt2$ģ,PQqBӉ!nOϭXL M5$Z !<=ńiڏ%ۊ?7+kɐ�{CKڡ…̰pONWc,?'J/XЄS"AtF]-#2GOqI2Õ k8P׊ZDSI26Ia 8qy~RC%8Z$nv>;EXtWOu@J,X xȑ#M(5]5$h9{@:Z'u}Y9. a|Ł.Hf }HkLEn<jvh^ lݺuOBe˖?޿ȑMN,W!l㢘^NcĒ\Hsm"˖XUCGmVՊ xJ> I@"pDX/)d$4ExحW;?)x5z ;Э9X[NXx@ j?q[*֒| Gx\^Cq'{\hslY;B/?oJ'q2{C\:>p7 @d /}K 9@aȑ?pK-rf8i+NŽPT:EGO/Ww.?Sڤ8]%R\KΊO'ЊR-95MB`!k+z-Њ9<VAp}n^_]k6D&=M6aS<K&eljI 5 PQw#I,єYUٞ_wz{Y¬F=9MIłLDήv3/32pG)a[7m?-k6'N{sa'?yA8W]uw*8{f!$gJta |*KɯQk*UNB Fg|ܺO^f]ZHXKa9lt\ONNoьI_('/:ƔT@YeҌc9]rό))_J!,5m@\Vl,Cg,KG,ABưDkP( z.,!)UFf=z{{NX\19cM#k #aY&Mؕv~Bnel{ +ao:Cl<0$lN]9k9 +~i6e޴v>"DTY;κdBGH n9$ sjR jIp*SS@IbB$ؖru"'j;F!�$4G(Y&fYdгP\JSF1Unư2ϥ|Q W(&$&#[/k舥is$="y<5%JkFjMۄxv7WUo8Rړh�~xo,}b 7-||xAA]S/jv8zqJٛ7|Ɔ[ouaI$SۿõZw �|>CNJL$b]"�R*ǫ3T>)4=ZOL쪗3f]^$l,C ؎,O|òH {E]&gBt@>)*I &Dpzn(r/+-n#m' 2N$&Cq,!@yWё Ц\Y⬪s?)do39 EowߚYI%U)MQcÙ4Sya EF=6p﹔oժ=pY)+ Tx-L?J] "tUltWSשԫT*%NկSQU:N]|qPʾ*aQҺ&k8BhL9+f|4U >픍Q  tu#VJES=8=ٲ,KX8&.mvLr⤬q`I !Њ� OO<a!_c!cpX6 |ibr~Κ6K?x~<ע3'DX;r࠱CKa!^/8Ȼc/6uDΞ= bu>:zDr"w9c<ks!iN`w�|�ϭ.uףUNի*_GZyx'@�7s"ax't�,e6[Ƣ(NvNYRHR2FNX6q鄞;F\–+$ӑ7)P Z(8Z q'E29kD^DZΕ4 VYt a\+uk//\u aM̍ Zxe�o:wf]~<9 ۋIfwT**^zlTuQf3u/Zc[1ۉAP|<\}n@:J)j^@^AXAW.(E @u RbCBh鎥qErOOdiQZdK%$ؙ?±Zˎu}t<sſ.>'VhW>nHb?a0yE=kGhSXρgNǭب=9{]ZcYC~vA!85,]dIC!_ QȤA:Ej !!%l ңg"Ph燿0Yt=Oqr{ Hq,ia I5\FZSLM 寇e Zr0^iGGw@q<q<e9z2XsVp㿢Ai1=+7vXv87X#@#|ihr\GFx"u/Zm۶}|Ν/8P(fi* ©Z>Ry.{gd|;3 '[E? Il~:O-YkV4Nn%J۶ȗ ~N-.?"ېW({XpRYns:9˴_wE7;W}r/,}7Q!̌y"n+0W\ގ"k�’�hs)8I,/*jM)5wbl.ʲ,*{cӺgTRszy_R"G|r nTMy֭7ONMՖ)mQON^.ONpVo|W#E5P["; ;]XNj\ZLLA}9>\ܱV)'wݮ6h4C̳veYܷ2Ӈe,!ajfU4͵GXxCLESTA*5q;z ^rt Dݼa€.8-‘Ehccgqx?&z#{Z h>m썍C׷{. PV+o> Bwd#jP;Cưm wx?R 81Z;0o IL$(W*KڋŠw1Ǥ'<ݺeˑ {ER4zƧ>DMe:;[r],Ɩ`bpx;]ྼtrss$Nwi,.nԢ!==uX|f"Wԉ[0�xcTU6YG[ ?XBG)c;%'qf>>&W^r{zzJ\�Olp·bf};vUOi<ap𴳼_U~Y0Cݻu;MY5&rů. }ѣv{ 4;�jTt/y;-!(U<ooB֚L:\BH}1U&- 0 <o[v=ԧtZؓurFOu_XQiKuhq/aFVy{LaeK|L FENCrFk5*5Ֆ_MFjfl lHy:IoQW!- ?9<|L)e+LC(ĤD=ϾNlnNlҭ[?S(\ }bM�, dIg]/u~e;,V~|z[+\zRÅ{ Χ|o ;sGc1i-%jd2фoc75Hוσa,{u|j}z˸_Fke|gO~8-v>i[YKkq~wMÏ!ln29|ѸDNVkگZI&p\0Hm`ǝ9kI؂Jjp})9ZcI'Yq+%$)Es:m _CZ<o}\Uhj6^A�oYr%t 7[m G-Ҿ;,R8Ưj1�aЫx,4Z3q;瞲j,&Z6iRRQazUp;rs -J4=*qL\GZLT&pB x%_ pUs5眽$ ﹋ƪR#Z~w Mp8}޶~xb޾qr#,6n$"~6YڗFDNBѭ>kzh�w //$Leْ]eY$b1VD:# Њ)jZ5C5 l۶,kgjRO+j1@Vt= zn9} xl\€Π 65vhdt͜]/}pq <z9G/\T|kiL p2M] ,a1<9r犒Hr^Z)T"zdrjۥl|щ&̃sׅKQk(} #Oy%Lmӵ\J>lxQT,c�J^x"z \N4"rΩ& g:>E5OǶqgEfVI)L!4l uoa5\ Kʽ-x%r^|B'赡E܇>ϢCKפm@ݭM /' \VV}?'dͱ{0ZSq{} [vhޜsKZ\aZ`NzB¢=X6cSu2Qmg|ɯ[�*�m%"[Tֽ <AK  4$zza:c+h~'z*|s/asѓw6vX<;\ <4F~9Fjt_v!^27!t'2|<RZ`Zj/빌TmM2 U(+)"!L1\yUNRX,, l?h9/`rM9cq#]JlY76JrjBoV[aqpFy̿B<aϗ3 ߤ}A6;.vj0*)%s|lUӄ¢ >FXIJ7([mJl @e_ ˲{\ə[Gf{#a '/9R(:y^@z¬z"޺ �e8cՄ[lfiT\DS,�_< O% B(//(FJ8ZMk֓N[i%-=(e5jSī=6|̲RU}\_!{+a&| cP" H;Vf X3{�92HV~4it\E"\!lE}"g{[7O,̴�dw]rA'8\~3-)tl} etbDf55__c C`|hhO}곕RۗjsܼNB# |?@,C_jmoviGEGOyiƻykO8c&G tZȖ)X}ygy>kb/p]KYMԌa<c="ո ؖ-7ZBǶ,J MQ5[n\/؛P,9{[kk=Zm؏b~ײb9}JvO c2:{GݾB<u-hDt_vk]K/ sNFe144{,w|lm~*=`kv2 + B'tut{޲C, sypn%<+r~:,rb BXg֋/UW^T2YZ4G /%BZb49[L40fY  7q)w?EU?r;l\H$?ԪƝzE/(tMų _ ^DZ.ZNe8͡cG82| Ƕh"rt$lb{zܶ.իV^{6\Ҵ9AXk䙄Y)c˩/э KTzc0^<y8ㄅ0S|y p'P>`aAQw3=.yZ[ h</9wdS+NM= `m-wCӔGP۲<}sd<oY�˲.3B%8'KFxU>`w43Wlޯ+~-)HxR$x#L_Nn7ͷQ& KCtlK&d_';"�-"笵r-8RkMd,m˲v]/\k'( !lymۦ=-/"LO] j[_ҢpCxsH{!jZcYE6}Ͷc+{ GX.' F0Y:a;[x'H[_A#p aߢf\<DŽGZB6+ViIsJU:OmoOO꫆n:yl*KH<4%̱m aӺt>j)[@kAD=Ҧ׸?[SAV$̜'SCV5k>qر+7#S^E0[duS6 K;͍@X8afhj7埣E0KhbGdS3cttl=xcrvšBc3"EΩb?˾pΝu/X[,lld�gOimmq#tlۦZ^֯^GOge^&!:Д*%Va5V1iSk| y:'7tG&d2OR_R#aVvʀۨV7'a9g:kkȤVعqK[oHɏ�tt#glÕIW <MgY>>@T@J88vĢ/E p}B$ml!)s5(pNM  LHIggWUJ͵lX!"-XVw]!vpB6K=uG\޳qǪT}mm鈥q?6g:B)83tLv@k4!$6b8jڥE*j[Nؕ?ʴWi؃3~0%|LV#GH)k`09DuI۾7}/o׮Z >Љ^fsqvթn(]H)R3f3h4X/"Is}v}6zgD%-nc#K@cK=Fj9b=_VS+vNQɶɏtRjP(4`0V<gˉ !c+Ne'n9|[oߛ5=SS:ؙ?XEڜ$hPEєԣKf3ᒋ>=KZRA@ PVZ\rt/NLڌTs/E Mn_JU}}>40 �{c0 9xsrEάXPJ!nKoѷ}w/& ODBg6K<j�x-t`.v GҚ`.N&8Kf9;NTPժLL/e)h @iK6bKJPq֏8n$Hv;ܳfpCCV(N �9<Z~bdru?*U,nZI6´:')?Z逃 F+S%:OK_+h:۶;|r6ՔI "|O͏B".S 8L)M8p�K$p{6~1L~>|߈`0,Dpy#rf \}2W*='SUk5S;WIB1s:puv'Ew<CW,C&.m@+ԢҖTjeT)NKHaoic6'/@_(a@)o_jfz#8mn 9JkoQtd7o~ǟ{jn:KvQ!Qhf23^GNqڜ$NXlj֨([jXQ::(תĜزGA8DkF9X!ʑ9HXlvO_w $߬J;e=k`X駬֞"Tc3_ή]Vï)@xM劉@SQ:ZT@QU(W$%cN e'IV*,kf>ARJ&g)WXDBn[&(zUrG$0LA]d |kU<;)qT)GHi.E`0,șL9EάX �T58wg2[3jtyvOX?8qBhP5UW΄(&a9$i;AڎĤ-eˢ a GLV oOe#gUM}RZ)E]թ>UNޯ#Q#LR*+v#\wt2N)�0 aiE4a;�wiKyz6ȳy*۟8<2rŘ֗>*uG^,aV0%vЀ|<'- ++NN$qiȂ >)>D γSV;EȈӈ-t:y"*GQ|P'D;˄_f�7Cbc:ƺzzwYOlիܵ.E今aa`0b= <~>ى :61%N_^ M3?XJJK5m�M=2@I@h8~ vibdNXFBR@i-BBhlUaU BK8 QGQ h\}AJi%D.q>P=^q<˶RqDzFBlveRz}ϥ_<2:6*`0VAPB ~qhiYy=>@___g/;v*Jڵkrl&Kx*D)Zz| {U Ip @[[ B %@ktZh _kܰ#:ٙLQ8EBƓd^+UXfT:31Ǚ}/} p=LC H<FTdiZse1q/-MW'RT]Hٓ bd:,8M RH$RAħR)H@kzZZ<ԅ%[r|=LԻR%-L&R{ỏUT׭ZUR j{{y^~ D\Bz6`8O�q !����IENDB`�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/browser.js���000755 �000766 �000024 �00000053751 12455173731 033263� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/* HTTP Hawk Authentication Scheme Copyright (c) 2012-2013, Eran Hammer <eran@hueniverse.com> MIT Licensed */ // Declare namespace var hawk = {}; // Export if used as a module if (typeof module !== "undefined" && module.exports) { module.exports = hawk; } hawk.client = { // Generate an Authorization header for a given request /* uri: 'http://example.com/resource?a=b' method: HTTP verb (e.g. 'GET', 'POST') options: { // Required credentials: { id: 'dh37fgj492je', key: 'aoijedoaijsdlaksjdl', algorithm: 'sha256' // 'sha1', 'sha256' }, // Optional ext: 'application-specific', // Application specific data sent via the ext attribute timestamp: Date.now() / 1000, // A pre-calculated timestamp in seconds nonce: '2334f34f', // A pre-generated nonce localtimeOffsetMsec: 400, // Time offset to sync with server time (ignored if timestamp provided) payload: '{"some":"payload"}', // UTF-8 encoded string for body hash generation (ignored if hash provided) contentType: 'application/json', // Payload content-type (ignored if hash provided) hash: 'U4MKKSmiVxk37JCCrAVIjV=', // Pre-calculated payload hash app: '24s23423f34dx', // Oz application id dlg: '234sz34tww3sd' // Oz delegated-by application id } */ header: function (uri, method, options) { var result = { field: '', artifacts: {} }; // Validate inputs if (!uri || (typeof uri !== 'string' && typeof uri !== 'object') || !method || typeof method !== 'string' || !options || typeof options !== 'object') { result.err = 'Invalid argument type'; return result; } // Application time var timestamp = options.timestamp || Math.floor((hawk.utils.now() + (options.localtimeOffsetMsec || 0)) / 1000) // Validate credentials var credentials = options.credentials; if (!credentials || !credentials.id || !credentials.key || !credentials.algorithm) { result.err = 'Invalid credential object'; return result; } if (hawk.crypto.algorithms.indexOf(credentials.algorithm) === -1) { result.err = 'Unknown algorithm'; return result; } // Parse URI if (typeof uri === 'string') { uri = hawk.utils.parseUri(uri); } // Calculate signature var artifacts = { ts: timestamp, nonce: options.nonce || hawk.utils.randomString(6), method: method, resource: uri.relative, host: uri.hostname, port: uri.port, hash: options.hash, ext: options.ext, app: options.app, dlg: options.dlg }; result.artifacts = artifacts; // Calculate payload hash if (!artifacts.hash && options.hasOwnProperty('payload')) { artifacts.hash = hawk.crypto.calculatePayloadHash(options.payload, credentials.algorithm, options.contentType); } var mac = hawk.crypto.calculateMac('header', credentials, artifacts); // Construct header var hasExt = artifacts.ext !== null && artifacts.ext !== undefined && artifacts.ext !== ''; // Other falsey values allowed var header = 'Hawk id="' + credentials.id + '", ts="' + artifacts.ts + '", nonce="' + artifacts.nonce + (artifacts.hash ? '", hash="' + artifacts.hash : '') + (hasExt ? '", ext="' + hawk.utils.escapeHeaderAttribute(artifacts.ext) : '') + '", mac="' + mac + '"'; if (artifacts.app) { header += ', app="' + artifacts.app + (artifacts.dlg ? '", dlg="' + artifacts.dlg : '') + '"'; } result.field = header; return result; }, // Validate server response /* request: object created via 'new XMLHttpRequest()' after response received artifacts: object recieved from header().artifacts options: { payload: optional payload received required: specifies if a Server-Authorization header is required. Defaults to 'false' } */ authenticate: function (request, credentials, artifacts, options) { options = options || {}; if (request.getResponseHeader('www-authenticate')) { // Parse HTTP WWW-Authenticate header var attributes = hawk.utils.parseAuthorizationHeader(request.getResponseHeader('www-authenticate'), ['ts', 'tsm', 'error']); if (!attributes) { return false; } if (attributes.ts) { var tsm = hawk.crypto.calculateTsMac(attributes.ts, credentials); if (tsm !== attributes.tsm) { return false; } hawk.utils.setNtpOffset(attributes.ts - Math.floor(Date.now() / 1000)); // Keep offset at 1 second precision } } // Parse HTTP Server-Authorization header if (!request.getResponseHeader('server-authorization') && !options.required) { return true; } var attributes = hawk.utils.parseAuthorizationHeader(request.getResponseHeader('server-authorization'), ['mac', 'ext', 'hash']); if (!attributes) { return false; } var modArtifacts = { ts: artifacts.ts, nonce: artifacts.nonce, method: artifacts.method, resource: artifacts.resource, host: artifacts.host, port: artifacts.port, hash: attributes.hash, ext: attributes.ext, app: artifacts.app, dlg: artifacts.dlg }; var mac = hawk.crypto.calculateMac('response', credentials, modArtifacts); if (mac !== attributes.mac) { return false; } if (!options.hasOwnProperty('payload')) { return true; } if (!attributes.hash) { return false; } var calculatedHash = hawk.crypto.calculatePayloadHash(options.payload, credentials.algorithm, request.getResponseHeader('content-type')); return (calculatedHash === attributes.hash); }, message: function (host, port, message, options) { // Validate inputs if (!host || typeof host !== 'string' || !port || typeof port !== 'number' || message === null || message === undefined || typeof message !== 'string' || !options || typeof options !== 'object') { return null; } // Application time var timestamp = options.timestamp || Math.floor((hawk.utils.now() + (options.localtimeOffsetMsec || 0)) / 1000) // Validate credentials var credentials = options.credentials; if (!credentials || !credentials.id || !credentials.key || !credentials.algorithm) { // Invalid credential object return null; } if (hawk.crypto.algorithms.indexOf(credentials.algorithm) === -1) { return null; } // Calculate signature var artifacts = { ts: timestamp, nonce: options.nonce || hawk.utils.randomString(6), host: host, port: port, hash: hawk.crypto.calculatePayloadHash(message, credentials.algorithm) }; // Construct authorization var result = { id: credentials.id, ts: artifacts.ts, nonce: artifacts.nonce, hash: artifacts.hash, mac: hawk.crypto.calculateMac('message', credentials, artifacts) }; return result; }, authenticateTimestamp: function (message, credentials, updateClock) { // updateClock defaults to true var tsm = hawk.crypto.calculateTsMac(message.ts, credentials); if (tsm !== message.tsm) { return false; } if (updateClock !== false) { hawk.utils.setNtpOffset(message.ts - Math.floor(Date.now() / 1000)); // Keep offset at 1 second precision } return true; } }; hawk.crypto = { headerVersion: '1', algorithms: ['sha1', 'sha256'], calculateMac: function (type, credentials, options) { var normalized = hawk.crypto.generateNormalizedString(type, options); var hmac = CryptoJS['Hmac' + credentials.algorithm.toUpperCase()](normalized, credentials.key); return hmac.toString(CryptoJS.enc.Base64); }, generateNormalizedString: function (type, options) { var normalized = 'hawk.' + hawk.crypto.headerVersion + '.' + type + '\n' + options.ts + '\n' + options.nonce + '\n' + (options.method || '').toUpperCase() + '\n' + (options.resource || '') + '\n' + options.host.toLowerCase() + '\n' + options.port + '\n' + (options.hash || '') + '\n'; if (options.ext) { normalized += options.ext.replace('\\', '\\\\').replace('\n', '\\n'); } normalized += '\n'; if (options.app) { normalized += options.app + '\n' + (options.dlg || '') + '\n'; } return normalized; }, calculatePayloadHash: function (payload, algorithm, contentType) { var hash = CryptoJS.algo[algorithm.toUpperCase()].create(); hash.update('hawk.' + hawk.crypto.headerVersion + '.payload\n'); hash.update(hawk.utils.parseContentType(contentType) + '\n'); hash.update(payload || ''); hash.update('\n'); return hash.finalize().toString(CryptoJS.enc.Base64); }, calculateTsMac: function (ts, credentials) { var hash = CryptoJS['Hmac' + credentials.algorithm.toUpperCase()]('hawk.' + hawk.crypto.headerVersion + '.ts\n' + ts + '\n', credentials.key); return hash.toString(CryptoJS.enc.Base64); } }; hawk.utils = { storage: { // localStorage compatible interface _cache: {}, setItem: function (key, value) { hawk.utils.storage._cache[key] = value; }, getItem: function (key) { return hawk.utils.storage._cache[key]; } }, setStorage: function (storage) { var ntpOffset = hawk.utils.getNtpOffset() || 0; hawk.utils.storage = storage; hawk.utils.setNtpOffset(ntpOffset); }, setNtpOffset: function (offset) { try { hawk.utils.storage.setItem('hawk_ntp_offset', offset); } catch (err) { console.error('[hawk] could not write to storage.'); console.error(err); } }, getNtpOffset: function () { return parseInt(hawk.utils.storage.getItem('hawk_ntp_offset') || '0', 10); }, now: function () { return Date.now() + hawk.utils.getNtpOffset(); }, escapeHeaderAttribute: function (attribute) { return attribute.replace(/\\/g, '\\\\').replace(/\"/g, '\\"'); }, parseContentType: function (header) { if (!header) { return ''; } return header.split(';')[0].trim().toLowerCase(); }, parseAuthorizationHeader: function (header, keys) { if (!header) { return null; } var headerParts = header.match(/^(\w+)(?:\s+(.*))?$/); // Header: scheme[ something] if (!headerParts) { return null; } var scheme = headerParts[1]; if (scheme.toLowerCase() !== 'hawk') { return null; } var attributesString = headerParts[2]; if (!attributesString) { return null; } var attributes = {}; var verify = attributesString.replace(/(\w+)="([^"\\]*)"\s*(?:,\s*|$)/g, function ($0, $1, $2) { // Check valid attribute names if (keys.indexOf($1) === -1) { return; } // Allowed attribute value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9 if ($2.match(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~]+$/) === null) { return; } // Check for duplicates if (attributes.hasOwnProperty($1)) { return; } attributes[$1] = $2; return ''; }); if (verify !== '') { return null; } return attributes; }, randomString: function (size) { var randomSource = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789'; var len = randomSource.length; var result = []; for (var i = 0; i < size; ++i) { result[i] = randomSource[Math.floor(Math.random() * len)]; } return result.join(''); }, parseUri: function (input) { // Based on: parseURI 1.2.2 // http://blog.stevenlevithan.com/archives/parseuri // (c) Steven Levithan <stevenlevithan.com> // MIT License var keys = ['source', 'protocol', 'authority', 'userInfo', 'user', 'password', 'hostname', 'port', 'resource', 'relative', 'pathname', 'directory', 'file', 'query', 'fragment']; var uriRegex = /^(?:([^:\/?#]+):)?(?:\/\/((?:(([^:@]*)(?::([^:@]*))?)?@)?([^:\/?#]*)(?::(\d*))?))?(((((?:[^?#\/]*\/)*)([^?#]*))(?:\?([^#]*))?)(?:#(.*))?)/; var uriByNumber = uriRegex.exec(input); var uri = {}; var i = 15; while (i--) { uri[keys[i]] = uriByNumber[i] || ''; } if (uri.port === null || uri.port === '') { uri.port = (uri.protocol.toLowerCase() === 'http' ? '80' : (uri.protocol.toLowerCase() === 'https' ? '443' : '')); } return uri; } }; // Based on: Crypto-JS v3.1.2 // Copyright (c) 2009-2013, Jeff Mott. All rights reserved. // http://code.google.com/p/crypto-js/ // http://code.google.com/p/crypto-js/wiki/License var CryptoJS=CryptoJS||function(h,r){var k={},l=k.lib={},n=function(){},f=l.Base={extend:function(a){n.prototype=this;var b=new n;a&&b.mixIn(a);b.hasOwnProperty("init")||(b.init=function(){b.$super.init.apply(this,arguments)});b.init.prototype=b;b.$super=this;return b},create:function(){var a=this.extend();a.init.apply(a,arguments);return a},init:function(){},mixIn:function(a){for(var b in a)a.hasOwnProperty(b)&&(this[b]=a[b]);a.hasOwnProperty("toString")&&(this.toString=a.toString)},clone:function(){return this.init.prototype.extend(this)}},j=l.WordArray=f.extend({init:function(a,b){a=this.words=a||[];this.sigBytes=b!=r?b:4*a.length},toString:function(a){return(a||s).stringify(this)},concat:function(a){var b=this.words,d=a.words,c=this.sigBytes;a=a.sigBytes;this.clamp();if(c%4)for(var e=0;e<a;e++)b[c+e>>>2]|=(d[e>>>2]>>>24-8*(e%4)&255)<<24-8*((c+e)%4);else if(65535<d.length)for(e=0;e<a;e+=4)b[c+e>>>2]=d[e>>>2];else b.push.apply(b,d);this.sigBytes+=a;return this},clamp:function(){var a=this.words,b=this.sigBytes;a[b>>>2]&=4294967295<<32-8*(b%4);a.length=h.ceil(b/4)},clone:function(){var a=f.clone.call(this);a.words=this.words.slice(0);return a},random:function(a){for(var b=[],d=0;d<a;d+=4)b.push(4294967296*h.random()|0);return new j.init(b,a)}}),m=k.enc={},s=m.Hex={stringify:function(a){var b=a.words;a=a.sigBytes;for(var d=[],c=0;c<a;c++){var e=b[c>>>2]>>>24-8*(c%4)&255;d.push((e>>>4).toString(16));d.push((e&15).toString(16))}return d.join("")},parse:function(a){for(var b=a.length,d=[],c=0;c<b;c+=2)d[c>>>3]|=parseInt(a.substr(c,2),16)<<24-4*(c%8);return new j.init(d,b/2)}},p=m.Latin1={stringify:function(a){var b=a.words;a=a.sigBytes;for(var d=[],c=0;c<a;c++)d.push(String.fromCharCode(b[c>>>2]>>>24-8*(c%4)&255));return d.join("")},parse:function(a){for(var b=a.length,d=[],c=0;c<b;c++)d[c>>>2]|=(a.charCodeAt(c)&255)<<24-8*(c%4);return new j.init(d,b)}},t=m.Utf8={stringify:function(a){try{return decodeURIComponent(escape(p.stringify(a)))}catch(b){throw Error("Malformed UTF-8 data");}},parse:function(a){return p.parse(unescape(encodeURIComponent(a)))}},q=l.BufferedBlockAlgorithm=f.extend({reset:function(){this._data=new j.init;this._nDataBytes=0},_append:function(a){"string"==typeof a&&(a=t.parse(a));this._data.concat(a);this._nDataBytes+=a.sigBytes},_process:function(a){var b=this._data,d=b.words,c=b.sigBytes,e=this.blockSize,f=c/(4*e),f=a?h.ceil(f):h.max((f|0)-this._minBufferSize,0);a=f*e;c=h.min(4*a,c);if(a){for(var g=0;g<a;g+=e)this._doProcessBlock(d,g);g=d.splice(0,a);b.sigBytes-=c}return new j.init(g,c)},clone:function(){var a=f.clone.call(this);a._data=this._data.clone();return a},_minBufferSize:0});l.Hasher=q.extend({cfg:f.extend(),init:function(a){this.cfg=this.cfg.extend(a);this.reset()},reset:function(){q.reset.call(this);this._doReset()},update:function(a){this._append(a);this._process();return this},finalize:function(a){a&&this._append(a);return this._doFinalize()},blockSize:16,_createHelper:function(a){return function(b,d){return(new a.init(d)).finalize(b)}},_createHmacHelper:function(a){return function(b,d){return(new u.HMAC.init(a,d)).finalize(b)}}});var u=k.algo={};return k}(Math); (function () { var k = CryptoJS, b = k.lib, m = b.WordArray, l = b.Hasher, d = [], b = k.algo.SHA1 = l.extend({ _doReset: function () { this._hash = new m.init([1732584193, 4023233417, 2562383102, 271733878, 3285377520]) }, _doProcessBlock: function (n, p) { for (var a = this._hash.words, e = a[0], f = a[1], h = a[2], j = a[3], b = a[4], c = 0; 80 > c; c++) { if (16 > c) d[c] = n[p + c] | 0; else { var g = d[c - 3] ^ d[c - 8] ^ d[c - 14] ^ d[c - 16]; d[c] = g << 1 | g >>> 31 } g = (e << 5 | e >>> 27) + b + d[c]; g = 20 > c ? g + ((f & h | ~f & j) + 1518500249) : 40 > c ? g + ((f ^ h ^ j) + 1859775393) : 60 > c ? g + ((f & h | f & j | h & j) - 1894007588) : g + ((f ^ h ^ j) - 899497514); b = j; j = h; h = f << 30 | f >>> 2; f = e; e = g } a[0] = a[0] + e | 0; a[1] = a[1] + f | 0; a[2] = a[2] + h | 0; a[3] = a[3] + j | 0; a[4] = a[4] + b | 0 }, _doFinalize: function () { var b = this._data, d = b.words, a = 8 * this._nDataBytes, e = 8 * b.sigBytes; d[e >>> 5] |= 128 << 24 - e % 32; d[(e + 64 >>> 9 << 4) + 14] = Math.floor(a / 4294967296); d[(e + 64 >>> 9 << 4) + 15] = a; b.sigBytes = 4 * d.length; this._process(); return this._hash }, clone: function () { var b = l.clone.call(this); b._hash = this._hash.clone(); return b } }); k.SHA1 = l._createHelper(b); k.HmacSHA1 = l._createHmacHelper(b) })(); (function (k) { for (var g = CryptoJS, h = g.lib, v = h.WordArray, j = h.Hasher, h = g.algo, s = [], t = [], u = function (q) { return 4294967296 * (q - (q | 0)) | 0 }, l = 2, b = 0; 64 > b;) { var d; a: { d = l; for (var w = k.sqrt(d), r = 2; r <= w; r++) if (!(d % r)) { d = !1; break a } d = !0 } d && (8 > b && (s[b] = u(k.pow(l, 0.5))), t[b] = u(k.pow(l, 1 / 3)), b++); l++ } var n = [], h = h.SHA256 = j.extend({ _doReset: function () { this._hash = new v.init(s.slice(0)) }, _doProcessBlock: function (q, h) { for (var a = this._hash.words, c = a[0], d = a[1], b = a[2], k = a[3], f = a[4], g = a[5], j = a[6], l = a[7], e = 0; 64 > e; e++) { if (16 > e) n[e] = q[h + e] | 0; else { var m = n[e - 15], p = n[e - 2]; n[e] = ((m << 25 | m >>> 7) ^ (m << 14 | m >>> 18) ^ m >>> 3) + n[e - 7] + ((p << 15 | p >>> 17) ^ (p << 13 | p >>> 19) ^ p >>> 10) + n[e - 16] } m = l + ((f << 26 | f >>> 6) ^ (f << 21 | f >>> 11) ^ (f << 7 | f >>> 25)) + (f & g ^ ~f & j) + t[e] + n[e]; p = ((c << 30 | c >>> 2) ^ (c << 19 | c >>> 13) ^ (c << 10 | c >>> 22)) + (c & d ^ c & b ^ d & b); l = j; j = g; g = f; f = k + m | 0; k = b; b = d; d = c; c = m + p | 0 } a[0] = a[0] + c | 0; a[1] = a[1] + d | 0; a[2] = a[2] + b | 0; a[3] = a[3] + k | 0; a[4] = a[4] + f | 0; a[5] = a[5] + g | 0; a[6] = a[6] + j | 0; a[7] = a[7] + l | 0 }, _doFinalize: function () { var d = this._data, b = d.words, a = 8 * this._nDataBytes, c = 8 * d.sigBytes; b[c >>> 5] |= 128 << 24 - c % 32; b[(c + 64 >>> 9 << 4) + 14] = k.floor(a / 4294967296); b[(c + 64 >>> 9 << 4) + 15] = a; d.sigBytes = 4 * b.length; this._process(); return this._hash }, clone: function () { var b = j.clone.call(this); b._hash = this._hash.clone(); return b } }); g.SHA256 = j._createHelper(h); g.HmacSHA256 = j._createHmacHelper(h) })(Math); (function(){var c=CryptoJS,k=c.enc.Utf8;c.algo.HMAC=c.lib.Base.extend({init:function(a,b){a=this._hasher=new a.init;"string"==typeof b&&(b=k.parse(b));var c=a.blockSize,e=4*c;b.sigBytes>e&&(b=a.finalize(b));b.clamp();for(var f=this._oKey=b.clone(),g=this._iKey=b.clone(),h=f.words,j=g.words,d=0;d<c;d++)h[d]^=1549556828,j[d]^=909522486;f.sigBytes=g.sigBytes=e;this.reset()},reset:function(){var a=this._hasher;a.reset();a.update(this._iKey)},update:function(a){this._hasher.update(a);return this},finalize:function(a){var b=this._hasher;a=b.finalize(a);b.reset();return b.finalize(this._oKey.clone().concat(a))}})})(); (function(){var h=CryptoJS,j=h.lib.WordArray;h.enc.Base64={stringify:function(b){var e=b.words,f=b.sigBytes,c=this._map;b.clamp();b=[];for(var a=0;a<f;a+=3)for(var d=(e[a>>>2]>>>24-8*(a%4)&255)<<16|(e[a+1>>>2]>>>24-8*((a+1)%4)&255)<<8|e[a+2>>>2]>>>24-8*((a+2)%4)&255,g=0;4>g&&a+0.75*g<f;g++)b.push(c.charAt(d>>>6*(3-g)&63));if(e=c.charAt(64))for(;b.length%4;)b.push(e);return b.join("")},parse:function(b){var e=b.length,f=this._map,c=f.charAt(64);c&&(c=b.indexOf(c),-1!=c&&(e=c));for(var c=[],a=0,d=0;d<e;d++)if(d%4){var g=f.indexOf(b.charAt(d-1))<<2*(d%4),h=f.indexOf(b.charAt(d))>>>6-2*(d%4);c[a>>>2]|=(g|h)<<24-8*(a%4);a++}return j.create(c,a)},_map:"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/="}})(); �����������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/client.js����000755 �000766 �000024 �00000024260 12455173731 033047� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Url = require('url'); var Hoek = require('hoek'); var Cryptiles = require('cryptiles'); var Crypto = require('./crypto'); var Utils = require('./utils'); // Declare internals var internals = {}; // Generate an Authorization header for a given request /* uri: 'http://example.com/resource?a=b' or object from Url.parse() method: HTTP verb (e.g. 'GET', 'POST') options: { // Required credentials: { id: 'dh37fgj492je', key: 'aoijedoaijsdlaksjdl', algorithm: 'sha256' // 'sha1', 'sha256' }, // Optional ext: 'application-specific', // Application specific data sent via the ext attribute timestamp: Date.now(), // A pre-calculated timestamp nonce: '2334f34f', // A pre-generated nonce localtimeOffsetMsec: 400, // Time offset to sync with server time (ignored if timestamp provided) payload: '{"some":"payload"}', // UTF-8 encoded string for body hash generation (ignored if hash provided) contentType: 'application/json', // Payload content-type (ignored if hash provided) hash: 'U4MKKSmiVxk37JCCrAVIjV=', // Pre-calculated payload hash app: '24s23423f34dx', // Oz application id dlg: '234sz34tww3sd' // Oz delegated-by application id } */ exports.header = function (uri, method, options) { var result = { field: '', artifacts: {} }; // Validate inputs if (!uri || (typeof uri !== 'string' && typeof uri !== 'object') || !method || typeof method !== 'string' || !options || typeof options !== 'object') { result.err = 'Invalid argument type'; return result; } // Application time var timestamp = options.timestamp || Math.floor((Utils.now() + (options.localtimeOffsetMsec || 0)) / 1000) // Validate credentials var credentials = options.credentials; if (!credentials || !credentials.id || !credentials.key || !credentials.algorithm) { result.err = 'Invalid credential object'; return result; } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { result.err = 'Unknown algorithm'; return result; } // Parse URI if (typeof uri === 'string') { uri = Url.parse(uri); } // Calculate signature var artifacts = { ts: timestamp, nonce: options.nonce || Cryptiles.randomString(6), method: method, resource: uri.pathname + (uri.search || ''), // Maintain trailing '?' host: uri.hostname, port: uri.port || (uri.protocol === 'http:' ? 80 : 443), hash: options.hash, ext: options.ext, app: options.app, dlg: options.dlg }; result.artifacts = artifacts; // Calculate payload hash if (!artifacts.hash && options.hasOwnProperty('payload')) { artifacts.hash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, options.contentType); } var mac = Crypto.calculateMac('header', credentials, artifacts); // Construct header var hasExt = artifacts.ext !== null && artifacts.ext !== undefined && artifacts.ext !== ''; // Other falsey values allowed var header = 'Hawk id="' + credentials.id + '", ts="' + artifacts.ts + '", nonce="' + artifacts.nonce + (artifacts.hash ? '", hash="' + artifacts.hash : '') + (hasExt ? '", ext="' + Utils.escapeHeaderAttribute(artifacts.ext) : '') + '", mac="' + mac + '"'; if (artifacts.app) { header += ', app="' + artifacts.app + (artifacts.dlg ? '", dlg="' + artifacts.dlg : '') + '"'; } result.field = header; return result; }; // Validate server response /* res: node's response object artifacts: object recieved from header().artifacts options: { payload: optional payload received required: specifies if a Server-Authorization header is required. Defaults to 'false' } */ exports.authenticate = function (res, credentials, artifacts, options) { artifacts = Hoek.clone(artifacts); options = options || {}; if (res.headers['www-authenticate']) { // Parse HTTP WWW-Authenticate header var attributes = Utils.parseAuthorizationHeader(res.headers['www-authenticate'], ['ts', 'tsm', 'error']); if (attributes instanceof Error) { return false; } // Validate server timestamp (not used to update clock since it is done via the SNPT client) if (attributes.ts) { var tsm = Crypto.calculateTsMac(attributes.ts, credentials); if (tsm !== attributes.tsm) { return false; } } } // Parse HTTP Server-Authorization header if (!res.headers['server-authorization'] && !options.required) { return true; } var attributes = Utils.parseAuthorizationHeader(res.headers['server-authorization'], ['mac', 'ext', 'hash']); if (attributes instanceof Error) { return false; } artifacts.ext = attributes.ext; artifacts.hash = attributes.hash; var mac = Crypto.calculateMac('response', credentials, artifacts); if (mac !== attributes.mac) { return false; } if (!options.hasOwnProperty('payload')) { return true; } if (!attributes.hash) { return false; } var calculatedHash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, res.headers['content-type']); return (calculatedHash === attributes.hash); }; // Generate a bewit value for a given URI /* * credentials is an object with the following keys: 'id, 'key', 'algorithm'. * options is an object with the following optional keys: 'ext', 'localtimeOffsetMsec' */ /* uri: 'http://example.com/resource?a=b' or object from Url.parse() options: { // Required credentials: { id: 'dh37fgj492je', key: 'aoijedoaijsdlaksjdl', algorithm: 'sha256' // 'sha1', 'sha256' }, ttlSec: 60 * 60, // TTL in seconds // Optional ext: 'application-specific', // Application specific data sent via the ext attribute localtimeOffsetMsec: 400 // Time offset to sync with server time }; */ exports.getBewit = function (uri, options) { // Validate inputs if (!uri || (typeof uri !== 'string' && typeof uri !== 'object') || !options || typeof options !== 'object' || !options.ttlSec) { return ''; } options.ext = (options.ext === null || options.ext === undefined ? '' : options.ext); // Zero is valid value // Application time var now = Utils.now() + (options.localtimeOffsetMsec || 0); // Validate credentials var credentials = options.credentials; if (!credentials || !credentials.id || !credentials.key || !credentials.algorithm) { return ''; } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { return ''; } // Parse URI if (typeof uri === 'string') { uri = Url.parse(uri); } // Calculate signature var exp = Math.floor(now / 1000) + options.ttlSec; var mac = Crypto.calculateMac('bewit', credentials, { ts: exp, nonce: '', method: 'GET', resource: uri.pathname + (uri.search || ''), // Maintain trailing '?' host: uri.hostname, port: uri.port || (uri.protocol === 'http:' ? 80 : 443), ext: options.ext }); // Construct bewit: id\exp\mac\ext var bewit = credentials.id + '\\' + exp + '\\' + mac + '\\' + options.ext; return Utils.base64urlEncode(bewit); }; // Generate an authorization string for a message /* host: 'example.com', port: 8000, message: '{"some":"payload"}', // UTF-8 encoded string for body hash generation options: { // Required credentials: { id: 'dh37fgj492je', key: 'aoijedoaijsdlaksjdl', algorithm: 'sha256' // 'sha1', 'sha256' }, // Optional timestamp: Date.now(), // A pre-calculated timestamp nonce: '2334f34f', // A pre-generated nonce localtimeOffsetMsec: 400, // Time offset to sync with server time (ignored if timestamp provided) } */ exports.message = function (host, port, message, options) { // Validate inputs if (!host || typeof host !== 'string' || !port || typeof port !== 'number' || message === null || message === undefined || typeof message !== 'string' || !options || typeof options !== 'object') { return null; } // Application time var timestamp = options.timestamp || Math.floor((Utils.now() + (options.localtimeOffsetMsec || 0)) / 1000) // Validate credentials var credentials = options.credentials; if (!credentials || !credentials.id || !credentials.key || !credentials.algorithm) { // Invalid credential object return null; } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { return null; } // Calculate signature var artifacts = { ts: timestamp, nonce: options.nonce || Cryptiles.randomString(6), host: host, port: port, hash: Crypto.calculatePayloadHash(message, credentials.algorithm) }; // Construct authorization var result = { id: credentials.id, ts: artifacts.ts, nonce: artifacts.nonce, hash: artifacts.hash, mac: Crypto.calculateMac('message', credentials, artifacts) }; return result; }; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/crypto.js����000755 �000766 �000024 �00000006325 12455173731 033113� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Crypto = require('crypto'); var Url = require('url'); var Utils = require('./utils'); // Declare internals var internals = {}; // MAC normalization format version exports.headerVersion = '1'; // Prevent comparison of mac values generated with different normalized string formats // Supported HMAC algorithms exports.algorithms = ['sha1', 'sha256']; // Calculate the request MAC /* type: 'header', // 'header', 'bewit', 'response' credentials: { key: 'aoijedoaijsdlaksjdl', algorithm: 'sha256' // 'sha1', 'sha256' }, options: { method: 'GET', resource: '/resource?a=1&b=2', host: 'example.com', port: 8080, ts: 1357718381034, nonce: 'd3d345f', hash: 'U4MKKSmiVxk37JCCrAVIjV/OhB3y+NdwoCr6RShbVkE=', ext: 'app-specific-data', app: 'hf48hd83qwkj', // Application id (Oz) dlg: 'd8djwekds9cj' // Delegated by application id (Oz), requires options.app } */ exports.calculateMac = function (type, credentials, options) { var normalized = exports.generateNormalizedString(type, options); var hmac = Crypto.createHmac(credentials.algorithm, credentials.key).update(normalized); var digest = hmac.digest('base64'); return digest; }; exports.generateNormalizedString = function (type, options) { var normalized = 'hawk.' + exports.headerVersion + '.' + type + '\n' + options.ts + '\n' + options.nonce + '\n' + (options.method || '').toUpperCase() + '\n' + (options.resource || '') + '\n' + options.host.toLowerCase() + '\n' + options.port + '\n' + (options.hash || '') + '\n'; if (options.ext) { normalized += options.ext.replace('\\', '\\\\').replace('\n', '\\n'); } normalized += '\n'; if (options.app) { normalized += options.app + '\n' + (options.dlg || '') + '\n'; } return normalized; }; exports.calculatePayloadHash = function (payload, algorithm, contentType) { var hash = exports.initializePayloadHash(algorithm, contentType); hash.update(payload || ''); return exports.finalizePayloadHash(hash); }; exports.initializePayloadHash = function (algorithm, contentType) { var hash = Crypto.createHash(algorithm); hash.update('hawk.' + exports.headerVersion + '.payload\n'); hash.update(Utils.parseContentType(contentType) + '\n'); return hash; }; exports.finalizePayloadHash = function (hash) { hash.update('\n'); return hash.digest('base64'); }; exports.calculateTsMac = function (ts, credentials) { var hmac = Crypto.createHmac(credentials.algorithm, credentials.key); hmac.update('hawk.' + exports.headerVersion + '.ts\n' + ts + '\n'); return hmac.digest('base64'); }; exports.timestampMessage = function (credentials, localtimeOffsetMsec) { var now = Math.floor((Utils.now() + (localtimeOffsetMsec || 0)) / 1000); var tsm = exports.calculateTsMac(now, credentials); return { ts: now, tsm: tsm }; }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/index.js�����000755 �000766 �000024 �00000000556 12455173731 032702� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Export sub-modules exports.error = exports.Error = require('boom'); exports.sntp = require('sntp'); exports.server = require('./server'); exports.client = require('./client'); exports.crypto = require('./crypto'); exports.utils = require('./utils'); exports.uri = { authenticate: exports.server.authenticateBewit, getBewit: exports.client.getBewit }; ��������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/server.js����000755 �000766 �000024 �00000042202 12455173731 033073� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Boom = require('boom'); var Hoek = require('hoek'); var Cryptiles = require('cryptiles'); var Crypto = require('./crypto'); var Utils = require('./utils'); // Declare internals var internals = {}; // Hawk authentication /* req: node's HTTP request object or an object as follows: var request = { method: 'GET', url: '/resource/4?a=1&b=2', host: 'example.com', port: 8080, authorization: 'Hawk id="dh37fgj492je", ts="1353832234", nonce="j4h3g2", ext="some-app-ext-data", mac="6R4rV5iE+NPoym+WwjeHzjAGXUtLNIxmo1vpMofpLAE="' }; credentialsFunc: required function to lookup the set of Hawk credentials based on the provided credentials id. The credentials include the MAC key, MAC algorithm, and other attributes (such as username) needed by the application. This function is the equivalent of verifying the username and password in Basic authentication. var credentialsFunc = function (id, callback) { // Lookup credentials in database db.lookup(id, function (err, item) { if (err || !item) { return callback(err); } var credentials = { // Required key: item.key, algorithm: item.algorithm, // Application specific user: item.user }; return callback(null, credentials); }); }; options: { hostHeaderName: optional header field name, used to override the default 'Host' header when used behind a cache of a proxy. Apache2 changes the value of the 'Host' header while preserving the original (which is what the module must verify) in the 'x-forwarded-host' header field. Only used when passed a node Http.ServerRequest object. nonceFunc: optional nonce validation function. The function signature is function(nonce, ts, callback) where 'callback' must be called using the signature function(err). timestampSkewSec: optional number of seconds of permitted clock skew for incoming timestamps. Defaults to 60 seconds. Provides a +/- skew which means actual allowed window is double the number of seconds. localtimeOffsetMsec: optional local clock time offset express in a number of milliseconds (positive or negative). Defaults to 0. payload: optional payload for validation. The client calculates the hash value and includes it via the 'hash' header attribute. The server always ensures the value provided has been included in the request MAC. When this option is provided, it validates the hash value itself. Validation is done by calculating a hash value over the entire payload (assuming it has already be normalized to the same format and encoding used by the client to calculate the hash on request). If the payload is not available at the time of authentication, the authenticatePayload() method can be used by passing it the credentials and attributes.hash returned in the authenticate callback. host: optional host name override. Only used when passed a node request object. port: optional port override. Only used when passed a node request object. } callback: function (err, credentials, artifacts) { } */ exports.authenticate = function (req, credentialsFunc, options, callback) { callback = Utils.nextTick(callback); // Default options options.nonceFunc = options.nonceFunc || function (nonce, ts, nonceCallback) { return nonceCallback(); }; // No validation options.timestampSkewSec = options.timestampSkewSec || 60; // 60 seconds // Application time var now = Utils.now() + (options.localtimeOffsetMsec || 0); // Measure now before any other processing // Convert node Http request object to a request configuration object var request = Utils.parseRequest(req, options); if (request instanceof Error) { return callback(Boom.badRequest(request.message)); } // Parse HTTP Authorization header var attributes = Utils.parseAuthorizationHeader(request.authorization); if (attributes instanceof Error) { return callback(attributes); } // Construct artifacts container var artifacts = { method: request.method, host: request.host, port: request.port, resource: request.url, ts: attributes.ts, nonce: attributes.nonce, hash: attributes.hash, ext: attributes.ext, app: attributes.app, dlg: attributes.dlg, mac: attributes.mac, id: attributes.id }; // Verify required header attributes if (!attributes.id || !attributes.ts || !attributes.nonce || !attributes.mac) { return callback(Boom.badRequest('Missing attributes'), null, artifacts); } // Fetch Hawk credentials credentialsFunc(attributes.id, function (err, credentials) { if (err) { return callback(err, credentials || null, artifacts); } if (!credentials) { return callback(Boom.unauthorized('Unknown credentials', 'Hawk'), null, artifacts); } if (!credentials.key || !credentials.algorithm) { return callback(Boom.internal('Invalid credentials'), credentials, artifacts); } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { return callback(Boom.internal('Unknown algorithm'), credentials, artifacts); } // Calculate MAC var mac = Crypto.calculateMac('header', credentials, artifacts); if (!Cryptiles.fixedTimeComparison(mac, attributes.mac)) { return callback(Boom.unauthorized('Bad mac', 'Hawk'), credentials, artifacts); } // Check payload hash if (options.payload !== null && options.payload !== undefined) { // '' is valid if (!attributes.hash) { return callback(Boom.unauthorized('Missing required payload hash', 'Hawk'), credentials, artifacts); } var hash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, request.contentType); if (!Cryptiles.fixedTimeComparison(hash, attributes.hash)) { return callback(Boom.unauthorized('Bad payload hash', 'Hawk'), credentials, artifacts); } } // Check nonce options.nonceFunc(attributes.nonce, attributes.ts, function (err) { if (err) { return callback(Boom.unauthorized('Invalid nonce', 'Hawk'), credentials, artifacts); } // Check timestamp staleness if (Math.abs((attributes.ts * 1000) - now) > (options.timestampSkewSec * 1000)) { var tsm = Crypto.timestampMessage(credentials, options.localtimeOffsetMsec); return callback(Boom.unauthorized('Stale timestamp', 'Hawk', tsm), credentials, artifacts); } // Successful authentication return callback(null, credentials, artifacts); }); }); }; // Authenticate payload hash - used when payload cannot be provided during authenticate() /* payload: raw request payload credentials: from authenticate callback artifacts: from authenticate callback contentType: req.headers['content-type'] */ exports.authenticatePayload = function (payload, credentials, artifacts, contentType) { var calculatedHash = Crypto.calculatePayloadHash(payload, credentials.algorithm, contentType); return Cryptiles.fixedTimeComparison(calculatedHash, artifacts.hash); }; // Generate a Server-Authorization header for a given response /* credentials: {}, // Object received from authenticate() artifacts: {} // Object received from authenticate(); 'mac', 'hash', and 'ext' - ignored options: { ext: 'application-specific', // Application specific data sent via the ext attribute payload: '{"some":"payload"}', // UTF-8 encoded string for body hash generation (ignored if hash provided) contentType: 'application/json', // Payload content-type (ignored if hash provided) hash: 'U4MKKSmiVxk37JCCrAVIjV=' // Pre-calculated payload hash } */ exports.header = function (credentials, artifacts, options) { // Prepare inputs options = options || {}; if (!artifacts || typeof artifacts !== 'object' || typeof options !== 'object') { return ''; } artifacts = Hoek.clone(artifacts); delete artifacts.mac; artifacts.hash = options.hash; artifacts.ext = options.ext; // Validate credentials if (!credentials || !credentials.key || !credentials.algorithm) { // Invalid credential object return ''; } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { return ''; } // Calculate payload hash if (!artifacts.hash && options.hasOwnProperty('payload')) { artifacts.hash = Crypto.calculatePayloadHash(options.payload, credentials.algorithm, options.contentType); } var mac = Crypto.calculateMac('response', credentials, artifacts); // Construct header var header = 'Hawk mac="' + mac + '"' + (artifacts.hash ? ', hash="' + artifacts.hash + '"' : ''); if (artifacts.ext !== null && artifacts.ext !== undefined && artifacts.ext !== '') { // Other falsey values allowed header += ', ext="' + Utils.escapeHeaderAttribute(artifacts.ext) + '"'; } return header; }; /* * Arguments and options are the same as authenticate() with the exception that the only supported options are: * 'hostHeaderName', 'localtimeOffsetMsec', 'host', 'port' */ exports.authenticateBewit = function (req, credentialsFunc, options, callback) { callback = Utils.nextTick(callback); // Application time var now = Utils.now() + (options.localtimeOffsetMsec || 0); // Convert node Http request object to a request configuration object var request = Utils.parseRequest(req, options); if (request instanceof Error) { return callback(Boom.badRequest(request.message)); } // Extract bewit // 1 2 3 4 var resource = request.url.match(/^(\/.*)([\?&])bewit\=([^&$]*)(?:&(.+))?$/); if (!resource) { return callback(Boom.unauthorized(null, 'Hawk')); } // Bewit not empty if (!resource[3]) { return callback(Boom.unauthorized('Empty bewit', 'Hawk')); } // Verify method is GET if (request.method !== 'GET' && request.method !== 'HEAD') { return callback(Boom.unauthorized('Invalid method', 'Hawk')); } // No other authentication if (request.authorization) { return callback(Boom.badRequest('Multiple authentications', 'Hawk')); } // Parse bewit var bewitString = Utils.base64urlDecode(resource[3]); if (bewitString instanceof Error) { return callback(Boom.badRequest('Invalid bewit encoding')); } // Bewit format: id\exp\mac\ext ('\' is used because it is a reserved header attribute character) var bewitParts = bewitString.split('\\'); if (!bewitParts || bewitParts.length !== 4) { return callback(Boom.badRequest('Invalid bewit structure')); } var bewit = { id: bewitParts[0], exp: parseInt(bewitParts[1], 10), mac: bewitParts[2], ext: bewitParts[3] || '' }; if (!bewit.id || !bewit.exp || !bewit.mac) { return callback(Boom.badRequest('Missing bewit attributes')); } // Construct URL without bewit var url = resource[1]; if (resource[4]) { url += resource[2] + resource[4]; } // Check expiration if (bewit.exp * 1000 <= now) { return callback(Boom.unauthorized('Access expired', 'Hawk'), null, bewit); } // Fetch Hawk credentials credentialsFunc(bewit.id, function (err, credentials) { if (err) { return callback(err, credentials || null, bewit.ext); } if (!credentials) { return callback(Boom.unauthorized('Unknown credentials', 'Hawk'), null, bewit); } if (!credentials.key || !credentials.algorithm) { return callback(Boom.internal('Invalid credentials'), credentials, bewit); } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { return callback(Boom.internal('Unknown algorithm'), credentials, bewit); } // Calculate MAC var mac = Crypto.calculateMac('bewit', credentials, { ts: bewit.exp, nonce: '', method: 'GET', resource: url, host: request.host, port: request.port, ext: bewit.ext }); if (!Cryptiles.fixedTimeComparison(mac, bewit.mac)) { return callback(Boom.unauthorized('Bad mac', 'Hawk'), credentials, bewit); } // Successful authentication return callback(null, credentials, bewit); }); }; /* * options are the same as authenticate() with the exception that the only supported options are: * 'nonceFunc', 'timestampSkewSec', 'localtimeOffsetMsec' */ exports.authenticateMessage = function (host, port, message, authorization, credentialsFunc, options, callback) { callback = Utils.nextTick(callback); // Default options options.nonceFunc = options.nonceFunc || function (nonce, ts, nonceCallback) { return nonceCallback(); }; // No validation options.timestampSkewSec = options.timestampSkewSec || 60; // 60 seconds // Application time var now = Utils.now() + (options.localtimeOffsetMsec || 0); // Measure now before any other processing // Validate authorization if (!authorization.id || !authorization.ts || !authorization.nonce || !authorization.hash || !authorization.mac) { return callback(Boom.badRequest('Invalid authorization')) } // Fetch Hawk credentials credentialsFunc(authorization.id, function (err, credentials) { if (err) { return callback(err, credentials || null); } if (!credentials) { return callback(Boom.unauthorized('Unknown credentials', 'Hawk')); } if (!credentials.key || !credentials.algorithm) { return callback(Boom.internal('Invalid credentials'), credentials); } if (Crypto.algorithms.indexOf(credentials.algorithm) === -1) { return callback(Boom.internal('Unknown algorithm'), credentials); } // Construct artifacts container var artifacts = { ts: authorization.ts, nonce: authorization.nonce, host: host, port: port, hash: authorization.hash }; // Calculate MAC var mac = Crypto.calculateMac('message', credentials, artifacts); if (!Cryptiles.fixedTimeComparison(mac, authorization.mac)) { return callback(Boom.unauthorized('Bad mac', 'Hawk'), credentials); } // Check payload hash var hash = Crypto.calculatePayloadHash(message, credentials.algorithm); if (!Cryptiles.fixedTimeComparison(hash, authorization.hash)) { return callback(Boom.unauthorized('Bad message hash', 'Hawk'), credentials); } // Check nonce options.nonceFunc(authorization.nonce, authorization.ts, function (err) { if (err) { return callback(Boom.unauthorized('Invalid nonce', 'Hawk'), credentials); } // Check timestamp staleness if (Math.abs((authorization.ts * 1000) - now) > (options.timestampSkewSec * 1000)) { return callback(Boom.unauthorized('Stale timestamp'), credentials); } // Successful authentication return callback(null, credentials); }); }); }; ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/lib/utils.js�����000755 �000766 �000024 �00000007713 12455173731 032735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Hoek = require('hoek'); var Sntp = require('sntp'); var Boom = require('boom'); // Declare internals var internals = {}; // Import Hoek Utilities internals.import = function () { for (var i in Hoek) { if (Hoek.hasOwnProperty(i)) { exports[i] = Hoek[i]; } } }; internals.import(); // Hawk version exports.version = function () { return exports.loadPackage(__dirname + '/..').version; }; // Extract host and port from request exports.parseHost = function (req, hostHeaderName) { hostHeaderName = (hostHeaderName ? hostHeaderName.toLowerCase() : 'host'); var hostHeader = req.headers[hostHeaderName]; if (!hostHeader) { return null; } var hostHeaderRegex; if (hostHeader[0] === '[') { hostHeaderRegex = /^(?:(?:\r\n)?\s)*(\[[^\]]+\])(?::(\d+))?(?:(?:\r\n)?\s)*$/; // IPv6 } else { hostHeaderRegex = /^(?:(?:\r\n)?\s)*([^:]+)(?::(\d+))?(?:(?:\r\n)?\s)*$/; // IPv4, hostname } var hostParts = hostHeader.match(hostHeaderRegex); if (!hostParts || hostParts.length !== 3 || !hostParts[1]) { return null; } return { name: hostParts[1], port: (hostParts[2] ? hostParts[2] : (req.connection && req.connection.encrypted ? 443 : 80)) }; }; // Parse Content-Type header content exports.parseContentType = function (header) { if (!header) { return ''; } return header.split(';')[0].trim().toLowerCase(); }; // Convert node's to request configuration object exports.parseRequest = function (req, options) { if (!req.headers) { return req; } // Obtain host and port information if (!options.host || !options.port) { var host = exports.parseHost(req, options.hostHeaderName); if (!host) { return new Error('Invalid Host header'); } } var request = { method: req.method, url: req.url, host: options.host || host.name, port: options.port || host.port, authorization: req.headers.authorization, contentType: req.headers['content-type'] || '' }; return request; }; exports.now = function () { return Sntp.now(); }; // Parse Hawk HTTP Authorization header exports.parseAuthorizationHeader = function (header, keys) { keys = keys || ['id', 'ts', 'nonce', 'hash', 'ext', 'mac', 'app', 'dlg']; if (!header) { return Boom.unauthorized(null, 'Hawk'); } var headerParts = header.match(/^(\w+)(?:\s+(.*))?$/); // Header: scheme[ something] if (!headerParts) { return Boom.badRequest('Invalid header syntax'); } var scheme = headerParts[1]; if (scheme.toLowerCase() !== 'hawk') { return Boom.unauthorized(null, 'Hawk'); } var attributesString = headerParts[2]; if (!attributesString) { return Boom.badRequest('Invalid header syntax'); } var attributes = {}; var errorMessage = ''; var verify = attributesString.replace(/(\w+)="([^"\\]*)"\s*(?:,\s*|$)/g, function ($0, $1, $2) { // Check valid attribute names if (keys.indexOf($1) === -1) { errorMessage = 'Unknown attribute: ' + $1; return; } // Allowed attribute value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9 if ($2.match(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~]+$/) === null) { errorMessage = 'Bad attribute value: ' + $1; return; } // Check for duplicates if (attributes.hasOwnProperty($1)) { errorMessage = 'Duplicate attribute: ' + $1; return; } attributes[$1] = $2; return ''; }); if (verify !== '') { return Boom.badRequest(errorMessage || 'Bad header format'); } return attributes; }; exports.unauthorized = function (message) { return Boom.unauthorized(message, 'Hawk'); }; �����������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/hawk.png��000755 �000766 �000024 �00000015441 12455173731 033373� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������PNG  ��� IHDR��)���g���\*��� pHYs��\F��\FCA�� OiCCPPhotoshop ICC profile��xڝSgTS=BKKoR RB&*! J!QEEȠQ, !{kּ> H3Q5 B.@ $p�d!s#�~<<+"�x �M0B\t8K�@zB�@F&S��`cb�P-�`'�{�[!� eD�h;�VE�X0�fK9�-�0IWfH�� � �0Q)�{�`##x��FW<+*��x<$9E[-qWW.(I+6aa@.y24��x6_-"bbϫp@��t~,/;m%h^ uf@�Wp~<<EJB[aW}g_Wl~<$2]GLϒ bG "IbX*QqD2"B)%d,>5�j>{-]cK'Xt��o(hw?G%�fIq��^D$.Tʳ?��D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;�2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ<FFi\$mmƣ&&!&KMMRM);L;L֙͢5=12כ߷`ZxZ,eIZYnZ9YXUZ]F%ֻNNgðɶۮm}agbgŮ}}= Z~sr:V:ޚΜ?}/gX3)iSGggs󈋉K.>.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz�%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9<qy +V<*mOW~&zMk^ʂk U }]OX/Yߵa>(xoʿܔĹdff-[n ڴ VE/(ۻC<e;?TTTT6ݵan{4[>ɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG4<YyJTiӓgό}~.`ۢ{cjotE;;\tWW:_mt<Oǻ\kz{f7y՞9=ݽzo~r'˻w'O_@AC݇?[jwGCˆ 8>99?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-��� cHRM��z%��������u0��`��:��o_F��LIDATxya]dqATAT\@*1<ƍDDw1759FqIDE `DMF`fGs{]{ss󺺺_r"8](Jht}#|zt(P%o#WK"(gO+D}hS/; XTڡ`kEN_`!;g(``DC{FZH)H)"("("(((*R(*R(*RH)H)"("("(((*R(*R(*RH)H)"("("((((*R(*RH)H)H)"("(JD(o?  z{= Xfk݁}@kKJ>>۫Cޚl�:v | ,� vjmχ�=u0�߷�z:l>}` :Se@5/I6aW< G#^Y26ƾP>ՎZb!@Bn`N:9c"ѤTGw NZ<tJ`m<XyAD%sA2Okй,B٬* |[C3kA h�higaz h℈^;d@IJiR�"u`.R醯=:Q:V7@R 롱 \< R/O ?Er"ULn'EjFJRsSh M0^'Hft`]FZ aer"uv]3ݹ6GS3N (O8^$T\x1B~y8&OEj6VL6a`s0r <xnb`:wx`,p:V pO>hW:_ibgvd%l$8C%5I໦Vn;ƆE"?<!R0g0 8-a_ C<u8NCx_#."ܳ#FP#_hǘ mZYmf#4a p��)Tv7`$ip^ Z 3<Gu{�FJbYR8mc0G@X|pO�Ѹ9ۓ{׸>�zX%(ӽfy0|2O|L}FvҎh;%{u*f;%k#y^@~ΑHE=H5,rId6ZbU#7]1'jkEoD G Bw/|gP^ ܕ bvt=F̂uW.̩T�WǗ9+�5үX"iuBPPdgOu;prOζv?I 3ҙe@#ii@t,((a`ԻBUH19 S.G{2'-Vn))wiA~R<U|<}Wxw'0Ɩ&n. Ƒ$I}DHu4u;xǐ+au/1~b!"V 5ǓPhgHt/n1 'Y$rs~nD7M1D?מp&`^gA fŝ$&Ճ`ND$S8{K K0'9؂)%}ĈN##v`^wpgAH)fD"vBGnFn:*Z3"bF)%_FS.t%=6 '1 -lݵtHc~H@1&?cg Y Wԡ!mzD ƨtvmт@ԥ(TpJd\~{`|<9$<xVc2YakUw ;2 v5R?ґTtb~Z;A| &#=|" 0 UIW!^%5ಘx;=©Y_{\q<14 az`bʫ")0- �^%; NGfm">CPTsƱ׮Zc; AaMgv֞Pqϡhbj89 h? ĹȺ"^!fMk]9fE<?yga8tgeI#Iǎ3.K<Ǵl;S DOvjQ tyWo &rv]# �vjʉf$Zcߡ,H去q5a*fo(w78NC#|& |�$e@Z7T:1UU> 5|1YMkC~ih:ߗ:f#`@z'�T.Hc.~<&m]ʅӴVd2Dz[-WLZY#ph<l,H)�k�e58Q2MҹY Uދ FV :H,A }UyTV[06S.`l{uMDχx s(^*RO(nؔ/5DՁ2^Β뱝|w2yBj4 ME*z Hrt,)_gYWp!FS0T<P:Q;IaRW2/ RN] /gb~ȅ!_*R;;#WgU#05EA<KZi+L0lTvn tǘިHy$ AEwYFecֺ9`,I PZW!lRI72ow8u}*# E c-v;ѐ316پEiKMc~ׯ?X=0!2_ss|�ߎDUBiaHYwpBq)1V<b 7 p9ǫHjI_E1{bQ(y}vrrEYHE* k9ލ9~~T3y6mB&Zco>ۧ"/qq5OܶPmh^p䝴uEE*=RcNeoXT^mIO܇q(p[JE*=Ӛ9Ӕv0=,HA6ZwdMd}6=MCKٳ YwC1 Ⱥi]9 v_S+"<d)̡鶘Sm?ҟ$Xw͐qƝo-\qGþz6C>I7O0oÅ4Q0O׻ E�zt/3h8GR Cv$Px7Np'Ouv;P")*% /szHW6{:Z1Z "\ڿ9QóVn(.a $YKҹ2 _k|Tvn%uTanc"z<yog-Nw;cI<*WÄ9kgHB<aXﶵkqosصw(6?\DVHXB%;kob-!p)t]Ү˱0M`_9&Wp<#X[1^Z_jQ6:U1&6^|D}s}|NN*Rr8:w1U17%%ʰl}}vLhs4|{ӌyܓT(S#k:ht. 0rsa[gs4ӮÄ^a+aPiEk;U܁6#6sXFmpيk6Hԥ8J*(aNWm&J}{-oK�˻ ����IENDB`�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/images/logo.png��000755 �000766 �000024 �00000214064 12455173731 033403� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������PNG  ��� IHDR��@��g���9$��� pHYs�� �� ��� OiCCPPhotoshop ICC profile��xڝSgTS=BKKoR RB&*! J!QEEȠQ, !{kּ> H3Q5 B.@ $p�d!s#�~<<+"�x �M0B\t8K�@zB�@F&S��`cb�P-�`'�{�[!� eD�h;�VE�X0�fK9�-�0IWfH�� � �0Q)�{�`##x��FW<+*��x<$9E[-qWW.(I+6aa@.y24��x6_-"bbϫp@��t~,/;m%h^ uf@�Wp~<<EJB[aW}g_Wl~<$2]GLϒ bG "IbX*QqD2"B)%d,>5�j>{-]cK'Xt��o(hw?G%�fIq��^D$.Tʳ?��D*A, `6B$BB dr`)B(Ͱ*`/@4Qhp.U=pa( Aa!ڈbX#!H$ ɈQ"K5H1RT UH=r9\F;�2G1Q= C7F dt1r=6Ыhڏ>C03l0.B8, c˱" VcϱwE 6wB aAHXLXNH $4 7 Q'"K&b21XH,#/{C7$C2'ITFnR#,4H#dk9, +ȅ3![ b@qS(RjJ4e2AURݨT5ZBRQ4u9̓IKhhitݕNWGw Ljg(gwLӋT071oUX**| J&*/Tު UUT^S}FU3S ԖUPSSg;goT?~YYLOCQ_ cx,!k u5&|v*=9C3J3WRf?qtN (~))4L1e\kXHQG6EYAJ'\'GgSSݧ M=:.kDwn^Loy}/TmG X $ <5qo</QC]@Caaᄑ<FFi\$mmƣ&&!&KMMRM);L;L֙͢5=12כ߷`ZxZ,eIZYnZ9YXUZ]F%ֻNNgðɶۮm}agbgŮ}}= Z~sr:V:ޚΜ?}/gX3)iSGggs󈋉K.>.ȽJtq]zۯ6iܟ4)Y3sCQ? 0k߬~OCOg#/c/Wװwa>>r><72Y_7ȷOo_C#dz�%gA[z|!?:eAAA!h쐭!ΑiP~aa~ 'W?pX15wCsDDDޛg1O9-J5*>.j<74?.fYXXIlK9.*6nl {/]py.,:@LN8A*%w% yg"/6шC\*NH*Mz쑼5y$3,幄'L Lݛ:v m2=:1qB!Mggfvˬen/kY- BTZ(*geWf͉9+̳ې7ᒶKW-X潬j9<qy +V<*mOW~&zMk^ʂk U }]OX/Yߵa>(xoʿܔĹdff-[n ڴ VE/(ۻC<e;?TTTT6ݵan{4[>ɾUUMfeI?m]Nmq#׹=TR+Gw- 6 U#pDy  :v{vg/jBFS[b[O>zG4<YyJTiӓgό}~.`ۢ{cjotE;;\tWW:_mt<Oǻ\kz{f7y՞9=ݽzo~r'˻w'O_@AC݇?[jwGCˆ 8>99?rCd&ˮ/~јѡ򗓿m|x31^VwwO| (hSЧc3-��� cHRM��z%��������u0��`��:��o_F� _IDATxw|Ն;3ۛzlٖ-r Bz / PB IhdٖduiǸm$Gy%̜S#$'9;Q-INrNW]MGJ@8/W0oH !pY.mY. L?]nЄd9 Ι9Do$^drf6Y[xlq82/ <PR1Mt=KIkP<BiH*'9Y hXt3jʘ$U y\T( PiK]lo-K$m ls WZ�vY9 0�e~@~Dm \TU-3 Fؖix*EQMFgLb)޴>vw&;nQ` @ls<)Rb"p��0#BPˁs]䈉EDo'dÖN<UުR4EdUl ] 숳zO/ct ,"уέ9�`�s�8�(�ò>A(ӏX>'( >RnK!KܛboOƎ$;.VǍQU b*`�s�A({$n_22N2c\UuRCCY ҭ*<c-ݒ24u'yhK;nl+с{I)d�s� �63a[.xΘٳX )eK/e@UMŭ* ^hMmk{@.Jr��0�P-LyTMw@%wk]$u[[ 9�;�U5 &"0TƓLKP$lۼ [h(dtӦ/e)`R"~zm<P6 Ac�vs�8xϐ2ҾKOkg( EUFbE SM|.j [ǦXA[">Մx>g�p[( ;~oEeK37vDן=XG3{oFUQM"c24ȊӖ[^MsK$�sr+�nRVTׯLoM[7B!J<cjR_Fo`mK)㊠)s�8nM 9$�8UiI i=x.rή-P1-7[Bاaے߸u{BX[Ƚa3 �,LÓJ f^w3a捨:f-67SK<-K(w~Nr.p�Q-v۳_>֍Pѕ5fLڒ -=i6FU`pgNr�8Xkl),gMl:n֪e[S]ܻ~KVFC9 8<:'Wrw7*c7? NGgvgTg{Llץ"˝y 0˦ r�~$B )d ?{Y|zLZv cو1hAK_ $jUT_(<69ɹ︸DJyJJ;sHwV)4k]w<& n^=HE wRxng7L[ Dά;C +y aqR9nϞH__R6$MQ\$oOڒ]I{7ԓDA:F3ټ꽼.<" !4`t'~9�rFҶ++ԙ2l?jϖh=*Kt2M,erw7;li?`R7�I�8]AZ9�I�߻y}u'$fΝ.ϨEPpͰfo/bS7ͽ)Li uFAfBOgWG՞P}.ҰK"b" |!j�u )˺2pهNr7oDeti-%>Ju-hMaq}Hs[iUt't/ٰlKczi7,[-!9"Q З�G5()̞X}6[got%l)UJB^n|n?sVn Fy(5HWgRpSS 4XdaRq�MQ"my3idTH4J4A L5 4n>څ mÂ.}[n}ip ΝW4I~E%Sw�wI'rK |{]`3_>En7pŅ?!D[43jm)фm5h(T;Q!mEvp D�Uþ)jy2, BnJô;&|uHy\׌v^NUŒnk/;foO~I-))1m4�~^EY|l}�H%y|Z2χn83MЫ1" Pg9m�p >W" (m ߼m] |oa 7%ZZ&mLt|�| Z6ca XI)TP" _"{d5?;�%UdOZ+*u/N*#>ܪr�2DV)򈗎oZ {**ZXͷs$_p>~-Y9.+󈥍C8IJ% ,d޸㍼g8f~3Oҗ䫷͔==J^TOcb}=)y)nm)_RҢ$>_~8K�,_X~sWp jMy[sY11CA|᜼ �c@p`H *iһ67cX</2z-%*7!RJܷKQ/8nbⶉa;4 E~z5EۗR5k"%t=#ĥ \B)-wVGSPso _p+.ea`3,NS me4qV'8=65 _^Ν;9y v?O~<#{7]RO[_t:tcվ2�˹Yq 8;<^xeU�e0f_''o戀C7|%AP*.)ؼ?^˙s9svs#D68| 7C[_px".eD�L“w/-{d0u!}}|S9YCӱ��S[Cs7m-ges:3>K=ZZpIؖUw}+_8gh}X-%ɌŧQ5D!ʝkٰ?ƱǞo***>>U|y2wNGL"}0<ES9ȁ 2i>яbdt>``"�IlavӀGUx^J 4#Ō0t;\}()Ndܽy7{zBOit3}iC[cZY;qTɛIw"wlĒDLS?D]DeI)7oĥ .a> [#mX7DDheTΡ+ӛyxKܳ%WBC.1u<$2&}i-fZ@K(D xZ(sh!\΁%[ԭe{(l&il/3y$NBsT4-L+]]<|NQBR70[)^� DPQx3gSKm}&9br~t&2h,Ρxƶ=(P(ao4\].p >dmD^g&1˯o|M iJ6_,O_?mih€"X?ݿ]1BA[</3D3A-![uRWffU$[(H&AGC %anZ)GrZs<�ڹ7=)yůOL8ҊJ&Ned�c'K ؙO7rdžVf*_Ǥ !N>*܉ȴqHs#p ˦+yv֎2 &Cp"qi 3lg$& 95"B>۳_J @�.+8Ṷ̏|)wyXMOfll>cqKAD ɬ< - 9C9wx6QC򱅓jNi(!̐qQA2cQp-{)ҟ3* #( Zc)iX6D29˹9�8}If^M!2ضNb"~yq/"gO.$sp"A |rss|jlKG1$o ,@ Mhp3*( {Y3&_ |+'R\V (I0*M%cڨ`w)UER"H֨0$/r"Pk$,w4�y/lf˂(EeRR?Ȓ @RYP^Ķ<.ތ&TUu\ O$M4ķ_:uqm PG+}a�s97;Pw s=) uqӻC[YzTnG+tWq+<i.2&Eb) 9ʺ^bi7 2 έ �̩[ a"y.<>{3n@i˱M!2(- p v>xӝ`CS-$19PȻL wOo;B\xRfJY s<\ AOnk^wO;<nHpw%}뿫8댿0 .F2 ' FӋ]fONCUjIiPF [bӜv6prEbM2wmjNg/ !y 12_u\̛U$ 4 ;?a1[vRTgJm1PnqIpiޔ=zVD Iۂ/ߡ/9y�0gl({/ Ft�H%tu5}V,_6w@, JH㹧_s_.aBԨ(R݆Kr#Us{o\ෲlK^Ue+Sx I ]w߿[o�]OqEs�;EQd1LW<eBui21\;<sB~D&W;0E<pYq Tew!^`ժ|oE"HflQ5)*tի)'c�4fL]ZW/?v2i1_ s!;,`lEJTsIU(E3dׯhM*_r_| IIA+?P4'F(LWR(F:]TA -rm!1ZsI1 tI" g|k [qL U.ۺٻ%ea,Eh+%2 ͡6$# ~9 p A¶ B|٥=vbT�}=$:@ޝ]$)Y:WAR76 !:;Ɂ�~�N,lly*k*R6x\�}M]4?wlg^&rjE;nyH)Mkr 8ϱ�s뜱mg,?iceC!!�$7\]C,i0mz1G-$ȔeTL*$-aɁh)Xt%n5N.p$%082Ӄ |b>> y{pqլ8yy~4 _:#=LOOSWudLkc%{y 笿�ðMPWJ^a>"N<y|fٴx_|`VM7 زa̭x]|,`6a#l7PJe62htR:Wj;ny~f yY0�;T\lo~8J߿w>A7] }$L @`kGEPΒ́~VpggΞ_8?){xs/>Lgg%OcͺF*nyɸe||T*=E3N:X-I̷�(؟.!fLuYWRA//'M.b|l=�hy~v7v�,R0%1}Tӿ i$�9)Fuci7.�H8H`ɤYŀ~`|.P7eYm1?ܚ´ gwSUS԰tz u!EW\Fw( i6m}8~a6bdIeu XWJ}}^N+"1IDMϢ)Ȫ)~F~nڴE3R*ӕ$=ع�pumiiZ߄ |~l-M2cAgBv)ݎkmK:c͕IЫ ћ[69/2�! p==զ n�I,)%'D6lD$mXM X}Х$u4i$#j ]cooJ}8e+ X|ӷв?ʯ/K6䖕2<T41L /@SU$tNr�8l1E %P�^S+]By*įV7֭avK<cPfnU["-^3''lޘmlHͶ-|(imNr� +p0_l_Ȁ;U@7�<vK"3$6BaI|.O,op9,Zu6bFidBwv6JS-OY99Q]B T\!eI'Nj0Rv+oPNWĒiMWO}Exw 84DFYXUD+k߉W.^n!ܺ`}dŌ2~p ΞSűS Y4|?LAWZ!rb}B7F"x7:ONrpLsH~Un˞_`8V }IcM(�llÖ)󖚶$1ؒOwlf)9ɢe=K Us( n)3ONܟ[Ӷ-/q$مt8rWmxu0m,K"^ `skDĭ)2)h@/JIHAЃ"}<C 'SHd :c &[$tZi??>&MJIe%59;�sH%-[G�i@CJdcv 8O);S_[̈[ ʤeݬc/}ߑ\b)"[2 (3&{L- U3uD,́`N޹�8{U?m7BH&H{E;91Hxi9 UeY¶U(2b<GOKkKD+™OR\2߿g.JCӓa۟Ɂ`N޹�YCJx5)Lqhĥ&p"iΚSOå* q|�IP:Aiɿ,G\bѴI<ceyQ7דb|,+ &0'je0~ VVضdI \uWC!0RK)yHָ $2&34ڽi|U(m;%oVB ~?[ȅ .?o1LZ5{5N/0fݾ1<_AhB(_rY�k�<2 qp ]#,[2 ˲y&#Be~pNEa5G u!N^BSW";q.BA_q]D-{. |nǎ'M43auw LE#= iu+ʣ@29~E}G� 8C$߲ Ԙe0criӊ-6H)'m륯,xgQ'Rrs{u>ڳN`3PTpIyys+Q7M!cRR  cvlm| +^S8@)_~F��f-!J [d8nϼ3l*})6nԖG#ݙ`CK/59J[)_QAУ1*:҈)%)˶%ǖG#{zmU:r�Uaq#CMlRʿXR$,S˖^z ҆a8e"RTЙߛ&mXhBMU%[¢:M@i1w&E;(t+2@EqHHZCE5&Q )FhH@ӊl9"z4>�Ӓ$tnu)T( yU)xIJGn)?hڤ23Jif[;bgRrk 2 QLauS;~W6ۤuu`v 1UOI?m}i=wKWl0!+f)Gzy Н0Ș6_YQ˗Nm<.r��Wh}H圼*$& 9+ΟCUa6'<;g� ~_Iw'_`r^UDƒ+òeX^_Uϑ oKqS'<`͸^ޙ_= Ii$`Ʋbieu%lO)RB:x]jK�p,@ e[_0{I(xXiU +cKWcE}) ۢg ��p@K,o{d5=a L+ Rshs7\b!H6>%M=z*9�B`0NH6XQgOL{4ץR["c2cLBHN+/ƶI]0�p�r`{5� (TaoAڶޟ8Y9w~%qݲS&DN> tFSח"=sti*a�(ƇE^Qf'F()wuP X.^U W n|IQȏ6-ȴ] )sq*G%uqMH)sl){j+\Zy \B)IzАY"N-4U:czI4Ouw ^[0)͛|z·NOã)t&2-tuS2,ZDu' TP5]`UƯ*m)Y2Ցmf]sc.xUI$$ls6'sZ*#$u҆rm$TAE6ǔʀ"1V-%[QaxhmK5T1maI ,GH-I:F~GLۧQ3cZ2~@Kв!̪0(3*uW(wsI1l6/PJm]ᅓi^ZRp}.4pMxTiJ$u %C.ьapBySew;ӓ46׻%rX�ɖฎ*m޼pRa޷ON"caXNZ(B[NJ-٥K6s.pYVR:XnPfFivtD $P:DzN m~ե*?sdI:* <[Wk'M礆{R9wxv^ER.+7>< UzeْGN.!M TÖr䌆rLȣ3y9Ґ6s8UM{44o>v^kKo#&syyژm!jle 6./x?n[kq==\r*>uZ |gNVl "( {B`ffcͅ9Ӏ̛߭_C t"-y4pǎaJIޤ:7VJ( em 5p^Jo@Qr 5ܛ2*{Ax6쏦Y:5"{c��%K Ib]28^>-!Q6߾}Ѵ5<ŵXRsd#`=M2cj(B{.'Һ~~,|qlGqgR..|Q( yH,?Ux\ ||/!mZܲzv1-4/?Ⱥ.UAUĀlgO?R;-*"YkJQpprC)?;wvl6orm<g @֗&X:xŒ<. Ӹgm+|qAƼ<649cN9]q}܃[8Ntڰ1-ϥ( jvQm6)- Ύ8Hܓ+!ՐH,ƮdowES |_k[!յBrv(@1 )Kϟ_E-rWSE6XEAX8%ӊ9ef\o[Ϝ<N#[IeLvu7E"cҗ4۝&cZ܎{ZtQ</!7nUg<�ec60؀GL�4 ޽yN,c>h! 1:8:6^<9EЩMr ~Lg͠*I=<i?߸}=ea/8~7 }\v~Ta/vžn7Q){j$ hRggg 4E3L2h=Qffe|U)IfLc/ wQR_"6Dž3 3f#|ֵMؖG\py�PG>8ĺ">{%Il\B#AڴVǗB߾k.A7 ~RWS( ygm+޵MQI̟OE~M«)�MIJmc#2gH;cu}ܷ ۶,Q$X4ŰlEاfO/k{yߑiMyfkͣ?}~U^?=c4Ncc_ n;k^̫vthj<&7~t!ua{Sk KWmanU=mua҆nY$[$+vkNJY%?4ASEI [7͓[cm ^BCyGNcvUEA7yq*8wAiEO?&œ0Lqn{=\ 4N�,wQEu~Ǵl*|,{X4ʦnpTMcOWrv~͏kYTSΝE$ࢹ7eۯ`~}+ER8m.U) U>jh(EΤ= ^:y ۆ~^j(HEDbi+keǠ0-I4m&ӓԹ=T$P9o>H+#w_R* B8mV_Z<e6Л4^G0,?Paog-'3f`KIW8h:~(1mHaw)z8jr*gB&X"vs] iJC^J#^B^ KnƾA-Krܴ"Zz J!5c8- riKM8nZ1qK[lj7E*cq>&ғQƉfKKu2U5\o/DžGVї2Hr# ےDiE, y3!,-E*ZyxSIЫQ0la0BYFQNQ–ָ{Ww|[(PIͥ:vVF]>#%OU|!ȼ*F&o]TiXemIM]\VnX 0{B4xbg x <*y>)2|t' n-ߥQS &mXeB1mBxr[G+r ZihHu!z>?`KԊTx)ٞ#GUtZW̌I z4BZ{\vf~|./,j{4:cmw%`X6)É%56�[bFVA V"/ư'`°(yX4wtvEEw:�;<Q_5-HM(}rIZ[o[SM7ҙZ.RR( iIq#;-lm%S3=!BO 1GeKH&ߣ1gB+K9rb Z z5j]M{ a2$1)28[;jYU[e�Gs7FAL+>=GU6,7qcl틍ܷi?(GM 6-vE$E�~/k zS\0>3JȘ6I{IݴM,  ,^̒4v&&IuAaTEҭ106Q2c[gYv q'9�ix%Ajy'q5T{ 7֥* kC_Jg7j$֫J?~\Ӣ*r.U!Qylk߹cwoYٜ6 Mm�팣L3T0B>;;l38HU˖c2X̛Gɝt)<BYs$�B+#?t WN=q&ޣ'M*>J"T5F fL·aO#FXܻk̈́|?>u9&h�a<lz8i TcM 7܋'jN;n%?)[ )BWYNu[_}ׂ*Jetwk �9l Hd, t&I&RBȧa["} kR AEKoWvV2OYeBW<nnC 1pʬ2O-gdzMo+T;a21q!Mh >zl ~OmqFd8u FM_.RHo>R1hKWRg^})҆EWR\Yl 8X^_By_i@ * niK;<ַOxr!6G;g <lnn`_OWN꘦|h V"Eè O8msc �c"e |K'V~`Dc7!@a;?k1%롮"Hu*">ܚ)薍87>;P`Im1g)gfyh;LB7,yHvv^Kut[${<6M!J8^eAܪ 6IdwԺuPҕ'vrfnKϬĺ=Itf4&[JܪBy/ʵ6|B=[>a1$-aϯniMO>~z|PA}̫VP'"W*ؖwm2TE|,QߥQWbFy€%,[Xd*z5ܚ%\yds~¬JY8˖$ ô_5*G%uzO/? c5 WQ[£)%u,%=*UƮAյX"[sj�%iimS+>J Af~, * x4zfk5Uڢ:/-m؃mJL))0wB>ϯdrq�KŰl҆=eǺtb~SKEYI J9-K$7S~n_OۂO*b-O2c0,=:7 loWO?r "<�K>'JBߟjgwSX<Xڤ7idCYPM]1S4u%ѓҩ`A%'L+,ϋ=i4v&Օ]=oG'iB s'D;19yL*p1lRmv#n[Y!= )ӛ6M萝H7DUea/ߺuwolW'< PH"XO#+ʎ tϚ-8~zŸh>qdqcP ~ӥ:8{4$ϣCAdZ1! 0]L/ qԔ:tF7öǶ+c]b?OAЍa\.|#M]Iחr޼JN lbwW޸C {`Ʋ?$N+#jϖυKU}⡭Dn~tf=KKćI% mߣiuc<'u79GN(pB|.z]�exmS:~77=ĵ6qR])3*Bܵ]qN/L,ЛI'FhKIy˟UO6 Aӈ8w;61٩8~2l`@YlI^*~x;W=EA~rLڗ&mXFwxbXDB*q̔BN/eE])U> -mjumEȫQv)%fd#v}I3,?Mg"NΑ5D.Uj$GM1@Q#^f-Ge�߁�(s+Rm( {6g-lj}GN͢<a4 #Cm2mI2㐂5␛y,/a~u>:{>^Jڰ((xUCy(07ҋ?THiE~՗6xdJYT??gvvR臨4@ m:/$e] ͽ4t�Aʶ){>@q=|Ɨʋ'KIJ֘(2%MٕE̟ϩ38fJ!i榕{w~V6u)L* p`XrH]=BFFg=(3!B*w,m\xDT}ߗjpVGvn_JL. `XJjKU"^M0Ч>x1_N^Z} bivN'xxcy5 {qlm1#<TW0%&8njKj z45< PS${nǥ ݛyxK;ԕ1,۱Ɔ\+{ݴ8␗n90?*%KHjZu%, GsM?O-„n4bK8jR>y>7 x@+IQ^."G9�|%e['UүFiK4eu)ܳ޶Y#R/e3j ^IAMۉMq"ΜSNYGG<ÿ^: xTfTt>,USuxhswk(4T {] kXotSR̹x`S~a// adL9֨$5v_| >gQU{f7ObjIie!l-6mEw.⯊f��]2^b t{3!Q 4}Iu\x4X&Kg5o d,XI9Yz\\OndrIIB^'ӭd[v6flM̫Σ @Уxhs;y>fUE{a i&cX̞9eŃ[yqw/ )`X)ÿ€w1?`+,\2M.dSk? ̬3ȟ9p)lxp<CG܊;f/C3lgNqM( {6s&NQOm%= GZ"cbX6J465-Kܰ*€[ŭ*/sF5n ~#<׷"%ԕ)6w{-K`Y])s"<ڇ0,LEbw$fsٲ?ʬu!RDE̝k[^dvU^ĚiI.^#[Q%'ʼ\<)[F!-xxk;̮;g cDS渘oZRԝё &波!{ܽ4ܮ.;$27Aw6)ozܜPOS~ԎK̬h-v($ drQ꧛xtk;SL+ [ْav S1-Z1=ܺzA/SKL- rJvt;i˰`R>y~7ilchƅfW�uY``OQ-$[|u<4KVM$GYP!Y/e9aF tUF|t% ۴qVIgBV(xe ^=.Kŭ)Uυn9�+fر?0G38H}E˖LfAV"c6mZyiO||-Y4k؟ML(sr}+w8Dyf PFo46ԕp߭.^-xXZAOJB/=a6|b-Z<AJs*~8$cU"Iĥ*<11e3J8yfs*y\4v&xl[;][:ggWݰ1,i9y{Rlv371'wpM(B0:@iY 85IħqҌ2C}vthCDϘ#2 [ܜPWS[0-1S I6s8]q.z ْ[/#[K\( 7;;@!r.z,יsxVo2w5ax�=B"cM<%/"yu)Xd>^jeMsMIzR/F|n)mǪjE8qF1= GCwrͳWǷNcZYX-`Ė'bQMk_ܽmq~|L-+gK\ !/yq?{#JT[Jn6Wmã)|:֕MS8QiLқҿUԟQ�prwZu eT:HKyUyS?q\ Ko㲳x9r* o~q6DiIM<K:81,+崆rNm(cfyB?/ 1L) ӱ$Nn(cFY/#8�M0~W469~%ɌŏL{Tg<.eا Q5;<dOYKUL1)gzo/}p71C]d. 楽Mzo96ɹӖ7wwӸ 쏦1pWJ(Y˃sQ1. 'Ɍt_x!a)ةE0%7IEtӦ/me!Ni(/M/'s̔"n)iIN/f{GhkzOf䧖9Vm1[coi 4NŔ JCNk{UE :ӓ\Ê"b_�ŨZ̖WNNݕ`nwZ=Y8֨S|0 0sM;G"m$ui1%X$c:(.o38sVPUSHe)Bxd~u>˦p3N*27@:1 SXQ_V5s6N+<CwRǰmO)丩<=݈[,R(͡{v iu2ϸrψTFxK(۟H+ _]>]M{L?$*"Ƕvpٹ N?tH n ҆EڰYXSܵ<D]iyd,ke_ʠ$9<kniL. KC-C^Nk(\N ̝O<mғ4*qƬr�]lo3:<0RŝkZIG�pDA|,m'H{ ޳zd%4v&7㧲dz=IcܕʌE%6,I%ors hGO)t[s$uř+Xo\ æB-.NUƦ(yű hR9~Z1uean\_Ǥ�s'6lg-yhS;]I9<qBOZY_\:-HG,UK*|uUYOp 8l*VM]NrҌR<JʰR NYK{;[ό 24QY4]*1mRS4<i ehg z4O)AvV-~pSn<.G�p-?gۯlG?ud'L3c*(]u>~xLK9BWP 5yv7k0&'tϜtVfWFU! wX 5˖O/&1IAYs*T<GM* &}Ybd\Joڸ%�p c?S/LOR'm|͗I|$ ߹//%10$ܨtK6-O-bRA+f:pPƴqi*iIG3*a)s$KM\.^L#cXR;fJ! ܸr/jfJqyH!**;ܱ$V9_�"mK?d :~2= q}M<o�})3`ڒxdnu!L#+wusr>WVo�(mؘRrZC9ͽ)~v,DwXBfO[\v4EblH#[R4:3)8�*D놖(oޏGQ)�[QΞW} Ӗӱh.^1ޔAcIRBاї4, sIcw.363!Uyݬ)3Ug0Æ$Nk(gbR_&:ag#p6KQA_`Y] l.6) 9Ellq"JN[ѐ\*wneck_-?GO�TF&:qx_og:@US;;)uTH֘@[:-O܌]3@h`X(3*5fCsf("6|a(‰Kg[GMq0K�%1lcQקe">lzR,-bfy_h0,D,ma\Х ikMWR們10lyQuʥg#h:TTlK{73"gN/ehP\B[UDzk8v4e2,5\N6D9cV9-Fw8cX4VRSx|'2$D<c $2&GM.`bn]̭qԤB `lْXdzeĶt#!/%t<X\5#s�8$7і4,W~z^"M~RB}|z`XlK%gAQÕo:93c3&3C4TDӓh: Ӗ:wĉ Z26F˓0EVIMQe<q%T27Ag̪+'wO)´mW*Bܶꕊ+s�8&uIΙiKzaѪ@Q߹m#pZgSm:c5Kyx'.M/IL({PYAB@,c12}l;Ѥ%$ukX')Ĉ׭rjCZ\nN.OlMg.]]\nN/vnu,+!mH.ڣNm(r !TץqKՕ(%@ۺ�E`ϛEѓ2PS1mJY62ob= UJ{T<z3k€???<3fUjTZ@mImiǷt9U3GH ֙WGЭv KJHda)=QsY?Yy̬6l.sW.nz~/K( zzk1mVԗhF:c(EihNqsM2<򻷕)0{E<6ey^yU S;6pk|:%L84skNKo<ĩjIЖ6}nRHR7{MQCZ酽,-`h0n"{Db":f7eafUFub6p26'vpfU˘n\_kZcu3̫"B%8?<4*W㉭ܵP'GE#{D awO,LyyrIPܹ//X`S]yngί@7AY+%\YSf:Clo#᤺AYBd ;ɒ3gX> {d8AP7lH4?lXNpЫtzwoMm/%s#4L-;fp}3.#/O67SկKH�p'"[?=ⰇXzs$tF&8~%4RBȫ3^-DfFt&ĺz:;c<꠮-c=|h$ٰm Bv3do3 ` (e wF=a'_aJ^0f $$R:?8s&X8ܹ=HyK*U8EiM_�ss) z_Q[}iΚWNQM/{Hkzq}T0'�Hc`IP eYNZLȣ+4k׭Þ$,N:ioOwdfք>06K�9*a5b1S C[چfJI]<; ԗ4xT~~,Z).g3!F ;�)YOΧLwvSVň"] U__}I<tОR`vE& n<kjMj([+ 3[</Dw<9cUʼnOKsb?*"Tl ?z"BdH Uy>4N+V1[](f_wgyÄlmZ¼ iL(a!Hd,j˂L+ 'wX,QJڴ-V[Y:G㊇aYu%NWcE})wkݜ Kߥi?xħMwMˣjߗМQaؿѓZ2C0PDz%A7kݜPYREKOܿw5u%dӲ%~7esVfVVT?iI\`ziKiEDSa3&?U<>ӊ>~wxyxr'itzqvz{Ӓ$tj4~%'(!cm,rt҈fL4!Lux\ -lU;F x9�iiܹ|:ia aKU\JSx7l(S|.#[ ݤC$Mn\ӂA`)ń?;<T( ;V9nj~i9s+ Ŷ(L)B7mt#.E%_9li#q M+26Pb$cQU৾"n$tSNRVNG>N8}}=7W]Qo:cCmȉ|e4zߩ1W?!@ڰq >2<KT{7!gwӕqkfd V/C)z=RFQqk?x$IkGS!6,z=c>T.[5O5Rÿ3-i޳2<i"ÈXNܓI|fT.w+MI CLƫ$ɴ+eKLֿ]=ϰ2T9 ">g֣ޤ1d[s~,[œ*&4-t&^"&c^ѱi٨A8}~ {Qv! 1'rsMS&>::ʃ+N«)ovNݲK|:ΜUlRlr >0$5?_>[^jKUF$1"E||I '7\ĴmVnRZZA$/pxQҶ? <uY<gdVU:Dd||ǶsآΚWNw 1Ti&8~zCKtKKUmaBϢ)3֠KBbBkidzY!ϬT.iJH$NK,^K=g>TQ_P-RץKP⡭,'[::xBlM])>?Y|# ~~Aa/Nq gf;�.t+?qd墣gVsnqVROΝ%!sҝ5i|R@J?v-!G6prl)bZ</OobUS*Dze6h>EM1,|!%ӊgˑ2l\3xobim6#=" ($d cY6W<'،aSrSSqǚfH|ⓟx,\x47\xգ/& �2 "azZZ[zR5rwugYRWD{Tp[Q uϕ{xqo-cA84v28̬ !܆.TmfziiJ TͩʉLNQ6086qݳM/4K:LٍaEl(aєkDaЗ4g] Ppw?ezGUh2-Bep,Ԥm)iPDC7R~"d'"8}v9 !%lhO7tk̨ V!v!]pS۝"Q zfPq}x\ AP4T)\|R-GM,^8 kP@N2R |_,Ӓt3|)\l?gwj,s;X1,I<iMw]3Ju)yլQp,@(gֳ>aCn?_WO)&2h .Uw+ ,ĶtR<.$X1n(pJ_ n޸)H䐁`t9O+ε-+Q79qwiR_ץZ)((\VMgZyhX]rY񲳵Ĺs`dF\<Ӭ۰a&#b4ḇ(%Ҿԙ嬘YJgҋ8xI_?ΞSv; x5m�T TtsSF"C�wk]]<*E@,ctZ1~nzqEA3d[U@o /as[?0(b:ue!867jdjIkhAikk Ir챋_ʷ/.[6o~ÿ "� rxq͟=|I {ًt\KK4'!-h`CK3n4ehB:+7^ 5߅nHnuPInQp�&i[:T Mܽ҈/i3f6~p&-�fb~tNm(7b{[g>OEネ/+_IS�Dޏc`ڶ.ҲL*  iSf?;\nU屢>(Rg8 =<.n 7ʹ%>�* 95-<<W%З68sn%aַGjxc7Z\xd%'ՕnaA[Jb.XPϙmk#F iD&S /;Ei0p {o^L0rF8@jMΜUi!L*Gg`6?a*w9YڭcÔ�E>7u!5t۝O]Yki"6qs8DaƅGTp}ܵgM6?:{&|_kH~װ`i_ogqٙ ܻ޻ׅץ%HfL:?><iCJ%"HMzک�Ng>Y>aM �!)@Ok(unC]9I*Vh5=*#d KŒCЖ</]IΟ[Ŗ(O"M,^2Kw3A;tml$~}�ߍ*^6f8o~:ym淏l05RV 1]s6_eDb$Κn/{swL/|�n�[ l�x~DGNo(g|Rk*ܺj�9st%t~Jw@!=)0l*}<.6D- P`I:NIX]\xd� W?Ոϥd t'u&EsxrG{{XAe_o GTsͳTF5eH=,gȻ|iyyR ч565ЛL0z~Ax՟x^fu�[/_.e^V'NŒ936j lZ8gv%!πy]CoFgqm!}0~&$ ҰɅ~;XVĖ(;;AĺMt<k<C^ڛ戚|>?{( {5{0Ѳ%}IoV*#۹YTC$ d1Q<t!є'}1[4xT缻c(+}:l?r=3l۱7u �R#aҺ qKMss{[)҆}u)͝k[+ l. )KM^u>=)־4smxlK;^Ma&XP@wkMLR $.}GTOf.sc wf8e_?5E\YJY; 7zBqe2|uukp֨[ŋ9e. [cp~V p�'nN+�ف5Ui\n~pF^Ù**32~(+z8qF nM!cX<~%˖MsO򈏉~uЖN]Q5<4QUO݌KQ!$%[kْ=hl"jRiaQM!߽s#<#m0(;gQC(R:cUWj1_{Ӧqgqw/0u r#m8ܹ̬: ܳn?nj qtF54,j)̛!cZΐj0Lųl4QƖ>v rp" 4rjC\`+tۇj( CGSeyGvp A%uMt$7i%A>wjb`SKo#&ת\l~JI؃*:A#%ضDQey^I޶%'E ogAP[d~i^sɘ6==lhTW[4W A<mnfUF] TM8!eStu+H$e12SJnza/J>k NYF4i-Λ[u76"C XTdj1߻s[[D|< Uq}n?g%7#d `_W-| llRr)׀n"~JqCQ1>giQ|\鹏^0ѫ9B�|T 7%AicH#/cS\D<̭–hCOJgaM- nQ *M4m6!N)MMd6S-X2׵rҌRBM4'(Œm}`093qC[ xA_^o{<CЧquݺוmNL;e,Rc/vB^ n"~˩Bxx}3_i.sC<p̙;-~3i `k77#CucEW ՕҜb}>1>x"1!*xJr lcec7y~74arzdխGԕ3L. 0*BBiXϠŭ1"‰u%ܵ}=)Nl$C5Gg^qCU=HMaKV+ԗp+ +J gX%{5-i^ݽӛ4舦۝&&^Iy]rM3#.rX.!%7Ӗ= ]~8?/mZ_JJ7TV4KX6:ĩםŦnC IFw 1,I~MQM_�a8LqVΜ[0(Etބ<fGg}+QG5?/}GW'5d",RQ-̙GAEWv?UڢN^·Ngl0ȴ?) |մRieMKz%luM-K'O]DžGQQ!syđ<Sgw%�RZ|CX@~ZZ_Qt%!S</z[Uai] ⩅TICB^fXPi9ÖNb-&( {IgIV-F(YvviM׌a<}ohj3\AQ=חlt4tk|$u-i*>xD~x&Fy;r ' \vLnK)e),,w-O=$+{/<cg>C?�ͥ}5ҶGimYyٻgϫ0!c~-%A7{|?kx]|ĩliaؒsUrH2⌚^|E5H$ܸ!~%^ad@7lfT/6vpkCKM 'E| t2CzI+.î8ݳ5b@ôI|,\Wo[1#BpH]=oRGЙYbѤBںx'))/ȅGQP8yr:1*I�͏L*pB]19eB"m| t'uKᱭL- Q_!1Л0xj{'u!B^ + DeX2 ֯Kk,!hݴ(zX<^vve=)2Mcky[n C⸢3*BY<7QBi`fy/{ -)>m@Жҗy7=;늸[6m&~+/:B&'Ceui/ =$2栕RU llN(6[+Keݾ>8&[; mVr&Uћ6MTEq);vv-N4s-s7[<\SYܱ׶PLcn`{TRV(wo#AѴ߭.@Anpbc ˦#˦R7<-Oʔ|?կ8cHkkag䲟 !edl~ X%y~NQJ,m %U( {<gWpĤ|bxkK0N8ķlnJm"RBupOzmpR%ytK >uvɚ=}\*ѴI 7v_E%3-ifR?k#MAUĈqueg\zz=/皨ץgjgNa]s/Old>Tpǜy iQ_G/b3r&cov�,i$T0=-,)䙥DiBc }ukH.ޣ)4xnWe!4M`X+G p)ʰ&,Xf crǚ{Ҹ̴q /v,= m1p ث wC\n"wǏG6*}#gIȘNiIie%̙5޶V>s'o'}ylڰ;o_ .zE,:h>Oش{O@(sv16=[ Z6ɖZcr- z5zS:߸e5>x"񌉦6hI4A&9.)wOT7hk[Kx\Xeࡍ ْGVhKlt22S;;Y  Ű,.})Ƙ^/œ|)!hE@{\">yd~Ng~u>ͽB$u>~l /{ x<O-uʷv2\QƮn̙GTӝ x5Θ_۰G}G}Ө,.�)If4CWO+v_UO(`t g`0y&ӼD &[x4{Є`nׇ5~2,IƔ#I$.EsNfNJd,<݃'pSR* 9nJ1Wo'Qt9Uo3K'3B_u=m aku3-KkY6/޼}) azYR3cr5_k<'JgoK#t2ݼj>d2sʂ4S?S+gڵ]==ϩpO(WTE'ܗCa*p 0ةE݃t$F<^h⼹yKx4xdgG�y>`vimZi =U[vv -4[Sdt&uqBFcg y-)3MR KM3wA`gmltƲ鹳0-ܾ*pX<TI|0,a&ru�P yF<tXrm"(iiQ_S'fP.-ED&]9�@RZ( xX1k )6/JMa%IJ}>$ux]3{v4]IMC{ KNZ~<#MQK%MT@(2LHe, zA|uڢ{R"eHeL<.M+II؃K; tuC~q,V6usK͔FT"+3nqJ^2%)J%عk|hғ H,e2,17XmLo/g[RPaJIaɕY~X5[ iNp!b]TdZI~fU2jN"u܊<)10Û0,K pɦ8\x~&/^bi0˦p}<<v#aNYƕn'aP:1!7iRи9<+FaЃgꖓ`Z89ģ=O`#knܚ*=ttPa_w])4lIT2|%8!Yrp=Ze>Msc)s֖-m1fF^"i VҺCڨ.QMZUZ:}8jTib3|=ۯkԗ"y<=]I[C" e x1]@[o2k8S09cv9߿km4!$PX 'Nݛ,x]#Roif=!·Vq&]Jyoؒ2�?"O=P.wX՚fGkg*>9rue [HRJ[RZP@q!@]63flBd&ٽ33g7cy4GR3ӓIY}vo=$C]Cqqk(6p6{tpf / @1$kb|yڶTW]iTIo[R;�I$r ;m%!L)ؽYvkg<ƌ)HQa)s65H9xM/2x7P7~~~X^_5ò!c;::f /'O-fQue{+]NAڲ*'Zǂ%9ܳXYū6l<I@ ئg {S� 8R^l.!g4F\qXC'ՃBZbXb\yȍK8rnփ됶>W:7_6;Rn"Pry^T?XKADP<:5�VPXXGÂa&,a<MѽR6Z 1pcxi%AS'7j"cО 7iCCdǟx .uQ|s@6%"{ooK;qT 8єڸ흵>q9DR)psooh .飢 7cmK\AEt/vL(<$щvIK,%rYu-,oo2el1󿹵 5pwRWm{n�!L͂H*ޣ(YXUH$>S뻅L 9FSUx2`*{d6B)|} :�'7PR6m kdmn8Tk$6n/uL}ZEA'.!̼;8TVl- Rr5 cGIZ[\C$-4:9zx1 V DSU*!7Hfu8D~N{<ӧ|JAaM,pC 6is|a,k<χGkP@YYؔҹtwʉ'`!YйzJB^.:ֵ̂,t߭tFh -_LQ~!k5yu >W'L̥ZVskrJB>>vH) _\n |ؐ1khx MʵJg�Oc H9ȪU\�}Dީ!%Ml/gQM/<G##[m=dH.I[{Tm[0mK)LU;I7 &v Lkh<zeᴵ!wdlW, k c xpjur) &d;feݟTII�64wnawjQRRӞ]pI\U-B \'NH[SSrNW5PL:y~7#r+(^vE ॹxu !K[ gַ�D6䕔뮥V3 =MѤ`@׷5Le\:[Q+7H=. -g/GAo5hB04H*c:R蚠ai}l= ^]\s; t0lO:Iۊe~W^z3?߾h<34G\7Yq/o99TU #Ɍ띾pz%-_]ܧ/;44s.w į]):JB^k<\s+Ե4 lQ!ZjYح^Orsjē),amc`~Qq(F%)/~o^|zwR0hRj|dlմ&="?>(5$S/b6F) yHm\r9b_u)F"c3aA|_JTk :WPw/y,~,^]\æ:YU̥lN{tFo2~4qn0r0ֶd,{0 )\o�)6s=/9)ضCeAt<UkȐA)yzqzy%9<k)siՙ>W_|eqBzn!y~?7޵$?@g$ś^M$3-4"[ȶv\!0K}wu )'l¶$Ki\/y^6ٓ +;f"Ae~Inק@MfXC]=p0u<e9nvbUM<ϭf~JB<ozE€]-햿v?qE4S&{a5w}[~yMs||Eٕ̱rǼ^C*+�/G)rdsO_J7W'1Jiͣ'qNK$+0 ko8G+ x`&tw CYf>|k1Ƃm>Bp<=^T +_ ʢ:Җcy|]A쵿-.o�0axnNn|ve40"נ)uQ,ᑇ&Mp1FyL;f,%HRUjbRxXSbŲL!qT)Ʌx4L:"!v|pueTxweo#<3흝eGͲrAeモ#:#::.]N8Ej:qa;nYq1�j"! hfU$ٓ+܀ |\:eA#i{"Mlh`pՕ?ds&Wۗ05Ξ<DWwsÞ(ǃA[G�yWpG1dd% ! +|}\HP/ZicxkˬAhO3[qW;o˯ʡg]»9ܻ$=KMw‹.w#UA!"IvKk]c|4m)א5?( z26߸o.ǜ7STӿr)K>$ȼZ<)*|MNRYЅ`PAm|{)7oqNۓ.EJݚ�O> = s朳NscXswiǝZ<eqPU> %Kfɦ0y]m1t>?O)ń/nj(',-YsHr'` Z[]) ?!'uMZ%5a4]g[_\R׾sWOw@@f26+p҉'Gd,\ŋ2g[W7qarwfrN<^M iZRz 3He,q,JFOM\?w۶$J|(~ӟqUW31FQ[2oh (P99| }ྭmE}}C47eZA"m34;o'xexL,=ҺC$ ǣ̘~(Qnlh,׋B@$ 7WC˘zaZ.?l0#KrΣ X\IyU'q< meR5| w%ZtM e Täۖyc& J1ګ{464ّ. 2lHCo7P"n~8|:{kHf|sY $gI|{o0eW&! �宻7 p;k\\,Zofcsx2Mmkѳ˹ʫyW).)>DMtXZ$>ʄ^{ͦM/_RN0�u4Sxml^ cZiUȃQI>% R|'2,kH3n-c"PW5FSc.=Π Ve<hD.DY<G}9GՐOۖzI}Xev N.'$kmn&YoguF8aop"Mvҗ⸣bnmYkZ(q N<Oڲ?`}T& {+s lg'v x( ?X|)^.*A17_m]UgsɽG17Y¿ ߿=^MQ4uUV]%aַ( 8c=z^eFaE4S<5S:PX 1*/ɧ¿﹗pݓA@�uVׇis3,Gu<Oϫk\v`ix}oqz{%h 'mr2ya�񅼸\XѠGC4%x|[_կW֠:VpN1nw+5ֶmhBM78C~ X͂TDa<n[ޢ%4'g !,a,]8멫+͒qr^~Unn~yO~AV,YW4_ƚ(!*=,썄<:QQdj7V\wߥqqIe 3ۈ|ҵ7h?\+y y}?~r15m z15%~Sɏcx\p?1MikPTMVqKXe@o)\XH*ϻeΣ=)')-w7zwHC<̮qjHvtI'rG(|iW?BQH29aɌ0axc)OqBXKR?eACx#;*O<2ίa`%Ar&BtM`hF ǧ79+xG5zV^M,0epK:[^]I6"i>ӦnY6m G1=c>ɓ&KUqCyw} ;P@JI}Gח5Gx o=\ɌCUq./-܈NݭkVs 0tɄ!,dCs<y~Cx hW4Qg|+8ui,]7-鹛0 Ing.ugչu7?fN>^8'|dbj#΋k :c+yuRK.G[顱7z w״`HIE7u CRkC={a!ow7f;jD_xDSRH�(="(r+6W'egBCC:)e__cTiW^~9xUrPԇ#KC )q4)9ky:GuhV4Mx&!m[RIVִWW,#~ZonEjȲbTS0xnaK\0t.>[RSwp!CIԇYή|wֵ-FU / ґ`;9#<|>tʳ!$;fϞ͏~Cá#K-REAdLiMqV5GՍQV6Dp};O/c][%!jrQGRV^%\B3̬yM#+:)3[VoqM}r_whu2xEy) z,!F["kXRƐF14I*c&ABQz?~9m?͹PB1[ldѦvbB7u.lg?zFL{﹧xN�=�tSyRΑ}|w ?9aG! !n֯*|n@* |X==S68+ZoΜɽo̤ åjuGCoӗx˯rrl>k_?z( Ŷ1t c)5t$3*T/-Zܼ>Y,Cs|SAvXmD2$-3q`.Sj$RvmkHRi_>a 2b8,[GE,3\ `rbN[Ÿ[H$8+lq.Z}?}�{\XSVs=G8Pkj:mQ:diKWW0y \O> \vhM,ghIHvk0JpҠY`(g[;f3JӜ<i0jֶE(D}mugCp]'ОU6~?pD"wY0KAM[mcGU&�*7]opTbTbC`�3N[Ÿ@ qtɼxŗ84+mu|#9:=^9QC(h_Cqz:fϚG1bh )-gر[lu_WGkRuA/_>z>q}j>Sґp;XM> XMM卷B/??}4AV=pk;h'R6iKm=%mWW[>=(3f}Ws>pR-܂MKY8Y/iq5k IR<cywlq7qPpN9xN=d9ܝ(c'L(}EA5<$¿﹋P_q%M©**i[B^^5w&Y< n't^/F<msY+߾uoQqP(tMr`r2'릊};nьǒRht .c,'ow>\WXzE)ЌkmEII9򩤲džB~-&Mܧ~s_jKės,=&G,b9,Q^_ĦGn{lRx́k=S{k+c'L07ND{,]Rt(,G7?`RO/ Η|5_7$c[<g7QW$R69]�IǐYdlTra=oÚKgJYMͦj6a< >|X,ƄpD= y|jx<Jʸ k�8�N8|?9>#[p]+?|m^~\ÏOOˠ"?$|XҖ47\-կ0?k] :9 YowcxI60rC|8g.:_}Gm3xe([% f[w҇͡&;r_AM Q 7}"GօEK,B?)Ve'8f[M3nGbz<{ԯ_?(C5mgdkMfxfzޯqSU5C[k+6TaZy dBlOMO/SN+fƍq k}iIUCYl ^V.s>SYȡC15|)᣹~{=\u|iF*{BygCSN[gc9&R[S)c$�BMJ?1= ˵I9kSO8~9fU!{g81R{ @]GggZU({g-^J9n~T_WLjIGX{\Q 1XݵM9 ʃIK8tX!9:l[1NdA5?яo晧A s-1BMg{v<s8̛ »fdтu }GT?^zEN=4t & .$5%McFXP4>U9hAtvW@qlPΆ[-2(<J)Tzg<o[M2+!۲ WOm,APRM�T\2>QcVE:8mt Ԯ؎B 3n2;MeHliq04I x4RC[4ƶ'.eӖA)ݘ͞Ld8tx1jZ~ԩ8sƙ3g.8|T + -vrfECp&qWK$ qpL$X|9/ipYg1xȐ]i[+Du3g:4E9<j~pݵ-nssIYy9Ægy|E;گ@Y;ތw4>Nvtz)f6>!P�Jc) :q][GS]vo/vQbJwm:Q вa>'{Fs']'IVq3s,'I){frv2\xp_^]O>ig'Q9::{idY¥ILxt)yorE7^IS2i 2@CuZRT UG7PSS?q]1w!gCfb=[O+oUX"A,KGa~u/�ѤښBCz+}͝c0ߺS). knCNUrZ\F2coB;J=uX-=mkiAEϠτ<1}ely.!k/zSB9a|u6z[<k2w\Ɨ-G1FSE[,+rum|eIr3pcil։9fxw.ﻐO+m!է(OYbkztcK3׭_c%}>owW._Gˋ/>+;0hjٮ3C`|E.>t) tF20ߟH{ 8u}.m;26]2(Ȅy 1$m%>%AƔ0tLց"9>3J[&~& {462Sd�ہxǕsB[Ч붾uLST A"\t NSÏ<igEֶ̮v�G2h[n.Uoބرl:=Ocn혈_R{>lMxvH15 qNy-^¡Jnn6]~<0}2֭YՇ Qń1Waly.>Sòr_CsKx bceUh10ǰ y>[98V GMTn,Kcez�CУ3cTc xW)**dر;f6L[n/pδmXY@2ǞG:G}Tlw}^zFR4{Lvn' //}7ޘg*_o!fI!^]f0m y~jKWN,z)2 -p|&f,<u*&nَQQu%mPnErpl$!s'bz<#4ɛ9nkf`Prѡ^=a$5kȊ-'*D iRݭ>u>Ct7(ǃpOZP% d,%=+.C~>׻};ig's;>JPo4\фEkUZ{ﻟ+/l77MΙ8Ǖ(E$ez=, Y60<8p˛~qSgPOS0ko!Ssc%l ds@3�@B椕sB&欳w] &k֬'/+sп`tJ!ЅPUs|Qx[oޘ98n�4␗Jj08Ajz]LC0!gtϻ0u Q0ǀ\?^fֈwfv9^+ .I,iٔy1 7/O<rl^ai߰a-޻7-?SpQ#*t8m5 QD$<JEEӦM#px7xXa}s>YiU1*+xuۜp1~-;k_2OR>TOLlܛ/n%v>;C^ʶZ}W^ŝ xqaѢſIyI`׸Y 0�qL(uC#+te)EvI5)VU}!<5fC[dzZц&:͑o/gvڳ&GfA--47S_WQC 9m�r k;98 M7e,lNAgǬWq<2oR\8C5djBoe (A/(]}_ 9)?,NIu[~^l+mDO0ό3#wOvwpM7QW_RWyP�7/2ԁ?vL1NZrb|E.>wԎvy xbam'iRMZ4t&p]3:) i0qh34Hrlzg8*[3E . z [k+җ_Ov^JViCFqp/w\5s̵,$48lh>u Jx4ɬ5ml숡K-L0W.bs~M=AW^ᄓOF+>PP:�T]j VylJO= 4rG׬Zō? ?&"/*yOT^{! 娼ęC+떕5) RvN:NX(-v�bDnAcs^1#K8jD1G.ere.X&qّl*瘴2Yw& E~m^xƌCBҭe}+GSRrRL)YfYCV7GYcyC%ua:N]ʍere.ycFSϼ“Ͼ 4{ ~}.r:_P;R]P7Uྐ�wSq\ܧ``Yi)t2FVλik4~9y+>�US@±>zXԋ#XSvQ*a;]YY{uc8CұBTɂhLX5i%Aܾ o?6;D╜p l??82_>k\NmGlв@N␇sq#A9W'1x}y{y 51ɵ]E]#<ʟ2네q#3]e@USZ٨z~ VwR?:w�~b7�0Ѹ\މ#D*3v4) yu(.nIlaECh&*unOA6mΠ1r) xY[Ե'ܪk 7ixfn5s\pL?hT Dgg-cY. q( DZޅ5A!1 tQ,CK$_KZ)-*$ʌa ;V{+ݪ�k DR�[9BiFYeLl `/3WSƖ <krygwynuFQ\I2Á:�SXntAP$۩r"(ɡ$!m9Iɚ LMjrZ |%a@f;AytD}!//ɭ8>XRN4chy>Csm.b]k~h�?{�ث�#8?IJ{t>GG ԹѥDHcSGzw9 3he9e{N^ G)ܰ!C$6TXx"HӞ_um -3F5ϧ 閶N,e7ZVԾE?�Xq{gm;mS'HօX2z qr|Lg; _vVu/ RIZݞޜ;j] EA#JlT/lBQ @D<:9 . 2"|/okh1d<T6XW"c78i\BG(gG׀6[7Pd q�a)nOrBF7vN6Ӻ6R3(x04-YRMX"T<Ȓ*|$xʮz]v> ̸ )v,a-d7[TiYm\1&ɄGqBUwLY)Z~�[�-8nWvƭIKI[٨EfLr}&\Bl/;de^FxP6'x tC,ew+:Zct9ql)R^쵃DNDaLcnu{EK"}1]�{�I!2RZmGC1, \A][У ztB>("߇m B,6YBH�SgcK$N(CD6mJM / rؐB:ȹi !�p@B.L8lK5ȭ>;�(37QV+ L &|}@vhCw,fQ=SJ M]{Ɣd;)єEIK( zY\9%ՑR�}{,4l|n€\FI{\@WRi{{ں35  .L )eګI&!f%!/|q!If>.a<q=xH+3-vAF)iJN�p?@ )ƔbݠַKԤgj|YPP8CIKie!/-'o0uP'[~5Xѥ8,i?::b`}uUxF?~L 2 UCSe&7LCٗ&L)AΜ |'he9|D!K,hK/Ґ+�P|pk=}H!6Yä<I�?AU+\״)ƉJޕOیet3ϧ9�HcZ'p洛3ceCtË(MZ{4gm~S#xva=G.&gPAp Fa;JqbZ9y6[T~� ̖DՍQQ\bʲAp?l[Qe M/,! %8jd1C><Μ 1W Q}MX?� :rƎVG̓Ǖ5$Ɍ ,>. yiM0{];gL*wk2[!GO;LUT6v/K+H~�}��iBlЄ;GG*&vtȲ]ky`G>Đ"?c&0%>~t)!!m8>B.~�+�_4�NH!8 M<<H] lG10ϒN>Ιtg쯒iC6IXRK]$lդO?�!�~̩;yCIcW"d@u= L?ys{t Ǐ)9lQxN Q}{|�(k2|6؇e( MYȳ5GR+IelUux4e79vT 97ǶҗZJ-SXo_ݐ�D[ ^NgN|o]k䁹 .M/-c+: .QE=:rbʠ|NW`.ots£ A��wS>�HَT Q̀<ӏEK;9<1 1LdtF!\I6z U´2R\F9S4D`�g� b v[2}h1 ")?gx?�RMb)sʸR<F>W !HY6Iˡ(Q#68Eӓ:R3J2Qw o@3AYh^ugڠnIpwŸ\SCq8at)D�@)  3ǔ3uͱW t:�?�CWoѦolMetYM d(s<8 ] cK Rrbs$EE6c$Trn~�[7a7|=ꖲ*IW?7,je6˷0#Hflb)Ґ#qА|qxm[쬴r5)M R�n[#WW4 եµ;c'ҖCa.p#T0HeD&e9T8y|C FӃ7秕sT" H�f<2�wj|�^=d+goh=$9vT1$2 TKfl&㵥hp޴ iOKx&m;LDZ#KT'ks9@քQ��jؘc'-UR9&�?ڥL!^TºqF#%$3'.ڶ"o<Q5S*^ԅ- !pK&Uq˜JC^6E2JLmT=JuM�1MFc/ MZ<g3$'x/�P*Lα=Bmu%!z0[~�I]' ! m;?Nؑ%xr-TwIr<:_\NG"Mr>cV &MՑx̃bZE@գJmwv#2p1pq(o| /)Sp𿧞VAZqTK;!;\=av& eh+y8=Yvy~A~:?0p2}Dy~GsbRnOK8(<mBK@F壯97?(-*:K/? /|@CI翏=&RQ)VH�G"Q[ʺ4I9al1M4iF~gRL}l!KjCX(Ir})fMswV0seZc�:x+sR@R*յ^$@^` QGUW_͙gEAAA}Zp!:ev`g%O�,�u*j[BGrAXJNd[O@OY{7ߟ?p Q 4!xt5fΆv>ƪIZi@ā@�Ij`G9bر{q}Yۗwz?Ƕ^Jv�OrVf AmrqṟWV~wNu;<qBX"i9bmo4) )FWp̨"NWF]g31"+ v&҄W e0H¢1dusMmm>C-5&rr <\~G!c[v]0 0 YOl�؟- M@veԋ/^Y54TVM~ ܛ%!7Ǻ(a ?ss޵ M`겻=l.r*8i`;1$y>L92ﮝ?ޜIfrp qm'q&�% JZO  @ ^/*s//'4'H? Фe>vY]@JArRӵ~KͲC㐡<&:NSJ3똭$JЫhRC,m8XCrphgHRg`\IQ.c2aƎ㹷,R<7c"R 'm떤rAX^R2裎r)m7՞+BOާ֧c)XmjM[̴AMB^mӸ! m~spn~yǎ,aB;hǔ&.pê Z$ϧa;X&Q , =U5qP߃ٽJ{RJ‘0MMt0fEld`)Sos'QZVs|O�PΧsъbsPRb(9-y5.ؑ%L9*y~<;Ln=5t$9oj//m.Fߤ#ԧ.jҝu1h|k-CLË7 pל(28f)2@qI1�±m,&X4J*{ﺛ7~ cgj:'r2\r)!8Pzٳ%rt{6@H޴_H6Sl Ņqe]+MK)IAϠ gR|Z_!N$iq2,_/L[,Mv>57XcVU3k 28 UE~MFk c9|/}I&1:Q$SIZZihj$eYuaơqgr±1Czy>S1>^?6�w �~'$mo{ƺ`nD.=:&¥Udw`P+# #*% p *\ǐ6um ʛ+iHҞLchVXA:@ǐG$Uk;1%32fq8Gs &Mbx�ic5Ǐ3U@? Q*M�r8mX+rULGA$BP7 |LNDM Aqß^ZŃm䟗ġ H0Ķ]q /?\l*x .%R͑MK$EM{-qڢiTn5|k8B|~?DUU`&%%nZOB{44P`Iˍ N9jx1G,oR!g0 G2jr۹c\ɵ]Ȼk[ץS9thu$Bg35ԆG3a`Ɍzt~3޵^5!01.ѥLUgYXZugy|{el.bIԯ{/ FVYo0׶u_5Dڧ ִުV<0H$LIX-]x ]lq vfr&8ql)k; ) 02hrX tMZ)FֵF^(h5{5P,U整H[R.`rpY|^}u/BbD"~m~i4xP'-C*�b"�q((5V/:ǕIfm (Kvhi%mR$"|*EY֩cYYZd|){?DL}=1{0Zb69^W[05 PY`;W] B%-d1幌(pk C̘q%L~z.\H$GeExX$W�P?哔�խ*�vr./y#F32ѥ9DSojᣍm4uuAYK'34ERDRY xI4)wQ{F@J:8& $6&8cR9zF:8xv3t s r:üF7H6&) xt 5psTOuhq.o-a|Wcn 6rJ0 C]cc̣VJulw�nnOz �٤moY>,e+r<Lc<2bQM'ֵ>) y8JMӞHӑd1@B! 3xcy3Xbʠ|u ( e9䤱ep{6Dz"D6`YUpRNXȒi՝gΊ^[IJ0qb^iK[SJ &-Z_E;T*Emm-Vbڵ`YQV1xz(%&qy3K�(@s.i1tu( s9tX!rllK0sUk:Imr}&%9 ]L;$2 m4dХہ]kZHgǎ.r 1fuXp2TwMBr"P؄X"Pߣ3< cJ8c� Χ @̲h V4ZօIIDI0un&\h7V53dPJ v˲m۶,t:Mmm--D"x^rrrH# _GKGG [`F[S3C/iR>e/^Ȟ[t-'h_x./k-8v#rg 7aUS+YZƣI& 娑ŔznS*cwK]== D U2pk`CMȵ'VpB;�PQꘚ qHM$7gq 9v8A[4.D]Jr.)xXXw׶&j;,ȕ=T(Q8 `3uL]b; ]76sd"R qm'kQJ"Nc&9 a[Hy`ޜ6@jK5d2R)?%�Rq?�@p�p}kC <dו1$^Còi^Y m�T|^ȈJC&^C#iل,IL]3%^\0;7ֽoa{} ~ǗKgQ)7=Ar^‰ ]5|{=zw-Ë -MpRNu)0 Z ̰%F}gaVԇ# -  Tyj~5q{)"irǕ%pM%Z5ֱrJ̝Ǻݟ0f,OΑ!U蚎?`@�no~K@4�n) e,֟߭{ ҎwCƱC,MۙnMD XZfQM'bԇ]޻ѥ9.aTy VOK[v7 tMzyu̢d "~n9O.iiԲ[7C|E,Ō*!c75 &/-ng8xZH$Tweٸ?C 34jRמ5Ƣ0+aԖٷƌ\DZhmm:VZܜq8'NdA8 _̃<BP)�-} ;{M;G)4/>dhhRMfiO9ʂGB0(?A)ϡ,En:e+&Y/r5SM !\�,{xt~ XP5 ëhۍ_4=a[-a<xӑpf ]}ZO)QQז~Μlk3ߠaU ')'1cPQރ<e]L�kg�wG<;yAN"mtΧKL] 6qn`imTG\2Gk "e9S֖\]+ %nhB򳧗R˦3z@θ4IA~V`JB_\ϼ,ӖR)U6|&z�Q|=5^|9N9Fahlh ֓2 R1p ` HYY;ݗV+=#RB\C�ܞT^ѽYrR8/_:}G*5\0]i.iKpS'˛4vpP< )34ĐbE9tM: T6k8M59yT)܄R ԥ gp3xii! / ɮh-1614Ie~ e+>kTv &W5"kW}4W&LB4R;ش0}I_5!3J]O԰]RnJ* ! ЪQ(mjum,ґ`gxMU)ܗ&],>Bu{3' k.|DԘ%2\A{,C2c34PJB^Hvn}֚[WNYA>XMyy2}ݳy!dg?Tҧg+"%fou[ZGڒ I`ݵg;ah`P>$?]3šN_ʢMa:4GRxD{3$onalY塝f(wLxM -02۴q2 J} |_W|OL&Yp!VxH'x4Ɵn`k�cg�tB,s:9Wgj{\DkJ,or3t)gҠ\)`\yy~Һ:yom+;X\ɺ+"TqTX\fE}SǕibWM+:7"˯onm_o<5g e8biPSS hmm%'"?/_g#|B.R| Hvo}Ձ.8Z:#(k2v@xzHVN=$qӿJBT|ƕ(x hZh3oc;+ֵǘX~%vM;R\ĈtM-˨z[z#ZWYnh6~ڞw>+"zņڪ^אC޵-!tKAW'dTyJD)A mOp{լ - ܢza *nN)&-Ο>pqCmZ&U~|F*;v)=<FV[KC}=7ndժUR)LDu*Vo[6~BF3ݳe/ ڝoh&II-m,$^]R75lTȟZ)؟2U=^,*r8-5(w&XR3u&w\pK^bݖ>-Zmx<9AJKKxb5%ԫ@9�&$bάFxOl,(vJp];I۞둲/iǩ ض"`jZuˀ3脼:Ը!uԓ\֝hbECuQ2“eIV%(hiۡ>djۄa-[|@SSi1MN~hoo'(:Jݱ`L]!� "jYW$S; McSQZ\c2oC//kD(]] 2uĩ!/) yMM0I:c2C"2,%A_K<e3$ֱO#܆& &m}|/ڦi8pf֯_φ Xv-!0M?g7f{q H-;Ǫ}>lxYZ?(ѣhiuU7ZҖsOʇ{dc qyxk1+r^1g!8=8{,k9$x<WZӌIΝ6p"&e!y>Ugҝ-lO# wL&I$XEgg'tp8L2$JN%]1 ׋zYf =?0t/O7T-} ]*Qjj׺?׏Q,-'(:y5HĿRlǫi׵U MI,mu,! o¢*ȲHZ):�=<( 6W34=3 rL@`p];(>(K4]os ^+ţk>s S8tƏg٤RnOe~99x=t@T&9yjR@ܶ-~ʺWύ}{z�{f=„cb5spA5W':{tI<cϙ 'Z=h?14-![S*YchLTJQASϯ%u /2*yyIg)ecg|^CQxnӝ z%% M|^ԥkl B^eT}(w5f1{.:|!XR")\~`uC?㞻b؈ᴴtcx}>^/^X<F3LKs3˖-ֶv7PnuM�->ē~vB}x'C^,[`(x3(6M\zh%~7_<7[0QIR-m=s7vPNhgR q'i4t$H]Ae"V ҶM,eZV!8kkE״ES.o.^48d3$2k%!悁RK:nA!u#+AȖ؃Sf _wQ_@kh;17ilǕn-G!4cwBp[k�?q454RRR₯qhg,M475t<4 X{}1rT-/u{ICz W2n@ J)$SM;[8[�Uݢ5zT)-!Ķ mMsK�1p0;%A^߆幌,ˡ-&ޒp RW؆pSq/oOs>NA>I 9\IA,ˀ</F0 ,zsK`O[?r'[cwk!p;<zTd[sTuIWjT= $hR3$iy5DS}\r٥,zjkkm}ee#QRt�,=vMj$P0ІPaV_6U]P1bz7W1cG#%,o sBbYucp|c,m3 hV[KpM11X/B Y8}b9 "<AF9*:]q ||a>X"ϸ.r#r񔅩K4Eh& Jl[)Ql+X!:1 U 򂚁#M,7вS߬v+5v^R[O) X󮐢.p.g;(!pnf^G9?Wd:BRY7vkd}KV6 :l±(?x4Bܶ- X 4mMjr+T{cu쾳*^�JS�Z9" ~B1e`RcѦv�VSi­!M q|{,q#J8ub)9^Յ5^e{N_ 97V 's"jt7w܀~`qǺ 8ؑ%|TiIdl &"|2cr8n\) -2Ljyox~)-5Pb UfY)xZY )QM=JQ4qħkJ 0uGGLf04G#IelY'+uR.9`ѴrkH7(Vӕn;D^7щm‰ ]cXDƦ3qT7�26v7fY)zX:bgꀰuX m@+X'zI=JBl|�(�MSX,,r88ɼԈ[iF! Zc^5r);ZϊHSP%\ gpESǗ2cx!w7~!;>T"c3sn<m4L@S8ŵ/fֺ PbjNK:)/Ji;& A2c󳧖ʚ&& )`V�B+CL;[߼Z4%i[( RJ` D`@�Wj9 t!M=[sC`. $^] #uAM܍]Q<&!rlHB)G*m!Вp46y Pɬ_dtepը! &ܚJ" t(Gťt-*+IMTlG�B3}mK�d+�tU'B|; H+IË(*s}\0A:B MpYJ}(RK8 8~d1RZ՜ҐfQ>@ qlԶPSKq;Бy<Sm3e :6qewbsFp1Oc{;[ 3@'P!LW!+5z?ggOꢉrJ�Y DAE@,1 t5d%*.ݵ G@0)gT aj�TDV+0s* ` A! kfo6=(>ځ\� �Hv 8!ٙA ݷ!ӹqȰ%5{<Wgi]ހ&Y5L[ʾ34@)u kEmYP(1*X83tB0Jm=_;(7D_ŧ 'W?ߪ#J-d)32OJV ux+c:�nX{-rbv@Ϸl"l�D[_FB=~/}M; �poy{S$ȠĎmgW|%6wvp, LIUy_Xxev j;I$Sv{)VGhIFyvJX%xʐu\`$p?в+k%VSQ"%QYOG"H*{hY?|)^Ok lytRλsx{ACTCPBe8gÞsO=„[qs ؉l&z_kz{bG}.zbk@.{}~pǏWd$'1*x"g]ҵp@(&JWǂ;nILAIˌ�RJuK0[!8XrP)]?R?eN1 /LbCJ"׌K?Ւv;0Z.gv4 =IlB:#)[LT?=3F$nݏ�J .32H)l>QYMGf_{*% .:e~IZN"eut�[`?!b}4!\Fi(ZTUioOV)k;<B"Kn{M뀷*%r?=Od%,fL-gDb)P@ Jl �~᪏t.#GRQXL{4E<iJ٤R ɐ(5QTli^ �/�@|KU`"Q%8AM1):Rn6(ϋGɌO?�`?}:=.cWtSkLY y�a4lBlt4@s`|8A;λ@z ڊXbvK(U%RI$#8gꘚf$2vx\F6{?�VPP$#gqָaDp #;f Gl@b?CnLYm5ݿo}v!eAJJ.� 6dld8E}S`ܷ7Gi\WټWzr-N>o %7)e#�1\x qw|DWJσ`?Ss5N '! |3T()T!$D Elp yת4ԩV~Bg!+  U}0տ)O+]ꓦF}KFWjѤM}k[BS=#n!d.I3xR)?A�7es/~�mn̘mFw8!m柰<0DUE^1+_m"7"vV8*h1VL+%}4]ц�~g+uH̱#OJL[+e*pԑrG)Fm |E_ [d m?-Q/'xѿF?k3G#g!ГOxQ*(TǯP �a7~P&N-spjb5:[ʖW~՛;UgNx8Ft#8'D`` ` /9ˮJI]d+(A?m]~wM+Q-A=3''6N^?g�Su^Mi⠸rz{tçsa|)?G6hN |)n]bP^U��Qg$oZ⨩ +aB><hMQ)y߿CQ :Q~�<U=�o Uu=؂,q8g_`Puʥ2AA |ϔEXbBR"3Bk+(Bg1m6?+[VfJ,|3@I)(v�% I>p֤S8!\I/$Unq:3`@= j㋁xZګ�[S`?�]@! QCʞq0 HQ_i^/yyyؖK"*&C~:,zmUAB# \HZVY1GOB?k;9Ф-@N(qR$t {%�`#u-pVi_9^1q Wڳ;,;45( ~Y37m6<^ }*X^@�x`w�q.K\hP^NhFn(r/@hrs㮸gy% >]ȴ9yb9~u^=b{>JAqHg ~XIS>4)++'mtlbղe;3͚~#�0K 9ρ || \phF+{.JWs+yZ�{,ʔrMS} ~�Ɠc�< NVچi0y4+PRRJn^ H$R)8ph4(4MCjCC&^xyN^FAPљ\CݰjA!W3[Ĺ睃;vlWxH%,?uVa2R�B8#RG )*lǙ M9WMb\+]LCJ4胴CN/i%M_MӞp HPM1�T�0�P@,:FΠ:| =L:C,#ΤQ RV&TФ'IH]}kW"v*7<4Ms -w� w?y>O< 2iH)يhD, XTn=xPOtWeLu}"ky4`hd ЫkXRBG{;Lf'MU6` -žW?IGڶuaw`RA k4uv9z ˖.m �Rcx6c9ѣ9t*+)(* -g7|Rk?;7x8Y{[WK;p CsXh'A:%Q@R2hǃ4L�흝,[yK93>"i1!'<9)R >N:(>ϿM9~$ .RlR}.c 09 ((-cA y_:?ot  ) D#~k֭ 7SN;!CRv"dt&\}4wޝ]7>jۗ !JS{MjrL<HSwY2~q81kYy/m+JIq#F0ut 3i(T M׻A4 =,ϣ7_} ]n ]Ζ3 |\Kp"J,ZijO?MkG{\%]`ERen&A3>N^(W_{mK@N~>+W `fPT_/=Osk+'w\v9ťeutP_@SSXd2I&!NH&(^]3S48bW dc]eKX4ƩڑD:DxuWTv Mu.J9quф;:ijo㐃sW' 8T ضɐΪe,^+k9c=6Ca-ɸM<8x| raf;q:E;; G"TUUqQS[{|ұ4HI_i? SP $Cq>OQY9W>B$saW1M,)]:>Ko2l0L<M5YhVzݺ^q@hr۸.Y vHl  jc^īj؊/%>)uZ;t) Ҹ_=O4ǜpe\pɥYkIe= -$ +KW#Q1YS9Y+g`4 $dv#44ǩiQ=iH>|v%)ӦzmMH&444&rƍ'46m)g)q{u{la�?I ?K!c %KxR)^~RbnPRI`&s? 3}A\zeDb1͝G&0 4Mۮ% NSX\y{*6.iDi$4I%m,k33kAiB?QYGWQU"T# ?C,]uq&r]w7:i2^r u464z]N?!mӨR8JQ1%LVo`^j$ju|_|`9uBxS#crM;5QP M8=As{9 . SKK{膎ra856ʈ#_w'8?j[EAMƽI?I� E>)4Mko՗^ p4M2 m3JR ćb_0 cqHss3˗�T; (&I).ͣN8dr))CLScźv fΩtC^- U7G/_3LUYUy䯳s9M3#_ شid2Y/-}>% ^AHX>-a6(t"r2Z*+eX+inORR#ΜFo KL^4D"!Ndb˙q ȃ<ptSK~!&|uD4 "OCh6n`,SN9nKRttvb[ea[qH%S.Yi:͊ի(/-/\J,c(0͏UJmXVyXV+c(E:BtaGj茤ysN/hd::ciLl(^�_vsʯA-S'Ms^cI\xS[JHq"'6aݦ�˲q͆Ma%=! W0etO4:c%nMlФͶ$$$KZۓ45pxLG_\snd*juvB23b2>h(,Z(**XcȻu473j(wx ]r >?I!~Q?_۽J/`_gu)qD QVF9mP 2_Q'9shmm4-$mD&A X0z}DE5u.ʠo 蝐A:3%aF\U|ڠ2ݵ\?ߟ(T:Ţd2x<}x<r hZkc8 ]׉Ezo7~q+B_|22C'?{gu{.<@(J"#K6EmZvt%kDZ'Jlk7XJRvu%XK.ںeiuH&E p=13@OzE�`^z[=<~ YϑB "Bs }ysrh0pF5'1#P�;g/``Z6m>-QyVXN{{;'e3MʺǦOkxg4/�PF}x-,XbP LC>gl,ͮ7- K P2֨|?wͭxk'm_k%I`;NJT<ɤRD)T= ;Hǔ1"cPȚ7͞Q,Ɵh|kDb1vET"(RvÔ <Ec!e"(Ll&/b.˖ UŜjN^j{0Wa۞F GGJ rQlc0^dM_"#ԃP=W_ Иo`n�SMP @Y,\%Khji!hii N4+ /}ߥoh,KӺ D31;g8ߘc^vKRmryMX$&(~gGqig'sΥJ),Rh,_pt`|RH⣍F!QO@t5+;е.o$U HGNii]/fy455/~P$Žv՜F$RXS3A`_~cЦ.qY([?}г~9y+ɏ_#-(*lߛqWXY9#d.C"}<z- crf #aepա;=\jchiiA)zb 里\'m_DyѼ'~'Nhmb:{6: BQޮ$)b߯FDo\ L&huyG6! سd*Yt)�[CmhLĔ9ME!N^WObJp"җ3-7S. TR@R={MU]rAURyA1Ƙ64Y?)XsԪ 8@#<>ۗ/_5p= a/w$o_k,ڬ=bf[)>ٔڳ\%- ^D{Jn/Q.X,ۢ)| :// hȅ) 2Eɞ@ y[dfƈ673K@?k�7+KơB!+͟O"¡IOR,޷I%0oR f|y&x(r6ued֭v-,^ɶ7u]Bee)iTb-Te5R,|A>BM�4FiMk_ŽO!c<<J/[k%}$ß7.6+Sq^f3&fED9$Z:||_E4=wc,l_@@ߕƘO:[13<dfԤW.|R{c7GE5|%;X5`;9tl[&1M"9JDO+'1 џ\'`?gڵ\mvxķ\.dG"~mAo`ނx:`w&":so]eY#,^nunQF 78n\7/Ie]}|Rs{tYʾ3$CGٽu a tOhkR[:/^ =g(cnL> ȦRٹhWS^cp[[ZKt Rl6K$!Lr!~Y(3\?g.KV`s\l9EwO/Rqj),RdURXl9-X*a6Su`m+s+wkj"NWS*<RUW;�"%"Qe>tEO"lȁ{K˚rP$zs`.>m%bfˆ[p09:Ju)}ʶXwlۦ\("M\`-jhAKgVo fd[P_ٖr1gYZjdsYI*!ˍcFb8qԄ!'8:8@!ÂZO&ڨ>c!moj.f2(!4L+p՚=Yw GT6с?ɇ;�gyk�aN?9c<@k{)ybmWSV4̪vdrW#릥[v G#diFG !j9Dں4հXB>x2R\>8ک yTÇ[?W,9w/nTE=:6F<Tg"._T2R'LIten ib:h``BT*86,4Ƅgr1Es`@ף>swmh"NV \PvNW53(%ؖM|`�W=ԣ2?pDm-Ib;R %,|X,kG鵔#"rˢނ h+W_ISs3疩r 1<D<B={Jʮ9۲Cb5j("h GE-1q`l(pl#!XE!`MT>h: 8a CD7eϜ\zAUwwQ,Q]sdgadhd&MXH_XFSZfR% W_i eגNI&kPZ340@ϡC(IQꯪMs27Ճ~ ^.xw\0@BDa̢HsAG{>{T8M 1˾-ݹlKyGkp<N47{-e}]ߟ=;*\.)K$'V*CCXx`:O#䌑׎lCk 6|ŋ9r̶mwGl˺γiEyWB 5(m3|`b.{O՝ jF3b)JTm=b)1&mÃ+ @'ؙ-gm0ȫL( <k}#yWE!#͢ikLP(k1'olf8>7Ez"Ɛy7Yx52۲%}]\<ɣ)|CW#b1o! Ƙeae\'><D 'd}z$@C}!/Z BVOw'VZߩ[9uX~$gU)Lг!ij;s2akcx:mv{+"4�/&yT_cc45`0&V CM ꩈo ~n/ %hCODUj #sr3h` 9:zXur‘PM²m,Wi2![DpYMNnxAsX.uD3\֩rp\4{C64N(D)ƞ;9UtP،rJ-P_Lv �ȝ/6?p}{gir-6PPq⇂t =k6?8HhS #\5F9GmfhlpP*;icҥT<Z , ykfl~#1k):m�b2Y[KT0x̲,{^知\,rUñm|c>ꙮ i'2;n!דL?z ƈF4ϚE�HɨUbeNu]F#tOs{YFm#j P۲yWykTQΦZj�bNQZmN WpzbY̲?W*?DZV^~\oeb%ֳ!^^yy:/c6F)Ca +k95df͞Mf,3G,rH=ZL#ʚy%T*>Gض vKV4zw4�xCD|WO!|75w[PFDr4ܦ'AH c}Fe̘%rGDW_et3::B8&6>;:`>!|, ;\>Jp]rD0QJ:7 q1\ ;0: cKnilȩIl~,"db 'ƗRK dVt&O54WJ|b$~C9*xhjnMbp`6fD)yRZ̈́o0Hh,j=t˪'?kJٱ1oȮSљ]'3|m}݆ ik`7پ-F}Vփzr's9thN.b3y  [MƘV<Ato9bR<UT_x\f֬,^v*?Ű+=kV M5^%dʉ'.K;l%ćy陧پ};GeKeeCj)MPN~Z(+QH�WӳmUwR DQT>,?ʼv.IJ,{{֋,%Pskz֔je* eץ/0:�ÏD홺c/^ҒN+/1No!X-[ ps63g�LQD6" !<;-)[6 <5np+V# #�kmz+S.ޜ֖+xz .-cf0^zTwNw 2ht*EOoke"Q�� ;!x�XӜ'�į7;J,yz{zl˗ e>Ʀ#)Tn.ds7^~=eϣT,6ߧ/Bk  _y^[ǯY �N_g 2�k%e=ĘK&G/I!b1,A& mp\S4YjT۶kS##<ot~SAR1p�l�"�I]H_Dq|>V6ڱ7DQrѡ[^;"a<r y @ �^ ɒ �`]vYz C ¢-j_s?[f׼}yjD"*%��l�p0i�"~<e<!};A\S3Lt2D)e-<#=@Ϲ4��`C�<�xǑjX[-48Y|9XlLrηD=t9u�` i�pZ8s#JkHa89zK9g"GFG9o���x�_ xj>ڊ IȗeTMi�4�8�X+ː\y 8#C×c*%Nw1𜴜5��` ��k_IɥR̙7R,)Q̼ �` i��`צGsDus䤚MW'��lHgwD=~skƯ9 �6a5![ fX) e�W#\P�`C.f1dѐcnLAC.rϹߨcz�#'y����IENDB`����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/hawk/example/usage.js�000755 �000766 �000024 �00000004036 12455173731 033561� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Load modules var Http = require('http'); var Request = require('request'); var Hawk = require('../lib'); // Declare internals var internals = { credentials: { dh37fgj492je: { id: 'dh37fgj492je', // Required by Hawk.client.header key: 'werxhqb98rpaxn39848xrunpaw3489ruxnpa98w4rxn', algorithm: 'sha256', user: 'Steve' } } }; // Credentials lookup function var credentialsFunc = function (id, callback) { return callback(null, internals.credentials[id]); }; // Create HTTP server var handler = function (req, res) { Hawk.server.authenticate(req, credentialsFunc, {}, function (err, credentials, artifacts) { var payload = (!err ? 'Hello ' + credentials.user + ' ' + artifacts.ext : 'Shoosh!'); var headers = { 'Content-Type': 'text/plain', 'Server-Authorization': Hawk.server.header(credentials, artifacts, { payload: payload, contentType: 'text/plain' }) }; res.writeHead(!err ? 200 : 401, headers); res.end(payload); }); }; Http.createServer(handler).listen(8000, '127.0.0.1'); // Send unauthenticated request Request('http://127.0.0.1:8000/resource/1?b=1&a=2', function (error, response, body) { console.log(response.statusCode + ': ' + body); }); // Send authenticated request credentialsFunc('dh37fgj492je', function (err, credentials) { var header = Hawk.client.header('http://127.0.0.1:8000/resource/1?b=1&a=2', 'GET', { credentials: credentials, ext: 'and welcome!' }); var options = { uri: 'http://127.0.0.1:8000/resource/1?b=1&a=2', method: 'GET', headers: { authorization: header.field } }; Request(options, function (error, response, body) { var isValid = Hawk.client.authenticate(response, credentials, header.artifacts, { payload: body }); console.log(response.statusCode + ': ' + body + (isValid ? ' (valid)' : ' (invalid)')); process.exit(0); }); }); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/form-data/lib/��������000755 �000766 �000024 �00000000000 12456115120 032131� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/form-data/License�����000644 �000766 �000024 �00000002136 12455173731 032705� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2012 Felix Geisendörfer (felix@debuggable.com) and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/����������������������000755 �000766 �000024 �00000000000 12456115120 033761� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/form-data/package.json000644 �000766 �000024 �00000003753 12455173731 033674� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Felix Geisendörfer", "email": "felix@debuggable.com", "url": "http://debuggable.com/" }, "name": "form-data", "description": "A module to create readable \"multipart/form-data\" streams. Can be used to submit forms and file uploads to other web applications.", "version": "0.2.0", "repository": { "type": "git", "url": "git://github.com/felixge/node-form-data.git" }, "main": "./lib/form_data", "scripts": { "test": "node test/run.js" }, "engines": { "node": ">= 0.8" }, "dependencies": { "async": "~0.9.0", "combined-stream": "~0.0.4", "mime-types": "~2.0.3" }, "licenses": [ { "type": "MIT", "url": "https://raw.github.com/felixge/node-form-data/master/License" } ], "devDependencies": { "fake": "~0.2.2", "far": "~0.0.7", "formidable": "~1.0.14", "request": "~2.36.0" }, "gitHead": "dfc1a2aef40b97807e2ffe477da06cb2c37e259f", "bugs": { "url": "https://github.com/felixge/node-form-data/issues" }, "homepage": "https://github.com/felixge/node-form-data", "_id": "form-data@0.2.0", "_shasum": "26f8bc26da6440e299cbdcfb69035c4f77a6e466", "_from": "form-data@>=0.2.0 <0.3.0", "_npmVersion": "1.4.28", "_npmUser": { "name": "alexindigo", "email": "iam@alexindigo.com" }, "maintainers": [ { "name": "felixge", "email": "felix@debuggable.com" }, { "name": "idralyuk", "email": "igor@buran.us" }, { "name": "alexindigo", "email": "iam@alexindigo.com" }, { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, { "name": "celer", "email": "dtyree77@gmail.com" } ], "dist": { "shasum": "26f8bc26da6440e299cbdcfb69035c4f77a6e466", "tarball": "http://registry.npmjs.org/form-data/-/form-data-0.2.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/form-data/-/form-data-0.2.0.tgz", "readme": "ERROR: No README data found!" } ���������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/form-data/Readme.md���000644 �000766 �000024 �00000011314 12455173731 033115� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Form-Data [![Build Status](https://travis-ci.org/felixge/node-form-data.png?branch=master)](https://travis-ci.org/felixge/node-form-data) [![Dependency Status](https://gemnasium.com/felixge/node-form-data.png)](https://gemnasium.com/felixge/node-form-data) A module to create readable ```"multipart/form-data"``` streams. Can be used to submit forms and file uploads to other web applications. The API of this module is inspired by the [XMLHttpRequest-2 FormData Interface][xhr2-fd]. [xhr2-fd]: http://dev.w3.org/2006/webapi/XMLHttpRequest-2/Overview.html#the-formdata-interface [streams2-thing]: http://nodejs.org/api/stream.html#stream_compatibility_with_older_node_versions ## Install ``` npm install form-data ``` ## Usage In this example we are constructing a form with 3 fields that contain a string, a buffer and a file stream. ``` javascript var FormData = require('form-data'); var fs = require('fs'); var form = new FormData(); form.append('my_field', 'my value'); form.append('my_buffer', new Buffer(10)); form.append('my_file', fs.createReadStream('/foo/bar.jpg')); ``` Also you can use http-response stream: ``` javascript var FormData = require('form-data'); var http = require('http'); var form = new FormData(); http.request('http://nodejs.org/images/logo.png', function(response) { form.append('my_field', 'my value'); form.append('my_buffer', new Buffer(10)); form.append('my_logo', response); }); ``` Or @mikeal's request stream: ``` javascript var FormData = require('form-data'); var request = require('request'); var form = new FormData(); form.append('my_field', 'my value'); form.append('my_buffer', new Buffer(10)); form.append('my_logo', request('http://nodejs.org/images/logo.png')); ``` In order to submit this form to a web application, call ```submit(url, [callback])``` method: ``` javascript form.submit('http://example.org/', function(err, res) { // res – response object (http.IncomingMessage) // res.resume(); // for node-0.10.x }); ``` For more advanced request manipulations ```submit()``` method returns ```http.ClientRequest``` object, or you can choose from one of the alternative submission methods. ### Alternative submission methods You can use node's http client interface: ``` javascript var http = require('http'); var request = http.request({ method: 'post', host: 'example.org', path: '/upload', headers: form.getHeaders() }); form.pipe(request); request.on('response', function(res) { console.log(res.statusCode); }); ``` Or if you would prefer the `'Content-Length'` header to be set for you: ``` javascript form.submit('example.org/upload', function(err, res) { console.log(res.statusCode); }); ``` To use custom headers and pre-known length in parts: ``` javascript var CRLF = '\r\n'; var form = new FormData(); var options = { header: CRLF + '--' + form.getBoundary() + CRLF + 'X-Custom-Header: 123' + CRLF + CRLF, knownLength: 1 }; form.append('my_buffer', buffer, options); form.submit('http://example.com/', function(err, res) { if (err) throw err; console.log('Done'); }); ``` Form-Data can recognize and fetch all the required information from common types of streams (```fs.readStream```, ```http.response``` and ```mikeal's request```), for some other types of streams you'd need to provide "file"-related information manually: ``` javascript someModule.stream(function(err, stdout, stderr) { if (err) throw err; var form = new FormData(); form.append('file', stdout, { filename: 'unicycle.jpg', contentType: 'image/jpg', knownLength: 19806 }); form.submit('http://example.com/', function(err, res) { if (err) throw err; console.log('Done'); }); }); ``` For edge cases, like POST request to URL with query string or to pass HTTP auth credentials, object can be passed to `form.submit()` as first parameter: ``` javascript form.submit({ host: 'example.com', path: '/probably.php?extra=params', auth: 'username:password' }, function(err, res) { console.log(res.statusCode); }); ``` In case you need to also send custom HTTP headers with the POST request, you can use the `headers` key in first parameter of `form.submit()`: ``` javascript form.submit({ host: 'example.com', path: '/surelynot.php', headers: {'x-test-header': 'test-header-value'} }, function(err, res) { console.log(res.statusCode); }); ``` ## Notes - ```getLengthSync()``` method DOESN'T calculate length for streams, use ```knownLength``` options as workaround. - If it feels like FormData hangs after submit and you're on ```node-0.10```, please check [Compatibility with Older Node Versions][streams2-thing] ## TODO - Add new streams (0.10) support and try really hard not to break it for 0.8.x. ## License Form-Data is licensed under the MIT license. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/����������������000755 �000766 �000024 �00000000000 12456115120 035076� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/�����������000755 �000766 �000024 �00000000000 12456115120 036052� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/HISTORY.md�000644 �000766 �000024 �00000001440 12455173731 037547� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������2.0.4 / 2014-12-10 ================== * deps: mime-db@~1.3.0 - Add new mime types 2.0.3 / 2014-11-09 ================== * deps: mime-db@~1.2.0 - Add new mime types 2.0.2 / 2014-09-28 ================== * deps: mime-db@~1.1.0 - Add new mime types - Add additional compressible - Update charsets 2.0.1 / 2014-09-07 ================== * Support Node.js 0.6 2.0.0 / 2014-09-02 ================== * Use `mime-db` * Remove `.define()` 1.0.2 / 2014-08-04 ================== * Set charset=utf-8 for `text/javascript` 1.0.1 / 2014-06-24 ================== * Add `text/jsx` type 1.0.0 / 2014-05-12 ================== * Return `false` for unknown types * Set charset=utf-8 for `application/json` 0.1.0 / 2014-05-02 ================== * Initial release ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/index.js���000644 �000766 �000024 �00000003246 12455173731 037537� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� var db = require('mime-db') // types[extension] = type exports.types = Object.create(null) // extensions[type] = [extensions] exports.extensions = Object.create(null) Object.keys(db).forEach(function (name) { var mime = db[name] var exts = mime.extensions if (!exts || !exts.length) return exports.extensions[name] = exts exts.forEach(function (ext) { exports.types[ext] = name }) }) exports.lookup = function (string) { if (!string || typeof string !== "string") return false // remove any leading paths, though we should just use path.basename string = string.replace(/.*[\.\/\\]/, '').toLowerCase() if (!string) return false return exports.types[string] || false } exports.extension = function (type) { if (!type || typeof type !== "string") return false // to do: use media-typer type = type.match(/^\s*([^;\s]*)(?:;|\s|$)/) if (!type) return false var exts = exports.extensions[type[1].toLowerCase()] if (!exts || !exts.length) return false return exts[0] } // type has to be an exact mime type exports.charset = function (type) { var mime = db[type] if (mime && mime.charset) return mime.charset // default text/* to utf-8 if (/^text\//.test(type)) return 'UTF-8' return false } // backwards compatibility exports.charsets = { lookup: exports.charset } // to do: maybe use set-type module or something exports.contentType = function (type) { if (!type || typeof type !== "string") return false if (!~type.indexOf('/')) type = exports.lookup(type) if (!type) return false if (!~type.indexOf('charset')) { var charset = exports.charset(type) if (charset) type += '; charset=' + charset.toLowerCase() } return type } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/LICENSE����000644 �000766 �000024 �00000002113 12455173731 037067� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� The MIT License (MIT) Copyright (c) 2014 Jonathan Ong me@jongleberry.com Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/node_modules/��000755 �000766 �000024 �00000000000 12456115120 040527� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/package.json���000644 �000766 �000024 �00000004005 12455173731 040352� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������{ "name": "mime-types", "description": "The ultimate javascript content-type utility.", "version": "2.0.4", "contributors": [ { "name": "Jeremiah Senkpiel", "email": "fishrock123@rocketmail.com", "url": "https://searchbeam.jit.su" }, { "name": "Jonathan Ong", "email": "me@jongleberry.com", "url": "http://jongleberry.com" } ], "license": "MIT", "keywords": [ "mime", "types" ], "repository": { "type": "git", "url": "https://github.com/jshttp/mime-types" }, "dependencies": { "mime-db": "~1.3.0" }, "devDependencies": { "istanbul": "0", "mocha": "~1.21.5" }, "files": [ "HISTORY.md", "LICENSE", "index.js" ], "engines": { "node": ">= 0.6" }, "scripts": { "test": "mocha --reporter spec test/test.js", "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot test/test.js", "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter dot test/test.js" }, "gitHead": "63a9b82e6e364d62428ed5459e5486504c489bf2", "bugs": { "url": "https://github.com/jshttp/mime-types/issues" }, "homepage": "https://github.com/jshttp/mime-types", "_id": "mime-types@2.0.4", "_shasum": "855a612979141d806ba5104294a28c731c6ea790", "_from": "mime-types@>=2.0.3 <2.1.0", "_npmVersion": "1.4.21", "_npmUser": { "name": "dougwilson", "email": "doug@somethingdoug.com" }, "maintainers": [ { "name": "jongleberry", "email": "jonathanrichardong@gmail.com" }, { "name": "fishrock123", "email": "fishrock123@rocketmail.com" }, { "name": "dougwilson", "email": "doug@somethingdoug.com" } ], "dist": { "shasum": "855a612979141d806ba5104294a28c731c6ea790", "tarball": "http://registry.npmjs.org/mime-types/-/mime-types-2.0.4.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.0.4.tgz", "readme": "ERROR: No README data found!" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/mime-types/README.md��000644 �000766 �000024 �00000005404 12455173731 037347� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# mime-types [![NPM Version][npm-image]][npm-url] [![NPM Downloads][downloads-image]][downloads-url] [![Node.js Version][node-version-image]][node-version-url] [![Build Status][travis-image]][travis-url] [![Test Coverage][coveralls-image]][coveralls-url] The ultimate javascript content-type utility. Similar to [node-mime](https://github.com/broofa/node-mime), except: - __No fallbacks.__ Instead of naively returning the first available type, `mime-types` simply returns `false`, so do `var type = mime.lookup('unrecognized') || 'application/octet-stream'`. - No `new Mime()` business, so you could do `var lookup = require('mime-types').lookup`. - Additional mime types are added such as jade and stylus via [mime-db](https://github.com/jshttp/mime-db) - No `.define()` functionality Otherwise, the API is compatible. ## Install ```sh $ npm install mime-types ``` ## Adding Types All mime types are based on [mime-db](https://github.com/jshttp/mime-db), so open a PR there if you'd like to add mime types. ## API ```js var mime = require('mime-types') ``` All functions return `false` if input is invalid or not found. ### mime.lookup(path) Lookup the content-type associated with a file. ```js mime.lookup('json') // 'application/json' mime.lookup('.md') // 'text/x-markdown' mime.lookup('file.html') // 'text/html' mime.lookup('folder/file.js') // 'application/javascript' mime.lookup('cats') // false ``` ### mime.contentType(type) Create a full content-type header given a content-type or extension. ```js mime.contentType('markdown') // 'text/x-markdown; charset=utf-8' mime.contentType('file.json') // 'application/json; charset=utf-8' ``` ### mime.extension(type) Get the default extension for a content-type. ```js mime.extension('application/octet-stream') // 'bin' ``` ### mime.charset(type) Lookup the implied default charset of a content-type. ```js mime.charset('text/x-markdown') // 'UTF-8' ``` ### var type = mime.types[extension] A map of content-types by extension. ### [extensions...] = mime.extensions[type] A map of extensions by content-type. ## License [MIT](LICENSE) [npm-image]: https://img.shields.io/npm/v/mime-types.svg?style=flat [npm-url]: https://npmjs.org/package/mime-types [node-version-image]: https://img.shields.io/badge/node.js-%3E%3D_0.6-brightgreen.svg?style=flat [node-version-url]: http://nodejs.org/download/ [travis-image]: https://img.shields.io/travis/jshttp/mime-types.svg?style=flat [travis-url]: https://travis-ci.org/jshttp/mime-types [coveralls-image]: https://img.shields.io/coveralls/jshttp/mime-types.svg?style=flat [coveralls-url]: https://coveralls.io/r/jshttp/mime-types [downloads-image]: https://img.shields.io/npm/dm/mime-types.svg?style=flat [downloads-url]: https://npmjs.org/package/mime-types ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/form-data/node_modules/mime-types/node_modules/mime-db/�������000755 �000766 �000024 �00000000000 12456115120 042041� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/form-data/node_modules/mime-types/node_modules/mime-db/db.json000644 �000766 �000024 �00000416270 12455173731 043346� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "application/1d-interleaved-parityfec": { "source": "iana" }, "application/3gpdash-qoe-report+xml": { "source": "iana" }, "application/3gpp-ims+xml": { "source": "iana" }, "application/a2l": { "source": "iana" }, "application/activemessage": { "source": "iana" }, "application/alto-costmap+json": { "source": "iana", "compressible": true }, "application/alto-costmapfilter+json": { "source": "iana", "compressible": true }, "application/alto-directory+json": { "source": "iana", "compressible": true }, "application/alto-endpointcost+json": { "source": "iana", "compressible": true }, "application/alto-endpointcostparams+json": { "source": "iana", "compressible": true }, "application/alto-endpointprop+json": { "source": "iana", "compressible": true }, "application/alto-endpointpropparams+json": { "source": "iana", "compressible": true }, "application/alto-error+json": { "source": "iana", "compressible": true }, "application/alto-networkmap+json": { "source": "iana", "compressible": true }, "application/alto-networkmapfilter+json": { "source": "iana", "compressible": true }, "application/aml": { "source": "iana" }, "application/andrew-inset": { "source": "iana", "extensions": ["ez"] }, "application/applefile": { "source": "iana" }, "application/applixware": { "source": "apache", "extensions": ["aw"] }, "application/atf": { "source": "iana" }, "application/atfx": { "source": "iana" }, "application/atom+xml": { "source": "iana", "compressible": true, "extensions": ["atom"] }, "application/atomcat+xml": { "source": "iana", "extensions": ["atomcat"] }, "application/atomdeleted+xml": { "source": "iana" }, "application/atomicmail": { "source": "iana" }, "application/atomsvc+xml": { "source": "iana", "extensions": ["atomsvc"] }, "application/atxml": { "source": "iana" }, "application/auth-policy+xml": { "source": "iana" }, "application/bacnet-xdd+zip": { "source": "iana" }, "application/batch-smtp": { "source": "iana" }, "application/beep+xml": { "source": "iana" }, "application/calendar+json": { "source": "iana", "compressible": true }, "application/calendar+xml": { "source": "iana" }, "application/call-completion": { "source": "iana" }, "application/cals-1840": { "source": "iana" }, "application/cbor": { "source": "iana" }, "application/ccmp+xml": { "source": "iana" }, "application/ccxml+xml": { "source": "iana", "extensions": ["ccxml"] }, "application/cdfx+xml": { "source": "iana" }, "application/cdmi-capability": { "source": "iana", "extensions": ["cdmia"] }, "application/cdmi-container": { "source": "iana", "extensions": ["cdmic"] }, "application/cdmi-domain": { "source": "iana", "extensions": ["cdmid"] }, "application/cdmi-object": { "source": "iana", "extensions": ["cdmio"] }, "application/cdmi-queue": { "source": "iana", "extensions": ["cdmiq"] }, "application/cea": { "source": "iana" }, "application/cea-2018+xml": { "source": "iana" }, "application/cellml+xml": { "source": "iana" }, "application/cfw": { "source": "iana" }, "application/cms": { "source": "iana" }, "application/cnrp+xml": { "source": "iana" }, "application/coap-group+json": { "source": "iana", "compressible": true }, "application/commonground": { "source": "iana" }, "application/conference-info+xml": { "source": "iana" }, "application/cpl+xml": { "source": "iana" }, "application/csrattrs": { "source": "iana" }, "application/csta+xml": { "source": "iana" }, "application/cstadata+xml": { "source": "iana" }, "application/cu-seeme": { "source": "apache", "extensions": ["cu"] }, "application/cybercash": { "source": "iana" }, "application/dart": { "compressible": true }, "application/dash+xml": { "source": "iana", "extensions": ["mdp"] }, "application/dashdelta": { "source": "iana" }, "application/davmount+xml": { "source": "iana", "extensions": ["davmount"] }, "application/dca-rft": { "source": "iana" }, "application/dcd": { "source": "iana" }, "application/dec-dx": { "source": "iana" }, "application/dialog-info+xml": { "source": "iana" }, "application/dicom": { "source": "iana" }, "application/dii": { "source": "iana" }, "application/dit": { "source": "iana" }, "application/dns": { "source": "iana" }, "application/docbook+xml": { "source": "apache", "extensions": ["dbk"] }, "application/dskpp+xml": { "source": "iana" }, "application/dssc+der": { "source": "iana", "extensions": ["dssc"] }, "application/dssc+xml": { "source": "iana", "extensions": ["xdssc"] }, "application/dvcs": { "source": "iana" }, "application/ecmascript": { "source": "iana", "compressible": true, "extensions": ["ecma"] }, "application/edi-consent": { "source": "iana" }, "application/edi-x12": { "source": "iana", "compressible": false }, "application/edifact": { "source": "iana", "compressible": false }, "application/emma+xml": { "source": "iana", "extensions": ["emma"] }, "application/emotionml+xml": { "source": "iana" }, "application/encaprtp": { "source": "iana" }, "application/epp+xml": { "source": "iana" }, "application/epub+zip": { "source": "iana", "extensions": ["epub"] }, "application/eshop": { "source": "iana" }, "application/example": { "source": "iana" }, "application/exi": { "source": "iana", "extensions": ["exi"] }, "application/fastinfoset": { "source": "iana" }, "application/fastsoap": { "source": "iana" }, "application/fdt+xml": { "source": "iana" }, "application/fits": { "source": "iana" }, "application/font-sfnt": { "source": "iana" }, "application/font-tdpfr": { "source": "iana", "extensions": ["pfr"] }, "application/font-woff": { "source": "iana", "compressible": false, "extensions": ["woff"] }, "application/font-woff2": { "compressible": false, "extensions": ["woff2"] }, "application/framework-attributes+xml": { "source": "iana" }, "application/gml+xml": { "source": "apache", "extensions": ["gml"] }, "application/gpx+xml": { "source": "apache", "extensions": ["gpx"] }, "application/gxf": { "source": "apache", "extensions": ["gxf"] }, "application/gzip": { "source": "iana", "compressible": false }, "application/h224": { "source": "iana" }, "application/held+xml": { "source": "iana" }, "application/http": { "source": "iana" }, "application/hyperstudio": { "source": "iana", "extensions": ["stk"] }, "application/ibe-key-request+xml": { "source": "iana" }, "application/ibe-pkg-reply+xml": { "source": "iana" }, "application/ibe-pp-data": { "source": "iana" }, "application/iges": { "source": "iana" }, "application/im-iscomposing+xml": { "source": "iana" }, "application/index": { "source": "iana" }, "application/index.cmd": { "source": "iana" }, "application/index.obj": { "source": "iana" }, "application/index.response": { "source": "iana" }, "application/index.vnd": { "source": "iana" }, "application/inkml+xml": { "source": "iana", "extensions": ["ink","inkml"] }, "application/iotp": { "source": "iana" }, "application/ipfix": { "source": "iana", "extensions": ["ipfix"] }, "application/ipp": { "source": "iana" }, "application/isup": { "source": "iana" }, "application/its+xml": { "source": "iana" }, "application/java-archive": { "source": "apache", "compressible": false, "extensions": ["jar"] }, "application/java-serialized-object": { "source": "apache", "compressible": false, "extensions": ["ser"] }, "application/java-vm": { "source": "apache", "compressible": false, "extensions": ["class"] }, "application/javascript": { "source": "iana", "charset": "UTF-8", "compressible": true, "extensions": ["js"] }, "application/jrd+json": { "source": "iana", "compressible": true }, "application/json": { "source": "iana", "charset": "UTF-8", "compressible": true, "extensions": ["json","map"] }, "application/json-patch+json": { "source": "iana", "compressible": true }, "application/jsonml+json": { "source": "apache", "compressible": true, "extensions": ["jsonml"] }, "application/kpml-request+xml": { "source": "iana" }, "application/kpml-response+xml": { "source": "iana" }, "application/ld+json": { "source": "iana", "compressible": true }, "application/link-format": { "source": "iana" }, "application/load-control+xml": { "source": "iana" }, "application/lost+xml": { "source": "iana", "extensions": ["lostxml"] }, "application/lostsync+xml": { "source": "iana" }, "application/lxf": { "source": "iana" }, "application/mac-binhex40": { "source": "iana", "extensions": ["hqx"] }, "application/mac-compactpro": { "source": "apache", "extensions": ["cpt"] }, "application/macwriteii": { "source": "iana" }, "application/mads+xml": { "source": "iana", "extensions": ["mads"] }, "application/marc": { "source": "iana", "extensions": ["mrc"] }, "application/marcxml+xml": { "source": "iana", "extensions": ["mrcx"] }, "application/mathematica": { "source": "iana", "extensions": ["ma","nb","mb"] }, "application/mathml+xml": { "source": "iana", "extensions": ["mathml"] }, "application/mathml-content+xml": { "source": "iana" }, "application/mathml-presentation+xml": { "source": "iana" }, "application/mbms-associated-procedure-description+xml": { "source": "iana" }, "application/mbms-deregister+xml": { "source": "iana" }, "application/mbms-envelope+xml": { "source": "iana" }, "application/mbms-msk+xml": { "source": "iana" }, "application/mbms-msk-response+xml": { "source": "iana" }, "application/mbms-protection-description+xml": { "source": "iana" }, "application/mbms-reception-report+xml": { "source": "iana" }, "application/mbms-register+xml": { "source": "iana" }, "application/mbms-register-response+xml": { "source": "iana" }, "application/mbms-schedule+xml": { "source": "iana" }, "application/mbms-user-service-description+xml": { "source": "iana" }, "application/mbox": { "source": "apache", "extensions": ["mbox"] }, "application/mbox+xml": { "source": "iana" }, "application/media-policy-dataset+xml": { "source": "iana" }, "application/media_control+xml": { "source": "iana" }, "application/mediaservercontrol+xml": { "source": "iana", "extensions": ["mscml"] }, "application/merge-patch+json": { "source": "iana", "compressible": true }, "application/metalink+xml": { "source": "apache", "extensions": ["metalink"] }, "application/metalink4+xml": { "source": "iana", "extensions": ["meta4"] }, "application/mets+xml": { "source": "iana", "extensions": ["mets"] }, "application/mf4": { "source": "iana" }, "application/mikey": { "source": "iana" }, "application/mods+xml": { "source": "iana", "extensions": ["mods"] }, "application/moss-keys": { "source": "iana" }, "application/moss-signature": { "source": "iana" }, "application/mosskey-data": { "source": "iana" }, "application/mosskey-request": { "source": "iana" }, "application/mp21": { "source": "iana", "extensions": ["m21","mp21"] }, "application/mp4": { "source": "iana", "extensions": ["mp4s","m4p"] }, "application/mpeg4-generic": { "source": "iana" }, "application/mpeg4-iod": { "source": "iana" }, "application/mpeg4-iod-xmt": { "source": "iana" }, "application/mrb-consumer+xml": { "source": "iana" }, "application/mrb-publish+xml": { "source": "iana" }, "application/msc-ivr+xml": { "source": "iana" }, "application/msc-mixer+xml": { "source": "iana" }, "application/msword": { "source": "iana", "compressible": false, "extensions": ["doc","dot"] }, "application/mxf": { "source": "iana", "extensions": ["mxf"] }, "application/nasdata": { "source": "iana" }, "application/news-checkgroups": { "source": "iana" }, "application/news-groupinfo": { "source": "iana" }, "application/news-transmission": { "source": "iana" }, "application/nlsml+xml": { "source": "iana" }, "application/nss": { "source": "iana" }, "application/ocsp-request": { "source": "iana" }, "application/ocsp-response": { "source": "apache" }, "application/octet-stream": { "source": "iana", "compressible": false, "extensions": ["bin","dms","lrf","mar","so","dist","distz","pkg","bpk","dump","elc","deploy","buffer"] }, "application/oda": { "source": "iana", "extensions": ["oda"] }, "application/odx": { "source": "iana" }, "application/oebps-package+xml": { "source": "iana", "extensions": ["opf"] }, "application/ogg": { "source": "iana", "compressible": false, "extensions": ["ogx"] }, "application/omdoc+xml": { "source": "apache", "extensions": ["omdoc"] }, "application/onenote": { "source": "apache", "extensions": ["onetoc","onetoc2","onetmp","onepkg"] }, "application/oscp-response": { "source": "iana" }, "application/oxps": { "source": "iana", "extensions": ["oxps"] }, "application/p2p-overlay+xml": { "source": "iana" }, "application/parityfec": { "source": "iana" }, "application/patch-ops-error+xml": { "source": "iana", "extensions": ["xer"] }, "application/pdf": { "source": "iana", "compressible": false, "extensions": ["pdf"] }, "application/pdx": { "source": "iana" }, "application/pgp-encrypted": { "source": "iana", "compressible": false, "extensions": ["pgp"] }, "application/pgp-keys": { "source": "iana" }, "application/pgp-signature": { "source": "iana", "extensions": ["asc","sig"] }, "application/pics-rules": { "source": "apache", "extensions": ["prf"] }, "application/pidf+xml": { "source": "iana" }, "application/pidf-diff+xml": { "source": "iana" }, "application/pkcs10": { "source": "iana", "extensions": ["p10"] }, "application/pkcs7-mime": { "source": "iana", "extensions": ["p7m","p7c"] }, "application/pkcs7-signature": { "source": "iana", "extensions": ["p7s"] }, "application/pkcs8": { "source": "iana", "extensions": ["p8"] }, "application/pkix-attr-cert": { "source": "iana", "extensions": ["ac"] }, "application/pkix-cert": { "source": "iana", "extensions": ["cer"] }, "application/pkix-crl": { "source": "iana", "extensions": ["crl"] }, "application/pkix-pkipath": { "source": "iana", "extensions": ["pkipath"] }, "application/pkixcmp": { "source": "iana", "extensions": ["pki"] }, "application/pls+xml": { "source": "iana", "extensions": ["pls"] }, "application/poc-settings+xml": { "source": "iana" }, "application/postscript": { "source": "iana", "compressible": true, "extensions": ["ai","eps","ps"] }, "application/provenance+xml": { "source": "iana" }, "application/prs.alvestrand.titrax-sheet": { "source": "iana" }, "application/prs.cww": { "source": "iana", "extensions": ["cww"] }, "application/prs.hpub+zip": { "source": "iana" }, "application/prs.nprend": { "source": "iana" }, "application/prs.plucker": { "source": "iana" }, "application/prs.rdf-xml-crypt": { "source": "iana" }, "application/prs.xsf+xml": { "source": "iana" }, "application/pskc+xml": { "source": "iana", "extensions": ["pskcxml"] }, "application/qsig": { "source": "iana" }, "application/raptorfec": { "source": "iana" }, "application/rdf+xml": { "source": "iana", "compressible": true, "extensions": ["rdf"] }, "application/reginfo+xml": { "source": "iana", "extensions": ["rif"] }, "application/relax-ng-compact-syntax": { "source": "iana", "extensions": ["rnc"] }, "application/remote-printing": { "source": "iana" }, "application/reputon+json": { "source": "iana", "compressible": true }, "application/resource-lists+xml": { "source": "iana", "extensions": ["rl"] }, "application/resource-lists-diff+xml": { "source": "iana", "extensions": ["rld"] }, "application/riscos": { "source": "iana" }, "application/rlmi+xml": { "source": "iana" }, "application/rls-services+xml": { "source": "iana", "extensions": ["rs"] }, "application/rpki-ghostbusters": { "source": "iana", "extensions": ["gbr"] }, "application/rpki-manifest": { "source": "iana", "extensions": ["mft"] }, "application/rpki-roa": { "source": "iana", "extensions": ["roa"] }, "application/rpki-updown": { "source": "iana" }, "application/rsd+xml": { "source": "apache", "extensions": ["rsd"] }, "application/rss+xml": { "source": "apache", "compressible": true, "extensions": ["rss"] }, "application/rtf": { "source": "iana", "compressible": true, "extensions": ["rtf"] }, "application/rtploopback": { "source": "iana" }, "application/rtx": { "source": "iana" }, "application/samlassertion+xml": { "source": "iana" }, "application/samlmetadata+xml": { "source": "iana" }, "application/sbml+xml": { "source": "iana", "extensions": ["sbml"] }, "application/scaip+xml": { "source": "iana" }, "application/scvp-cv-request": { "source": "iana", "extensions": ["scq"] }, "application/scvp-cv-response": { "source": "iana", "extensions": ["scs"] }, "application/scvp-vp-request": { "source": "iana", "extensions": ["spq"] }, "application/scvp-vp-response": { "source": "iana", "extensions": ["spp"] }, "application/sdp": { "source": "iana", "extensions": ["sdp"] }, "application/sep+xml": { "source": "iana" }, "application/sep-exi": { "source": "iana" }, "application/session-info": { "source": "iana" }, "application/set-payment": { "source": "iana" }, "application/set-payment-initiation": { "source": "iana", "extensions": ["setpay"] }, "application/set-registration": { "source": "iana" }, "application/set-registration-initiation": { "source": "iana", "extensions": ["setreg"] }, "application/sgml": { "source": "iana" }, "application/sgml-open-catalog": { "source": "iana" }, "application/shf+xml": { "source": "iana", "extensions": ["shf"] }, "application/sieve": { "source": "iana" }, "application/simple-filter+xml": { "source": "iana" }, "application/simple-message-summary": { "source": "iana" }, "application/simplesymbolcontainer": { "source": "iana" }, "application/slate": { "source": "iana" }, "application/smil": { "source": "iana" }, "application/smil+xml": { "source": "iana", "extensions": ["smi","smil"] }, "application/smpte336m": { "source": "iana" }, "application/soap+fastinfoset": { "source": "iana" }, "application/soap+xml": { "source": "iana", "compressible": true }, "application/sparql-query": { "source": "iana", "extensions": ["rq"] }, "application/sparql-results+xml": { "source": "iana", "extensions": ["srx"] }, "application/spirits-event+xml": { "source": "iana" }, "application/sql": { "source": "iana" }, "application/srgs": { "source": "iana", "extensions": ["gram"] }, "application/srgs+xml": { "source": "iana", "extensions": ["grxml"] }, "application/sru+xml": { "source": "iana", "extensions": ["sru"] }, "application/ssdl+xml": { "source": "apache", "extensions": ["ssdl"] }, "application/ssml+xml": { "source": "iana", "extensions": ["ssml"] }, "application/tamp-apex-update": { "source": "iana" }, "application/tamp-apex-update-confirm": { "source": "iana" }, "application/tamp-community-update": { "source": "iana" }, "application/tamp-community-update-confirm": { "source": "iana" }, "application/tamp-error": { "source": "iana" }, "application/tamp-sequence-adjust": { "source": "iana" }, "application/tamp-sequence-adjust-confirm": { "source": "iana" }, "application/tamp-status-query": { "source": "iana" }, "application/tamp-status-response": { "source": "iana" }, "application/tamp-update": { "source": "iana" }, "application/tamp-update-confirm": { "source": "iana" }, "application/tar": { "compressible": true }, "application/tei+xml": { "source": "iana", "extensions": ["tei","teicorpus"] }, "application/thraud+xml": { "source": "iana", "extensions": ["tfi"] }, "application/timestamp-query": { "source": "iana" }, "application/timestamp-reply": { "source": "iana" }, "application/timestamped-data": { "source": "iana", "extensions": ["tsd"] }, "application/ttml+xml": { "source": "iana" }, "application/tve-trigger": { "source": "iana" }, "application/ulpfec": { "source": "iana" }, "application/urc-grpsheet+xml": { "source": "iana" }, "application/urc-ressheet+xml": { "source": "iana" }, "application/urc-targetdesc+xml": { "source": "iana" }, "application/urc-uisocketdesc+xml": { "source": "iana" }, "application/vcard+json": { "source": "iana", "compressible": true }, "application/vcard+xml": { "source": "iana" }, "application/vemmi": { "source": "iana" }, "application/vividence.scriptfile": { "source": "apache" }, "application/vnd-acucobol": { "source": "iana" }, "application/vnd-curl": { "source": "iana" }, "application/vnd-dart": { "source": "iana" }, "application/vnd-dxr": { "source": "iana" }, "application/vnd-fdf": { "source": "iana" }, "application/vnd-mif": { "source": "iana" }, "application/vnd-sema": { "source": "iana" }, "application/vnd-wap-wmlc": { "source": "iana" }, "application/vnd.3gpp.bsf+xml": { "source": "iana" }, "application/vnd.3gpp.pic-bw-large": { "source": "iana", "extensions": ["plb"] }, "application/vnd.3gpp.pic-bw-small": { "source": "iana", "extensions": ["psb"] }, "application/vnd.3gpp.pic-bw-var": { "source": "iana", "extensions": ["pvb"] }, "application/vnd.3gpp.sms": { "source": "iana" }, "application/vnd.3gpp2.bcmcsinfo+xml": { "source": "iana" }, "application/vnd.3gpp2.sms": { "source": "iana" }, "application/vnd.3gpp2.tcap": { "source": "iana", "extensions": ["tcap"] }, "application/vnd.3m.post-it-notes": { "source": "iana", "extensions": ["pwn"] }, "application/vnd.accpac.simply.aso": { "source": "iana", "extensions": ["aso"] }, "application/vnd.accpac.simply.imp": { "source": "iana", "extensions": ["imp"] }, "application/vnd.acucobol": { "source": "apache", "extensions": ["acu"] }, "application/vnd.acucorp": { "source": "iana", "extensions": ["atc","acutc"] }, "application/vnd.adobe.air-application-installer-package+zip": { "source": "apache", "extensions": ["air"] }, "application/vnd.adobe.flash-movie": { "source": "iana" }, "application/vnd.adobe.formscentral.fcdt": { "source": "iana", "extensions": ["fcdt"] }, "application/vnd.adobe.fxp": { "source": "iana", "extensions": ["fxp","fxpl"] }, "application/vnd.adobe.partial-upload": { "source": "iana" }, "application/vnd.adobe.xdp+xml": { "source": "iana", "extensions": ["xdp"] }, "application/vnd.adobe.xfdf": { "source": "iana", "extensions": ["xfdf"] }, "application/vnd.aether.imp": { "source": "iana" }, "application/vnd.ah-barcode": { "source": "iana" }, "application/vnd.ahead.space": { "source": "iana", "extensions": ["ahead"] }, "application/vnd.airzip.filesecure.azf": { "source": "iana", "extensions": ["azf"] }, "application/vnd.airzip.filesecure.azs": { "source": "iana", "extensions": ["azs"] }, "application/vnd.amazon.ebook": { "source": "apache", "extensions": ["azw"] }, "application/vnd.americandynamics.acc": { "source": "iana", "extensions": ["acc"] }, "application/vnd.amiga.ami": { "source": "iana", "extensions": ["ami"] }, "application/vnd.amundsen.maze+xml": { "source": "iana" }, "application/vnd.android.package-archive": { "source": "apache", "compressible": false, "extensions": ["apk"] }, "application/vnd.anser-web-certificate-issue-initiation": { "source": "iana", "extensions": ["cii"] }, "application/vnd.anser-web-funds-transfer-initiation": { "source": "apache", "extensions": ["fti"] }, "application/vnd.antix.game-component": { "source": "iana", "extensions": ["atx"] }, "application/vnd.apache.thrift.binary": { "source": "iana" }, "application/vnd.apache.thrift.compact": { "source": "iana" }, "application/vnd.apache.thrift.json": { "source": "iana" }, "application/vnd.api+json": { "source": "iana", "compressible": true }, "application/vnd.apple.installer+xml": { "source": "iana", "extensions": ["mpkg"] }, "application/vnd.apple.mpegurl": { "source": "iana", "extensions": ["m3u8"] }, "application/vnd.arastra.swi": { "source": "iana" }, "application/vnd.aristanetworks.swi": { "source": "iana", "extensions": ["swi"] }, "application/vnd.artsquare": { "source": "iana" }, "application/vnd.astraea-software.iota": { "source": "iana", "extensions": ["iota"] }, "application/vnd.audiograph": { "source": "iana", "extensions": ["aep"] }, "application/vnd.autopackage": { "source": "iana" }, "application/vnd.avistar+xml": { "source": "iana" }, "application/vnd.balsamiq.bmml+xml": { "source": "iana" }, "application/vnd.bekitzur-stech+json": { "source": "iana", "compressible": true }, "application/vnd.blueice.multipass": { "source": "iana", "extensions": ["mpm"] }, "application/vnd.bluetooth.ep.oob": { "source": "iana" }, "application/vnd.bluetooth.le.oob": { "source": "iana" }, "application/vnd.bmi": { "source": "iana", "extensions": ["bmi"] }, "application/vnd.businessobjects": { "source": "iana", "extensions": ["rep"] }, "application/vnd.cab-jscript": { "source": "iana" }, "application/vnd.canon-cpdl": { "source": "iana" }, "application/vnd.canon-lips": { "source": "iana" }, "application/vnd.cendio.thinlinc.clientconf": { "source": "iana" }, "application/vnd.century-systems.tcp_stream": { "source": "iana" }, "application/vnd.chemdraw+xml": { "source": "iana", "extensions": ["cdxml"] }, "application/vnd.chipnuts.karaoke-mmd": { "source": "iana", "extensions": ["mmd"] }, "application/vnd.cinderella": { "source": "iana", "extensions": ["cdy"] }, "application/vnd.cirpack.isdn-ext": { "source": "iana" }, "application/vnd.claymore": { "source": "iana", "extensions": ["cla"] }, "application/vnd.cloanto.rp9": { "source": "iana", "extensions": ["rp9"] }, "application/vnd.clonk.c4group": { "source": "iana", "extensions": ["c4g","c4d","c4f","c4p","c4u"] }, "application/vnd.cluetrust.cartomobile-config": { "source": "iana", "extensions": ["c11amc"] }, "application/vnd.cluetrust.cartomobile-config-pkg": { "source": "iana", "extensions": ["c11amz"] }, "application/vnd.coffeescript": { "source": "iana" }, "application/vnd.collection+json": { "source": "iana", "compressible": true }, "application/vnd.collection.doc+json": { "source": "iana", "compressible": true }, "application/vnd.collection.next+json": { "source": "iana", "compressible": true }, "application/vnd.commerce-battelle": { "source": "iana" }, "application/vnd.commonspace": { "source": "iana", "extensions": ["csp"] }, "application/vnd.contact.cmsg": { "source": "iana", "extensions": ["cdbcmsg"] }, "application/vnd.cosmocaller": { "source": "iana", "extensions": ["cmc"] }, "application/vnd.crick.clicker": { "source": "iana", "extensions": ["clkx"] }, "application/vnd.crick.clicker.keyboard": { "source": "iana", "extensions": ["clkk"] }, "application/vnd.crick.clicker.palette": { "source": "iana", "extensions": ["clkp"] }, "application/vnd.crick.clicker.template": { "source": "iana", "extensions": ["clkt"] }, "application/vnd.crick.clicker.wordbank": { "source": "iana", "extensions": ["clkw"] }, "application/vnd.criticaltools.wbs+xml": { "source": "iana", "extensions": ["wbs"] }, "application/vnd.ctc-posml": { "source": "iana", "extensions": ["pml"] }, "application/vnd.ctct.ws+xml": { "source": "iana" }, "application/vnd.cups-pdf": { "source": "iana" }, "application/vnd.cups-postscript": { "source": "iana" }, "application/vnd.cups-ppd": { "source": "iana", "extensions": ["ppd"] }, "application/vnd.cups-raster": { "source": "iana" }, "application/vnd.cups-raw": { "source": "iana" }, "application/vnd.curl": { "source": "apache" }, "application/vnd.curl.car": { "source": "apache", "extensions": ["car"] }, "application/vnd.curl.pcurl": { "source": "apache", "extensions": ["pcurl"] }, "application/vnd.cyan.dean.root+xml": { "source": "iana" }, "application/vnd.cybank": { "source": "iana" }, "application/vnd.dart": { "source": "apache", "compressible": true, "extensions": ["dart"] }, "application/vnd.data-vision.rdz": { "source": "iana", "extensions": ["rdz"] }, "application/vnd.debian.binary-package": { "source": "iana" }, "application/vnd.dece-zip": { "source": "iana" }, "application/vnd.dece.data": { "source": "iana", "extensions": ["uvf","uvvf","uvd","uvvd"] }, "application/vnd.dece.ttml+xml": { "source": "iana", "extensions": ["uvt","uvvt"] }, "application/vnd.dece.unspecified": { "source": "iana", "extensions": ["uvx","uvvx"] }, "application/vnd.dece.zip": { "source": "apache", "extensions": ["uvz","uvvz"] }, "application/vnd.denovo.fcselayout-link": { "source": "iana", "extensions": ["fe_launch"] }, "application/vnd.desmume-movie": { "source": "iana" }, "application/vnd.dir-bi.plate-dl-nosuffix": { "source": "iana" }, "application/vnd.dm.delegation+xml": { "source": "iana" }, "application/vnd.dna": { "source": "iana", "extensions": ["dna"] }, "application/vnd.document+json": { "source": "iana", "compressible": true }, "application/vnd.dolby.mlp": { "source": "apache", "extensions": ["mlp"] }, "application/vnd.dolby.mobile.1": { "source": "iana" }, "application/vnd.dolby.mobile.2": { "source": "iana" }, "application/vnd.doremir.scorecloud-binary-document": { "source": "iana" }, "application/vnd.dpgraph": { "source": "iana", "extensions": ["dpg"] }, "application/vnd.dreamfactory": { "source": "iana", "extensions": ["dfac"] }, "application/vnd.ds-keypoint": { "source": "apache", "extensions": ["kpxx"] }, "application/vnd.dtg.local": { "source": "iana" }, "application/vnd.dtg.local.flash": { "source": "iana" }, "application/vnd.dtg.local.html": { "source": "iana" }, "application/vnd.dvb.ait": { "source": "iana", "extensions": ["ait"] }, "application/vnd.dvb.dvbj": { "source": "iana" }, "application/vnd.dvb.esgcontainer": { "source": "iana" }, "application/vnd.dvb.ipdcdftnotifaccess": { "source": "iana" }, "application/vnd.dvb.ipdcesgaccess": { "source": "iana" }, "application/vnd.dvb.ipdcesgaccess2": { "source": "iana" }, "application/vnd.dvb.ipdcesgpdd": { "source": "iana" }, "application/vnd.dvb.ipdcroaming": { "source": "iana" }, "application/vnd.dvb.iptv.alfec-base": { "source": "iana" }, "application/vnd.dvb.iptv.alfec-enhancement": { "source": "iana" }, "application/vnd.dvb.notif-aggregate-root+xml": { "source": "iana" }, "application/vnd.dvb.notif-container+xml": { "source": "iana" }, "application/vnd.dvb.notif-generic+xml": { "source": "iana" }, "application/vnd.dvb.notif-ia-msglist+xml": { "source": "iana" }, "application/vnd.dvb.notif-ia-registration-request+xml": { "source": "iana" }, "application/vnd.dvb.notif-ia-registration-response+xml": { "source": "iana" }, "application/vnd.dvb.notif-init+xml": { "source": "iana" }, "application/vnd.dvb.pfr": { "source": "iana" }, "application/vnd.dvb.service": { "source": "apache", "extensions": ["svc"] }, "application/vnd.dvb_service": { "source": "iana" }, "application/vnd.dxr": { "source": "apache" }, "application/vnd.dynageo": { "source": "iana", "extensions": ["geo"] }, "application/vnd.dzr": { "source": "iana" }, "application/vnd.easykaraoke.cdgdownload": { "source": "iana" }, "application/vnd.ecdis-update": { "source": "iana" }, "application/vnd.ecowin.chart": { "source": "iana", "extensions": ["mag"] }, "application/vnd.ecowin.filerequest": { "source": "iana" }, "application/vnd.ecowin.fileupdate": { "source": "iana" }, "application/vnd.ecowin.series": { "source": "iana" }, "application/vnd.ecowin.seriesrequest": { "source": "iana" }, "application/vnd.ecowin.seriesupdate": { "source": "iana" }, "application/vnd.emclient.accessrequest+xml": { "source": "iana" }, "application/vnd.enliven": { "source": "iana", "extensions": ["nml"] }, "application/vnd.enphase.envoy": { "source": "iana" }, "application/vnd.eprints.data+xml": { "source": "iana" }, "application/vnd.epson.esf": { "source": "iana", "extensions": ["esf"] }, "application/vnd.epson.msf": { "source": "iana", "extensions": ["msf"] }, "application/vnd.epson.quickanime": { "source": "iana", "extensions": ["qam"] }, "application/vnd.epson.salt": { "source": "iana", "extensions": ["slt"] }, "application/vnd.epson.ssf": { "source": "iana", "extensions": ["ssf"] }, "application/vnd.ericsson.quickcall": { "source": "iana" }, "application/vnd.eszigno3+xml": { "source": "iana", "extensions": ["es3","et3"] }, "application/vnd.etsi.aoc+xml": { "source": "iana" }, "application/vnd.etsi.asic-e+zip": { "source": "iana" }, "application/vnd.etsi.asic-s+zip": { "source": "iana" }, "application/vnd.etsi.cug+xml": { "source": "iana" }, "application/vnd.etsi.iptvcommand+xml": { "source": "iana" }, "application/vnd.etsi.iptvdiscovery+xml": { "source": "iana" }, "application/vnd.etsi.iptvprofile+xml": { "source": "iana" }, "application/vnd.etsi.iptvsad-bc+xml": { "source": "iana" }, "application/vnd.etsi.iptvsad-cod+xml": { "source": "iana" }, "application/vnd.etsi.iptvsad-npvr+xml": { "source": "iana" }, "application/vnd.etsi.iptvservice+xml": { "source": "iana" }, "application/vnd.etsi.iptvsync+xml": { "source": "iana" }, "application/vnd.etsi.iptvueprofile+xml": { "source": "iana" }, "application/vnd.etsi.mcid+xml": { "source": "iana" }, "application/vnd.etsi.mheg5": { "source": "iana" }, "application/vnd.etsi.overload-control-policy-dataset+xml": { "source": "iana" }, "application/vnd.etsi.pstn+xml": { "source": "iana" }, "application/vnd.etsi.sci+xml": { "source": "iana" }, "application/vnd.etsi.simservs+xml": { "source": "iana" }, "application/vnd.etsi.timestamp-token": { "source": "iana" }, "application/vnd.etsi.tsl+xml": { "source": "iana" }, "application/vnd.etsi.tsl.der": { "source": "iana" }, "application/vnd.eudora.data": { "source": "iana" }, "application/vnd.ezpix-album": { "source": "iana", "extensions": ["ez2"] }, "application/vnd.ezpix-package": { "source": "iana", "extensions": ["ez3"] }, "application/vnd.f-secure.mobile": { "source": "iana" }, "application/vnd.fdf": { "source": "apache", "extensions": ["fdf"] }, "application/vnd.fdsn.mseed": { "source": "iana", "extensions": ["mseed"] }, "application/vnd.fdsn.seed": { "source": "iana", "extensions": ["seed","dataless"] }, "application/vnd.ffsns": { "source": "iana" }, "application/vnd.fints": { "source": "iana" }, "application/vnd.flographit": { "source": "iana", "extensions": ["gph"] }, "application/vnd.fluxtime.clip": { "source": "iana", "extensions": ["ftc"] }, "application/vnd.font-fontforge-sfd": { "source": "iana" }, "application/vnd.framemaker": { "source": "iana", "extensions": ["fm","frame","maker","book"] }, "application/vnd.frogans.fnc": { "source": "iana", "extensions": ["fnc"] }, "application/vnd.frogans.ltf": { "source": "iana", "extensions": ["ltf"] }, "application/vnd.fsc.weblaunch": { "source": "iana", "extensions": ["fsc"] }, "application/vnd.fujitsu.oasys": { "source": "iana", "extensions": ["oas"] }, "application/vnd.fujitsu.oasys2": { "source": "iana", "extensions": ["oa2"] }, "application/vnd.fujitsu.oasys3": { "source": "iana", "extensions": ["oa3"] }, "application/vnd.fujitsu.oasysgp": { "source": "iana", "extensions": ["fg5"] }, "application/vnd.fujitsu.oasysprs": { "source": "iana", "extensions": ["bh2"] }, "application/vnd.fujixerox.art-ex": { "source": "iana" }, "application/vnd.fujixerox.art4": { "source": "iana" }, "application/vnd.fujixerox.ddd": { "source": "iana", "extensions": ["ddd"] }, "application/vnd.fujixerox.docuworks": { "source": "iana", "extensions": ["xdw"] }, "application/vnd.fujixerox.docuworks.binder": { "source": "iana", "extensions": ["xbd"] }, "application/vnd.fujixerox.docuworks.container": { "source": "iana" }, "application/vnd.fujixerox.hbpl": { "source": "iana" }, "application/vnd.fut-misnet": { "source": "iana" }, "application/vnd.fuzzysheet": { "source": "iana", "extensions": ["fzs"] }, "application/vnd.genomatix.tuxedo": { "source": "iana", "extensions": ["txd"] }, "application/vnd.geo+json": { "source": "iana", "compressible": true }, "application/vnd.geocube+xml": { "source": "iana" }, "application/vnd.geogebra.file": { "source": "iana", "extensions": ["ggb"] }, "application/vnd.geogebra.tool": { "source": "iana", "extensions": ["ggt"] }, "application/vnd.geometry-explorer": { "source": "iana", "extensions": ["gex","gre"] }, "application/vnd.geonext": { "source": "iana", "extensions": ["gxt"] }, "application/vnd.geoplan": { "source": "iana", "extensions": ["g2w"] }, "application/vnd.geospace": { "source": "iana", "extensions": ["g3w"] }, "application/vnd.globalplatform.card-content-mgt": { "source": "iana" }, "application/vnd.globalplatform.card-content-mgt-response": { "source": "iana" }, "application/vnd.gmx": { "source": "iana", "extensions": ["gmx"] }, "application/vnd.google-earth.kml+xml": { "source": "iana", "compressible": true, "extensions": ["kml"] }, "application/vnd.google-earth.kmz": { "source": "iana", "compressible": false, "extensions": ["kmz"] }, "application/vnd.gov.sk.e-form+zip": { "source": "iana" }, "application/vnd.grafeq": { "source": "iana", "extensions": ["gqf","gqs"] }, "application/vnd.gridmp": { "source": "iana" }, "application/vnd.groove-account": { "source": "iana", "extensions": ["gac"] }, "application/vnd.groove-help": { "source": "iana", "extensions": ["ghf"] }, "application/vnd.groove-identity-message": { "source": "iana", "extensions": ["gim"] }, "application/vnd.groove-injector": { "source": "iana", "extensions": ["grv"] }, "application/vnd.groove-tool-message": { "source": "iana", "extensions": ["gtm"] }, "application/vnd.groove-tool-template": { "source": "iana", "extensions": ["tpl"] }, "application/vnd.groove-vcard": { "source": "iana", "extensions": ["vcg"] }, "application/vnd.hal+json": { "source": "iana", "compressible": true }, "application/vnd.hal+xml": { "source": "iana", "extensions": ["hal"] }, "application/vnd.handheld-entertainment+xml": { "source": "iana", "extensions": ["zmm"] }, "application/vnd.hbci": { "source": "iana", "extensions": ["hbci"] }, "application/vnd.hcl-bireports": { "source": "iana" }, "application/vnd.heroku+json": { "source": "iana", "compressible": true }, "application/vnd.hhe.lesson-player": { "source": "iana", "extensions": ["les"] }, "application/vnd.hp-hpgl": { "source": "iana", "extensions": ["hpgl"] }, "application/vnd.hp-hpid": { "source": "iana", "extensions": ["hpid"] }, "application/vnd.hp-hps": { "source": "iana", "extensions": ["hps"] }, "application/vnd.hp-jlyt": { "source": "iana", "extensions": ["jlt"] }, "application/vnd.hp-pcl": { "source": "iana", "extensions": ["pcl"] }, "application/vnd.hp-pclxl": { "source": "iana", "extensions": ["pclxl"] }, "application/vnd.httphone": { "source": "iana" }, "application/vnd.hydrostatix.sof-data": { "source": "iana" }, "application/vnd.hzn-3d-crossword": { "source": "iana" }, "application/vnd.ibm.afplinedata": { "source": "iana" }, "application/vnd.ibm.electronic-media": { "source": "iana" }, "application/vnd.ibm.minipay": { "source": "iana", "extensions": ["mpy"] }, "application/vnd.ibm.modcap": { "source": "iana", "extensions": ["afp","listafp","list3820"] }, "application/vnd.ibm.rights-management": { "source": "iana", "extensions": ["irm"] }, "application/vnd.ibm.secure-container": { "source": "iana", "extensions": ["sc"] }, "application/vnd.iccprofile": { "source": "iana", "extensions": ["icc","icm"] }, "application/vnd.ieee.1905": { "source": "iana" }, "application/vnd.igloader": { "source": "iana", "extensions": ["igl"] }, "application/vnd.immervision-ivp": { "source": "iana", "extensions": ["ivp"] }, "application/vnd.immervision-ivu": { "source": "iana", "extensions": ["ivu"] }, "application/vnd.ims.imsccv1p1": { "source": "iana" }, "application/vnd.ims.lis.v2.result+json": { "source": "iana", "compressible": true }, "application/vnd.ims.lti.v2.toolconsumerprofile+json": { "source": "iana", "compressible": true }, "application/vnd.ims.lti.v2.toolproxy+json": { "source": "iana", "compressible": true }, "application/vnd.ims.lti.v2.toolproxy.id+json": { "source": "iana", "compressible": true }, "application/vnd.ims.lti.v2.toolsettings+json": { "source": "iana", "compressible": true }, "application/vnd.ims.lti.v2.toolsettings.simple+json": { "source": "iana", "compressible": true }, "application/vnd.informedcontrol.rms+xml": { "source": "iana" }, "application/vnd.informix-visionary": { "source": "iana" }, "application/vnd.infotech.project": { "source": "iana" }, "application/vnd.infotech.project+xml": { "source": "iana" }, "application/vnd.innopath.wamp.notification": { "source": "iana" }, "application/vnd.insors.igm": { "source": "iana", "extensions": ["igm"] }, "application/vnd.intercon.formnet": { "source": "iana", "extensions": ["xpw","xpx"] }, "application/vnd.intergeo": { "source": "iana", "extensions": ["i2g"] }, "application/vnd.intertrust.digibox": { "source": "iana" }, "application/vnd.intertrust.nncp": { "source": "iana" }, "application/vnd.intu.qbo": { "source": "iana", "extensions": ["qbo"] }, "application/vnd.intu.qfx": { "source": "iana", "extensions": ["qfx"] }, "application/vnd.iptc.g2.catalogitem+xml": { "source": "iana" }, "application/vnd.iptc.g2.conceptitem+xml": { "source": "iana" }, "application/vnd.iptc.g2.knowledgeitem+xml": { "source": "iana" }, "application/vnd.iptc.g2.newsitem+xml": { "source": "iana" }, "application/vnd.iptc.g2.newsmessage+xml": { "source": "iana" }, "application/vnd.iptc.g2.packageitem+xml": { "source": "iana" }, "application/vnd.iptc.g2.planningitem+xml": { "source": "iana" }, "application/vnd.ipunplugged.rcprofile": { "source": "iana", "extensions": ["rcprofile"] }, "application/vnd.irepository.package+xml": { "source": "iana", "extensions": ["irp"] }, "application/vnd.is-xpr": { "source": "iana", "extensions": ["xpr"] }, "application/vnd.isac.fcs": { "source": "iana", "extensions": ["fcs"] }, "application/vnd.jam": { "source": "iana", "extensions": ["jam"] }, "application/vnd.japannet-directory-service": { "source": "iana" }, "application/vnd.japannet-jpnstore-wakeup": { "source": "iana" }, "application/vnd.japannet-payment-wakeup": { "source": "iana" }, "application/vnd.japannet-registration": { "source": "iana" }, "application/vnd.japannet-registration-wakeup": { "source": "iana" }, "application/vnd.japannet-setstore-wakeup": { "source": "iana" }, "application/vnd.japannet-verification": { "source": "iana" }, "application/vnd.japannet-verification-wakeup": { "source": "iana" }, "application/vnd.jcp.javame.midlet-rms": { "source": "iana", "extensions": ["rms"] }, "application/vnd.jisp": { "source": "iana", "extensions": ["jisp"] }, "application/vnd.joost.joda-archive": { "source": "iana", "extensions": ["joda"] }, "application/vnd.jsk.isdn-ngn": { "source": "iana" }, "application/vnd.kahootz": { "source": "iana", "extensions": ["ktz","ktr"] }, "application/vnd.kde.karbon": { "source": "iana", "extensions": ["karbon"] }, "application/vnd.kde.kchart": { "source": "iana", "extensions": ["chrt"] }, "application/vnd.kde.kformula": { "source": "iana", "extensions": ["kfo"] }, "application/vnd.kde.kivio": { "source": "iana", "extensions": ["flw"] }, "application/vnd.kde.kontour": { "source": "iana", "extensions": ["kon"] }, "application/vnd.kde.kpresenter": { "source": "iana", "extensions": ["kpr","kpt"] }, "application/vnd.kde.kspread": { "source": "iana", "extensions": ["ksp"] }, "application/vnd.kde.kword": { "source": "iana", "extensions": ["kwd","kwt"] }, "application/vnd.kenameaapp": { "source": "iana", "extensions": ["htke"] }, "application/vnd.kidspiration": { "source": "iana", "extensions": ["kia"] }, "application/vnd.kinar": { "source": "iana", "extensions": ["kne","knp"] }, "application/vnd.koan": { "source": "iana", "extensions": ["skp","skd","skt","skm"] }, "application/vnd.kodak-descriptor": { "source": "iana", "extensions": ["sse"] }, "application/vnd.las.las+xml": { "source": "iana", "extensions": ["lasxml"] }, "application/vnd.liberty-request+xml": { "source": "iana" }, "application/vnd.llamagraphics.life-balance.desktop": { "source": "iana", "extensions": ["lbd"] }, "application/vnd.llamagraphics.life-balance.exchange+xml": { "source": "iana", "extensions": ["lbe"] }, "application/vnd.lotus-1-2-3": { "source": "iana", "extensions": ["123"] }, "application/vnd.lotus-approach": { "source": "iana", "extensions": ["apr"] }, "application/vnd.lotus-freelance": { "source": "iana", "extensions": ["pre"] }, "application/vnd.lotus-notes": { "source": "iana", "extensions": ["nsf"] }, "application/vnd.lotus-organizer": { "source": "iana", "extensions": ["org"] }, "application/vnd.lotus-screencam": { "source": "iana", "extensions": ["scm"] }, "application/vnd.lotus-wordpro": { "source": "iana", "extensions": ["lwp"] }, "application/vnd.macports.portpkg": { "source": "iana", "extensions": ["portpkg"] }, "application/vnd.marlin.drm.actiontoken+xml": { "source": "iana" }, "application/vnd.marlin.drm.conftoken+xml": { "source": "iana" }, "application/vnd.marlin.drm.license+xml": { "source": "iana" }, "application/vnd.marlin.drm.mdcf": { "source": "iana" }, "application/vnd.mason+json": { "source": "iana", "compressible": true }, "application/vnd.maxmind.maxmind-db": { "source": "iana" }, "application/vnd.mcd": { "source": "iana", "extensions": ["mcd"] }, "application/vnd.medcalcdata": { "source": "iana", "extensions": ["mc1"] }, "application/vnd.mediastation.cdkey": { "source": "iana", "extensions": ["cdkey"] }, "application/vnd.meridian-slingshot": { "source": "iana" }, "application/vnd.mfer": { "source": "iana", "extensions": ["mwf"] }, "application/vnd.mfmp": { "source": "iana", "extensions": ["mfm"] }, "application/vnd.micrografx-igx": { "source": "iana" }, "application/vnd.micrografx.flo": { "source": "iana", "extensions": ["flo"] }, "application/vnd.micrografx.igx": { "source": "apache", "extensions": ["igx"] }, "application/vnd.miele+json": { "source": "iana", "compressible": true }, "application/vnd.mif": { "source": "apache", "extensions": ["mif"] }, "application/vnd.minisoft-hp3000-save": { "source": "iana" }, "application/vnd.mitsubishi.misty-guard.trustweb": { "source": "iana" }, "application/vnd.mobius.daf": { "source": "iana", "extensions": ["daf"] }, "application/vnd.mobius.dis": { "source": "iana", "extensions": ["dis"] }, "application/vnd.mobius.mbk": { "source": "iana", "extensions": ["mbk"] }, "application/vnd.mobius.mqy": { "source": "iana", "extensions": ["mqy"] }, "application/vnd.mobius.msl": { "source": "iana", "extensions": ["msl"] }, "application/vnd.mobius.plc": { "source": "iana", "extensions": ["plc"] }, "application/vnd.mobius.txf": { "source": "iana", "extensions": ["txf"] }, "application/vnd.mophun.application": { "source": "iana", "extensions": ["mpn"] }, "application/vnd.mophun.certificate": { "source": "iana", "extensions": ["mpc"] }, "application/vnd.motorola.flexsuite": { "source": "iana" }, "application/vnd.motorola.flexsuite.adsi": { "source": "iana" }, "application/vnd.motorola.flexsuite.fis": { "source": "iana" }, "application/vnd.motorola.flexsuite.gotap": { "source": "iana" }, "application/vnd.motorola.flexsuite.kmr": { "source": "iana" }, "application/vnd.motorola.flexsuite.ttc": { "source": "iana" }, "application/vnd.motorola.flexsuite.wem": { "source": "iana" }, "application/vnd.motorola.iprm": { "source": "iana" }, "application/vnd.mozilla.xul+xml": { "source": "iana", "compressible": true, "extensions": ["xul"] }, "application/vnd.ms-3mfdocument": { "source": "iana" }, "application/vnd.ms-artgalry": { "source": "iana", "extensions": ["cil"] }, "application/vnd.ms-asf": { "source": "iana" }, "application/vnd.ms-cab-compressed": { "source": "iana", "extensions": ["cab"] }, "application/vnd.ms-color.iccprofile": { "source": "apache" }, "application/vnd.ms-excel": { "source": "iana", "compressible": false, "extensions": ["xls","xlm","xla","xlc","xlt","xlw"] }, "application/vnd.ms-excel.addin.macroenabled.12": { "source": "iana", "extensions": ["xlam"] }, "application/vnd.ms-excel.sheet.binary.macroenabled.12": { "source": "iana", "extensions": ["xlsb"] }, "application/vnd.ms-excel.sheet.macroenabled.12": { "source": "iana", "extensions": ["xlsm"] }, "application/vnd.ms-excel.template.macroenabled.12": { "source": "iana", "extensions": ["xltm"] }, "application/vnd.ms-fontobject": { "source": "iana", "compressible": true, "extensions": ["eot"] }, "application/vnd.ms-htmlhelp": { "source": "iana", "extensions": ["chm"] }, "application/vnd.ms-ims": { "source": "iana", "extensions": ["ims"] }, "application/vnd.ms-lrm": { "source": "iana", "extensions": ["lrm"] }, "application/vnd.ms-office.activex+xml": { "source": "iana" }, "application/vnd.ms-officetheme": { "source": "iana", "extensions": ["thmx"] }, "application/vnd.ms-opentype": { "source": "apache", "compressible": true }, "application/vnd.ms-package.obfuscated-opentype": { "source": "apache" }, "application/vnd.ms-pki.seccat": { "source": "apache", "extensions": ["cat"] }, "application/vnd.ms-pki.stl": { "source": "apache", "extensions": ["stl"] }, "application/vnd.ms-playready.initiator+xml": { "source": "iana" }, "application/vnd.ms-powerpoint": { "source": "iana", "compressible": false, "extensions": ["ppt","pps","pot"] }, "application/vnd.ms-powerpoint.addin.macroenabled.12": { "source": "iana", "extensions": ["ppam"] }, "application/vnd.ms-powerpoint.presentation.macroenabled.12": { "source": "iana", "extensions": ["pptm"] }, "application/vnd.ms-powerpoint.slide.macroenabled.12": { "source": "iana", "extensions": ["sldm"] }, "application/vnd.ms-powerpoint.slideshow.macroenabled.12": { "source": "iana", "extensions": ["ppsm"] }, "application/vnd.ms-powerpoint.template.macroenabled.12": { "source": "iana", "extensions": ["potm"] }, "application/vnd.ms-printing.printticket+xml": { "source": "apache" }, "application/vnd.ms-project": { "source": "iana", "extensions": ["mpp","mpt"] }, "application/vnd.ms-tnef": { "source": "iana" }, "application/vnd.ms-windows.printerpairing": { "source": "iana" }, "application/vnd.ms-wmdrm.lic-chlg-req": { "source": "iana" }, "application/vnd.ms-wmdrm.lic-resp": { "source": "iana" }, "application/vnd.ms-wmdrm.meter-chlg-req": { "source": "iana" }, "application/vnd.ms-wmdrm.meter-resp": { "source": "iana" }, "application/vnd.ms-word.document.macroenabled.12": { "source": "iana", "extensions": ["docm"] }, "application/vnd.ms-word.template.macroenabled.12": { "source": "iana", "extensions": ["dotm"] }, "application/vnd.ms-works": { "source": "iana", "extensions": ["wps","wks","wcm","wdb"] }, "application/vnd.ms-wpl": { "source": "iana", "extensions": ["wpl"] }, "application/vnd.ms-xpsdocument": { "source": "iana", "compressible": false, "extensions": ["xps"] }, "application/vnd.mseq": { "source": "iana", "extensions": ["mseq"] }, "application/vnd.msign": { "source": "iana" }, "application/vnd.multiad.creator": { "source": "iana" }, "application/vnd.multiad.creator.cif": { "source": "iana" }, "application/vnd.music-niff": { "source": "iana" }, "application/vnd.musician": { "source": "iana", "extensions": ["mus"] }, "application/vnd.muvee.style": { "source": "iana", "extensions": ["msty"] }, "application/vnd.mynfc": { "source": "iana", "extensions": ["taglet"] }, "application/vnd.ncd.control": { "source": "iana" }, "application/vnd.ncd.reference": { "source": "iana" }, "application/vnd.nervana": { "source": "iana" }, "application/vnd.netfpx": { "source": "iana" }, "application/vnd.neurolanguage.nlu": { "source": "iana", "extensions": ["nlu"] }, "application/vnd.nintendo.nitro.rom": { "source": "iana" }, "application/vnd.nintendo.snes.rom": { "source": "iana" }, "application/vnd.nitf": { "source": "iana", "extensions": ["ntf","nitf"] }, "application/vnd.noblenet-directory": { "source": "iana", "extensions": ["nnd"] }, "application/vnd.noblenet-sealer": { "source": "iana", "extensions": ["nns"] }, "application/vnd.noblenet-web": { "source": "iana", "extensions": ["nnw"] }, "application/vnd.nokia.catalogs": { "source": "iana" }, "application/vnd.nokia.conml+wbxml": { "source": "iana" }, "application/vnd.nokia.conml+xml": { "source": "iana" }, "application/vnd.nokia.iptv.config+xml": { "source": "iana" }, "application/vnd.nokia.isds-radio-presets": { "source": "iana" }, "application/vnd.nokia.landmark+wbxml": { "source": "iana" }, "application/vnd.nokia.landmark+xml": { "source": "iana" }, "application/vnd.nokia.landmarkcollection+xml": { "source": "iana" }, "application/vnd.nokia.n-gage.ac+xml": { "source": "iana" }, "application/vnd.nokia.n-gage.data": { "source": "iana", "extensions": ["ngdat"] }, "application/vnd.nokia.n-gage.symbian.install": { "source": "iana" }, "application/vnd.nokia.ncd": { "source": "iana" }, "application/vnd.nokia.pcd+wbxml": { "source": "iana" }, "application/vnd.nokia.pcd+xml": { "source": "iana" }, "application/vnd.nokia.radio-preset": { "source": "iana", "extensions": ["rpst"] }, "application/vnd.nokia.radio-presets": { "source": "iana", "extensions": ["rpss"] }, "application/vnd.novadigm.edm": { "source": "iana", "extensions": ["edm"] }, "application/vnd.novadigm.edx": { "source": "iana", "extensions": ["edx"] }, "application/vnd.novadigm.ext": { "source": "iana", "extensions": ["ext"] }, "application/vnd.ntt-local.content-share": { "source": "iana" }, "application/vnd.ntt-local.file-transfer": { "source": "iana" }, "application/vnd.ntt-local.ogw_remote-access": { "source": "iana" }, "application/vnd.ntt-local.sip-ta_remote": { "source": "iana" }, "application/vnd.ntt-local.sip-ta_tcp_stream": { "source": "iana" }, "application/vnd.oasis.opendocument.chart": { "source": "iana", "extensions": ["odc"] }, "application/vnd.oasis.opendocument.chart-template": { "source": "iana", "extensions": ["otc"] }, "application/vnd.oasis.opendocument.database": { "source": "iana", "extensions": ["odb"] }, "application/vnd.oasis.opendocument.formula": { "source": "iana", "extensions": ["odf"] }, "application/vnd.oasis.opendocument.formula-template": { "source": "iana", "extensions": ["odft"] }, "application/vnd.oasis.opendocument.graphics": { "source": "iana", "compressible": false, "extensions": ["odg"] }, "application/vnd.oasis.opendocument.graphics-template": { "source": "iana", "extensions": ["otg"] }, "application/vnd.oasis.opendocument.image": { "source": "iana", "extensions": ["odi"] }, "application/vnd.oasis.opendocument.image-template": { "source": "iana", "extensions": ["oti"] }, "application/vnd.oasis.opendocument.presentation": { "source": "iana", "compressible": false, "extensions": ["odp"] }, "application/vnd.oasis.opendocument.presentation-template": { "source": "iana", "extensions": ["otp"] }, "application/vnd.oasis.opendocument.spreadsheet": { "source": "iana", "compressible": false, "extensions": ["ods"] }, "application/vnd.oasis.opendocument.spreadsheet-template": { "source": "iana", "extensions": ["ots"] }, "application/vnd.oasis.opendocument.text": { "source": "iana", "compressible": false, "extensions": ["odt"] }, "application/vnd.oasis.opendocument.text-master": { "source": "iana", "extensions": ["odm"] }, "application/vnd.oasis.opendocument.text-template": { "source": "iana", "extensions": ["ott"] }, "application/vnd.oasis.opendocument.text-web": { "source": "iana", "extensions": ["oth"] }, "application/vnd.obn": { "source": "iana" }, "application/vnd.oftn.l10n+json": { "source": "iana", "compressible": true }, "application/vnd.oipf.contentaccessdownload+xml": { "source": "iana" }, "application/vnd.oipf.contentaccessstreaming+xml": { "source": "iana" }, "application/vnd.oipf.cspg-hexbinary": { "source": "iana" }, "application/vnd.oipf.dae.svg+xml": { "source": "iana" }, "application/vnd.oipf.dae.xhtml+xml": { "source": "iana" }, "application/vnd.oipf.mippvcontrolmessage+xml": { "source": "iana" }, "application/vnd.oipf.pae.gem": { "source": "iana" }, "application/vnd.oipf.spdiscovery+xml": { "source": "iana" }, "application/vnd.oipf.spdlist+xml": { "source": "iana" }, "application/vnd.oipf.ueprofile+xml": { "source": "iana" }, "application/vnd.oipf.userprofile+xml": { "source": "iana" }, "application/vnd.olpc-sugar": { "source": "iana", "extensions": ["xo"] }, "application/vnd.oma-scws-config": { "source": "iana" }, "application/vnd.oma-scws-http-request": { "source": "iana" }, "application/vnd.oma-scws-http-response": { "source": "iana" }, "application/vnd.oma.bcast.associated-procedure-parameter+xml": { "source": "iana" }, "application/vnd.oma.bcast.drm-trigger+xml": { "source": "iana" }, "application/vnd.oma.bcast.imd+xml": { "source": "iana" }, "application/vnd.oma.bcast.ltkm": { "source": "iana" }, "application/vnd.oma.bcast.notification+xml": { "source": "iana" }, "application/vnd.oma.bcast.provisioningtrigger": { "source": "iana" }, "application/vnd.oma.bcast.sgboot": { "source": "iana" }, "application/vnd.oma.bcast.sgdd+xml": { "source": "iana" }, "application/vnd.oma.bcast.sgdu": { "source": "iana" }, "application/vnd.oma.bcast.simple-symbol-container": { "source": "iana" }, "application/vnd.oma.bcast.smartcard-trigger+xml": { "source": "iana" }, "application/vnd.oma.bcast.sprov+xml": { "source": "iana" }, "application/vnd.oma.bcast.stkm": { "source": "iana" }, "application/vnd.oma.cab-address-book+xml": { "source": "iana" }, "application/vnd.oma.cab-feature-handler+xml": { "source": "iana" }, "application/vnd.oma.cab-pcc+xml": { "source": "iana" }, "application/vnd.oma.cab-subs-invite+xml": { "source": "iana" }, "application/vnd.oma.cab-user-prefs+xml": { "source": "iana" }, "application/vnd.oma.dcd": { "source": "iana" }, "application/vnd.oma.dcdc": { "source": "iana" }, "application/vnd.oma.dd2+xml": { "source": "iana", "extensions": ["dd2"] }, "application/vnd.oma.drm.risd+xml": { "source": "iana" }, "application/vnd.oma.group-usage-list+xml": { "source": "iana" }, "application/vnd.oma.pal+xml": { "source": "iana" }, "application/vnd.oma.poc.detailed-progress-report+xml": { "source": "iana" }, "application/vnd.oma.poc.final-report+xml": { "source": "iana" }, "application/vnd.oma.poc.groups+xml": { "source": "iana" }, "application/vnd.oma.poc.invocation-descriptor+xml": { "source": "iana" }, "application/vnd.oma.poc.optimized-progress-report+xml": { "source": "iana" }, "application/vnd.oma.push": { "source": "iana" }, "application/vnd.oma.scidm.messages+xml": { "source": "iana" }, "application/vnd.oma.xcap-directory+xml": { "source": "iana" }, "application/vnd.omads-email+xml": { "source": "iana" }, "application/vnd.omads-file+xml": { "source": "iana" }, "application/vnd.omads-folder+xml": { "source": "iana" }, "application/vnd.omaloc-supl-init": { "source": "iana" }, "application/vnd.openeye.oeb": { "source": "iana" }, "application/vnd.openofficeorg.extension": { "source": "apache", "extensions": ["oxt"] }, "application/vnd.openxmlformats-officedocument.custom-properties+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.customxmlproperties+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawing+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawingml.chart+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawingml.chartshapes+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawingml.diagramcolors+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawingml.diagramdata+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawingml.diagramlayout+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.drawingml.diagramstyle+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.extended-properties+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml-template": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.commentauthors+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.comments+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.handoutmaster+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.notesmaster+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.notesslide+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.presentation": { "source": "iana", "compressible": false, "extensions": ["pptx"] }, "application/vnd.openxmlformats-officedocument.presentationml.presentation.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.presprops+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.slide": { "source": "iana", "extensions": ["sldx"] }, "application/vnd.openxmlformats-officedocument.presentationml.slide+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.slidelayout+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.slidemaster+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.slideshow": { "source": "iana", "extensions": ["ppsx"] }, "application/vnd.openxmlformats-officedocument.presentationml.slideshow.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.slideupdateinfo+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.tablestyles+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.tags+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.template": { "source": "apache", "extensions": ["potx"] }, "application/vnd.openxmlformats-officedocument.presentationml.template.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.presentationml.viewprops+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml-template": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.calcchain+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.chartsheet+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.comments+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.connections+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.dialogsheet+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.externallink+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcachedefinition+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcacherecords+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.pivottable+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.querytable+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.revisionheaders+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.revisionlog+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.sharedstrings+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet": { "source": "iana", "compressible": false, "extensions": ["xlsx"] }, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.sheetmetadata+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.styles+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.table+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.tablesinglecells+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.template": { "source": "apache", "extensions": ["xltx"] }, "application/vnd.openxmlformats-officedocument.spreadsheetml.template.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.usernames+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.volatiledependencies+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.theme+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.themeoverride+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.vmldrawing": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml-template": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.comments+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.document": { "source": "iana", "compressible": false, "extensions": ["docx"] }, "application/vnd.openxmlformats-officedocument.wordprocessingml.document.glossary+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.endnotes+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.fonttable+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.footer+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.footnotes+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.numbering+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.settings+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.styles+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.template": { "source": "apache", "extensions": ["dotx"] }, "application/vnd.openxmlformats-officedocument.wordprocessingml.template.main+xml": { "source": "iana" }, "application/vnd.openxmlformats-officedocument.wordprocessingml.websettings+xml": { "source": "iana" }, "application/vnd.openxmlformats-package.core-properties+xml": { "source": "iana" }, "application/vnd.openxmlformats-package.digital-signature-xmlsignature+xml": { "source": "iana" }, "application/vnd.openxmlformats-package.relationships+xml": { "source": "iana" }, "application/vnd.orange.indata": { "source": "iana" }, "application/vnd.osa.netdeploy": { "source": "iana" }, "application/vnd.osgeo.mapguide.package": { "source": "iana", "extensions": ["mgp"] }, "application/vnd.osgi.bundle": { "source": "iana" }, "application/vnd.osgi.dp": { "source": "iana", "extensions": ["dp"] }, "application/vnd.osgi.subsystem": { "source": "iana", "extensions": ["esa"] }, "application/vnd.otps.ct-kip+xml": { "source": "iana" }, "application/vnd.palm": { "source": "iana", "extensions": ["pdb","pqa","oprc"] }, "application/vnd.panoply": { "source": "iana" }, "application/vnd.paos+xml": { "source": "iana" }, "application/vnd.paos.xml": { "source": "apache" }, "application/vnd.pawaafile": { "source": "iana", "extensions": ["paw"] }, "application/vnd.pcos": { "source": "iana" }, "application/vnd.pg.format": { "source": "iana", "extensions": ["str"] }, "application/vnd.pg.osasli": { "source": "iana", "extensions": ["ei6"] }, "application/vnd.piaccess.application-licence": { "source": "iana" }, "application/vnd.picsel": { "source": "iana", "extensions": ["efif"] }, "application/vnd.pmi.widget": { "source": "iana", "extensions": ["wg"] }, "application/vnd.poc.group-advertisement+xml": { "source": "iana" }, "application/vnd.pocketlearn": { "source": "iana", "extensions": ["plf"] }, "application/vnd.powerbuilder6": { "source": "iana", "extensions": ["pbd"] }, "application/vnd.powerbuilder6-s": { "source": "iana" }, "application/vnd.powerbuilder7": { "source": "iana" }, "application/vnd.powerbuilder7-s": { "source": "iana" }, "application/vnd.powerbuilder75": { "source": "iana" }, "application/vnd.powerbuilder75-s": { "source": "iana" }, "application/vnd.preminet": { "source": "iana" }, "application/vnd.previewsystems.box": { "source": "iana", "extensions": ["box"] }, "application/vnd.proteus.magazine": { "source": "iana", "extensions": ["mgz"] }, "application/vnd.publishare-delta-tree": { "source": "iana", "extensions": ["qps"] }, "application/vnd.pvi.ptid1": { "source": "iana", "extensions": ["ptid"] }, "application/vnd.pwg-multiplexed": { "source": "apache" }, "application/vnd.pwg-xhtml-print+xml": { "source": "iana" }, "application/vnd.qualcomm.brew-app-res": { "source": "iana" }, "application/vnd.quark.quarkxpress": { "source": "iana", "extensions": ["qxd","qxt","qwd","qwt","qxl","qxb"] }, "application/vnd.quobject-quoxdocument": { "source": "iana" }, "application/vnd.radisys.moml+xml": { "source": "iana" }, "application/vnd.radisys.msml+xml": { "source": "iana" }, "application/vnd.radisys.msml-audit+xml": { "source": "iana" }, "application/vnd.radisys.msml-audit-conf+xml": { "source": "iana" }, "application/vnd.radisys.msml-audit-conn+xml": { "source": "iana" }, "application/vnd.radisys.msml-audit-dialog+xml": { "source": "iana" }, "application/vnd.radisys.msml-audit-stream+xml": { "source": "iana" }, "application/vnd.radisys.msml-conf+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog-base+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog-fax-detect+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog-fax-sendrecv+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog-group+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog-speech+xml": { "source": "iana" }, "application/vnd.radisys.msml-dialog-transform+xml": { "source": "iana" }, "application/vnd.rainstor.data": { "source": "iana" }, "application/vnd.rapid": { "source": "iana" }, "application/vnd.realvnc.bed": { "source": "iana", "extensions": ["bed"] }, "application/vnd.recordare.musicxml": { "source": "iana", "extensions": ["mxl"] }, "application/vnd.recordare.musicxml+xml": { "source": "iana", "extensions": ["musicxml"] }, "application/vnd.renlearn.rlprint": { "source": "iana" }, "application/vnd.rig.cryptonote": { "source": "iana", "extensions": ["cryptonote"] }, "application/vnd.rim.cod": { "source": "apache", "extensions": ["cod"] }, "application/vnd.rn-realmedia": { "source": "apache", "extensions": ["rm"] }, "application/vnd.rn-realmedia-vbr": { "source": "apache", "extensions": ["rmvb"] }, "application/vnd.route66.link66+xml": { "source": "iana", "extensions": ["link66"] }, "application/vnd.rs-274x": { "source": "iana" }, "application/vnd.ruckus.download": { "source": "iana" }, "application/vnd.s3sms": { "source": "iana" }, "application/vnd.sailingtracker.track": { "source": "iana", "extensions": ["st"] }, "application/vnd.sbm.cid": { "source": "iana" }, "application/vnd.sbm.mid2": { "source": "iana" }, "application/vnd.scribus": { "source": "iana" }, "application/vnd.sealed-doc": { "source": "iana" }, "application/vnd.sealed-eml": { "source": "iana" }, "application/vnd.sealed-mht": { "source": "iana" }, "application/vnd.sealed-ppt": { "source": "iana" }, "application/vnd.sealed-tiff": { "source": "iana" }, "application/vnd.sealed-xls": { "source": "iana" }, "application/vnd.sealed.3df": { "source": "iana" }, "application/vnd.sealed.csf": { "source": "iana" }, "application/vnd.sealed.doc": { "source": "apache" }, "application/vnd.sealed.eml": { "source": "apache" }, "application/vnd.sealed.mht": { "source": "apache" }, "application/vnd.sealed.net": { "source": "iana" }, "application/vnd.sealed.ppt": { "source": "apache" }, "application/vnd.sealed.tiff": { "source": "apache" }, "application/vnd.sealed.xls": { "source": "apache" }, "application/vnd.sealedmedia.softseal-html": { "source": "iana" }, "application/vnd.sealedmedia.softseal-pdf": { "source": "iana" }, "application/vnd.sealedmedia.softseal.html": { "source": "apache" }, "application/vnd.sealedmedia.softseal.pdf": { "source": "apache" }, "application/vnd.seemail": { "source": "iana", "extensions": ["see"] }, "application/vnd.sema": { "source": "apache", "extensions": ["sema"] }, "application/vnd.semd": { "source": "iana", "extensions": ["semd"] }, "application/vnd.semf": { "source": "iana", "extensions": ["semf"] }, "application/vnd.shana.informed.formdata": { "source": "iana", "extensions": ["ifm"] }, "application/vnd.shana.informed.formtemplate": { "source": "iana", "extensions": ["itp"] }, "application/vnd.shana.informed.interchange": { "source": "iana", "extensions": ["iif"] }, "application/vnd.shana.informed.package": { "source": "iana", "extensions": ["ipk"] }, "application/vnd.simtech-mindmapper": { "source": "iana", "extensions": ["twd","twds"] }, "application/vnd.siren+json": { "source": "iana", "compressible": true }, "application/vnd.smaf": { "source": "iana", "extensions": ["mmf"] }, "application/vnd.smart.notebook": { "source": "iana" }, "application/vnd.smart.teacher": { "source": "iana", "extensions": ["teacher"] }, "application/vnd.software602.filler.form+xml": { "source": "iana" }, "application/vnd.software602.filler.form-xml-zip": { "source": "iana" }, "application/vnd.solent.sdkm+xml": { "source": "iana", "extensions": ["sdkm","sdkd"] }, "application/vnd.spotfire.dxp": { "source": "iana", "extensions": ["dxp"] }, "application/vnd.spotfire.sfs": { "source": "iana", "extensions": ["sfs"] }, "application/vnd.sss-cod": { "source": "iana" }, "application/vnd.sss-dtf": { "source": "iana" }, "application/vnd.sss-ntf": { "source": "iana" }, "application/vnd.stardivision.calc": { "source": "apache", "extensions": ["sdc"] }, "application/vnd.stardivision.draw": { "source": "apache", "extensions": ["sda"] }, "application/vnd.stardivision.impress": { "source": "apache", "extensions": ["sdd"] }, "application/vnd.stardivision.math": { "source": "apache", "extensions": ["smf"] }, "application/vnd.stardivision.writer": { "source": "apache", "extensions": ["sdw","vor"] }, "application/vnd.stardivision.writer-global": { "source": "apache", "extensions": ["sgl"] }, "application/vnd.stepmania.package": { "source": "iana", "extensions": ["smzip"] }, "application/vnd.stepmania.stepchart": { "source": "iana", "extensions": ["sm"] }, "application/vnd.street-stream": { "source": "iana" }, "application/vnd.sun.wadl+xml": { "source": "iana" }, "application/vnd.sun.xml.calc": { "source": "apache", "extensions": ["sxc"] }, "application/vnd.sun.xml.calc.template": { "source": "apache", "extensions": ["stc"] }, "application/vnd.sun.xml.draw": { "source": "apache", "extensions": ["sxd"] }, "application/vnd.sun.xml.draw.template": { "source": "apache", "extensions": ["std"] }, "application/vnd.sun.xml.impress": { "source": "apache", "extensions": ["sxi"] }, "application/vnd.sun.xml.impress.template": { "source": "apache", "extensions": ["sti"] }, "application/vnd.sun.xml.math": { "source": "apache", "extensions": ["sxm"] }, "application/vnd.sun.xml.writer": { "source": "apache", "extensions": ["sxw"] }, "application/vnd.sun.xml.writer.global": { "source": "apache", "extensions": ["sxg"] }, "application/vnd.sun.xml.writer.template": { "source": "apache", "extensions": ["stw"] }, "application/vnd.sus-calendar": { "source": "iana", "extensions": ["sus","susp"] }, "application/vnd.svd": { "source": "iana", "extensions": ["svd"] }, "application/vnd.swiftview-ics": { "source": "iana" }, "application/vnd.symbian.install": { "source": "apache", "extensions": ["sis","sisx"] }, "application/vnd.syncml+xml": { "source": "iana", "extensions": ["xsm"] }, "application/vnd.syncml.dm+wbxml": { "source": "iana", "extensions": ["bdm"] }, "application/vnd.syncml.dm+xml": { "source": "iana", "extensions": ["xdm"] }, "application/vnd.syncml.dm.notification": { "source": "iana" }, "application/vnd.syncml.dmddf+wbxml": { "source": "iana" }, "application/vnd.syncml.dmddf+xml": { "source": "iana" }, "application/vnd.syncml.dmtnds+wbxml": { "source": "iana" }, "application/vnd.syncml.dmtnds+xml": { "source": "iana" }, "application/vnd.syncml.ds.notification": { "source": "iana" }, "application/vnd.tao.intent-module-archive": { "source": "iana", "extensions": ["tao"] }, "application/vnd.tcpdump.pcap": { "source": "iana", "extensions": ["pcap","cap","dmp"] }, "application/vnd.tmd.mediaflex.api+xml": { "source": "iana" }, "application/vnd.tmobile-livetv": { "source": "iana", "extensions": ["tmo"] }, "application/vnd.trid.tpt": { "source": "iana", "extensions": ["tpt"] }, "application/vnd.triscape.mxs": { "source": "iana", "extensions": ["mxs"] }, "application/vnd.trueapp": { "source": "iana", "extensions": ["tra"] }, "application/vnd.truedoc": { "source": "iana" }, "application/vnd.ubisoft.webplayer": { "source": "iana" }, "application/vnd.ufdl": { "source": "iana", "extensions": ["ufd","ufdl"] }, "application/vnd.uiq.theme": { "source": "iana", "extensions": ["utz"] }, "application/vnd.umajin": { "source": "iana", "extensions": ["umj"] }, "application/vnd.unity": { "source": "iana", "extensions": ["unityweb"] }, "application/vnd.uoml+xml": { "source": "iana", "extensions": ["uoml"] }, "application/vnd.uplanet.alert": { "source": "iana" }, "application/vnd.uplanet.alert-wbxml": { "source": "iana" }, "application/vnd.uplanet.bearer-choice": { "source": "iana" }, "application/vnd.uplanet.bearer-choice-wbxml": { "source": "iana" }, "application/vnd.uplanet.cacheop": { "source": "iana" }, "application/vnd.uplanet.cacheop-wbxml": { "source": "iana" }, "application/vnd.uplanet.channel": { "source": "iana" }, "application/vnd.uplanet.channel-wbxml": { "source": "iana" }, "application/vnd.uplanet.list": { "source": "iana" }, "application/vnd.uplanet.list-wbxml": { "source": "iana" }, "application/vnd.uplanet.listcmd": { "source": "iana" }, "application/vnd.uplanet.listcmd-wbxml": { "source": "iana" }, "application/vnd.uplanet.signal": { "source": "iana" }, "application/vnd.valve.source.material": { "source": "iana" }, "application/vnd.vcx": { "source": "iana", "extensions": ["vcx"] }, "application/vnd.vd-study": { "source": "iana" }, "application/vnd.vectorworks": { "source": "iana" }, "application/vnd.verimatrix.vcas": { "source": "iana" }, "application/vnd.vidsoft.vidconference": { "source": "iana" }, "application/vnd.visio": { "source": "iana", "extensions": ["vsd","vst","vss","vsw"] }, "application/vnd.visionary": { "source": "iana", "extensions": ["vis"] }, "application/vnd.vividence.scriptfile": { "source": "iana" }, "application/vnd.vsf": { "source": "iana", "extensions": ["vsf"] }, "application/vnd.wap-slc": { "source": "iana" }, "application/vnd.wap-wbxml": { "source": "iana" }, "application/vnd.wap.sic": { "source": "iana" }, "application/vnd.wap.slc": { "source": "apache" }, "application/vnd.wap.wbxml": { "source": "apache", "extensions": ["wbxml"] }, "application/vnd.wap.wmlc": { "source": "apache", "extensions": ["wmlc"] }, "application/vnd.wap.wmlscriptc": { "source": "iana", "extensions": ["wmlsc"] }, "application/vnd.webturbo": { "source": "iana", "extensions": ["wtb"] }, "application/vnd.wfa.p2p": { "source": "iana" }, "application/vnd.wfa.wsc": { "source": "iana" }, "application/vnd.windows.devicepairing": { "source": "iana" }, "application/vnd.wmc": { "source": "iana" }, "application/vnd.wmf.bootstrap": { "source": "iana" }, "application/vnd.wolfram.mathematica": { "source": "iana" }, "application/vnd.wolfram.mathematica.package": { "source": "iana" }, "application/vnd.wolfram.player": { "source": "iana", "extensions": ["nbp"] }, "application/vnd.wordperfect": { "source": "iana", "extensions": ["wpd"] }, "application/vnd.wqd": { "source": "iana", "extensions": ["wqd"] }, "application/vnd.wrq-hp3000-labelled": { "source": "iana" }, "application/vnd.wt.stf": { "source": "iana", "extensions": ["stf"] }, "application/vnd.wv.csp+wbxml": { "source": "iana" }, "application/vnd.wv.csp+xml": { "source": "iana" }, "application/vnd.wv.ssp+xml": { "source": "iana" }, "application/vnd.xacml+json": { "source": "iana", "compressible": true }, "application/vnd.xara": { "source": "iana", "extensions": ["xar"] }, "application/vnd.xfdl": { "source": "iana", "extensions": ["xfdl"] }, "application/vnd.xfdl.webform": { "source": "iana" }, "application/vnd.xmi+xml": { "source": "iana" }, "application/vnd.xmpie.cpkg": { "source": "iana" }, "application/vnd.xmpie.dpkg": { "source": "iana" }, "application/vnd.xmpie.plan": { "source": "iana" }, "application/vnd.xmpie.ppkg": { "source": "iana" }, "application/vnd.xmpie.xlim": { "source": "iana" }, "application/vnd.yamaha.hv-dic": { "source": "iana", "extensions": ["hvd"] }, "application/vnd.yamaha.hv-script": { "source": "iana", "extensions": ["hvs"] }, "application/vnd.yamaha.hv-voice": { "source": "iana", "extensions": ["hvp"] }, "application/vnd.yamaha.openscoreformat": { "source": "iana", "extensions": ["osf"] }, "application/vnd.yamaha.openscoreformat.osfpvg+xml": { "source": "iana", "extensions": ["osfpvg"] }, "application/vnd.yamaha.remote-setup": { "source": "iana" }, "application/vnd.yamaha.smaf-audio": { "source": "iana", "extensions": ["saf"] }, "application/vnd.yamaha.smaf-phrase": { "source": "iana", "extensions": ["spf"] }, "application/vnd.yamaha.through-ngn": { "source": "iana" }, "application/vnd.yamaha.tunnel-udpencap": { "source": "iana" }, "application/vnd.yaoweme": { "source": "iana" }, "application/vnd.yellowriver-custom-menu": { "source": "iana", "extensions": ["cmp"] }, "application/vnd.zul": { "source": "iana", "extensions": ["zir","zirz"] }, "application/vnd.zzazz.deck+xml": { "source": "iana", "extensions": ["zaz"] }, "application/voicexml+xml": { "source": "iana", "extensions": ["vxml"] }, "application/vq-rtcpxr": { "source": "iana" }, "application/vwg-multiplexed": { "source": "iana" }, "application/watcherinfo+xml": { "source": "iana" }, "application/whoispp-query": { "source": "iana" }, "application/whoispp-response": { "source": "iana" }, "application/widget": { "source": "iana", "extensions": ["wgt"] }, "application/winhlp": { "source": "apache", "extensions": ["hlp"] }, "application/wita": { "source": "iana" }, "application/wordperfect5.1": { "source": "iana" }, "application/wsdl+xml": { "source": "iana", "extensions": ["wsdl"] }, "application/wspolicy+xml": { "source": "iana", "extensions": ["wspolicy"] }, "application/x-7z-compressed": { "source": "apache", "compressible": false, "extensions": ["7z"] }, "application/x-abiword": { "source": "apache", "extensions": ["abw"] }, "application/x-ace-compressed": { "source": "apache", "extensions": ["ace"] }, "application/x-amf": { "source": "apache" }, "application/x-apple-diskimage": { "source": "apache", "extensions": ["dmg"] }, "application/x-authorware-bin": { "source": "apache", "extensions": ["aab","x32","u32","vox"] }, "application/x-authorware-map": { "source": "apache", "extensions": ["aam"] }, "application/x-authorware-seg": { "source": "apache", "extensions": ["aas"] }, "application/x-bcpio": { "source": "apache", "extensions": ["bcpio"] }, "application/x-bittorrent": { "source": "apache", "extensions": ["torrent"] }, "application/x-blorb": { "source": "apache", "extensions": ["blb","blorb"] }, "application/x-bzip": { "source": "apache", "compressible": false, "extensions": ["bz"] }, "application/x-bzip2": { "source": "apache", "compressible": false, "extensions": ["bz2","boz"] }, "application/x-cbr": { "source": "apache", "extensions": ["cbr","cba","cbt","cbz","cb7"] }, "application/x-cdlink": { "source": "apache", "extensions": ["vcd"] }, "application/x-cfs-compressed": { "source": "apache", "extensions": ["cfs"] }, "application/x-chat": { "source": "apache", "extensions": ["chat"] }, "application/x-chess-pgn": { "source": "apache", "extensions": ["pgn"] }, "application/x-chrome-extension": { "extensions": ["crx"] }, "application/x-compress": { "source": "apache" }, "application/x-conference": { "source": "apache", "extensions": ["nsc"] }, "application/x-cpio": { "source": "apache", "extensions": ["cpio"] }, "application/x-csh": { "source": "apache", "extensions": ["csh"] }, "application/x-deb": { "compressible": false }, "application/x-debian-package": { "source": "apache", "extensions": ["deb","udeb"] }, "application/x-dgc-compressed": { "source": "apache", "extensions": ["dgc"] }, "application/x-director": { "source": "apache", "extensions": ["dir","dcr","dxr","cst","cct","cxt","w3d","fgd","swa"] }, "application/x-doom": { "source": "apache", "extensions": ["wad"] }, "application/x-dtbncx+xml": { "source": "apache", "extensions": ["ncx"] }, "application/x-dtbook+xml": { "source": "apache", "extensions": ["dtb"] }, "application/x-dtbresource+xml": { "source": "apache", "extensions": ["res"] }, "application/x-dvi": { "source": "apache", "compressible": false, "extensions": ["dvi"] }, "application/x-envoy": { "source": "apache", "extensions": ["evy"] }, "application/x-eva": { "source": "apache", "extensions": ["eva"] }, "application/x-font-bdf": { "source": "apache", "extensions": ["bdf"] }, "application/x-font-dos": { "source": "apache" }, "application/x-font-framemaker": { "source": "apache" }, "application/x-font-ghostscript": { "source": "apache", "extensions": ["gsf"] }, "application/x-font-libgrx": { "source": "apache" }, "application/x-font-linux-psf": { "source": "apache", "extensions": ["psf"] }, "application/x-font-otf": { "source": "apache", "compressible": true, "extensions": ["otf"] }, "application/x-font-pcf": { "source": "apache", "extensions": ["pcf"] }, "application/x-font-snf": { "source": "apache", "extensions": ["snf"] }, "application/x-font-speedo": { "source": "apache" }, "application/x-font-sunos-news": { "source": "apache" }, "application/x-font-ttf": { "source": "apache", "compressible": true, "extensions": ["ttf","ttc"] }, "application/x-font-type1": { "source": "apache", "extensions": ["pfa","pfb","pfm","afm"] }, "application/x-font-vfont": { "source": "apache" }, "application/x-freearc": { "source": "apache", "extensions": ["arc"] }, "application/x-futuresplash": { "source": "apache", "extensions": ["spl"] }, "application/x-gca-compressed": { "source": "apache", "extensions": ["gca"] }, "application/x-glulx": { "source": "apache", "extensions": ["ulx"] }, "application/x-gnumeric": { "source": "apache", "extensions": ["gnumeric"] }, "application/x-gramps-xml": { "source": "apache", "extensions": ["gramps"] }, "application/x-gtar": { "source": "apache", "extensions": ["gtar"] }, "application/x-gzip": { "source": "apache" }, "application/x-hdf": { "source": "apache", "extensions": ["hdf"] }, "application/x-install-instructions": { "source": "apache", "extensions": ["install"] }, "application/x-iso9660-image": { "source": "apache", "extensions": ["iso"] }, "application/x-java-jnlp-file": { "source": "apache", "compressible": false, "extensions": ["jnlp"] }, "application/x-javascript": { "compressible": true }, "application/x-latex": { "source": "apache", "compressible": false, "extensions": ["latex"] }, "application/x-lua-bytecode": { "extensions": ["luac"] }, "application/x-lzh-compressed": { "source": "apache", "extensions": ["lzh","lha"] }, "application/x-mie": { "source": "apache", "extensions": ["mie"] }, "application/x-mobipocket-ebook": { "source": "apache", "extensions": ["prc","mobi"] }, "application/x-mpegurl": { "compressible": false }, "application/x-ms-application": { "source": "apache", "extensions": ["application"] }, "application/x-ms-shortcut": { "source": "apache", "extensions": ["lnk"] }, "application/x-ms-wmd": { "source": "apache", "extensions": ["wmd"] }, "application/x-ms-wmz": { "source": "apache", "extensions": ["wmz"] }, "application/x-ms-xbap": { "source": "apache", "extensions": ["xbap"] }, "application/x-msaccess": { "source": "apache", "extensions": ["mdb"] }, "application/x-msbinder": { "source": "apache", "extensions": ["obd"] }, "application/x-mscardfile": { "source": "apache", "extensions": ["crd"] }, "application/x-msclip": { "source": "apache", "extensions": ["clp"] }, "application/x-msdownload": { "source": "apache", "extensions": ["exe","dll","com","bat","msi"] }, "application/x-msmediaview": { "source": "apache", "extensions": ["mvb","m13","m14"] }, "application/x-msmetafile": { "source": "apache", "extensions": ["wmf","wmz","emf","emz"] }, "application/x-msmoney": { "source": "apache", "extensions": ["mny"] }, "application/x-mspublisher": { "source": "apache", "extensions": ["pub"] }, "application/x-msschedule": { "source": "apache", "extensions": ["scd"] }, "application/x-msterminal": { "source": "apache", "extensions": ["trm"] }, "application/x-mswrite": { "source": "apache", "extensions": ["wri"] }, "application/x-netcdf": { "source": "apache", "extensions": ["nc","cdf"] }, "application/x-nzb": { "source": "apache", "extensions": ["nzb"] }, "application/x-pkcs12": { "source": "apache", "compressible": false, "extensions": ["p12","pfx"] }, "application/x-pkcs7-certificates": { "source": "apache", "extensions": ["p7b","spc"] }, "application/x-pkcs7-certreqresp": { "source": "apache", "extensions": ["p7r"] }, "application/x-rar-compressed": { "source": "apache", "compressible": false, "extensions": ["rar"] }, "application/x-research-info-systems": { "source": "apache", "extensions": ["ris"] }, "application/x-sh": { "source": "apache", "compressible": true, "extensions": ["sh"] }, "application/x-shar": { "source": "apache", "extensions": ["shar"] }, "application/x-shockwave-flash": { "source": "apache", "compressible": false, "extensions": ["swf"] }, "application/x-silverlight-app": { "source": "apache", "extensions": ["xap"] }, "application/x-sql": { "source": "apache", "extensions": ["sql"] }, "application/x-stuffit": { "source": "apache", "compressible": false, "extensions": ["sit"] }, "application/x-stuffitx": { "source": "apache", "extensions": ["sitx"] }, "application/x-subrip": { "source": "apache", "extensions": ["srt"] }, "application/x-sv4cpio": { "source": "apache", "extensions": ["sv4cpio"] }, "application/x-sv4crc": { "source": "apache", "extensions": ["sv4crc"] }, "application/x-t3vm-image": { "source": "apache", "extensions": ["t3"] }, "application/x-tads": { "source": "apache", "extensions": ["gam"] }, "application/x-tar": { "source": "apache", "compressible": true, "extensions": ["tar"] }, "application/x-tcl": { "source": "apache", "extensions": ["tcl"] }, "application/x-tex": { "source": "apache", "extensions": ["tex"] }, "application/x-tex-tfm": { "source": "apache", "extensions": ["tfm"] }, "application/x-texinfo": { "source": "apache", "extensions": ["texinfo","texi"] }, "application/x-tgif": { "source": "apache", "extensions": ["obj"] }, "application/x-ustar": { "source": "apache", "extensions": ["ustar"] }, "application/x-wais-source": { "source": "apache", "extensions": ["src"] }, "application/x-web-app-manifest+json": { "compressible": true, "extensions": ["webapp"] }, "application/x-www-form-urlencoded": { "source": "iana", "compressible": true }, "application/x-x509-ca-cert": { "source": "apache", "extensions": ["der","crt"] }, "application/x-xfig": { "source": "apache", "extensions": ["fig"] }, "application/x-xliff+xml": { "source": "apache", "extensions": ["xlf"] }, "application/x-xpinstall": { "source": "apache", "compressible": false, "extensions": ["xpi"] }, "application/x-xz": { "source": "apache", "extensions": ["xz"] }, "application/x-zmachine": { "source": "apache", "extensions": ["z1","z2","z3","z4","z5","z6","z7","z8"] }, "application/x400-bp": { "source": "iana" }, "application/xacml+xml": { "source": "iana" }, "application/xaml+xml": { "source": "apache", "extensions": ["xaml"] }, "application/xcap-att+xml": { "source": "iana" }, "application/xcap-caps+xml": { "source": "iana" }, "application/xcap-diff+xml": { "source": "iana", "extensions": ["xdf"] }, "application/xcap-el+xml": { "source": "iana" }, "application/xcap-error+xml": { "source": "iana" }, "application/xcap-ns+xml": { "source": "iana" }, "application/xcon-conference-info+xml": { "source": "iana" }, "application/xcon-conference-info-diff+xml": { "source": "iana" }, "application/xenc+xml": { "source": "iana", "extensions": ["xenc"] }, "application/xhtml+xml": { "source": "iana", "compressible": true, "extensions": ["xhtml","xht"] }, "application/xhtml-voice+xml": { "source": "iana" }, "application/xml": { "source": "iana", "compressible": true, "extensions": ["xml","xsl","xsd"] }, "application/xml-dtd": { "source": "iana", "compressible": true, "extensions": ["dtd"] }, "application/xml-external-parsed-entity": { "source": "iana" }, "application/xml-patch+xml": { "source": "iana" }, "application/xmpp+xml": { "source": "iana" }, "application/xop+xml": { "source": "iana", "compressible": true, "extensions": ["xop"] }, "application/xproc+xml": { "source": "apache", "extensions": ["xpl"] }, "application/xslt+xml": { "source": "iana", "extensions": ["xslt"] }, "application/xspf+xml": { "source": "apache", "extensions": ["xspf"] }, "application/xv+xml": { "source": "iana", "extensions": ["mxml","xhvml","xvml","xvm"] }, "application/yang": { "source": "iana", "extensions": ["yang"] }, "application/yin+xml": { "source": "iana", "extensions": ["yin"] }, "application/zip": { "source": "iana", "compressible": false, "extensions": ["zip"] }, "application/zlib": { "source": "iana" }, "audio/1d-interleaved-parityfec": { "source": "iana" }, "audio/32kadpcm": { "source": "iana" }, "audio/3gpp": { "source": "iana" }, "audio/3gpp2": { "source": "iana" }, "audio/ac3": { "source": "iana" }, "audio/adpcm": { "source": "apache", "extensions": ["adp"] }, "audio/amr": { "source": "iana" }, "audio/amr-wb": { "source": "iana" }, "audio/amr-wb+": { "source": "iana" }, "audio/aptx": { "source": "iana" }, "audio/asc": { "source": "iana" }, "audio/atrac-advanced-lossless": { "source": "iana" }, "audio/atrac-x": { "source": "iana" }, "audio/atrac3": { "source": "iana" }, "audio/basic": { "source": "iana", "compressible": false, "extensions": ["au","snd"] }, "audio/bv16": { "source": "iana" }, "audio/bv32": { "source": "iana" }, "audio/clearmode": { "source": "iana" }, "audio/cn": { "source": "iana" }, "audio/dat12": { "source": "iana" }, "audio/dls": { "source": "iana" }, "audio/dsr-es201108": { "source": "iana" }, "audio/dsr-es202050": { "source": "iana" }, "audio/dsr-es202211": { "source": "iana" }, "audio/dsr-es202212": { "source": "iana" }, "audio/dv": { "source": "iana" }, "audio/dvi4": { "source": "iana" }, "audio/eac3": { "source": "iana" }, "audio/encaprtp": { "source": "iana" }, "audio/evrc": { "source": "iana" }, "audio/evrc-qcp": { "source": "iana" }, "audio/evrc0": { "source": "iana" }, "audio/evrc1": { "source": "iana" }, "audio/evrcb": { "source": "iana" }, "audio/evrcb0": { "source": "iana" }, "audio/evrcb1": { "source": "iana" }, "audio/evrcnw": { "source": "iana" }, "audio/evrcnw0": { "source": "iana" }, "audio/evrcnw1": { "source": "iana" }, "audio/evrcwb": { "source": "iana" }, "audio/evrcwb0": { "source": "iana" }, "audio/evrcwb1": { "source": "iana" }, "audio/example": { "source": "iana" }, "audio/fwdred": { "source": "iana" }, "audio/g719": { "source": "iana" }, "audio/g721": { "source": "iana" }, "audio/g722": { "source": "iana" }, "audio/g7221": { "source": "apache" }, "audio/g723": { "source": "iana" }, "audio/g726-16": { "source": "iana" }, "audio/g726-24": { "source": "iana" }, "audio/g726-32": { "source": "iana" }, "audio/g726-40": { "source": "iana" }, "audio/g728": { "source": "iana" }, "audio/g729": { "source": "iana" }, "audio/g7291": { "source": "iana" }, "audio/g729d": { "source": "iana" }, "audio/g729e": { "source": "iana" }, "audio/gsm": { "source": "iana" }, "audio/gsm-efr": { "source": "iana" }, "audio/gsm-hr-08": { "source": "iana" }, "audio/ilbc": { "source": "iana" }, "audio/ip-mr_v2.5": { "source": "iana" }, "audio/isac": { "source": "apache" }, "audio/l16": { "source": "iana" }, "audio/l20": { "source": "iana" }, "audio/l24": { "source": "iana", "compressible": false }, "audio/l8": { "source": "iana" }, "audio/lpc": { "source": "iana" }, "audio/midi": { "source": "apache", "extensions": ["mid","midi","kar","rmi"] }, "audio/mobile-xmf": { "source": "iana" }, "audio/mp4": { "source": "iana", "compressible": false, "extensions": ["mp4a","m4a"] }, "audio/mp4a-latm": { "source": "iana" }, "audio/mpa": { "source": "iana" }, "audio/mpa-robust": { "source": "iana" }, "audio/mpeg": { "source": "iana", "compressible": false, "extensions": ["mpga","mp2","mp2a","mp3","m2a","m3a"] }, "audio/mpeg4-generic": { "source": "iana" }, "audio/musepack": { "source": "apache" }, "audio/ogg": { "source": "iana", "compressible": false, "extensions": ["oga","ogg","spx"] }, "audio/opus": { "source": "apache" }, "audio/parityfec": { "source": "iana" }, "audio/pcma": { "source": "iana" }, "audio/pcma-wb": { "source": "iana" }, "audio/pcmu": { "source": "iana" }, "audio/pcmu-wb": { "source": "iana" }, "audio/prs.sid": { "source": "iana" }, "audio/qcelp": { "source": "iana" }, "audio/raptorfec": { "source": "iana" }, "audio/red": { "source": "iana" }, "audio/rtp-enc-aescm128": { "source": "iana" }, "audio/rtp-midi": { "source": "iana" }, "audio/rtploopback": { "source": "iana" }, "audio/rtx": { "source": "iana" }, "audio/s3m": { "source": "apache", "extensions": ["s3m"] }, "audio/silk": { "source": "apache", "extensions": ["sil"] }, "audio/smv": { "source": "iana" }, "audio/smv-qcp": { "source": "iana" }, "audio/smv0": { "source": "iana" }, "audio/sp-midi": { "source": "iana" }, "audio/speex": { "source": "iana" }, "audio/t140c": { "source": "iana" }, "audio/t38": { "source": "iana" }, "audio/telephone-event": { "source": "iana" }, "audio/tone": { "source": "iana" }, "audio/uemclip": { "source": "iana" }, "audio/ulpfec": { "source": "iana" }, "audio/vdvi": { "source": "iana" }, "audio/vmr-wb": { "source": "iana" }, "audio/vnd.3gpp.iufp": { "source": "iana" }, "audio/vnd.4sb": { "source": "iana" }, "audio/vnd.audiokoz": { "source": "iana" }, "audio/vnd.celp": { "source": "iana" }, "audio/vnd.cisco.nse": { "source": "iana" }, "audio/vnd.cmles.radio-events": { "source": "iana" }, "audio/vnd.cns.anp1": { "source": "iana" }, "audio/vnd.cns.inf1": { "source": "iana" }, "audio/vnd.dece.audio": { "source": "iana", "extensions": ["uva","uvva"] }, "audio/vnd.digital-winds": { "source": "iana", "extensions": ["eol"] }, "audio/vnd.dlna.adts": { "source": "iana" }, "audio/vnd.dolby.heaac.1": { "source": "iana" }, "audio/vnd.dolby.heaac.2": { "source": "iana" }, "audio/vnd.dolby.mlp": { "source": "iana" }, "audio/vnd.dolby.mps": { "source": "iana" }, "audio/vnd.dolby.pl2": { "source": "iana" }, "audio/vnd.dolby.pl2x": { "source": "iana" }, "audio/vnd.dolby.pl2z": { "source": "iana" }, "audio/vnd.dolby.pulse.1": { "source": "iana" }, "audio/vnd.dra": { "source": "iana", "extensions": ["dra"] }, "audio/vnd.dts": { "source": "iana", "extensions": ["dts"] }, "audio/vnd.dts.hd": { "source": "iana", "extensions": ["dtshd"] }, "audio/vnd.dvb.file": { "source": "iana" }, "audio/vnd.everad.plj": { "source": "iana" }, "audio/vnd.hns.audio": { "source": "iana" }, "audio/vnd.lucent.voice": { "source": "iana", "extensions": ["lvp"] }, "audio/vnd.ms-playready.media.pya": { "source": "iana", "extensions": ["pya"] }, "audio/vnd.nokia.mobile-xmf": { "source": "iana" }, "audio/vnd.nortel.vbk": { "source": "iana" }, "audio/vnd.nuera.ecelp4800": { "source": "iana", "extensions": ["ecelp4800"] }, "audio/vnd.nuera.ecelp7470": { "source": "iana", "extensions": ["ecelp7470"] }, "audio/vnd.nuera.ecelp9600": { "source": "iana", "extensions": ["ecelp9600"] }, "audio/vnd.octel.sbc": { "source": "iana" }, "audio/vnd.qcelp": { "source": "iana" }, "audio/vnd.rhetorex.32kadpcm": { "source": "iana" }, "audio/vnd.rip": { "source": "iana", "extensions": ["rip"] }, "audio/vnd.rn-realaudio": { "compressible": false }, "audio/vnd.sealedmedia.softseal-mpeg": { "source": "iana" }, "audio/vnd.sealedmedia.softseal.mpeg": { "source": "apache" }, "audio/vnd.vmx.cvsd": { "source": "iana" }, "audio/vnd.wave": { "compressible": false }, "audio/vorbis": { "source": "iana", "compressible": false }, "audio/vorbis-config": { "source": "iana" }, "audio/webm": { "source": "apache", "compressible": false, "extensions": ["weba"] }, "audio/x-aac": { "source": "apache", "compressible": false, "extensions": ["aac"] }, "audio/x-aiff": { "source": "apache", "extensions": ["aif","aiff","aifc"] }, "audio/x-caf": { "source": "apache", "compressible": false, "extensions": ["caf"] }, "audio/x-flac": { "source": "apache", "extensions": ["flac"] }, "audio/x-matroska": { "source": "apache", "extensions": ["mka"] }, "audio/x-mpegurl": { "source": "apache", "extensions": ["m3u"] }, "audio/x-ms-wax": { "source": "apache", "extensions": ["wax"] }, "audio/x-ms-wma": { "source": "apache", "extensions": ["wma"] }, "audio/x-pn-realaudio": { "source": "apache", "extensions": ["ram","ra"] }, "audio/x-pn-realaudio-plugin": { "source": "apache", "extensions": ["rmp"] }, "audio/x-tta": { "source": "apache" }, "audio/x-wav": { "source": "apache", "extensions": ["wav"] }, "audio/xm": { "source": "apache", "extensions": ["xm"] }, "chemical/x-cdx": { "source": "apache", "extensions": ["cdx"] }, "chemical/x-cif": { "source": "apache", "extensions": ["cif"] }, "chemical/x-cmdf": { "source": "apache", "extensions": ["cmdf"] }, "chemical/x-cml": { "source": "apache", "extensions": ["cml"] }, "chemical/x-csml": { "source": "apache", "extensions": ["csml"] }, "chemical/x-pdb": { "source": "apache" }, "chemical/x-xyz": { "source": "apache", "extensions": ["xyz"] }, "font/opentype": { "compressible": true, "extensions": ["otf"] }, "image/bmp": { "source": "apache", "compressible": true, "extensions": ["bmp"] }, "image/cgm": { "source": "iana", "extensions": ["cgm"] }, "image/example": { "source": "iana" }, "image/fits": { "source": "iana" }, "image/g3fax": { "source": "iana", "extensions": ["g3"] }, "image/gif": { "source": "iana", "compressible": false, "extensions": ["gif"] }, "image/ief": { "source": "iana", "extensions": ["ief"] }, "image/jp2": { "source": "iana" }, "image/jpeg": { "source": "iana", "compressible": false, "extensions": ["jpeg","jpg","jpe"] }, "image/jpm": { "source": "iana" }, "image/jpx": { "source": "iana" }, "image/ktx": { "source": "iana", "extensions": ["ktx"] }, "image/naplps": { "source": "iana" }, "image/pjpeg": { "compressible": false }, "image/png": { "source": "iana", "compressible": false, "extensions": ["png"] }, "image/prs.btif": { "source": "iana", "extensions": ["btif"] }, "image/prs.pti": { "source": "iana" }, "image/pwg-raster": { "source": "iana" }, "image/sgi": { "source": "apache", "extensions": ["sgi"] }, "image/svg+xml": { "source": "iana", "compressible": true, "extensions": ["svg","svgz"] }, "image/t38": { "source": "iana" }, "image/tiff": { "source": "iana", "compressible": false, "extensions": ["tiff","tif"] }, "image/tiff-fx": { "source": "iana" }, "image/vnd-djvu": { "source": "iana" }, "image/vnd-svf": { "source": "iana" }, "image/vnd-wap-wbmp": { "source": "iana" }, "image/vnd.adobe.photoshop": { "source": "iana", "compressible": true, "extensions": ["psd"] }, "image/vnd.airzip.accelerator.azv": { "source": "iana" }, "image/vnd.cns.inf2": { "source": "iana" }, "image/vnd.dece.graphic": { "source": "iana", "extensions": ["uvi","uvvi","uvg","uvvg"] }, "image/vnd.djvu": { "source": "apache", "extensions": ["djvu","djv"] }, "image/vnd.dvb.subtitle": { "source": "iana", "extensions": ["sub"] }, "image/vnd.dwg": { "source": "iana", "extensions": ["dwg"] }, "image/vnd.dxf": { "source": "iana", "extensions": ["dxf"] }, "image/vnd.fastbidsheet": { "source": "iana", "extensions": ["fbs"] }, "image/vnd.fpx": { "source": "iana", "extensions": ["fpx"] }, "image/vnd.fst": { "source": "iana", "extensions": ["fst"] }, "image/vnd.fujixerox.edmics-mmr": { "source": "iana", "extensions": ["mmr"] }, "image/vnd.fujixerox.edmics-rlc": { "source": "iana", "extensions": ["rlc"] }, "image/vnd.globalgraphics.pgb": { "source": "iana" }, "image/vnd.microsoft.icon": { "source": "iana" }, "image/vnd.mix": { "source": "iana" }, "image/vnd.ms-modi": { "source": "iana", "extensions": ["mdi"] }, "image/vnd.ms-photo": { "source": "apache", "extensions": ["wdp"] }, "image/vnd.net-fpx": { "source": "iana", "extensions": ["npx"] }, "image/vnd.radiance": { "source": "iana" }, "image/vnd.sealed-png": { "source": "iana" }, "image/vnd.sealed.png": { "source": "apache" }, "image/vnd.sealedmedia.softseal-gif": { "source": "iana" }, "image/vnd.sealedmedia.softseal-jpg": { "source": "iana" }, "image/vnd.sealedmedia.softseal.gif": { "source": "apache" }, "image/vnd.sealedmedia.softseal.jpg": { "source": "apache" }, "image/vnd.svf": { "source": "apache" }, "image/vnd.tencent.tap": { "source": "iana" }, "image/vnd.valve.source.texture": { "source": "iana" }, "image/vnd.wap.wbmp": { "source": "apache", "extensions": ["wbmp"] }, "image/vnd.xiff": { "source": "iana", "extensions": ["xif"] }, "image/webp": { "source": "apache", "extensions": ["webp"] }, "image/x-3ds": { "source": "apache", "extensions": ["3ds"] }, "image/x-cmu-raster": { "source": "apache", "extensions": ["ras"] }, "image/x-cmx": { "source": "apache", "extensions": ["cmx"] }, "image/x-freehand": { "source": "apache", "extensions": ["fh","fhc","fh4","fh5","fh7"] }, "image/x-icon": { "source": "apache", "compressible": true, "extensions": ["ico"] }, "image/x-mrsid-image": { "source": "apache", "extensions": ["sid"] }, "image/x-pcx": { "source": "apache", "extensions": ["pcx"] }, "image/x-pict": { "source": "apache", "extensions": ["pic","pct"] }, "image/x-portable-anymap": { "source": "apache", "extensions": ["pnm"] }, "image/x-portable-bitmap": { "source": "apache", "extensions": ["pbm"] }, "image/x-portable-graymap": { "source": "apache", "extensions": ["pgm"] }, "image/x-portable-pixmap": { "source": "apache", "extensions": ["ppm"] }, "image/x-rgb": { "source": "apache", "extensions": ["rgb"] }, "image/x-tga": { "source": "apache", "extensions": ["tga"] }, "image/x-xbitmap": { "source": "apache", "extensions": ["xbm"] }, "image/x-xcf": { "compressible": false }, "image/x-xpixmap": { "source": "apache", "extensions": ["xpm"] }, "image/x-xwindowdump": { "source": "apache", "extensions": ["xwd"] }, "message/cpim": { "source": "iana" }, "message/delivery-status": { "source": "iana" }, "message/disposition-notification": { "source": "iana" }, "message/example": { "source": "iana" }, "message/external-body": { "source": "iana" }, "message/feedback-report": { "source": "iana" }, "message/global": { "source": "iana" }, "message/global-delivery-status": { "source": "iana" }, "message/global-disposition-notification": { "source": "iana" }, "message/global-headers": { "source": "iana" }, "message/http": { "source": "iana", "compressible": false }, "message/imdn+xml": { "source": "iana", "compressible": true }, "message/news": { "source": "iana" }, "message/partial": { "source": "iana", "compressible": false }, "message/rfc822": { "source": "iana", "compressible": true, "extensions": ["eml","mime"] }, "message/s-http": { "source": "iana" }, "message/sip": { "source": "iana" }, "message/sipfrag": { "source": "iana" }, "message/tracking-status": { "source": "iana" }, "message/vnd.si.simp": { "source": "iana" }, "message/vnd.wfa.wsc": { "source": "iana" }, "model/example": { "source": "iana", "compressible": false }, "model/iges": { "source": "iana", "compressible": false, "extensions": ["igs","iges"] }, "model/mesh": { "source": "iana", "compressible": false, "extensions": ["msh","mesh","silo"] }, "model/vnd-dwf": { "source": "iana" }, "model/vnd.collada+xml": { "source": "iana", "extensions": ["dae"] }, "model/vnd.dwf": { "source": "apache", "extensions": ["dwf"] }, "model/vnd.flatland.3dml": { "source": "iana" }, "model/vnd.gdl": { "source": "iana", "extensions": ["gdl"] }, "model/vnd.gs-gdl": { "source": "iana" }, "model/vnd.gs.gdl": { "source": "apache" }, "model/vnd.gtw": { "source": "iana", "extensions": ["gtw"] }, "model/vnd.moml+xml": { "source": "iana" }, "model/vnd.mts": { "source": "iana", "extensions": ["mts"] }, "model/vnd.opengex": { "source": "iana" }, "model/vnd.parasolid.transmit-binary": { "source": "iana" }, "model/vnd.parasolid.transmit-text": { "source": "iana" }, "model/vnd.parasolid.transmit.binary": { "source": "apache" }, "model/vnd.parasolid.transmit.text": { "source": "apache" }, "model/vnd.valve.source.compiled-map": { "source": "iana" }, "model/vnd.vtu": { "source": "iana", "extensions": ["vtu"] }, "model/vrml": { "source": "iana", "compressible": false, "extensions": ["wrl","vrml"] }, "model/x3d+binary": { "source": "apache", "compressible": false, "extensions": ["x3db","x3dbz"] }, "model/x3d+fastinfoset": { "source": "iana" }, "model/x3d+vrml": { "source": "apache", "compressible": false, "extensions": ["x3dv","x3dvz"] }, "model/x3d+xml": { "source": "iana", "compressible": true, "extensions": ["x3d","x3dz"] }, "model/x3d-vrml": { "source": "iana" }, "multipart/alternative": { "source": "iana", "compressible": false }, "multipart/appledouble": { "source": "iana" }, "multipart/byteranges": { "source": "iana" }, "multipart/digest": { "source": "iana" }, "multipart/encrypted": { "source": "iana", "compressible": false }, "multipart/example": { "source": "iana" }, "multipart/form-data": { "source": "iana", "compressible": false }, "multipart/header-set": { "source": "iana" }, "multipart/mixed": { "source": "iana", "compressible": false }, "multipart/parallel": { "source": "iana" }, "multipart/related": { "source": "iana", "compressible": false }, "multipart/report": { "source": "iana" }, "multipart/signed": { "source": "iana", "compressible": false }, "multipart/voice-message": { "source": "iana" }, "multipart/x-mixed-replace": { "source": "iana" }, "text/1d-interleaved-parityfec": { "source": "iana" }, "text/cache-manifest": { "source": "iana", "compressible": true, "extensions": ["appcache","manifest"] }, "text/calendar": { "source": "iana", "extensions": ["ics","ifb"] }, "text/calender": { "compressible": true }, "text/cmd": { "compressible": true }, "text/coffeescript": { "extensions": ["coffee"] }, "text/css": { "source": "iana", "compressible": true, "extensions": ["css"] }, "text/csv": { "source": "iana", "compressible": true, "extensions": ["csv"] }, "text/csv-schema": { "source": "iana" }, "text/directory": { "source": "iana" }, "text/dns": { "source": "iana" }, "text/ecmascript": { "source": "iana" }, "text/encaprtp": { "source": "iana" }, "text/enriched": { "source": "iana" }, "text/example": { "source": "iana" }, "text/fwdred": { "source": "iana" }, "text/grammar-ref-list": { "source": "iana" }, "text/html": { "source": "iana", "compressible": true, "extensions": ["html","htm"] }, "text/jade": { "extensions": ["jade"] }, "text/javascript": { "source": "iana", "compressible": true }, "text/jcr-cnd": { "source": "iana" }, "text/jsx": { "compressible": true, "extensions": ["jsx"] }, "text/less": { "extensions": ["less"] }, "text/markdown": { "source": "iana" }, "text/mizar": { "source": "iana" }, "text/n3": { "source": "iana", "compressible": true, "extensions": ["n3"] }, "text/parameters": { "source": "iana" }, "text/parityfec": { "source": "iana" }, "text/plain": { "source": "iana", "compressible": true, "extensions": ["txt","text","conf","def","list","log","in","ini"] }, "text/provenance-notation": { "source": "iana" }, "text/prs.fallenstein.rst": { "source": "iana" }, "text/prs.lines.tag": { "source": "iana", "extensions": ["dsc"] }, "text/raptorfec": { "source": "iana" }, "text/red": { "source": "iana" }, "text/rfc822-headers": { "source": "iana" }, "text/richtext": { "source": "iana", "compressible": true, "extensions": ["rtx"] }, "text/rtf": { "source": "iana" }, "text/rtp-enc-aescm128": { "source": "iana" }, "text/rtploopback": { "source": "iana" }, "text/rtx": { "source": "iana" }, "text/sgml": { "source": "iana", "extensions": ["sgml","sgm"] }, "text/stylus": { "extensions": ["stylus","styl"] }, "text/t140": { "source": "iana" }, "text/tab-separated-values": { "source": "iana", "compressible": true, "extensions": ["tsv"] }, "text/troff": { "source": "iana", "extensions": ["t","tr","roff","man","me","ms"] }, "text/turtle": { "source": "iana", "extensions": ["ttl"] }, "text/ulpfec": { "source": "iana" }, "text/uri-list": { "source": "iana", "compressible": true, "extensions": ["uri","uris","urls"] }, "text/vcard": { "source": "iana", "compressible": true, "extensions": ["vcard"] }, "text/vnd-a": { "source": "iana" }, "text/vnd-curl": { "source": "iana" }, "text/vnd.abc": { "source": "iana" }, "text/vnd.curl": { "source": "apache", "extensions": ["curl"] }, "text/vnd.curl.dcurl": { "source": "apache", "extensions": ["dcurl"] }, "text/vnd.curl.mcurl": { "source": "apache", "extensions": ["mcurl"] }, "text/vnd.curl.scurl": { "source": "apache", "extensions": ["scurl"] }, "text/vnd.debian.copyright": { "source": "iana" }, "text/vnd.dmclientscript": { "source": "iana" }, "text/vnd.dvb.subtitle": { "source": "iana", "extensions": ["sub"] }, "text/vnd.esmertec.theme-descriptor": { "source": "iana" }, "text/vnd.fly": { "source": "iana", "extensions": ["fly"] }, "text/vnd.fmi.flexstor": { "source": "iana", "extensions": ["flx"] }, "text/vnd.graphviz": { "source": "iana", "extensions": ["gv"] }, "text/vnd.in3d.3dml": { "source": "iana", "extensions": ["3dml"] }, "text/vnd.in3d.spot": { "source": "iana", "extensions": ["spot"] }, "text/vnd.iptc.newsml": { "source": "iana" }, "text/vnd.iptc.nitf": { "source": "iana" }, "text/vnd.latex-z": { "source": "iana" }, "text/vnd.motorola.reflex": { "source": "iana" }, "text/vnd.ms-mediapackage": { "source": "iana" }, "text/vnd.net2phone.commcenter.command": { "source": "iana" }, "text/vnd.radisys.msml-basic-layout": { "source": "iana" }, "text/vnd.si.uricatalogue": { "source": "iana" }, "text/vnd.sun.j2me.app-descriptor": { "source": "iana", "extensions": ["jad"] }, "text/vnd.trolltech.linguist": { "source": "iana" }, "text/vnd.wap-wml": { "source": "iana" }, "text/vnd.wap.si": { "source": "iana" }, "text/vnd.wap.sl": { "source": "iana" }, "text/vnd.wap.wml": { "source": "apache", "extensions": ["wml"] }, "text/vnd.wap.wmlscript": { "source": "iana", "extensions": ["wmls"] }, "text/vtt": { "charset": "UTF-8", "compressible": true, "extensions": ["vtt"] }, "text/x-asm": { "source": "apache", "extensions": ["s","asm"] }, "text/x-c": { "source": "apache", "extensions": ["c","cc","cxx","cpp","h","hh","dic"] }, "text/x-component": { "extensions": ["htc"] }, "text/x-fortran": { "source": "apache", "extensions": ["f","for","f77","f90"] }, "text/x-gwt-rpc": { "compressible": true }, "text/x-handlebars-template": { "extensions": ["hbs"] }, "text/x-java-source": { "source": "apache", "extensions": ["java"] }, "text/x-jquery-tmpl": { "compressible": true }, "text/x-lua": { "extensions": ["lua"] }, "text/x-markdown": { "compressible": true, "extensions": ["markdown","md","mkd"] }, "text/x-nfo": { "source": "apache", "extensions": ["nfo"] }, "text/x-opml": { "source": "apache", "extensions": ["opml"] }, "text/x-pascal": { "source": "apache", "extensions": ["p","pas"] }, "text/x-sass": { "extensions": ["sass"] }, "text/x-scss": { "extensions": ["scss"] }, "text/x-setext": { "source": "apache", "extensions": ["etx"] }, "text/x-sfv": { "source": "apache", "extensions": ["sfv"] }, "text/x-uuencode": { "source": "apache", "extensions": ["uu"] }, "text/x-vcalendar": { "source": "apache", "extensions": ["vcs"] }, "text/x-vcard": { "source": "apache", "extensions": ["vcf"] }, "text/xml": { "source": "iana", "compressible": true }, "text/xml-external-parsed-entity": { "source": "iana" }, "text/yaml": { "extensions": ["yaml","yml"] }, "video/1d-interleaved-parityfec": { "source": "apache" }, "video/3gpp": { "source": "apache", "extensions": ["3gp"] }, "video/3gpp-tt": { "source": "apache" }, "video/3gpp2": { "source": "apache", "extensions": ["3g2"] }, "video/bmpeg": { "source": "apache" }, "video/bt656": { "source": "apache" }, "video/celb": { "source": "apache" }, "video/dv": { "source": "apache" }, "video/example": { "source": "apache" }, "video/h261": { "source": "apache", "extensions": ["h261"] }, "video/h263": { "source": "apache", "extensions": ["h263"] }, "video/h263-1998": { "source": "apache" }, "video/h263-2000": { "source": "apache" }, "video/h264": { "source": "apache", "extensions": ["h264"] }, "video/h264-rcdo": { "source": "apache" }, "video/h264-svc": { "source": "apache" }, "video/jpeg": { "source": "apache", "extensions": ["jpgv"] }, "video/jpeg2000": { "source": "apache" }, "video/jpm": { "source": "apache", "extensions": ["jpm","jpgm"] }, "video/mj2": { "source": "apache", "extensions": ["mj2","mjp2"] }, "video/mp1s": { "source": "apache" }, "video/mp2p": { "source": "apache" }, "video/mp2t": { "source": "apache", "extensions": ["ts"] }, "video/mp4": { "source": "apache", "compressible": false, "extensions": ["mp4","mp4v","mpg4"] }, "video/mp4v-es": { "source": "apache" }, "video/mpeg": { "source": "apache", "compressible": false, "extensions": ["mpeg","mpg","mpe","m1v","m2v"] }, "video/mpeg4-generic": { "source": "apache" }, "video/mpv": { "source": "apache" }, "video/nv": { "source": "apache" }, "video/ogg": { "source": "apache", "compressible": false, "extensions": ["ogv"] }, "video/parityfec": { "source": "apache" }, "video/pointer": { "source": "apache" }, "video/quicktime": { "source": "apache", "compressible": false, "extensions": ["qt","mov"] }, "video/raw": { "source": "apache" }, "video/rtp-enc-aescm128": { "source": "apache" }, "video/rtx": { "source": "apache" }, "video/smpte292m": { "source": "apache" }, "video/ulpfec": { "source": "apache" }, "video/vc1": { "source": "apache" }, "video/vnd.cctv": { "source": "apache" }, "video/vnd.dece.hd": { "source": "apache", "extensions": ["uvh","uvvh"] }, "video/vnd.dece.mobile": { "source": "apache", "extensions": ["uvm","uvvm"] }, "video/vnd.dece.mp4": { "source": "apache" }, "video/vnd.dece.pd": { "source": "apache", "extensions": ["uvp","uvvp"] }, "video/vnd.dece.sd": { "source": "apache", "extensions": ["uvs","uvvs"] }, "video/vnd.dece.video": { "source": "apache", "extensions": ["uvv","uvvv"] }, "video/vnd.directv.mpeg": { "source": "apache" }, "video/vnd.directv.mpeg-tts": { "source": "apache" }, "video/vnd.dlna.mpeg-tts": { "source": "apache" }, "video/vnd.dvb.file": { "source": "apache", "extensions": ["dvb"] }, "video/vnd.fvt": { "source": "apache", "extensions": ["fvt"] }, "video/vnd.hns.video": { "source": "apache" }, "video/vnd.iptvforum.1dparityfec-1010": { "source": "apache" }, "video/vnd.iptvforum.1dparityfec-2005": { "source": "apache" }, "video/vnd.iptvforum.2dparityfec-1010": { "source": "apache" }, "video/vnd.iptvforum.2dparityfec-2005": { "source": "apache" }, "video/vnd.iptvforum.ttsavc": { "source": "apache" }, "video/vnd.iptvforum.ttsmpeg2": { "source": "apache" }, "video/vnd.motorola.video": { "source": "apache" }, "video/vnd.motorola.videop": { "source": "apache" }, "video/vnd.mpegurl": { "source": "apache", "extensions": ["mxu","m4u"] }, "video/vnd.ms-playready.media.pyv": { "source": "apache", "extensions": ["pyv"] }, "video/vnd.nokia.interleaved-multimedia": { "source": "apache" }, "video/vnd.nokia.videovoip": { "source": "apache" }, "video/vnd.objectvideo": { "source": "apache" }, "video/vnd.sealed.mpeg1": { "source": "apache" }, "video/vnd.sealed.mpeg4": { "source": "apache" }, "video/vnd.sealed.swf": { "source": "apache" }, "video/vnd.sealedmedia.softseal.mov": { "source": "apache" }, "video/vnd.uvvu.mp4": { "source": "apache", "extensions": ["uvu","uvvu"] }, "video/vnd.vivo": { "source": "apache", "extensions": ["viv"] }, "video/webm": { "source": "apache", "compressible": false, "extensions": ["webm"] }, "video/x-f4v": { "source": "apache", "extensions": ["f4v"] }, "video/x-fli": { "source": "apache", "extensions": ["fli"] }, "video/x-flv": { "source": "apache", "compressible": false, "extensions": ["flv"] }, "video/x-m4v": { "source": "apache", "extensions": ["m4v"] }, "video/x-matroska": { "source": "apache", "compressible": false, "extensions": ["mkv","mk3d","mks"] }, "video/x-mng": { "source": "apache", "extensions": ["mng"] }, "video/x-ms-asf": { "source": "apache", "extensions": ["asf","asx"] }, "video/x-ms-vob": { "source": "apache", "extensions": ["vob"] }, "video/x-ms-wm": { "source": "apache", "extensions": ["wm"] }, "video/x-ms-wmv": { "source": "apache", "compressible": false, "extensions": ["wmv"] }, "video/x-ms-wmx": { "source": "apache", "extensions": ["wmx"] }, "video/x-ms-wvx": { "source": "apache", "extensions": ["wvx"] }, "video/x-msvideo": { "source": "apache", "extensions": ["avi"] }, "video/x-sgi-movie": { "source": "apache", "extensions": ["movie"] }, "video/x-smv": { "source": "apache", "extensions": ["smv"] }, "x-conference/x-cooltalk": { "source": "apache", "extensions": ["ice"] }, "x-shader/x-fragment": { "compressible": true }, "x-shader/x-vertex": { "compressible": true } } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/request/node_modules/form-data/node_modules/mime-types/node_modules/mime-db/index.js���000644 �000766 �000024 �00000000210 12455173731 043512� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������/*! * mime-db * Copyright(c) 2014 Jonathan Ong * MIT Licensed */ /** * Module exports. */ module.exports = require('./db.json') ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/form-data/node_modules/mime-types/node_modules/mime-db/LICENSE000644 �000766 �000024 �00000002113 12455173731 043056� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules�������������������������������������������������������������������������������������������������������������������������������� The MIT License (MIT) Copyright (c) 2014 Jonathan Ong me@jongleberry.com Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������request/node_modules/form-data/node_modules/mime-types/node_modules/mime-db/package.json������������000644 �000766 �000024 �00000003745 12455173731 044353� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������{ "name": "mime-db", "description": "Media Type Database", "version": "1.3.0", "author": { "name": "Jonathan Ong", "email": "me@jongleberry.com", "url": "http://jongleberry.com" }, "license": "MIT", "repository": { "type": "git", "url": "https://github.com/jshttp/mime-db" }, "devDependencies": { "co": "3", "cogent": "1", "csv-parse": "0", "gnode": "0.1.0", "istanbul": "0.3.4", "mocha": "~1.21.4", "stream-to-array": "2" }, "engines": { "node": ">= 0.6" }, "files": [ "LICENSE", "db.json", "index.js" ], "scripts": { "update": "gnode scripts/extensions && gnode scripts/types && node scripts/build", "clean": "rm src/*", "test": "mocha --reporter spec --bail --check-leaks test/", "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" }, "keywords": [ "mime", "db", "type", "types", "database", "charset", "charsets" ], "gitHead": "dc3a4d4948e9e6814404712d0f3560f1fffe7d73", "bugs": { "url": "https://github.com/jshttp/mime-db/issues" }, "homepage": "https://github.com/jshttp/mime-db", "_id": "mime-db@1.3.0", "_shasum": "5fefeb25dd9b097c5d45091c60f8149b98d749ec", "_from": "mime-db@>=1.3.0 <1.4.0", "_npmVersion": "1.4.21", "_npmUser": { "name": "dougwilson", "email": "doug@somethingdoug.com" }, "maintainers": [ { "name": "jongleberry", "email": "jonathanrichardong@gmail.com" }, { "name": "dougwilson", "email": "doug@somethingdoug.com" } ], "dist": { "shasum": "5fefeb25dd9b097c5d45091c60f8149b98d749ec", "tarball": "http://registry.npmjs.org/mime-db/-/mime-db-1.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.3.0.tgz", "readme": "ERROR: No README data found!" } ���������������������������node_modules/request/node_modules/form-data/node_modules/mime-types/node_modules/mime-db/README.md��000644 �000766 �000024 �00000005123 12455173731 043334� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������# mime-db [![NPM Version][npm-version-image]][npm-url] [![NPM Downloads][npm-downloads-image]][npm-url] [![Node.js Version][node-image]][node-url] [![Build Status][travis-image]][travis-url] [![Coverage Status][coveralls-image]][coveralls-url] This is a database of all mime types. It consistents of a single, public JSON file and does not include any logic, allowing it to remain as unopinionated as possible with an API. It aggregates data from the following sources: - http://www.iana.org/assignments/media-types/media-types.xhtml - http://svn.apache.org/repos/asf/httpd/httpd/trunk/docs/conf/mime.types ## Usage ```bash npm i mime-db ``` ```js var db = require('mime-db'); // grab data on .js files var data = db['application/javascript']; ``` If you're crazy enough to use this in the browser, you can just grab the JSON file: ``` https://cdn.rawgit.com/jshttp/mime-db/master/db.json ``` ## Data Structure The JSON file is a map lookup for lowercased mime types. Each mime type has the following properties: - `.source` - where the mime type is defined. If not set, it's probably a custom media type. - `apache` - [Apache common media types](http://svn.apache.org/repos/asf/httpd/httpd/trunk/docs/conf/mime.types) - `iana` - [IANA-defined media types](http://www.iana.org/assignments/media-types/media-types.xhtml) - `.extensions[]` - known extensions associated with this mime type. - `.compressible` - whether a file of this type is can be gzipped. - `.charset` - the default charset associated with this type, if any. If unknown, every property could be `undefined`. ## Repository Structure - `scripts` - these are scripts to run to build the database - `src/` - this is a folder of files created from remote sources like Apache and IANA - `lib/` - this is a folder of our own custom sources and db, which will be merged into `db.json` - `db.json` - the final built JSON file for end-user usage ## Contributing To edit the database, only make PRs against files in the `lib/` folder. To update the build, run `npm run update`. [npm-version-image]: https://img.shields.io/npm/v/mime-db.svg?style=flat [npm-downloads-image]: https://img.shields.io/npm/dm/mime-db.svg?style=flat [npm-url]: https://npmjs.org/package/mime-db [travis-image]: https://img.shields.io/travis/jshttp/mime-db.svg?style=flat [travis-url]: https://travis-ci.org/jshttp/mime-db [coveralls-image]: https://img.shields.io/coveralls/jshttp/mime-db.svg?style=flat [coveralls-url]: https://coveralls.io/r/jshttp/mime-db?branch=master [node-image]: https://img.shields.io/node/v/mime-db.svg?style=flat [node-url]: http://nodejs.org/download/ ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/.travis.yml�����000644 �000766 �000024 �00000000046 12455173731 037222� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.10" ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/component.json��000644 �000766 �000024 �00000000424 12455173731 040006� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "async", "repo": "caolan/async", "description": "Higher-order functions and common patterns for asynchronous code", "version": "0.1.23", "keywords": [], "dependencies": {}, "development": {}, "main": "lib/async.js", "scripts": [ "lib/async.js" ] } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/lib/������������000755 �000766 �000024 �00000000000 12456115120 035644� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/LICENSE���������000644 �000766 �000024 �00000002047 12455173731 036121� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2010-2014 Caolan McMahon Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/package.json����000644 �000766 �000024 �00000002622 12455173731 037401� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "async", "description": "Higher-order functions and common patterns for asynchronous code", "main": "./lib/async", "author": { "name": "Caolan McMahon" }, "version": "0.9.0", "repository": { "type": "git", "url": "https://github.com/caolan/async.git" }, "bugs": { "url": "https://github.com/caolan/async/issues" }, "licenses": [ { "type": "MIT", "url": "https://github.com/caolan/async/raw/master/LICENSE" } ], "devDependencies": { "nodeunit": ">0.0.0", "uglify-js": "1.2.x", "nodelint": ">0.0.0" }, "jam": { "main": "lib/async.js", "include": [ "lib/async.js", "README.md", "LICENSE" ] }, "scripts": { "test": "nodeunit test/test-async.js" }, "homepage": "https://github.com/caolan/async", "_id": "async@0.9.0", "dist": { "shasum": "ac3613b1da9bed1b47510bb4651b8931e47146c7", "tarball": "http://registry.npmjs.org/async/-/async-0.9.0.tgz" }, "_from": "async@>=0.9.0 <0.10.0", "_npmVersion": "1.4.3", "_npmUser": { "name": "caolan", "email": "caolan.mcmahon@gmail.com" }, "maintainers": [ { "name": "caolan", "email": "caolan@caolanmcmahon.com" } ], "directories": {}, "_shasum": "ac3613b1da9bed1b47510bb4651b8931e47146c7", "_resolved": "https://registry.npmjs.org/async/-/async-0.9.0.tgz", "readme": "ERROR: No README data found!" } ��������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/README.md�������000644 �000766 �000024 �00000145160 12455173731 036377� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# Async.js [![Build Status via Travis CI](https://travis-ci.org/caolan/async.svg?branch=master)](https://travis-ci.org/caolan/async) Async is a utility module which provides straight-forward, powerful functions for working with asynchronous JavaScript. Although originally designed for use with [Node.js](http://nodejs.org), it can also be used directly in the browser. Also supports [component](https://github.com/component/component). Async provides around 20 functions that include the usual 'functional' suspects (`map`, `reduce`, `filter`, `each`…) as well as some common patterns for asynchronous control flow (`parallel`, `series`, `waterfall`…). All these functions assume you follow the Node.js convention of providing a single callback as the last argument of your `async` function. ## Quick Examples ```javascript async.map(['file1','file2','file3'], fs.stat, function(err, results){ // results is now an array of stats for each file }); async.filter(['file1','file2','file3'], fs.exists, function(results){ // results now equals an array of the existing files }); async.parallel([ function(){ ... }, function(){ ... } ], callback); async.series([ function(){ ... }, function(){ ... } ]); ``` There are many more functions available so take a look at the docs below for a full list. This module aims to be comprehensive, so if you feel anything is missing please create a GitHub issue for it. ## Common Pitfalls ### Binding a context to an iterator This section is really about `bind`, not about `async`. If you are wondering how to make `async` execute your iterators in a given context, or are confused as to why a method of another library isn't working as an iterator, study this example: ```js // Here is a simple object with an (unnecessarily roundabout) squaring method var AsyncSquaringLibrary = { squareExponent: 2, square: function(number, callback){ var result = Math.pow(number, this.squareExponent); setTimeout(function(){ callback(null, result); }, 200); } }; async.map([1, 2, 3], AsyncSquaringLibrary.square, function(err, result){ // result is [NaN, NaN, NaN] // This fails because the `this.squareExponent` expression in the square // function is not evaluated in the context of AsyncSquaringLibrary, and is // therefore undefined. }); async.map([1, 2, 3], AsyncSquaringLibrary.square.bind(AsyncSquaringLibrary), function(err, result){ // result is [1, 4, 9] // With the help of bind we can attach a context to the iterator before // passing it to async. Now the square function will be executed in its // 'home' AsyncSquaringLibrary context and the value of `this.squareExponent` // will be as expected. }); ``` ## Download The source is available for download from [GitHub](http://github.com/caolan/async). Alternatively, you can install using Node Package Manager (`npm`): npm install async __Development:__ [async.js](https://github.com/caolan/async/raw/master/lib/async.js) - 29.6kb Uncompressed ## In the Browser So far it's been tested in IE6, IE7, IE8, FF3.6 and Chrome 5. Usage: ```html <script type="text/javascript" src="async.js"></script> <script type="text/javascript"> async.map(data, asyncProcess, function(err, results){ alert(results); }); </script> ``` ## Documentation ### Collections * [`each`](#each) * [`eachSeries`](#eachSeries) * [`eachLimit`](#eachLimit) * [`map`](#map) * [`mapSeries`](#mapSeries) * [`mapLimit`](#mapLimit) * [`filter`](#filter) * [`filterSeries`](#filterSeries) * [`reject`](#reject) * [`rejectSeries`](#rejectSeries) * [`reduce`](#reduce) * [`reduceRight`](#reduceRight) * [`detect`](#detect) * [`detectSeries`](#detectSeries) * [`sortBy`](#sortBy) * [`some`](#some) * [`every`](#every) * [`concat`](#concat) * [`concatSeries`](#concatSeries) ### Control Flow * [`series`](#seriestasks-callback) * [`parallel`](#parallel) * [`parallelLimit`](#parallellimittasks-limit-callback) * [`whilst`](#whilst) * [`doWhilst`](#doWhilst) * [`until`](#until) * [`doUntil`](#doUntil) * [`forever`](#forever) * [`waterfall`](#waterfall) * [`compose`](#compose) * [`seq`](#seq) * [`applyEach`](#applyEach) * [`applyEachSeries`](#applyEachSeries) * [`queue`](#queue) * [`priorityQueue`](#priorityQueue) * [`cargo`](#cargo) * [`auto`](#auto) * [`retry`](#retry) * [`iterator`](#iterator) * [`apply`](#apply) * [`nextTick`](#nextTick) * [`times`](#times) * [`timesSeries`](#timesSeries) ### Utils * [`memoize`](#memoize) * [`unmemoize`](#unmemoize) * [`log`](#log) * [`dir`](#dir) * [`noConflict`](#noConflict) ## Collections <a name="forEach" /> <a name="each" /> ### each(arr, iterator, callback) Applies the function `iterator` to each item in `arr`, in parallel. The `iterator` is called with an item from the list, and a callback for when it has finished. If the `iterator` passes an error to its `callback`, the main `callback` (for the `each` function) is immediately called with the error. Note, that since this function applies `iterator` to each item in parallel, there is no guarantee that the iterator functions will complete in order. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A function to apply to each item in `arr`. The iterator is passed a `callback(err)` which must be called once it has completed. If no error has occured, the `callback` should be run without arguments or with an explicit `null` argument. * `callback(err)` - A callback which is called when all `iterator` functions have finished, or an error occurs. __Examples__ ```js // assuming openFiles is an array of file names and saveFile is a function // to save the modified contents of that file: async.each(openFiles, saveFile, function(err){ // if any of the saves produced an error, err would equal that error }); ``` ```js // assuming openFiles is an array of file names async.each(openFiles, function( file, callback) { // Perform operation on file here. console.log('Processing file ' + file); if( file.length > 32 ) { console.log('This file name is too long'); callback('File name too long'); } else { // Do work to process file here console.log('File processed'); callback(); } }, function(err){ // if any of the file processing produced an error, err would equal that error if( err ) { // One of the iterations produced an error. // All processing will now stop. console.log('A file failed to process'); } else { console.log('All files have been processed successfully'); } }); ``` --------------------------------------- <a name="forEachSeries" /> <a name="eachSeries" /> ### eachSeries(arr, iterator, callback) The same as [`each`](#each), only `iterator` is applied to each item in `arr` in series. The next `iterator` is only called once the current one has completed. This means the `iterator` functions will complete in order. --------------------------------------- <a name="forEachLimit" /> <a name="eachLimit" /> ### eachLimit(arr, limit, iterator, callback) The same as [`each`](#each), only no more than `limit` `iterator`s will be simultaneously running at any time. Note that the items in `arr` are not processed in batches, so there is no guarantee that the first `limit` `iterator` functions will complete before any others are started. __Arguments__ * `arr` - An array to iterate over. * `limit` - The maximum number of `iterator`s to run at any time. * `iterator(item, callback)` - A function to apply to each item in `arr`. The iterator is passed a `callback(err)` which must be called once it has completed. If no error has occured, the callback should be run without arguments or with an explicit `null` argument. * `callback(err)` - A callback which is called when all `iterator` functions have finished, or an error occurs. __Example__ ```js // Assume documents is an array of JSON objects and requestApi is a // function that interacts with a rate-limited REST api. async.eachLimit(documents, 20, requestApi, function(err){ // if any of the saves produced an error, err would equal that error }); ``` --------------------------------------- <a name="map" /> ### map(arr, iterator, callback) Produces a new array of values by mapping each value in `arr` through the `iterator` function. The `iterator` is called with an item from `arr` and a callback for when it has finished processing. Each of these callback takes 2 arguments: an `error`, and the transformed item from `arr`. If `iterator` passes an error to this callback, the main `callback` (for the `map` function) is immediately called with the error. Note, that since this function applies the `iterator` to each item in parallel, there is no guarantee that the `iterator` functions will complete in order. However, the results array will be in the same order as the original `arr`. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A function to apply to each item in `arr`. The iterator is passed a `callback(err, transformed)` which must be called once it has completed with an error (which can be `null`) and a transformed item. * `callback(err, results)` - A callback which is called when all `iterator` functions have finished, or an error occurs. Results is an array of the transformed items from the `arr`. __Example__ ```js async.map(['file1','file2','file3'], fs.stat, function(err, results){ // results is now an array of stats for each file }); ``` --------------------------------------- <a name="mapSeries" /> ### mapSeries(arr, iterator, callback) The same as [`map`](#map), only the `iterator` is applied to each item in `arr` in series. The next `iterator` is only called once the current one has completed. The results array will be in the same order as the original. --------------------------------------- <a name="mapLimit" /> ### mapLimit(arr, limit, iterator, callback) The same as [`map`](#map), only no more than `limit` `iterator`s will be simultaneously running at any time. Note that the items are not processed in batches, so there is no guarantee that the first `limit` `iterator` functions will complete before any others are started. __Arguments__ * `arr` - An array to iterate over. * `limit` - The maximum number of `iterator`s to run at any time. * `iterator(item, callback)` - A function to apply to each item in `arr`. The iterator is passed a `callback(err, transformed)` which must be called once it has completed with an error (which can be `null`) and a transformed item. * `callback(err, results)` - A callback which is called when all `iterator` calls have finished, or an error occurs. The result is an array of the transformed items from the original `arr`. __Example__ ```js async.mapLimit(['file1','file2','file3'], 1, fs.stat, function(err, results){ // results is now an array of stats for each file }); ``` --------------------------------------- <a name="select" /> <a name="filter" /> ### filter(arr, iterator, callback) __Alias:__ `select` Returns a new array of all the values in `arr` which pass an async truth test. _The callback for each `iterator` call only accepts a single argument of `true` or `false`; it does not accept an error argument first!_ This is in-line with the way node libraries work with truth tests like `fs.exists`. This operation is performed in parallel, but the results array will be in the same order as the original. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A truth test to apply to each item in `arr`. The `iterator` is passed a `callback(truthValue)`, which must be called with a boolean argument once it has completed. * `callback(results)` - A callback which is called after all the `iterator` functions have finished. __Example__ ```js async.filter(['file1','file2','file3'], fs.exists, function(results){ // results now equals an array of the existing files }); ``` --------------------------------------- <a name="selectSeries" /> <a name="filterSeries" /> ### filterSeries(arr, iterator, callback) __Alias:__ `selectSeries` The same as [`filter`](#filter) only the `iterator` is applied to each item in `arr` in series. The next `iterator` is only called once the current one has completed. The results array will be in the same order as the original. --------------------------------------- <a name="reject" /> ### reject(arr, iterator, callback) The opposite of [`filter`](#filter). Removes values that pass an `async` truth test. --------------------------------------- <a name="rejectSeries" /> ### rejectSeries(arr, iterator, callback) The same as [`reject`](#reject), only the `iterator` is applied to each item in `arr` in series. --------------------------------------- <a name="reduce" /> ### reduce(arr, memo, iterator, callback) __Aliases:__ `inject`, `foldl` Reduces `arr` into a single value using an async `iterator` to return each successive step. `memo` is the initial state of the reduction. This function only operates in series. For performance reasons, it may make sense to split a call to this function into a parallel map, and then use the normal `Array.prototype.reduce` on the results. This function is for situations where each step in the reduction needs to be async; if you can get the data before reducing it, then it's probably a good idea to do so. __Arguments__ * `arr` - An array to iterate over. * `memo` - The initial state of the reduction. * `iterator(memo, item, callback)` - A function applied to each item in the array to produce the next step in the reduction. The `iterator` is passed a `callback(err, reduction)` which accepts an optional error as its first argument, and the state of the reduction as the second. If an error is passed to the callback, the reduction is stopped and the main `callback` is immediately called with the error. * `callback(err, result)` - A callback which is called after all the `iterator` functions have finished. Result is the reduced value. __Example__ ```js async.reduce([1,2,3], 0, function(memo, item, callback){ // pointless async: process.nextTick(function(){ callback(null, memo + item) }); }, function(err, result){ // result is now equal to the last value of memo, which is 6 }); ``` --------------------------------------- <a name="reduceRight" /> ### reduceRight(arr, memo, iterator, callback) __Alias:__ `foldr` Same as [`reduce`](#reduce), only operates on `arr` in reverse order. --------------------------------------- <a name="detect" /> ### detect(arr, iterator, callback) Returns the first value in `arr` that passes an async truth test. The `iterator` is applied in parallel, meaning the first iterator to return `true` will fire the detect `callback` with that result. That means the result might not be the first item in the original `arr` (in terms of order) that passes the test. If order within the original `arr` is important, then look at [`detectSeries`](#detectSeries). __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A truth test to apply to each item in `arr`. The iterator is passed a `callback(truthValue)` which must be called with a boolean argument once it has completed. * `callback(result)` - A callback which is called as soon as any iterator returns `true`, or after all the `iterator` functions have finished. Result will be the first item in the array that passes the truth test (iterator) or the value `undefined` if none passed. __Example__ ```js async.detect(['file1','file2','file3'], fs.exists, function(result){ // result now equals the first file in the list that exists }); ``` --------------------------------------- <a name="detectSeries" /> ### detectSeries(arr, iterator, callback) The same as [`detect`](#detect), only the `iterator` is applied to each item in `arr` in series. This means the result is always the first in the original `arr` (in terms of array order) that passes the truth test. --------------------------------------- <a name="sortBy" /> ### sortBy(arr, iterator, callback) Sorts a list by the results of running each `arr` value through an async `iterator`. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A function to apply to each item in `arr`. The iterator is passed a `callback(err, sortValue)` which must be called once it has completed with an error (which can be `null`) and a value to use as the sort criteria. * `callback(err, results)` - A callback which is called after all the `iterator` functions have finished, or an error occurs. Results is the items from the original `arr` sorted by the values returned by the `iterator` calls. __Example__ ```js async.sortBy(['file1','file2','file3'], function(file, callback){ fs.stat(file, function(err, stats){ callback(err, stats.mtime); }); }, function(err, results){ // results is now the original array of files sorted by // modified date }); ``` __Sort Order__ By modifying the callback parameter the sorting order can be influenced: ```js //ascending order async.sortBy([1,9,3,5], function(x, callback){ callback(err, x); }, function(err,result){ //result callback } ); //descending order async.sortBy([1,9,3,5], function(x, callback){ callback(err, x*-1); //<- x*-1 instead of x, turns the order around }, function(err,result){ //result callback } ); ``` --------------------------------------- <a name="some" /> ### some(arr, iterator, callback) __Alias:__ `any` Returns `true` if at least one element in the `arr` satisfies an async test. _The callback for each iterator call only accepts a single argument of `true` or `false`; it does not accept an error argument first!_ This is in-line with the way node libraries work with truth tests like `fs.exists`. Once any iterator call returns `true`, the main `callback` is immediately called. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A truth test to apply to each item in the array in parallel. The iterator is passed a callback(truthValue) which must be called with a boolean argument once it has completed. * `callback(result)` - A callback which is called as soon as any iterator returns `true`, or after all the iterator functions have finished. Result will be either `true` or `false` depending on the values of the async tests. __Example__ ```js async.some(['file1','file2','file3'], fs.exists, function(result){ // if result is true then at least one of the files exists }); ``` --------------------------------------- <a name="every" /> ### every(arr, iterator, callback) __Alias:__ `all` Returns `true` if every element in `arr` satisfies an async test. _The callback for each `iterator` call only accepts a single argument of `true` or `false`; it does not accept an error argument first!_ This is in-line with the way node libraries work with truth tests like `fs.exists`. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A truth test to apply to each item in the array in parallel. The iterator is passed a callback(truthValue) which must be called with a boolean argument once it has completed. * `callback(result)` - A callback which is called after all the `iterator` functions have finished. Result will be either `true` or `false` depending on the values of the async tests. __Example__ ```js async.every(['file1','file2','file3'], fs.exists, function(result){ // if result is true then every file exists }); ``` --------------------------------------- <a name="concat" /> ### concat(arr, iterator, callback) Applies `iterator` to each item in `arr`, concatenating the results. Returns the concatenated list. The `iterator`s are called in parallel, and the results are concatenated as they return. There is no guarantee that the results array will be returned in the original order of `arr` passed to the `iterator` function. __Arguments__ * `arr` - An array to iterate over. * `iterator(item, callback)` - A function to apply to each item in `arr`. The iterator is passed a `callback(err, results)` which must be called once it has completed with an error (which can be `null`) and an array of results. * `callback(err, results)` - A callback which is called after all the `iterator` functions have finished, or an error occurs. Results is an array containing the concatenated results of the `iterator` function. __Example__ ```js async.concat(['dir1','dir2','dir3'], fs.readdir, function(err, files){ // files is now a list of filenames that exist in the 3 directories }); ``` --------------------------------------- <a name="concatSeries" /> ### concatSeries(arr, iterator, callback) Same as [`concat`](#concat), but executes in series instead of parallel. ## Control Flow <a name="series" /> ### series(tasks, [callback]) Run the functions in the `tasks` array in series, each one running once the previous function has completed. If any functions in the series pass an error to its callback, no more functions are run, and `callback` is immediately called with the value of the error. Otherwise, `callback` receives an array of results when `tasks` have completed. It is also possible to use an object instead of an array. Each property will be run as a function, and the results will be passed to the final `callback` as an object instead of an array. This can be a more readable way of handling results from [`series`](#series). **Note** that while many implementations preserve the order of object properties, the [ECMAScript Language Specifcation](http://www.ecma-international.org/ecma-262/5.1/#sec-8.6) explicitly states that > The mechanics and order of enumerating the properties is not specified. So if you rely on the order in which your series of functions are executed, and want this to work on all platforms, consider using an array. __Arguments__ * `tasks` - An array or object containing functions to run, each function is passed a `callback(err, result)` it must call on completion with an error `err` (which can be `null`) and an optional `result` value. * `callback(err, results)` - An optional callback to run once all the functions have completed. This function gets a results array (or object) containing all the result arguments passed to the `task` callbacks. __Example__ ```js async.series([ function(callback){ // do some stuff ... callback(null, 'one'); }, function(callback){ // do some more stuff ... callback(null, 'two'); } ], // optional callback function(err, results){ // results is now equal to ['one', 'two'] }); // an example using an object instead of an array async.series({ one: function(callback){ setTimeout(function(){ callback(null, 1); }, 200); }, two: function(callback){ setTimeout(function(){ callback(null, 2); }, 100); } }, function(err, results) { // results is now equal to: {one: 1, two: 2} }); ``` --------------------------------------- <a name="parallel" /> ### parallel(tasks, [callback]) Run the `tasks` array of functions in parallel, without waiting until the previous function has completed. If any of the functions pass an error to its callback, the main `callback` is immediately called with the value of the error. Once the `tasks` have completed, the results are passed to the final `callback` as an array. It is also possible to use an object instead of an array. Each property will be run as a function and the results will be passed to the final `callback` as an object instead of an array. This can be a more readable way of handling results from [`parallel`](#parallel). __Arguments__ * `tasks` - An array or object containing functions to run. Each function is passed a `callback(err, result)` which it must call on completion with an error `err` (which can be `null`) and an optional `result` value. * `callback(err, results)` - An optional callback to run once all the functions have completed. This function gets a results array (or object) containing all the result arguments passed to the task callbacks. __Example__ ```js async.parallel([ function(callback){ setTimeout(function(){ callback(null, 'one'); }, 200); }, function(callback){ setTimeout(function(){ callback(null, 'two'); }, 100); } ], // optional callback function(err, results){ // the results array will equal ['one','two'] even though // the second function had a shorter timeout. }); // an example using an object instead of an array async.parallel({ one: function(callback){ setTimeout(function(){ callback(null, 1); }, 200); }, two: function(callback){ setTimeout(function(){ callback(null, 2); }, 100); } }, function(err, results) { // results is now equals to: {one: 1, two: 2} }); ``` --------------------------------------- <a name="parallelLimit" /> ### parallelLimit(tasks, limit, [callback]) The same as [`parallel`](#parallel), only `tasks` are executed in parallel with a maximum of `limit` tasks executing at any time. Note that the `tasks` are not executed in batches, so there is no guarantee that the first `limit` tasks will complete before any others are started. __Arguments__ * `tasks` - An array or object containing functions to run, each function is passed a `callback(err, result)` it must call on completion with an error `err` (which can be `null`) and an optional `result` value. * `limit` - The maximum number of `tasks` to run at any time. * `callback(err, results)` - An optional callback to run once all the functions have completed. This function gets a results array (or object) containing all the result arguments passed to the `task` callbacks. --------------------------------------- <a name="whilst" /> ### whilst(test, fn, callback) Repeatedly call `fn`, while `test` returns `true`. Calls `callback` when stopped, or an error occurs. __Arguments__ * `test()` - synchronous truth test to perform before each execution of `fn`. * `fn(callback)` - A function which is called each time `test` passes. The function is passed a `callback(err)`, which must be called once it has completed with an optional `err` argument. * `callback(err)` - A callback which is called after the test fails and repeated execution of `fn` has stopped. __Example__ ```js var count = 0; async.whilst( function () { return count < 5; }, function (callback) { count++; setTimeout(callback, 1000); }, function (err) { // 5 seconds have passed } ); ``` --------------------------------------- <a name="doWhilst" /> ### doWhilst(fn, test, callback) The post-check version of [`whilst`](#whilst). To reflect the difference in the order of operations, the arguments `test` and `fn` are switched. `doWhilst` is to `whilst` as `do while` is to `while` in plain JavaScript. --------------------------------------- <a name="until" /> ### until(test, fn, callback) Repeatedly call `fn` until `test` returns `true`. Calls `callback` when stopped, or an error occurs. The inverse of [`whilst`](#whilst). --------------------------------------- <a name="doUntil" /> ### doUntil(fn, test, callback) Like [`doWhilst`](#doWhilst), except the `test` is inverted. Note the argument ordering differs from `until`. --------------------------------------- <a name="forever" /> ### forever(fn, errback) Calls the asynchronous function `fn` with a callback parameter that allows it to call itself again, in series, indefinitely. If an error is passed to the callback then `errback` is called with the error, and execution stops, otherwise it will never be called. ```js async.forever( function(next) { // next is suitable for passing to things that need a callback(err [, whatever]); // it will result in this function being called again. }, function(err) { // if next is called with a value in its first parameter, it will appear // in here as 'err', and execution will stop. } ); ``` --------------------------------------- <a name="waterfall" /> ### waterfall(tasks, [callback]) Runs the `tasks` array of functions in series, each passing their results to the next in the array. However, if any of the `tasks` pass an error to their own callback, the next function is not executed, and the main `callback` is immediately called with the error. __Arguments__ * `tasks` - An array of functions to run, each function is passed a `callback(err, result1, result2, ...)` it must call on completion. The first argument is an error (which can be `null`) and any further arguments will be passed as arguments in order to the next task. * `callback(err, [results])` - An optional callback to run once all the functions have completed. This will be passed the results of the last task's callback. __Example__ ```js async.waterfall([ function(callback){ callback(null, 'one', 'two'); }, function(arg1, arg2, callback){ // arg1 now equals 'one' and arg2 now equals 'two' callback(null, 'three'); }, function(arg1, callback){ // arg1 now equals 'three' callback(null, 'done'); } ], function (err, result) { // result now equals 'done' }); ``` --------------------------------------- <a name="compose" /> ### compose(fn1, fn2...) Creates a function which is a composition of the passed asynchronous functions. Each function consumes the return value of the function that follows. Composing functions `f()`, `g()`, and `h()` would produce the result of `f(g(h()))`, only this version uses callbacks to obtain the return values. Each function is executed with the `this` binding of the composed function. __Arguments__ * `functions...` - the asynchronous functions to compose __Example__ ```js function add1(n, callback) { setTimeout(function () { callback(null, n + 1); }, 10); } function mul3(n, callback) { setTimeout(function () { callback(null, n * 3); }, 10); } var add1mul3 = async.compose(mul3, add1); add1mul3(4, function (err, result) { // result now equals 15 }); ``` --------------------------------------- <a name="seq" /> ### seq(fn1, fn2...) Version of the compose function that is more natural to read. Each following function consumes the return value of the latter function. Each function is executed with the `this` binding of the composed function. __Arguments__ * functions... - the asynchronous functions to compose __Example__ ```js // Requires lodash (or underscore), express3 and dresende's orm2. // Part of an app, that fetches cats of the logged user. // This example uses `seq` function to avoid overnesting and error // handling clutter. app.get('/cats', function(request, response) { function handleError(err, data, callback) { if (err) { console.error(err); response.json({ status: 'error', message: err.message }); } else { callback(data); } } var User = request.models.User; async.seq( _.bind(User.get, User), // 'User.get' has signature (id, callback(err, data)) handleError, function(user, fn) { user.getCats(fn); // 'getCats' has signature (callback(err, data)) }, handleError, function(cats) { response.json({ status: 'ok', message: 'Cats found', data: cats }); } )(req.session.user_id); } }); ``` --------------------------------------- <a name="applyEach" /> ### applyEach(fns, args..., callback) Applies the provided arguments to each function in the array, calling `callback` after all functions have completed. If you only provide the first argument, then it will return a function which lets you pass in the arguments as if it were a single function call. __Arguments__ * `fns` - the asynchronous functions to all call with the same arguments * `args...` - any number of separate arguments to pass to the function * `callback` - the final argument should be the callback, called when all functions have completed processing __Example__ ```js async.applyEach([enableSearch, updateSchema], 'bucket', callback); // partial application example: async.each( buckets, async.applyEach([enableSearch, updateSchema]), callback ); ``` --------------------------------------- <a name="applyEachSeries" /> ### applyEachSeries(arr, iterator, callback) The same as [`applyEach`](#applyEach) only the functions are applied in series. --------------------------------------- <a name="queue" /> ### queue(worker, concurrency) Creates a `queue` object with the specified `concurrency`. Tasks added to the `queue` are processed in parallel (up to the `concurrency` limit). If all `worker`s are in progress, the task is queued until one becomes available. Once a `worker` completes a `task`, that `task`'s callback is called. __Arguments__ * `worker(task, callback)` - An asynchronous function for processing a queued task, which must call its `callback(err)` argument when finished, with an optional `error` as an argument. * `concurrency` - An `integer` for determining how many `worker` functions should be run in parallel. __Queue objects__ The `queue` object returned by this function has the following properties and methods: * `length()` - a function returning the number of items waiting to be processed. * `started` - a function returning whether or not any items have been pushed and processed by the queue * `running()` - a function returning the number of items currently being processed. * `idle()` - a function returning false if there are items waiting or being processed, or true if not. * `concurrency` - an integer for determining how many `worker` functions should be run in parallel. This property can be changed after a `queue` is created to alter the concurrency on-the-fly. * `push(task, [callback])` - add a new task to the `queue`. Calls `callback` once the `worker` has finished processing the task. Instead of a single task, a `tasks` array can be submitted. The respective callback is used for every task in the list. * `unshift(task, [callback])` - add a new task to the front of the `queue`. * `saturated` - a callback that is called when the `queue` length hits the `concurrency` limit, and further tasks will be queued. * `empty` - a callback that is called when the last item from the `queue` is given to a `worker`. * `drain` - a callback that is called when the last item from the `queue` has returned from the `worker`. * `paused` - a boolean for determining whether the queue is in a paused state * `pause()` - a function that pauses the processing of tasks until `resume()` is called. * `resume()` - a function that resumes the processing of queued tasks when the queue is paused. * `kill()` - a function that empties remaining tasks from the queue forcing it to go idle. __Example__ ```js // create a queue object with concurrency 2 var q = async.queue(function (task, callback) { console.log('hello ' + task.name); callback(); }, 2); // assign a callback q.drain = function() { console.log('all items have been processed'); } // add some items to the queue q.push({name: 'foo'}, function (err) { console.log('finished processing foo'); }); q.push({name: 'bar'}, function (err) { console.log('finished processing bar'); }); // add some items to the queue (batch-wise) q.push([{name: 'baz'},{name: 'bay'},{name: 'bax'}], function (err) { console.log('finished processing bar'); }); // add some items to the front of the queue q.unshift({name: 'bar'}, function (err) { console.log('finished processing bar'); }); ``` --------------------------------------- <a name="priorityQueue" /> ### priorityQueue(worker, concurrency) The same as [`queue`](#queue) only tasks are assigned a priority and completed in ascending priority order. There are two differences between `queue` and `priorityQueue` objects: * `push(task, priority, [callback])` - `priority` should be a number. If an array of `tasks` is given, all tasks will be assigned the same priority. * The `unshift` method was removed. --------------------------------------- <a name="cargo" /> ### cargo(worker, [payload]) Creates a `cargo` object with the specified payload. Tasks added to the cargo will be processed altogether (up to the `payload` limit). If the `worker` is in progress, the task is queued until it becomes available. Once the `worker` has completed some tasks, each callback of those tasks is called. Check out [this animation](https://camo.githubusercontent.com/6bbd36f4cf5b35a0f11a96dcd2e97711ffc2fb37/68747470733a2f2f662e636c6f75642e6769746875622e636f6d2f6173736574732f313637363837312f36383130382f62626330636662302d356632392d313165322d393734662d3333393763363464633835382e676966) for how `cargo` and `queue` work. While [queue](#queue) passes only one task to one of a group of workers at a time, cargo passes an array of tasks to a single worker, repeating when the worker is finished. __Arguments__ * `worker(tasks, callback)` - An asynchronous function for processing an array of queued tasks, which must call its `callback(err)` argument when finished, with an optional `err` argument. * `payload` - An optional `integer` for determining how many tasks should be processed per round; if omitted, the default is unlimited. __Cargo objects__ The `cargo` object returned by this function has the following properties and methods: * `length()` - A function returning the number of items waiting to be processed. * `payload` - An `integer` for determining how many tasks should be process per round. This property can be changed after a `cargo` is created to alter the payload on-the-fly. * `push(task, [callback])` - Adds `task` to the `queue`. The callback is called once the `worker` has finished processing the task. Instead of a single task, an array of `tasks` can be submitted. The respective callback is used for every task in the list. * `saturated` - A callback that is called when the `queue.length()` hits the concurrency and further tasks will be queued. * `empty` - A callback that is called when the last item from the `queue` is given to a `worker`. * `drain` - A callback that is called when the last item from the `queue` has returned from the `worker`. __Example__ ```js // create a cargo object with payload 2 var cargo = async.cargo(function (tasks, callback) { for(var i=0; i<tasks.length; i++){ console.log('hello ' + tasks[i].name); } callback(); }, 2); // add some items cargo.push({name: 'foo'}, function (err) { console.log('finished processing foo'); }); cargo.push({name: 'bar'}, function (err) { console.log('finished processing bar'); }); cargo.push({name: 'baz'}, function (err) { console.log('finished processing baz'); }); ``` --------------------------------------- <a name="auto" /> ### auto(tasks, [callback]) Determines the best order for running the functions in `tasks`, based on their requirements. Each function can optionally depend on other functions being completed first, and each function is run as soon as its requirements are satisfied. If any of the functions pass an error to their callback, it will not complete (so any other functions depending on it will not run), and the main `callback` is immediately called with the error. Functions also receive an object containing the results of functions which have completed so far. Note, all functions are called with a `results` object as a second argument, so it is unsafe to pass functions in the `tasks` object which cannot handle the extra argument. For example, this snippet of code: ```js async.auto({ readData: async.apply(fs.readFile, 'data.txt', 'utf-8') }, callback); ``` will have the effect of calling `readFile` with the results object as the last argument, which will fail: ```js fs.readFile('data.txt', 'utf-8', cb, {}); ``` Instead, wrap the call to `readFile` in a function which does not forward the `results` object: ```js async.auto({ readData: function(cb, results){ fs.readFile('data.txt', 'utf-8', cb); } }, callback); ``` __Arguments__ * `tasks` - An object. Each of its properties is either a function or an array of requirements, with the function itself the last item in the array. The object's key of a property serves as the name of the task defined by that property, i.e. can be used when specifying requirements for other tasks. The function receives two arguments: (1) a `callback(err, result)` which must be called when finished, passing an `error` (which can be `null`) and the result of the function's execution, and (2) a `results` object, containing the results of the previously executed functions. * `callback(err, results)` - An optional callback which is called when all the tasks have been completed. It receives the `err` argument if any `tasks` pass an error to their callback. Results are always returned; however, if an error occurs, no further `tasks` will be performed, and the results object will only contain partial results. __Example__ ```js async.auto({ get_data: function(callback){ console.log('in get_data'); // async code to get some data callback(null, 'data', 'converted to array'); }, make_folder: function(callback){ console.log('in make_folder'); // async code to create a directory to store a file in // this is run at the same time as getting the data callback(null, 'folder'); }, write_file: ['get_data', 'make_folder', function(callback, results){ console.log('in write_file', JSON.stringify(results)); // once there is some data and the directory exists, // write the data to a file in the directory callback(null, 'filename'); }], email_link: ['write_file', function(callback, results){ console.log('in email_link', JSON.stringify(results)); // once the file is written let's email a link to it... // results.write_file contains the filename returned by write_file. callback(null, {'file':results.write_file, 'email':'user@example.com'}); }] }, function(err, results) { console.log('err = ', err); console.log('results = ', results); }); ``` This is a fairly trivial example, but to do this using the basic parallel and series functions would look like this: ```js async.parallel([ function(callback){ console.log('in get_data'); // async code to get some data callback(null, 'data', 'converted to array'); }, function(callback){ console.log('in make_folder'); // async code to create a directory to store a file in // this is run at the same time as getting the data callback(null, 'folder'); } ], function(err, results){ async.series([ function(callback){ console.log('in write_file', JSON.stringify(results)); // once there is some data and the directory exists, // write the data to a file in the directory results.push('filename'); callback(null); }, function(callback){ console.log('in email_link', JSON.stringify(results)); // once the file is written let's email a link to it... callback(null, {'file':results.pop(), 'email':'user@example.com'}); } ]); }); ``` For a complicated series of `async` tasks, using the [`auto`](#auto) function makes adding new tasks much easier (and the code more readable). --------------------------------------- <a name="retry" /> ### retry([times = 5], task, [callback]) Attempts to get a successful response from `task` no more than `times` times before returning an error. If the task is successful, the `callback` will be passed the result of the successfull task. If all attemps fail, the callback will be passed the error and result (if any) of the final attempt. __Arguments__ * `times` - An integer indicating how many times to attempt the `task` before giving up. Defaults to 5. * `task(callback, results)` - A function which receives two arguments: (1) a `callback(err, result)` which must be called when finished, passing `err` (which can be `null`) and the `result` of the function's execution, and (2) a `results` object, containing the results of the previously executed functions (if nested inside another control flow). * `callback(err, results)` - An optional callback which is called when the task has succeeded, or after the final failed attempt. It receives the `err` and `result` arguments of the last attempt at completing the `task`. The [`retry`](#retry) function can be used as a stand-alone control flow by passing a callback, as shown below: ```js async.retry(3, apiMethod, function(err, result) { // do something with the result }); ``` It can also be embeded within other control flow functions to retry individual methods that are not as reliable, like this: ```js async.auto({ users: api.getUsers.bind(api), payments: async.retry(3, api.getPayments.bind(api)) }, function(err, results) { // do something with the results }); ``` --------------------------------------- <a name="iterator" /> ### iterator(tasks) Creates an iterator function which calls the next function in the `tasks` array, returning a continuation to call the next one after that. It's also possible to “peek” at the next iterator with `iterator.next()`. This function is used internally by the `async` module, but can be useful when you want to manually control the flow of functions in series. __Arguments__ * `tasks` - An array of functions to run. __Example__ ```js var iterator = async.iterator([ function(){ sys.p('one'); }, function(){ sys.p('two'); }, function(){ sys.p('three'); } ]); node> var iterator2 = iterator(); 'one' node> var iterator3 = iterator2(); 'two' node> iterator3(); 'three' node> var nextfn = iterator2.next(); node> nextfn(); 'three' ``` --------------------------------------- <a name="apply" /> ### apply(function, arguments..) Creates a continuation function with some arguments already applied. Useful as a shorthand when combined with other control flow functions. Any arguments passed to the returned function are added to the arguments originally passed to apply. __Arguments__ * `function` - The function you want to eventually apply all arguments to. * `arguments...` - Any number of arguments to automatically apply when the continuation is called. __Example__ ```js // using apply async.parallel([ async.apply(fs.writeFile, 'testfile1', 'test1'), async.apply(fs.writeFile, 'testfile2', 'test2'), ]); // the same process without using apply async.parallel([ function(callback){ fs.writeFile('testfile1', 'test1', callback); }, function(callback){ fs.writeFile('testfile2', 'test2', callback); } ]); ``` It's possible to pass any number of additional arguments when calling the continuation: ```js node> var fn = async.apply(sys.puts, 'one'); node> fn('two', 'three'); one two three ``` --------------------------------------- <a name="nextTick" /> ### nextTick(callback) Calls `callback` on a later loop around the event loop. In Node.js this just calls `process.nextTick`; in the browser it falls back to `setImmediate(callback)` if available, otherwise `setTimeout(callback, 0)`, which means other higher priority events may precede the execution of `callback`. This is used internally for browser-compatibility purposes. __Arguments__ * `callback` - The function to call on a later loop around the event loop. __Example__ ```js var call_order = []; async.nextTick(function(){ call_order.push('two'); // call_order now equals ['one','two'] }); call_order.push('one') ``` <a name="times" /> ### times(n, callback) Calls the `callback` function `n` times, and accumulates results in the same manner you would use with [`map`](#map). __Arguments__ * `n` - The number of times to run the function. * `callback` - The function to call `n` times. __Example__ ```js // Pretend this is some complicated async factory var createUser = function(id, callback) { callback(null, { id: 'user' + id }) } // generate 5 users async.times(5, function(n, next){ createUser(n, function(err, user) { next(err, user) }) }, function(err, users) { // we should now have 5 users }); ``` <a name="timesSeries" /> ### timesSeries(n, callback) The same as [`times`](#times), only the iterator is applied to each item in `arr` in series. The next `iterator` is only called once the current one has completed. The results array will be in the same order as the original. ## Utils <a name="memoize" /> ### memoize(fn, [hasher]) Caches the results of an `async` function. When creating a hash to store function results against, the callback is omitted from the hash and an optional hash function can be used. The cache of results is exposed as the `memo` property of the function returned by `memoize`. __Arguments__ * `fn` - The function to proxy and cache results from. * `hasher` - Tn optional function for generating a custom hash for storing results. It has all the arguments applied to it apart from the callback, and must be synchronous. __Example__ ```js var slow_fn = function (name, callback) { // do something callback(null, result); }; var fn = async.memoize(slow_fn); // fn can now be used as if it were slow_fn fn('some name', function () { // callback }); ``` <a name="unmemoize" /> ### unmemoize(fn) Undoes a [`memoize`](#memoize)d function, reverting it to the original, unmemoized form. Handy for testing. __Arguments__ * `fn` - the memoized function <a name="log" /> ### log(function, arguments) Logs the result of an `async` function to the `console`. Only works in Node.js or in browsers that support `console.log` and `console.error` (such as FF and Chrome). If multiple arguments are returned from the async function, `console.log` is called on each argument in order. __Arguments__ * `function` - The function you want to eventually apply all arguments to. * `arguments...` - Any number of arguments to apply to the function. __Example__ ```js var hello = function(name, callback){ setTimeout(function(){ callback(null, 'hello ' + name); }, 1000); }; ``` ```js node> async.log(hello, 'world'); 'hello world' ``` --------------------------------------- <a name="dir" /> ### dir(function, arguments) Logs the result of an `async` function to the `console` using `console.dir` to display the properties of the resulting object. Only works in Node.js or in browsers that support `console.dir` and `console.error` (such as FF and Chrome). If multiple arguments are returned from the async function, `console.dir` is called on each argument in order. __Arguments__ * `function` - The function you want to eventually apply all arguments to. * `arguments...` - Any number of arguments to apply to the function. __Example__ ```js var hello = function(name, callback){ setTimeout(function(){ callback(null, {hello: name}); }, 1000); }; ``` ```js node> async.dir(hello, 'world'); {hello: 'world'} ``` --------------------------------------- <a name="noConflict" /> ### noConflict() Changes the value of `async` back to its original value, returning a reference to the `async` object. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js����000755 �000766 �000024 �00000103446 12455173731 037345� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/*! * async * https://github.com/caolan/async * * Copyright 2010-2014 Caolan McMahon * Released under the MIT license */ /*jshint onevar: false, indent:4 */ /*global setImmediate: false, setTimeout: false, console: false */ (function () { var async = {}; // global on the server, window in the browser var root, previous_async; root = this; if (root != null) { previous_async = root.async; } async.noConflict = function () { root.async = previous_async; return async; }; function only_once(fn) { var called = false; return function() { if (called) throw new Error("Callback was already called."); called = true; fn.apply(root, arguments); } } //// cross-browser compatiblity functions //// var _toString = Object.prototype.toString; var _isArray = Array.isArray || function (obj) { return _toString.call(obj) === '[object Array]'; }; var _each = function (arr, iterator) { if (arr.forEach) { return arr.forEach(iterator); } for (var i = 0; i < arr.length; i += 1) { iterator(arr[i], i, arr); } }; var _map = function (arr, iterator) { if (arr.map) { return arr.map(iterator); } var results = []; _each(arr, function (x, i, a) { results.push(iterator(x, i, a)); }); return results; }; var _reduce = function (arr, iterator, memo) { if (arr.reduce) { return arr.reduce(iterator, memo); } _each(arr, function (x, i, a) { memo = iterator(memo, x, i, a); }); return memo; }; var _keys = function (obj) { if (Object.keys) { return Object.keys(obj); } var keys = []; for (var k in obj) { if (obj.hasOwnProperty(k)) { keys.push(k); } } return keys; }; //// exported async module functions //// //// nextTick implementation with browser-compatible fallback //// if (typeof process === 'undefined' || !(process.nextTick)) { if (typeof setImmediate === 'function') { async.nextTick = function (fn) { // not a direct alias for IE10 compatibility setImmediate(fn); }; async.setImmediate = async.nextTick; } else { async.nextTick = function (fn) { setTimeout(fn, 0); }; async.setImmediate = async.nextTick; } } else { async.nextTick = process.nextTick; if (typeof setImmediate !== 'undefined') { async.setImmediate = function (fn) { // not a direct alias for IE10 compatibility setImmediate(fn); }; } else { async.setImmediate = async.nextTick; } } async.each = function (arr, iterator, callback) { callback = callback || function () {}; if (!arr.length) { return callback(); } var completed = 0; _each(arr, function (x) { iterator(x, only_once(done) ); }); function done(err) { if (err) { callback(err); callback = function () {}; } else { completed += 1; if (completed >= arr.length) { callback(); } } } }; async.forEach = async.each; async.eachSeries = function (arr, iterator, callback) { callback = callback || function () {}; if (!arr.length) { return callback(); } var completed = 0; var iterate = function () { iterator(arr[completed], function (err) { if (err) { callback(err); callback = function () {}; } else { completed += 1; if (completed >= arr.length) { callback(); } else { iterate(); } } }); }; iterate(); }; async.forEachSeries = async.eachSeries; async.eachLimit = function (arr, limit, iterator, callback) { var fn = _eachLimit(limit); fn.apply(null, [arr, iterator, callback]); }; async.forEachLimit = async.eachLimit; var _eachLimit = function (limit) { return function (arr, iterator, callback) { callback = callback || function () {}; if (!arr.length || limit <= 0) { return callback(); } var completed = 0; var started = 0; var running = 0; (function replenish () { if (completed >= arr.length) { return callback(); } while (running < limit && started < arr.length) { started += 1; running += 1; iterator(arr[started - 1], function (err) { if (err) { callback(err); callback = function () {}; } else { completed += 1; running -= 1; if (completed >= arr.length) { callback(); } else { replenish(); } } }); } })(); }; }; var doParallel = function (fn) { return function () { var args = Array.prototype.slice.call(arguments); return fn.apply(null, [async.each].concat(args)); }; }; var doParallelLimit = function(limit, fn) { return function () { var args = Array.prototype.slice.call(arguments); return fn.apply(null, [_eachLimit(limit)].concat(args)); }; }; var doSeries = function (fn) { return function () { var args = Array.prototype.slice.call(arguments); return fn.apply(null, [async.eachSeries].concat(args)); }; }; var _asyncMap = function (eachfn, arr, iterator, callback) { arr = _map(arr, function (x, i) { return {index: i, value: x}; }); if (!callback) { eachfn(arr, function (x, callback) { iterator(x.value, function (err) { callback(err); }); }); } else { var results = []; eachfn(arr, function (x, callback) { iterator(x.value, function (err, v) { results[x.index] = v; callback(err); }); }, function (err) { callback(err, results); }); } }; async.map = doParallel(_asyncMap); async.mapSeries = doSeries(_asyncMap); async.mapLimit = function (arr, limit, iterator, callback) { return _mapLimit(limit)(arr, iterator, callback); }; var _mapLimit = function(limit) { return doParallelLimit(limit, _asyncMap); }; // reduce only has a series version, as doing reduce in parallel won't // work in many situations. async.reduce = function (arr, memo, iterator, callback) { async.eachSeries(arr, function (x, callback) { iterator(memo, x, function (err, v) { memo = v; callback(err); }); }, function (err) { callback(err, memo); }); }; // inject alias async.inject = async.reduce; // foldl alias async.foldl = async.reduce; async.reduceRight = function (arr, memo, iterator, callback) { var reversed = _map(arr, function (x) { return x; }).reverse(); async.reduce(reversed, memo, iterator, callback); }; // foldr alias async.foldr = async.reduceRight; var _filter = function (eachfn, arr, iterator, callback) { var results = []; arr = _map(arr, function (x, i) { return {index: i, value: x}; }); eachfn(arr, function (x, callback) { iterator(x.value, function (v) { if (v) { results.push(x); } callback(); }); }, function (err) { callback(_map(results.sort(function (a, b) { return a.index - b.index; }), function (x) { return x.value; })); }); }; async.filter = doParallel(_filter); async.filterSeries = doSeries(_filter); // select alias async.select = async.filter; async.selectSeries = async.filterSeries; var _reject = function (eachfn, arr, iterator, callback) { var results = []; arr = _map(arr, function (x, i) { return {index: i, value: x}; }); eachfn(arr, function (x, callback) { iterator(x.value, function (v) { if (!v) { results.push(x); } callback(); }); }, function (err) { callback(_map(results.sort(function (a, b) { return a.index - b.index; }), function (x) { return x.value; })); }); }; async.reject = doParallel(_reject); async.rejectSeries = doSeries(_reject); var _detect = function (eachfn, arr, iterator, main_callback) { eachfn(arr, function (x, callback) { iterator(x, function (result) { if (result) { main_callback(x); main_callback = function () {}; } else { callback(); } }); }, function (err) { main_callback(); }); }; async.detect = doParallel(_detect); async.detectSeries = doSeries(_detect); async.some = function (arr, iterator, main_callback) { async.each(arr, function (x, callback) { iterator(x, function (v) { if (v) { main_callback(true); main_callback = function () {}; } callback(); }); }, function (err) { main_callback(false); }); }; // any alias async.any = async.some; async.every = function (arr, iterator, main_callback) { async.each(arr, function (x, callback) { iterator(x, function (v) { if (!v) { main_callback(false); main_callback = function () {}; } callback(); }); }, function (err) { main_callback(true); }); }; // all alias async.all = async.every; async.sortBy = function (arr, iterator, callback) { async.map(arr, function (x, callback) { iterator(x, function (err, criteria) { if (err) { callback(err); } else { callback(null, {value: x, criteria: criteria}); } }); }, function (err, results) { if (err) { return callback(err); } else { var fn = function (left, right) { var a = left.criteria, b = right.criteria; return a < b ? -1 : a > b ? 1 : 0; }; callback(null, _map(results.sort(fn), function (x) { return x.value; })); } }); }; async.auto = function (tasks, callback) { callback = callback || function () {}; var keys = _keys(tasks); var remainingTasks = keys.length if (!remainingTasks) { return callback(); } var results = {}; var listeners = []; var addListener = function (fn) { listeners.unshift(fn); }; var removeListener = function (fn) { for (var i = 0; i < listeners.length; i += 1) { if (listeners[i] === fn) { listeners.splice(i, 1); return; } } }; var taskComplete = function () { remainingTasks-- _each(listeners.slice(0), function (fn) { fn(); }); }; addListener(function () { if (!remainingTasks) { var theCallback = callback; // prevent final callback from calling itself if it errors callback = function () {}; theCallback(null, results); } }); _each(keys, function (k) { var task = _isArray(tasks[k]) ? tasks[k]: [tasks[k]]; var taskCallback = function (err) { var args = Array.prototype.slice.call(arguments, 1); if (args.length <= 1) { args = args[0]; } if (err) { var safeResults = {}; _each(_keys(results), function(rkey) { safeResults[rkey] = results[rkey]; }); safeResults[k] = args; callback(err, safeResults); // stop subsequent errors hitting callback multiple times callback = function () {}; } else { results[k] = args; async.setImmediate(taskComplete); } }; var requires = task.slice(0, Math.abs(task.length - 1)) || []; var ready = function () { return _reduce(requires, function (a, x) { return (a && results.hasOwnProperty(x)); }, true) && !results.hasOwnProperty(k); }; if (ready()) { task[task.length - 1](taskCallback, results); } else { var listener = function () { if (ready()) { removeListener(listener); task[task.length - 1](taskCallback, results); } }; addListener(listener); } }); }; async.retry = function(times, task, callback) { var DEFAULT_TIMES = 5; var attempts = []; // Use defaults if times not passed if (typeof times === 'function') { callback = task; task = times; times = DEFAULT_TIMES; } // Make sure times is a number times = parseInt(times, 10) || DEFAULT_TIMES; var wrappedTask = function(wrappedCallback, wrappedResults) { var retryAttempt = function(task, finalAttempt) { return function(seriesCallback) { task(function(err, result){ seriesCallback(!err || finalAttempt, {err: err, result: result}); }, wrappedResults); }; }; while (times) { attempts.push(retryAttempt(task, !(times-=1))); } async.series(attempts, function(done, data){ data = data[data.length - 1]; (wrappedCallback || callback)(data.err, data.result); }); } // If a callback is passed, run this as a controll flow return callback ? wrappedTask() : wrappedTask }; async.waterfall = function (tasks, callback) { callback = callback || function () {}; if (!_isArray(tasks)) { var err = new Error('First argument to waterfall must be an array of functions'); return callback(err); } if (!tasks.length) { return callback(); } var wrapIterator = function (iterator) { return function (err) { if (err) { callback.apply(null, arguments); callback = function () {}; } else { var args = Array.prototype.slice.call(arguments, 1); var next = iterator.next(); if (next) { args.push(wrapIterator(next)); } else { args.push(callback); } async.setImmediate(function () { iterator.apply(null, args); }); } }; }; wrapIterator(async.iterator(tasks))(); }; var _parallel = function(eachfn, tasks, callback) { callback = callback || function () {}; if (_isArray(tasks)) { eachfn.map(tasks, function (fn, callback) { if (fn) { fn(function (err) { var args = Array.prototype.slice.call(arguments, 1); if (args.length <= 1) { args = args[0]; } callback.call(null, err, args); }); } }, callback); } else { var results = {}; eachfn.each(_keys(tasks), function (k, callback) { tasks[k](function (err) { var args = Array.prototype.slice.call(arguments, 1); if (args.length <= 1) { args = args[0]; } results[k] = args; callback(err); }); }, function (err) { callback(err, results); }); } }; async.parallel = function (tasks, callback) { _parallel({ map: async.map, each: async.each }, tasks, callback); }; async.parallelLimit = function(tasks, limit, callback) { _parallel({ map: _mapLimit(limit), each: _eachLimit(limit) }, tasks, callback); }; async.series = function (tasks, callback) { callback = callback || function () {}; if (_isArray(tasks)) { async.mapSeries(tasks, function (fn, callback) { if (fn) { fn(function (err) { var args = Array.prototype.slice.call(arguments, 1); if (args.length <= 1) { args = args[0]; } callback.call(null, err, args); }); } }, callback); } else { var results = {}; async.eachSeries(_keys(tasks), function (k, callback) { tasks[k](function (err) { var args = Array.prototype.slice.call(arguments, 1); if (args.length <= 1) { args = args[0]; } results[k] = args; callback(err); }); }, function (err) { callback(err, results); }); } }; async.iterator = function (tasks) { var makeCallback = function (index) { var fn = function () { if (tasks.length) { tasks[index].apply(null, arguments); } return fn.next(); }; fn.next = function () { return (index < tasks.length - 1) ? makeCallback(index + 1): null; }; return fn; }; return makeCallback(0); }; async.apply = function (fn) { var args = Array.prototype.slice.call(arguments, 1); return function () { return fn.apply( null, args.concat(Array.prototype.slice.call(arguments)) ); }; }; var _concat = function (eachfn, arr, fn, callback) { var r = []; eachfn(arr, function (x, cb) { fn(x, function (err, y) { r = r.concat(y || []); cb(err); }); }, function (err) { callback(err, r); }); }; async.concat = doParallel(_concat); async.concatSeries = doSeries(_concat); async.whilst = function (test, iterator, callback) { if (test()) { iterator(function (err) { if (err) { return callback(err); } async.whilst(test, iterator, callback); }); } else { callback(); } }; async.doWhilst = function (iterator, test, callback) { iterator(function (err) { if (err) { return callback(err); } var args = Array.prototype.slice.call(arguments, 1); if (test.apply(null, args)) { async.doWhilst(iterator, test, callback); } else { callback(); } }); }; async.until = function (test, iterator, callback) { if (!test()) { iterator(function (err) { if (err) { return callback(err); } async.until(test, iterator, callback); }); } else { callback(); } }; async.doUntil = function (iterator, test, callback) { iterator(function (err) { if (err) { return callback(err); } var args = Array.prototype.slice.call(arguments, 1); if (!test.apply(null, args)) { async.doUntil(iterator, test, callback); } else { callback(); } }); }; async.queue = function (worker, concurrency) { if (concurrency === undefined) { concurrency = 1; } function _insert(q, data, pos, callback) { if (!q.started){ q.started = true; } if (!_isArray(data)) { data = [data]; } if(data.length == 0) { // call drain immediately if there are no tasks return async.setImmediate(function() { if (q.drain) { q.drain(); } }); } _each(data, function(task) { var item = { data: task, callback: typeof callback === 'function' ? callback : null }; if (pos) { q.tasks.unshift(item); } else { q.tasks.push(item); } if (q.saturated && q.tasks.length === q.concurrency) { q.saturated(); } async.setImmediate(q.process); }); } var workers = 0; var q = { tasks: [], concurrency: concurrency, saturated: null, empty: null, drain: null, started: false, paused: false, push: function (data, callback) { _insert(q, data, false, callback); }, kill: function () { q.drain = null; q.tasks = []; }, unshift: function (data, callback) { _insert(q, data, true, callback); }, process: function () { if (!q.paused && workers < q.concurrency && q.tasks.length) { var task = q.tasks.shift(); if (q.empty && q.tasks.length === 0) { q.empty(); } workers += 1; var next = function () { workers -= 1; if (task.callback) { task.callback.apply(task, arguments); } if (q.drain && q.tasks.length + workers === 0) { q.drain(); } q.process(); }; var cb = only_once(next); worker(task.data, cb); } }, length: function () { return q.tasks.length; }, running: function () { return workers; }, idle: function() { return q.tasks.length + workers === 0; }, pause: function () { if (q.paused === true) { return; } q.paused = true; q.process(); }, resume: function () { if (q.paused === false) { return; } q.paused = false; q.process(); } }; return q; }; async.priorityQueue = function (worker, concurrency) { function _compareTasks(a, b){ return a.priority - b.priority; }; function _binarySearch(sequence, item, compare) { var beg = -1, end = sequence.length - 1; while (beg < end) { var mid = beg + ((end - beg + 1) >>> 1); if (compare(item, sequence[mid]) >= 0) { beg = mid; } else { end = mid - 1; } } return beg; } function _insert(q, data, priority, callback) { if (!q.started){ q.started = true; } if (!_isArray(data)) { data = [data]; } if(data.length == 0) { // call drain immediately if there are no tasks return async.setImmediate(function() { if (q.drain) { q.drain(); } }); } _each(data, function(task) { var item = { data: task, priority: priority, callback: typeof callback === 'function' ? callback : null }; q.tasks.splice(_binarySearch(q.tasks, item, _compareTasks) + 1, 0, item); if (q.saturated && q.tasks.length === q.concurrency) { q.saturated(); } async.setImmediate(q.process); }); } // Start with a normal queue var q = async.queue(worker, concurrency); // Override push to accept second parameter representing priority q.push = function (data, priority, callback) { _insert(q, data, priority, callback); }; // Remove unshift function delete q.unshift; return q; }; async.cargo = function (worker, payload) { var working = false, tasks = []; var cargo = { tasks: tasks, payload: payload, saturated: null, empty: null, drain: null, drained: true, push: function (data, callback) { if (!_isArray(data)) { data = [data]; } _each(data, function(task) { tasks.push({ data: task, callback: typeof callback === 'function' ? callback : null }); cargo.drained = false; if (cargo.saturated && tasks.length === payload) { cargo.saturated(); } }); async.setImmediate(cargo.process); }, process: function process() { if (working) return; if (tasks.length === 0) { if(cargo.drain && !cargo.drained) cargo.drain(); cargo.drained = true; return; } var ts = typeof payload === 'number' ? tasks.splice(0, payload) : tasks.splice(0, tasks.length); var ds = _map(ts, function (task) { return task.data; }); if(cargo.empty) cargo.empty(); working = true; worker(ds, function () { working = false; var args = arguments; _each(ts, function (data) { if (data.callback) { data.callback.apply(null, args); } }); process(); }); }, length: function () { return tasks.length; }, running: function () { return working; } }; return cargo; }; var _console_fn = function (name) { return function (fn) { var args = Array.prototype.slice.call(arguments, 1); fn.apply(null, args.concat([function (err) { var args = Array.prototype.slice.call(arguments, 1); if (typeof console !== 'undefined') { if (err) { if (console.error) { console.error(err); } } else if (console[name]) { _each(args, function (x) { console[name](x); }); } } }])); }; }; async.log = _console_fn('log'); async.dir = _console_fn('dir'); /*async.info = _console_fn('info'); async.warn = _console_fn('warn'); async.error = _console_fn('error');*/ async.memoize = function (fn, hasher) { var memo = {}; var queues = {}; hasher = hasher || function (x) { return x; }; var memoized = function () { var args = Array.prototype.slice.call(arguments); var callback = args.pop(); var key = hasher.apply(null, args); if (key in memo) { async.nextTick(function () { callback.apply(null, memo[key]); }); } else if (key in queues) { queues[key].push(callback); } else { queues[key] = [callback]; fn.apply(null, args.concat([function () { memo[key] = arguments; var q = queues[key]; delete queues[key]; for (var i = 0, l = q.length; i < l; i++) { q[i].apply(null, arguments); } }])); } }; memoized.memo = memo; memoized.unmemoized = fn; return memoized; }; async.unmemoize = function (fn) { return function () { return (fn.unmemoized || fn).apply(null, arguments); }; }; async.times = function (count, iterator, callback) { var counter = []; for (var i = 0; i < count; i++) { counter.push(i); } return async.map(counter, iterator, callback); }; async.timesSeries = function (count, iterator, callback) { var counter = []; for (var i = 0; i < count; i++) { counter.push(i); } return async.mapSeries(counter, iterator, callback); }; async.seq = function (/* functions... */) { var fns = arguments; return function () { var that = this; var args = Array.prototype.slice.call(arguments); var callback = args.pop(); async.reduce(fns, args, function (newargs, fn, cb) { fn.apply(that, newargs.concat([function () { var err = arguments[0]; var nextargs = Array.prototype.slice.call(arguments, 1); cb(err, nextargs); }])) }, function (err, results) { callback.apply(that, [err].concat(results)); }); }; }; async.compose = function (/* functions... */) { return async.seq.apply(null, Array.prototype.reverse.call(arguments)); }; var _applyEach = function (eachfn, fns /*args...*/) { var go = function () { var that = this; var args = Array.prototype.slice.call(arguments); var callback = args.pop(); return eachfn(fns, function (fn, cb) { fn.apply(that, args.concat([cb])); }, callback); }; if (arguments.length > 2) { var args = Array.prototype.slice.call(arguments, 2); return go.apply(this, args); } else { return go; } }; async.applyEach = doParallel(_applyEach); async.applyEachSeries = doSeries(_applyEach); async.forever = function (fn, callback) { function next(err) { if (err) { if (callback) { return callback(err); } throw err; } fn(next); } next(); }; // Node.js if (typeof module !== 'undefined' && module.exports) { module.exports = async; } // AMD / RequireJS else if (typeof define !== 'undefined' && define.amd) { define([], function () { return async; }); } // included directly via <script> tag else { root.async = async; } }()); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/form-data/lib/form_data.js�������������������000644 �000766 �000024 �00000022565 12455173731 034371� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var CombinedStream = require('combined-stream'); var util = require('util'); var path = require('path'); var http = require('http'); var https = require('https'); var parseUrl = require('url').parse; var fs = require('fs'); var mime = require('mime-types'); var async = require('async'); module.exports = FormData; function FormData() { this._overheadLength = 0; this._valueLength = 0; this._lengthRetrievers = []; CombinedStream.call(this); } util.inherits(FormData, CombinedStream); FormData.LINE_BREAK = '\r\n'; FormData.prototype.append = function(field, value, options) { options = options || {}; var append = CombinedStream.prototype.append.bind(this); // all that streamy business can't handle numbers if (typeof value == 'number') value = ''+value; // https://github.com/felixge/node-form-data/issues/38 if (util.isArray(value)) { // Please convert your array into string // the way web server expects it this._error(new Error('Arrays are not supported.')); return; } var header = this._multiPartHeader(field, value, options); var footer = this._multiPartFooter(field, value, options); append(header); append(value); append(footer); // pass along options.knownLength this._trackLength(header, value, options); }; FormData.prototype._trackLength = function(header, value, options) { var valueLength = 0; // used w/ getLengthSync(), when length is known. // e.g. for streaming directly from a remote server, // w/ a known file a size, and not wanting to wait for // incoming file to finish to get its size. if (options.knownLength != null) { valueLength += +options.knownLength; } else if (Buffer.isBuffer(value)) { valueLength = value.length; } else if (typeof value === 'string') { valueLength = Buffer.byteLength(value); } this._valueLength += valueLength; // @check why add CRLF? does this account for custom/multiple CRLFs? this._overheadLength += Buffer.byteLength(header) + + FormData.LINE_BREAK.length; // empty or either doesn't have path or not an http response if (!value || ( !value.path && !(value.readable && value.hasOwnProperty('httpVersion')) )) { return; } // no need to bother with the length if (!options.knownLength) this._lengthRetrievers.push(function(next) { if (value.hasOwnProperty('fd')) { // take read range into a account // `end` = Infinity –> read file till the end // // TODO: Looks like there is bug in Node fs.createReadStream // it doesn't respect `end` options without `start` options // Fix it when node fixes it. // https://github.com/joyent/node/issues/7819 if (value.end != undefined && value.end != Infinity && value.start != undefined) { // when end specified // no need to calculate range // inclusive, starts with 0 next(null, value.end+1 - (value.start ? value.start : 0)); // not that fast snoopy } else { // still need to fetch file size from fs fs.stat(value.path, function(err, stat) { var fileSize; if (err) { next(err); return; } // update final size based on the range options fileSize = stat.size - (value.start ? value.start : 0); next(null, fileSize); }); } // or http response } else if (value.hasOwnProperty('httpVersion')) { next(null, +value.headers['content-length']); // or request stream http://github.com/mikeal/request } else if (value.hasOwnProperty('httpModule')) { // wait till response come back value.on('response', function(response) { value.pause(); next(null, +response.headers['content-length']); }); value.resume(); // something else } else { next('Unknown stream'); } }); }; FormData.prototype._multiPartHeader = function(field, value, options) { var boundary = this.getBoundary(); var header = ''; // custom header specified (as string)? // it becomes responsible for boundary // (e.g. to handle extra CRLFs on .NET servers) if (options.header != null) { header = options.header; } else { header += '--' + boundary + FormData.LINE_BREAK + 'Content-Disposition: form-data; name="' + field + '"'; // fs- and request- streams have path property // or use custom filename and/or contentType // TODO: Use request's response mime-type if (options.filename || value.path) { header += '; filename="' + path.basename(options.filename || value.path) + '"' + FormData.LINE_BREAK + 'Content-Type: ' + (options.contentType || mime.lookup(options.filename || value.path)); // http response has not } else if (value.readable && value.hasOwnProperty('httpVersion')) { header += '; filename="' + path.basename(value.client._httpMessage.path) + '"' + FormData.LINE_BREAK + 'Content-Type: ' + value.headers['content-type']; } header += FormData.LINE_BREAK + FormData.LINE_BREAK; } return header; }; FormData.prototype._multiPartFooter = function(field, value, options) { return function(next) { var footer = FormData.LINE_BREAK; var lastPart = (this._streams.length === 0); if (lastPart) { footer += this._lastBoundary(); } next(footer); }.bind(this); }; FormData.prototype._lastBoundary = function() { return '--' + this.getBoundary() + '--'; }; FormData.prototype.getHeaders = function(userHeaders) { var formHeaders = { 'content-type': 'multipart/form-data; boundary=' + this.getBoundary() }; for (var header in userHeaders) { formHeaders[header.toLowerCase()] = userHeaders[header]; } return formHeaders; } FormData.prototype.getCustomHeaders = function(contentType) { contentType = contentType ? contentType : 'multipart/form-data'; var formHeaders = { 'content-type': contentType + '; boundary=' + this.getBoundary(), 'content-length': this.getLengthSync() }; return formHeaders; } FormData.prototype.getBoundary = function() { if (!this._boundary) { this._generateBoundary(); } return this._boundary; }; FormData.prototype._generateBoundary = function() { // This generates a 50 character boundary similar to those used by Firefox. // They are optimized for boyer-moore parsing. var boundary = '--------------------------'; for (var i = 0; i < 24; i++) { boundary += Math.floor(Math.random() * 10).toString(16); } this._boundary = boundary; }; // Note: getLengthSync DOESN'T calculate streams length // As workaround one can calculate file size manually // and add it as knownLength option FormData.prototype.getLengthSync = function(debug) { var knownLength = this._overheadLength + this._valueLength; // Don't get confused, there are 3 "internal" streams for each keyval pair // so it basically checks if there is any value added to the form if (this._streams.length) { knownLength += this._lastBoundary().length; } // https://github.com/felixge/node-form-data/issues/40 if (this._lengthRetrievers.length) { // Some async length retrivers are present // therefore synchronous length calculation is false. // Please use getLength(callback) to get proper length this._error(new Error('Cannot calculate proper length in synchronous way.')); } return knownLength; }; FormData.prototype.getLength = function(cb) { var knownLength = this._overheadLength + this._valueLength; if (this._streams.length) { knownLength += this._lastBoundary().length; } if (!this._lengthRetrievers.length) { process.nextTick(cb.bind(this, null, knownLength)); return; } async.parallel(this._lengthRetrievers, function(err, values) { if (err) { cb(err); return; } values.forEach(function(length) { knownLength += length; }); cb(null, knownLength); }); }; FormData.prototype.submit = function(params, cb) { var request , options , defaults = { method : 'post' }; // parse provided url if it's string // or treat it as options object if (typeof params == 'string') { params = parseUrl(params); options = populate({ port: params.port, path: params.pathname, host: params.hostname }, defaults); } else // use custom params { options = populate(params, defaults); // if no port provided use default one if (!options.port) { options.port = options.protocol == 'https:' ? 443 : 80; } } // put that good code in getHeaders to some use options.headers = this.getHeaders(params.headers); // https if specified, fallback to http in any other case if (params.protocol == 'https:') { request = https.request(options); } else { request = http.request(options); } // get content length and fire away this.getLength(function(err, length) { // TODO: Add chunked encoding when no length (if err) // add content length request.setHeader('Content-Length', length); this.pipe(request); if (cb) { request.on('error', cb); request.on('response', cb.bind(this, null)); } }.bind(this)); return request; }; FormData.prototype._error = function(err) { if (this.error) return; this.error = err; this.pause(); this.emit('error', err); }; /* * Santa's little helpers */ // populates missing values function populate(dst, src) { for (var prop in src) { if (!dst[prop]) dst[prop] = src[prop]; } return dst; } �������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/index.js000644 �000766 �000024 �00000007105 12455173731 033740� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = ForeverAgent ForeverAgent.SSL = ForeverAgentSSL var util = require('util') , Agent = require('http').Agent , net = require('net') , tls = require('tls') , AgentSSL = require('https').Agent function ForeverAgent(options) { var self = this self.options = options || {} self.requests = {} self.sockets = {} self.freeSockets = {} self.maxSockets = self.options.maxSockets || Agent.defaultMaxSockets self.minSockets = self.options.minSockets || ForeverAgent.defaultMinSockets self.on('free', function(socket, host, port) { var name = host + ':' + port if (self.requests[name] && self.requests[name].length) { self.requests[name].shift().onSocket(socket) } else if (self.sockets[name].length < self.minSockets) { if (!self.freeSockets[name]) self.freeSockets[name] = [] self.freeSockets[name].push(socket) // if an error happens while we don't use the socket anyway, meh, throw the socket away var onIdleError = function() { socket.destroy() } socket._onIdleError = onIdleError socket.on('error', onIdleError) } else { // If there are no pending requests just destroy the // socket and it will get removed from the pool. This // gets us out of timeout issues and allows us to // default to Connection:keep-alive. socket.destroy() } }) } util.inherits(ForeverAgent, Agent) ForeverAgent.defaultMinSockets = 5 ForeverAgent.prototype.createConnection = net.createConnection ForeverAgent.prototype.addRequestNoreuse = Agent.prototype.addRequest ForeverAgent.prototype.addRequest = function(req, host, port) { var name = host + ':' + port if (this.freeSockets[name] && this.freeSockets[name].length > 0 && !req.useChunkedEncodingByDefault) { var idleSocket = this.freeSockets[name].pop() idleSocket.removeListener('error', idleSocket._onIdleError) delete idleSocket._onIdleError req._reusedSocket = true req.onSocket(idleSocket) } else { this.addRequestNoreuse(req, host, port) } } ForeverAgent.prototype.removeSocket = function(s, name, host, port) { if (this.sockets[name]) { var index = this.sockets[name].indexOf(s) if (index !== -1) { this.sockets[name].splice(index, 1) } } else if (this.sockets[name] && this.sockets[name].length === 0) { // don't leak delete this.sockets[name] delete this.requests[name] } if (this.freeSockets[name]) { var index = this.freeSockets[name].indexOf(s) if (index !== -1) { this.freeSockets[name].splice(index, 1) if (this.freeSockets[name].length === 0) { delete this.freeSockets[name] } } } if (this.requests[name] && this.requests[name].length) { // If we have pending requests and a socket gets closed a new one // needs to be created to take over in the pool for the one that closed. this.createSocket(name, host, port).emit('free') } } function ForeverAgentSSL (options) { ForeverAgent.call(this, options) } util.inherits(ForeverAgentSSL, ForeverAgent) ForeverAgentSSL.prototype.createConnection = createConnectionSSL ForeverAgentSSL.prototype.addRequestNoreuse = AgentSSL.prototype.addRequest function createConnectionSSL (port, host, options) { if (typeof port === 'object') { options = port; } else if (typeof host === 'object') { options = host; } else if (typeof options === 'object') { options = options; } else { options = {}; } if (typeof port === 'number') { options.port = port; } if (typeof host === 'string') { options.host = host; } return tls.connect(options); } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/forever-agent/LICENSE�000644 �000766 �000024 �00000021664 12455173731 033306� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS����������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/forever-agent/package.json�������������������000644 �000766 �000024 �00000002502 12455173731 034476� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "forever-agent", "description": "HTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module.", "version": "0.5.2", "repository": { "url": "https://github.com/mikeal/forever-agent" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "bugs": { "url": "https://github.com/mikeal/forever-agent/issues" }, "homepage": "https://github.com/mikeal/forever-agent", "_id": "forever-agent@0.5.2", "dist": { "shasum": "6d0e09c4921f94a27f63d3b49c5feff1ea4c5130", "tarball": "http://registry.npmjs.org/forever-agent/-/forever-agent-0.5.2.tgz" }, "_from": "forever-agent@>=0.5.0 <0.6.0", "_npmVersion": "1.3.21", "_npmUser": { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" } ], "directories": {}, "_shasum": "6d0e09c4921f94a27f63d3b49c5feff1ea4c5130", "_resolved": "https://registry.npmjs.org/forever-agent/-/forever-agent-0.5.2.tgz", "readme": "ERROR: No README data found!", "scripts": {} } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/forever-agent/README.md����������������������000644 �000766 �000024 �00000000243 12455173731 033467� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������forever-agent ============= HTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/combined-stream/lib/��000755 �000766 �000024 �00000000000 12456115120 033330� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/combined-stream/License����������������������000644 �000766 �000024 �00000002075 12455173731 034027� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2011 Debuggable Limited <felix@debuggable.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/combined-stream/node_modules/����������������000755 �000766 �000024 �00000000000 12456115120 035160� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/combined-stream/package.json�����������������000644 �000766 �000024 �00000003020 12455173731 034777� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Felix Geisendörfer", "email": "felix@debuggable.com", "url": "http://debuggable.com/" }, "name": "combined-stream", "description": "A stream that emits multiple other streams one after another.", "version": "0.0.7", "homepage": "https://github.com/felixge/node-combined-stream", "repository": { "type": "git", "url": "git://github.com/felixge/node-combined-stream.git" }, "main": "./lib/combined_stream", "scripts": { "test": "node test/run.js" }, "engines": { "node": ">= 0.8" }, "dependencies": { "delayed-stream": "0.0.5" }, "devDependencies": { "far": "~0.0.7" }, "bugs": { "url": "https://github.com/felixge/node-combined-stream/issues" }, "_id": "combined-stream@0.0.7", "dist": { "shasum": "0137e657baa5a7541c57ac37ac5fc07d73b4dc1f", "tarball": "http://registry.npmjs.org/combined-stream/-/combined-stream-0.0.7.tgz" }, "_from": "combined-stream@>=0.0.5 <0.1.0", "_npmVersion": "1.4.3", "_npmUser": { "name": "felixge", "email": "felix@debuggable.com" }, "maintainers": [ { "name": "felixge", "email": "felix@debuggable.com" }, { "name": "celer", "email": "celer@scrypt.net" }, { "name": "alexindigo", "email": "iam@alexindigo.com" } ], "directories": {}, "_shasum": "0137e657baa5a7541c57ac37ac5fc07d73b4dc1f", "_resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-0.0.7.tgz", "readme": "ERROR: No README data found!" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/combined-stream/Readme.md��������������������000644 �000766 �000024 �00000010043 12455173731 034233� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# combined-stream [![Build Status](https://travis-ci.org/felixge/node-combined-stream.svg?branch=master)](https://travis-ci.org/felixge/node-combined-stream) A stream that emits multiple other streams one after another. ## Installation ``` bash npm install combined-stream ``` ## Usage Here is a simple example that shows how you can use combined-stream to combine two files into one: ``` javascript var CombinedStream = require('combined-stream'); var fs = require('fs'); var combinedStream = CombinedStream.create(); combinedStream.append(fs.createReadStream('file1.txt')); combinedStream.append(fs.createReadStream('file2.txt')); combinedStream.pipe(fs.createWriteStream('combined.txt')); ``` While the example above works great, it will pause all source streams until they are needed. If you don't want that to happen, you can set `pauseStreams` to `false`: ``` javascript var CombinedStream = require('combined-stream'); var fs = require('fs'); var combinedStream = CombinedStream.create({pauseStreams: false}); combinedStream.append(fs.createReadStream('file1.txt')); combinedStream.append(fs.createReadStream('file2.txt')); combinedStream.pipe(fs.createWriteStream('combined.txt')); ``` However, what if you don't have all the source streams yet, or you don't want to allocate the resources (file descriptors, memory, etc.) for them right away? Well, in that case you can simply provide a callback that supplies the stream by calling a `next()` function: ``` javascript var CombinedStream = require('combined-stream'); var fs = require('fs'); var combinedStream = CombinedStream.create(); combinedStream.append(function(next) { next(fs.createReadStream('file1.txt')); }); combinedStream.append(function(next) { next(fs.createReadStream('file2.txt')); }); combinedStream.pipe(fs.createWriteStream('combined.txt')); ``` ## API ### CombinedStream.create([options]) Returns a new combined stream object. Available options are: * `maxDataSize` * `pauseStreams` The effect of those options is described below. ### combinedStream.pauseStreams = `true` Whether to apply back pressure to the underlaying streams. If set to `false`, the underlaying streams will never be paused. If set to `true`, the underlaying streams will be paused right after being appended, as well as when `delayedStream.pipe()` wants to throttle. ### combinedStream.maxDataSize = `2 * 1024 * 1024` The maximum amount of bytes (or characters) to buffer for all source streams. If this value is exceeded, `combinedStream` emits an `'error'` event. ### combinedStream.dataSize = `0` The amount of bytes (or characters) currently buffered by `combinedStream`. ### combinedStream.append(stream) Appends the given `stream` to the combinedStream object. If `pauseStreams` is set to `true, this stream will also be paused right away. `streams` can also be a function that takes one parameter called `next`. `next` is a function that must be invoked in order to provide the `next` stream, see example above. Regardless of how the `stream` is appended, combined-stream always attaches an `'error'` listener to it, so you don't have to do that manually. Special case: `stream` can also be a String or Buffer. ### combinedStream.write(data) You should not call this, `combinedStream` takes care of piping the appended streams into itself for you. ### combinedStream.resume() Causes `combinedStream` to start drain the streams it manages. The function is idempotent, and also emits a `'resume'` event each time which usually goes to the stream that is currently being drained. ### combinedStream.pause(); If `combinedStream.pauseStreams` is set to `false`, this does nothing. Otherwise a `'pause'` event is emitted, this goes to the stream that is currently being drained, so you can use it to apply back pressure. ### combinedStream.end(); Sets `combinedStream.writable` to false, emits an `'end'` event, and removes all streams from the queue. ### combinedStream.destroy(); Same as `combinedStream.end()`, except it emits a `'close'` event instead of `'end'`. ## License combined-stream is licensed under the MIT license. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/�000755 �000766 �000024 �00000000000 12456115120 040060� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/.npmignore��������000644 �000766 �000024 �00000000026 12455173731 042070� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������*.un~ /node_modules/* ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/lib/�000755 �000766 �000024 �00000000000 12456115120 040626� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/License�����������000644 �000766 �000024 �00000002075 12455173731 041404� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2011 Debuggable Limited <felix@debuggable.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/Makefile����������000644 �000766 �000024 �00000000070 12455173731 041530� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������SHELL := /bin/bash test: @./test/run.js .PHONY: test ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/package.json������000644 �000766 �000024 �00000002342 12455173731 042362� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Felix Geisendörfer", "email": "felix@debuggable.com", "url": "http://debuggable.com/" }, "name": "delayed-stream", "description": "Buffers events from a stream until you are ready to handle them.", "version": "0.0.5", "homepage": "https://github.com/felixge/node-delayed-stream", "repository": { "type": "git", "url": "git://github.com/felixge/node-delayed-stream.git" }, "main": "./lib/delayed_stream", "engines": { "node": ">=0.4.0" }, "dependencies": {}, "devDependencies": { "fake": "0.2.0", "far": "0.0.1" }, "_id": "delayed-stream@0.0.5", "_engineSupported": true, "_npmVersion": "1.0.3", "_nodeVersion": "v0.4.9-pre", "_defaultsLoaded": true, "dist": { "shasum": "d4b1f43a93e8296dfe02694f4680bc37a313c73f", "tarball": "http://registry.npmjs.org/delayed-stream/-/delayed-stream-0.0.5.tgz" }, "scripts": {}, "directories": {}, "_shasum": "d4b1f43a93e8296dfe02694f4680bc37a313c73f", "_resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-0.0.5.tgz", "_from": "delayed-stream@0.0.5", "bugs": { "url": "https://github.com/felixge/node-delayed-stream/issues" }, "readme": "ERROR: No README data found!" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/Readme.md���������000644 �000766 �000024 �00000010236 12455173731 041614� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������# delayed-stream Buffers events from a stream until you are ready to handle them. ## Installation ``` bash npm install delayed-stream ``` ## Usage The following example shows how to write a http echo server that delays its response by 1000 ms. ``` javascript var DelayedStream = require('delayed-stream'); var http = require('http'); http.createServer(function(req, res) { var delayed = DelayedStream.create(req); setTimeout(function() { res.writeHead(200); delayed.pipe(res); }, 1000); }); ``` If you are not using `Stream#pipe`, you can also manually release the buffered events by calling `delayedStream.resume()`: ``` javascript var delayed = DelayedStream.create(req); setTimeout(function() { // Emit all buffered events and resume underlaying source delayed.resume(); }, 1000); ``` ## Implementation In order to use this meta stream properly, here are a few things you should know about the implementation. ### Event Buffering / Proxying All events of the `source` stream are hijacked by overwriting the `source.emit` method. Until node implements a catch-all event listener, this is the only way. However, delayed-stream still continues to emit all events it captures on the `source`, regardless of whether you have released the delayed stream yet or not. Upon creation, delayed-stream captures all `source` events and stores them in an internal event buffer. Once `delayedStream.release()` is called, all buffered events are emitted on the `delayedStream`, and the event buffer is cleared. After that, delayed-stream merely acts as a proxy for the underlaying source. ### Error handling Error events on `source` are buffered / proxied just like any other events. However, `delayedStream.create` attaches a no-op `'error'` listener to the `source`. This way you only have to handle errors on the `delayedStream` object, rather than in two places. ### Buffer limits delayed-stream provides a `maxDataSize` property that can be used to limit the amount of data being buffered. In order to protect you from bad `source` streams that don't react to `source.pause()`, this feature is enabled by default. ## API ### DelayedStream.create(source, [options]) Returns a new `delayedStream`. Available options are: * `pauseStream` * `maxDataSize` The description for those properties can be found below. ### delayedStream.source The `source` stream managed by this object. This is useful if you are passing your `delayedStream` around, and you still want to access properties on the `source` object. ### delayedStream.pauseStream = true Whether to pause the underlaying `source` when calling `DelayedStream.create()`. Modifying this property afterwards has no effect. ### delayedStream.maxDataSize = 1024 * 1024 The amount of data to buffer before emitting an `error`. If the underlaying source is emitting `Buffer` objects, the `maxDataSize` refers to bytes. If the underlaying source is emitting JavaScript strings, the size refers to characters. If you know what you are doing, you can set this property to `Infinity` to disable this feature. You can also modify this property during runtime. ### delayedStream.maxDataSize = 1024 * 1024 The amount of data to buffer before emitting an `error`. If the underlaying source is emitting `Buffer` objects, the `maxDataSize` refers to bytes. If the underlaying source is emitting JavaScript strings, the size refers to characters. If you know what you are doing, you can set this property to `Infinity` to disable this feature. ### delayedStream.dataSize = 0 The amount of data buffered so far. ### delayedStream.readable An ECMA5 getter that returns the value of `source.readable`. ### delayedStream.resume() If the `delayedStream` has not been released so far, `delayedStream.release()` is called. In either case, `source.resume()` is called. ### delayedStream.pause() Calls `source.pause()`. ### delayedStream.pipe(dest) Calls `delayedStream.resume()` and then proxies the arguments to `source.pipe`. ### delayedStream.release() Emits and clears all events that have been buffered up so far. This does not resume the underlaying source, use `delayedStream.resume()` instead. ## License delayed-stream is licensed under the MIT license. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.js�000644 �000766 �000024 �00000004131 12455173731 044160� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������var Stream = require('stream').Stream; var util = require('util'); module.exports = DelayedStream; function DelayedStream() { this.source = null; this.dataSize = 0; this.maxDataSize = 1024 * 1024; this.pauseStream = true; this._maxDataSizeExceeded = false; this._released = false; this._bufferedEvents = []; } util.inherits(DelayedStream, Stream); DelayedStream.create = function(source, options) { var delayedStream = new this(); options = options || {}; for (var option in options) { delayedStream[option] = options[option]; } delayedStream.source = source; var realEmit = source.emit; source.emit = function() { delayedStream._handleEmit(arguments); return realEmit.apply(source, arguments); }; source.on('error', function() {}); if (delayedStream.pauseStream) { source.pause(); } return delayedStream; }; DelayedStream.prototype.__defineGetter__('readable', function() { return this.source.readable; }); DelayedStream.prototype.resume = function() { if (!this._released) { this.release(); } this.source.resume(); }; DelayedStream.prototype.pause = function() { this.source.pause(); }; DelayedStream.prototype.release = function() { this._released = true; this._bufferedEvents.forEach(function(args) { this.emit.apply(this, args); }.bind(this)); this._bufferedEvents = []; }; DelayedStream.prototype.pipe = function() { var r = Stream.prototype.pipe.apply(this, arguments); this.resume(); return r; }; DelayedStream.prototype._handleEmit = function(args) { if (this._released) { this.emit.apply(this, args); return; } if (args[0] === 'data') { this.dataSize += args[1].length; this._checkIfMaxDataSizeExceeded(); } this._bufferedEvents.push(args); }; DelayedStream.prototype._checkIfMaxDataSizeExceeded = function() { if (this._maxDataSizeExceeded) { return; } if (this.dataSize <= this.maxDataSize) { return; } this._maxDataSizeExceeded = true; var message = 'DelayedStream#maxDataSize of ' + this.maxDataSize + ' bytes exceeded.' this.emit('error', new Error(message)); }; ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/request/node_modules/combined-stream/lib/combined_stream.js�������000644 �000766 �000024 �00000010314 12455173731 036754� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var util = require('util'); var Stream = require('stream').Stream; var DelayedStream = require('delayed-stream'); module.exports = CombinedStream; function CombinedStream() { this.writable = false; this.readable = true; this.dataSize = 0; this.maxDataSize = 2 * 1024 * 1024; this.pauseStreams = true; this._released = false; this._streams = []; this._currentStream = null; } util.inherits(CombinedStream, Stream); CombinedStream.create = function(options) { var combinedStream = new this(); options = options || {}; for (var option in options) { combinedStream[option] = options[option]; } return combinedStream; }; CombinedStream.isStreamLike = function(stream) { return (typeof stream !== 'function') && (typeof stream !== 'string') && (typeof stream !== 'boolean') && (typeof stream !== 'number') && (!Buffer.isBuffer(stream)); }; CombinedStream.prototype.append = function(stream) { var isStreamLike = CombinedStream.isStreamLike(stream); if (isStreamLike) { if (!(stream instanceof DelayedStream)) { var newStream = DelayedStream.create(stream, { maxDataSize: Infinity, pauseStream: this.pauseStreams, }); stream.on('data', this._checkDataSize.bind(this)); stream = newStream; } this._handleErrors(stream); if (this.pauseStreams) { stream.pause(); } } this._streams.push(stream); return this; }; CombinedStream.prototype.pipe = function(dest, options) { Stream.prototype.pipe.call(this, dest, options); this.resume(); return dest; }; CombinedStream.prototype._getNext = function() { this._currentStream = null; var stream = this._streams.shift(); if (typeof stream == 'undefined') { this.end(); return; } if (typeof stream !== 'function') { this._pipeNext(stream); return; } var getStream = stream; getStream(function(stream) { var isStreamLike = CombinedStream.isStreamLike(stream); if (isStreamLike) { stream.on('data', this._checkDataSize.bind(this)); this._handleErrors(stream); } this._pipeNext(stream); }.bind(this)); }; CombinedStream.prototype._pipeNext = function(stream) { this._currentStream = stream; var isStreamLike = CombinedStream.isStreamLike(stream); if (isStreamLike) { stream.on('end', this._getNext.bind(this)); stream.pipe(this, {end: false}); return; } var value = stream; this.write(value); this._getNext(); }; CombinedStream.prototype._handleErrors = function(stream) { var self = this; stream.on('error', function(err) { self._emitError(err); }); }; CombinedStream.prototype.write = function(data) { this.emit('data', data); }; CombinedStream.prototype.pause = function() { if (!this.pauseStreams) { return; } if(this.pauseStreams && this._currentStream && typeof(this._currentStream.pause) == 'function') this._currentStream.pause(); this.emit('pause'); }; CombinedStream.prototype.resume = function() { if (!this._released) { this._released = true; this.writable = true; this._getNext(); } if(this.pauseStreams && this._currentStream && typeof(this._currentStream.resume) == 'function') this._currentStream.resume(); this.emit('resume'); }; CombinedStream.prototype.end = function() { this._reset(); this.emit('end'); }; CombinedStream.prototype.destroy = function() { this._reset(); this.emit('close'); }; CombinedStream.prototype._reset = function() { this.writable = false; this._streams = []; this._currentStream = null; }; CombinedStream.prototype._checkDataSize = function() { this._updateDataSize(); if (this.dataSize <= this.maxDataSize) { return; } var message = 'DelayedStream#maxDataSize of ' + this.maxDataSize + ' bytes exceeded.'; this._emitError(new Error(message)); }; CombinedStream.prototype._updateDataSize = function() { this.dataSize = 0; var self = this; this._streams.forEach(function(stream) { if (!stream.dataSize) { return; } self.dataSize += stream.dataSize; }); if (this._currentStream && this._currentStream.dataSize) { this.dataSize += this._currentStream.dataSize; } }; CombinedStream.prototype._emitError = function(err) { this._reset(); this.emit('error', err); }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/caseless/index.js�����000644 �000766 �000024 �00000003254 12455173731 032777� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������function Caseless (dict) { this.dict = dict } Caseless.prototype.set = function (name, value, clobber) { if (typeof name === 'object') { for (var i in name) { this.set(i, name[i], value) } } else { if (typeof clobber === 'undefined') clobber = true var has = this.has(name) if (!clobber && has) this.dict[has] = this.dict[has] + ',' + value else this.dict[has || name] = value return has } } Caseless.prototype.has = function (name) { var keys = Object.keys(this.dict) , name = name.toLowerCase() ; for (var i=0;i<keys.length;i++) { if (keys[i].toLowerCase() === name) return keys[i] } return false } Caseless.prototype.get = function (name) { name = name.toLowerCase() var result, _key var headers = this.dict Object.keys(headers).forEach(function (key) { _key = key.toLowerCase() if (name === _key) result = headers[key] }) return result } Caseless.prototype.swap = function (name) { var has = this.has(name) if (!has) throw new Error('There is no header than matches "'+name+'"') this.dict[name] = this.dict[has] delete this.dict[has] } Caseless.prototype.del = function (name) { var has = this.has(name) return delete this.dict[has || name] } module.exports = function (dict) {return new Caseless(dict)} module.exports.httpify = function (resp, headers) { var c = new Caseless(headers) resp.setHeader = function (key, value, clobber) { return c.set(key, value, clobber) } resp.hasHeader = function (key) { return c.has(key) } resp.getHeader = function (key) { return c.get(key) } resp.removeHeader = function (key) { return c.del(key) } resp.headers = c.dict return c } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/caseless/package.json�000644 �000766 �000024 �00000002501 12455173731 033612� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "caseless", "version": "0.8.0", "description": "Caseless object set/get/has, very useful when working with HTTP headers.", "main": "index.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "https://github.com/mikeal/caseless" }, "keywords": [ "headers", "http", "caseless" ], "test": "node test.js", "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com" }, "license": "BSD", "bugs": { "url": "https://github.com/mikeal/caseless/issues" }, "devDependencies": { "tape": "^2.10.2" }, "gitHead": "1bfbf01d4481c057738a64ba284749222a944176", "homepage": "https://github.com/mikeal/caseless", "_id": "caseless@0.8.0", "_shasum": "5bca2881d41437f54b2407ebe34888c7b9ad4f7d", "_from": "caseless@>=0.8.0 <0.9.0", "_npmVersion": "2.0.0", "_npmUser": { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" } ], "dist": { "shasum": "5bca2881d41437f54b2407ebe34888c7b9ad4f7d", "tarball": "http://registry.npmjs.org/caseless/-/caseless-0.8.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/caseless/-/caseless-0.8.0.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/caseless/README.md����000644 �000766 �000024 �00000002225 12455173731 032606� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������## Caseless -- wrap an object to set and get property with caseless semantics but also preserve caseing. This library is incredibly useful when working with HTTP headers. It allows you to get/set/check for headers in a caseless manner while also preserving the caseing of headers the first time they are set. ## Usage ```javascript var headers = {} , c = caseless(headers) ; c.set('a-Header', 'asdf') c.get('a-header') === 'asdf' ``` ## has(key) Has takes a name and if it finds a matching header will return that header name with the preserved caseing it was set with. ```javascript c.has('a-header') === 'a-Header' ``` ## set(key, value[, clobber=true]) Set is fairly straight forward except that if the header exists and clobber is disabled it will add `','+value` to the existing header. ```javascript c.set('a-Header', 'fdas') c.set('a-HEADER', 'more', false) c.get('a-header') === 'fdsa,more' ``` ## swap(key) Swaps the casing of a header with the new one that is passed in. ```javascript var headers = {} , c = caseless(headers) ; c.set('a-Header', 'fdas') c.swap('a-HEADER') c.has('a-header') === 'a-HEADER' headers === {'a-HEADER': 'fdas'} ``` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/caseless/test.js������000644 �000766 �000024 �00000001733 12455173731 032647� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var tape = require('tape') , caseless = require('./') ; tape('set get has', function (t) { var headers = {} , c = caseless(headers) ; t.plan(17) c.set('a-Header', 'asdf') t.equal(c.get('a-header'), 'asdf') t.equal(c.has('a-header'), 'a-Header') t.ok(!c.has('nothing')) // old bug where we used the wrong regex t.ok(!c.has('a-hea')) c.set('a-header', 'fdsa') t.equal(c.get('a-header'), 'fdsa') t.equal(c.get('a-Header'), 'fdsa') c.set('a-HEADER', 'more', false) t.equal(c.get('a-header'), 'fdsa,more') t.deepEqual(headers, {'a-Header': 'fdsa,more'}) c.swap('a-HEADER') t.deepEqual(headers, {'a-HEADER': 'fdsa,more'}) c.set('deleteme', 'foobar') t.ok(c.has('deleteme')) t.ok(c.del('deleteme')) t.notOk(c.has('deleteme')) t.notOk(c.has('idonotexist')) t.ok(c.del('idonotexist')) c.set('tva', 'test1') c.set('tva-header', 'test2') t.equal(c.has('tva'), 'tva') t.notOk(c.has('header')) t.equal(c.get('tva'), 'test1') }) �������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/.jshintrc����������000644 �000766 �000024 �00000002173 12455173731 031751� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "predef": [ ] , "bitwise": false , "camelcase": false , "curly": false , "eqeqeq": false , "forin": false , "immed": false , "latedef": false , "noarg": true , "noempty": true , "nonew": true , "plusplus": false , "quotmark": true , "regexp": false , "undef": true , "unused": true , "strict": false , "trailing": true , "maxlen": 120 , "asi": true , "boss": true , "debug": true , "eqnull": true , "esnext": true , "evil": true , "expr": true , "funcscope": false , "globalstrict": false , "iterator": false , "lastsemic": true , "laxbreak": true , "laxcomma": true , "loopfunc": true , "multistr": false , "onecase": false , "proto": false , "regexdash": false , "scripturl": true , "smarttabs": false , "shadow": false , "sub": true , "supernew": false , "validthis": true , "browser": true , "couch": false , "devel": false , "dojo": false , "mootools": false , "node": true , "nonstandard": true , "prototypejs": false , "rhino": false , "worker": true , "wsh": false , "nomen": false , "onevar": false , "passfail": false }�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/.npmignore���������000644 �000766 �000024 �00000000015 12455173731 032114� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/.travis.yml��������000644 �000766 �000024 �00000000210 12455173731 032223� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.8 - "0.10" branches: only: - master notifications: email: - rod@vagg.org script: npm test����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/bl.js��������������000644 �000766 �000024 �00000011525 12455173731 031060� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var DuplexStream = require('readable-stream').Duplex , util = require('util') function BufferList (callback) { if (!(this instanceof BufferList)) return new BufferList(callback) this._bufs = [] this.length = 0 if (typeof callback == 'function') { this._callback = callback var piper = function (err) { if (this._callback) { this._callback(err) this._callback = null } }.bind(this) this.on('pipe', function (src) { src.on('error', piper) }) this.on('unpipe', function (src) { src.removeListener('error', piper) }) } else if (Buffer.isBuffer(callback)) this.append(callback) else if (Array.isArray(callback)) { callback.forEach(function (b) { Buffer.isBuffer(b) && this.append(b) }.bind(this)) } DuplexStream.call(this) } util.inherits(BufferList, DuplexStream) BufferList.prototype._offset = function (offset) { var tot = 0, i = 0, _t for (; i < this._bufs.length; i++) { _t = tot + this._bufs[i].length if (offset < _t) return [ i, offset - tot ] tot = _t } } BufferList.prototype.append = function (buf) { var isBuffer = Buffer.isBuffer(buf) || buf instanceof BufferList this._bufs.push(isBuffer ? buf : new Buffer(buf)) this.length += buf.length return this } BufferList.prototype._write = function (buf, encoding, callback) { this.append(buf) if (callback) callback() } BufferList.prototype._read = function (size) { if (!this.length) return this.push(null) size = Math.min(size, this.length) this.push(this.slice(0, size)) this.consume(size) } BufferList.prototype.end = function (chunk) { DuplexStream.prototype.end.call(this, chunk) if (this._callback) { this._callback(null, this.slice()) this._callback = null } } BufferList.prototype.get = function (index) { return this.slice(index, index + 1)[0] } BufferList.prototype.slice = function (start, end) { return this.copy(null, 0, start, end) } BufferList.prototype.copy = function (dst, dstStart, srcStart, srcEnd) { if (typeof srcStart != 'number' || srcStart < 0) srcStart = 0 if (typeof srcEnd != 'number' || srcEnd > this.length) srcEnd = this.length if (srcStart >= this.length) return dst || new Buffer(0) if (srcEnd <= 0) return dst || new Buffer(0) var copy = !!dst , off = this._offset(srcStart) , len = srcEnd - srcStart , bytes = len , bufoff = (copy && dstStart) || 0 , start = off[1] , l , i // copy/slice everything if (srcStart === 0 && srcEnd == this.length) { if (!copy) // slice, just return a full concat return Buffer.concat(this._bufs) // copy, need to copy individual buffers for (i = 0; i < this._bufs.length; i++) { this._bufs[i].copy(dst, bufoff) bufoff += this._bufs[i].length } return dst } // easy, cheap case where it's a subset of one of the buffers if (bytes <= this._bufs[off[0]].length - start) { return copy ? this._bufs[off[0]].copy(dst, dstStart, start, start + bytes) : this._bufs[off[0]].slice(start, start + bytes) } if (!copy) // a slice, we need something to copy in to dst = new Buffer(len) for (i = off[0]; i < this._bufs.length; i++) { l = this._bufs[i].length - start if (bytes > l) { this._bufs[i].copy(dst, bufoff, start) } else { this._bufs[i].copy(dst, bufoff, start, start + bytes) break } bufoff += l bytes -= l if (start) start = 0 } return dst } BufferList.prototype.toString = function (encoding, start, end) { return this.slice(start, end).toString(encoding) } BufferList.prototype.consume = function (bytes) { while (this._bufs.length) { if (bytes > this._bufs[0].length) { bytes -= this._bufs[0].length this.length -= this._bufs[0].length this._bufs.shift() } else { this._bufs[0] = this._bufs[0].slice(bytes) this.length -= bytes break } } return this } BufferList.prototype.duplicate = function () { var i = 0 , copy = new BufferList() for (; i < this._bufs.length; i++) copy.append(this._bufs[i]) return copy } BufferList.prototype.destroy = function () { this._bufs.length = 0; this.length = 0; this.push(null); } ;(function () { var methods = { 'readDoubleBE' : 8 , 'readDoubleLE' : 8 , 'readFloatBE' : 4 , 'readFloatLE' : 4 , 'readInt32BE' : 4 , 'readInt32LE' : 4 , 'readUInt32BE' : 4 , 'readUInt32LE' : 4 , 'readInt16BE' : 2 , 'readInt16LE' : 2 , 'readUInt16BE' : 2 , 'readUInt16LE' : 2 , 'readInt8' : 1 , 'readUInt8' : 1 } for (var m in methods) { (function (m) { BufferList.prototype[m] = function (offset) { return this.slice(offset, offset + methods[m])[m](0) } }(m)) } }()) module.exports = BufferList ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/LICENSE.md���������000644 �000766 �000024 �00000002273 12455173731 031531� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The MIT License (MIT) ===================== Copyright (c) 2014 bl contributors ---------------------------------- *bl contributors listed at <https://github.com/rvagg/bl#contributors>* Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/package.json�������000644 �000766 �000024 �00000003134 12455173731 032410� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "bl", "version": "0.9.3", "description": "Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!", "main": "bl.js", "scripts": { "test": "node test/test.js | faucet", "test-local": "brtapsauce-local test/basic-test.js" }, "repository": { "type": "git", "url": "https://github.com/rvagg/bl.git" }, "homepage": "https://github.com/rvagg/bl", "authors": [ "Rod Vagg <rod@vagg.org> (https://github.com/rvagg)", "Matteo Collina <matteo.collina@gmail.com> (https://github.com/mcollina)", "Jarett Cruger <jcrugzz@gmail.com> (https://github.com/jcrugzz)" ], "keywords": [ "buffer", "buffers", "stream", "awesomesauce" ], "license": "MIT", "dependencies": { "readable-stream": "~1.0.26" }, "devDependencies": { "tape": "~2.12.3", "hash_file": "~0.1.1", "faucet": "~0.0.1", "brtapsauce": "~0.3.0" }, "gitHead": "4987a76bf6bafd7616e62c7023c955e62f3a9461", "bugs": { "url": "https://github.com/rvagg/bl/issues" }, "_id": "bl@0.9.3", "_shasum": "c41eff3e7cb31bde107c8f10076d274eff7f7d44", "_from": "bl@>=0.9.0 <0.10.0", "_npmVersion": "1.4.27", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "c41eff3e7cb31bde107c8f10076d274eff7f7d44", "tarball": "http://registry.npmjs.org/bl/-/bl-0.9.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/bl/-/bl-0.9.3.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/bl/README.md����������000644 �000766 �000024 �00000023477 12455173731 031415� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# bl *(BufferList)* **A Node.js Buffer list collector, reader and streamer thingy.** [![NPM](https://nodei.co/npm/bl.png?downloads=true&downloadRank=true)](https://nodei.co/npm/bl/) [![NPM](https://nodei.co/npm-dl/bl.png?months=6&height=3)](https://nodei.co/npm/bl/) **bl** is a storage object for collections of Node Buffers, exposing them with the main Buffer readable API. Also works as a duplex stream so you can collect buffers from a stream that emits them and emit buffers to a stream that consumes them! The original buffers are kept intact and copies are only done as necessary. Any reads that require the use of a single original buffer will return a slice of that buffer only (which references the same memory as the original buffer). Reads that span buffers perform concatenation as required and return the results transparently. ```js const BufferList = require('bl') var bl = new BufferList() bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append('hi') // bl will also accept & convert Strings bl.append(new Buffer('j')) bl.append(new Buffer([ 0x3, 0x4 ])) console.log(bl.length) // 12 console.log(bl.slice(0, 10).toString('ascii')) // 'abcdefghij' console.log(bl.slice(3, 10).toString('ascii')) // 'defghij' console.log(bl.slice(3, 6).toString('ascii')) // 'def' console.log(bl.slice(3, 8).toString('ascii')) // 'defgh' console.log(bl.slice(5, 10).toString('ascii')) // 'fghij' // or just use toString! console.log(bl.toString()) // 'abcdefghij\u0003\u0004' console.log(bl.toString('ascii', 3, 8)) // 'defgh' console.log(bl.toString('ascii', 5, 10)) // 'fghij' // other standard Buffer readables console.log(bl.readUInt16BE(10)) // 0x0304 console.log(bl.readUInt16LE(10)) // 0x0403 ``` Give it a callback in the constructor and use it just like **[concat-stream](https://github.com/maxogden/node-concat-stream)**: ```js const bl = require('bl') , fs = require('fs') fs.createReadStream('README.md') .pipe(bl(function (err, data) { // note 'new' isn't strictly required // `data` is a complete Buffer object containing the full data console.log(data.toString()) })) ``` Note that when you use the *callback* method like this, the resulting `data` parameter is a concatenation of all `Buffer` objects in the list. If you want to avoid the overhead of this concatenation (in cases of extreme performance consciousness), then avoid the *callback* method and just listen to `'end'` instead, like a standard Stream. Or to fetch a URL using [hyperquest](https://github.com/substack/hyperquest) (should work with [request](http://github.com/mikeal/request) and even plain Node http too!): ```js const hyperquest = require('hyperquest') , bl = require('bl') , url = 'https://raw.github.com/rvagg/bl/master/README.md' hyperquest(url).pipe(bl(function (err, data) { console.log(data.toString()) })) ``` Or, use it as a readable stream to recompose a list of Buffers to an output source: ```js const BufferList = require('bl') , fs = require('fs') var bl = new BufferList() bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append(new Buffer('hi')) bl.append(new Buffer('j')) bl.pipe(fs.createWriteStream('gibberish.txt')) ``` ## API * <a href="#ctor"><code><b>new BufferList([ callback ])</b></code></a> * <a href="#length"><code>bl.<b>length</b></code></a> * <a href="#append"><code>bl.<b>append(buffer)</b></code></a> * <a href="#get"><code>bl.<b>get(index)</b></code></a> * <a href="#slice"><code>bl.<b>slice([ start[, end ] ])</b></code></a> * <a href="#copy"><code>bl.<b>copy(dest, [ destStart, [ srcStart [, srcEnd ] ] ])</b></code></a> * <a href="#duplicate"><code>bl.<b>duplicate()</b></code></a> * <a href="#consume"><code>bl.<b>consume(bytes)</b></code></a> * <a href="#toString"><code>bl.<b>toString([encoding, [ start, [ end ]]])</b></code></a> * <a href="#readXX"><code>bl.<b>readDoubleBE()</b></code>, <code>bl.<b>readDoubleLE()</b></code>, <code>bl.<b>readFloatBE()</b></code>, <code>bl.<b>readFloatLE()</b></code>, <code>bl.<b>readInt32BE()</b></code>, <code>bl.<b>readInt32LE()</b></code>, <code>bl.<b>readUInt32BE()</b></code>, <code>bl.<b>readUInt32LE()</b></code>, <code>bl.<b>readInt16BE()</b></code>, <code>bl.<b>readInt16LE()</b></code>, <code>bl.<b>readUInt16BE()</b></code>, <code>bl.<b>readUInt16LE()</b></code>, <code>bl.<b>readInt8()</b></code>, <code>bl.<b>readUInt8()</b></code></a> * <a href="#streams">Streams</a> -------------------------------------------------------- <a name="ctor"></a> ### new BufferList([ callback | buffer | buffer array ]) The constructor takes an optional callback, if supplied, the callback will be called with an error argument followed by a reference to the **bl** instance, when `bl.end()` is called (i.e. from a piped stream). This is a convenient method of collecting the entire contents of a stream, particularly when the stream is *chunky*, such as a network stream. Normally, no arguments are required for the constructor, but you can initialise the list by passing in a single `Buffer` object or an array of `Buffer` object. `new` is not strictly required, if you don't instantiate a new object, it will be done automatically for you so you can create a new instance simply with: ```js var bl = require('bl') var myinstance = bl() // equivilant to: var BufferList = require('bl') var myinstance = new BufferList() ``` -------------------------------------------------------- <a name="length"></a> ### bl.length Get the length of the list in bytes. This is the sum of the lengths of all of the buffers contained in the list, minus any initial offset for a semi-consumed buffer at the beginning. Should accurately represent the total number of bytes that can be read from the list. -------------------------------------------------------- <a name="append"></a> ### bl.append(buffer) `append(buffer)` adds an additional buffer or BufferList to the internal list. -------------------------------------------------------- <a name="get"></a> ### bl.get(index) `get()` will return the byte at the specified index. -------------------------------------------------------- <a name="slice"></a> ### bl.slice([ start, [ end ] ]) `slice()` returns a new `Buffer` object containing the bytes within the range specified. Both `start` and `end` are optional and will default to the beginning and end of the list respectively. If the requested range spans a single internal buffer then a slice of that buffer will be returned which shares the original memory range of that Buffer. If the range spans multiple buffers then copy operations will likely occur to give you a uniform Buffer. -------------------------------------------------------- <a name="copy"></a> ### bl.copy(dest, [ destStart, [ srcStart [, srcEnd ] ] ]) `copy()` copies the content of the list in the `dest` buffer, starting from `destStart` and containing the bytes within the range specified with `srcStart` to `srcEnd`. `destStart`, `start` and `end` are optional and will default to the beginning of the `dest` buffer, and the beginning and end of the list respectively. -------------------------------------------------------- <a name="duplicate"></a> ### bl.duplicate() `duplicate()` performs a **shallow-copy** of the list. The internal Buffers remains the same, so if you change the underlying Buffers, the change will be reflected in both the original and the duplicate. This method is needed if you want to call `consume()` or `pipe()` and still keep the original list.Example: ```js var bl = new BufferList() bl.append('hello') bl.append(' world') bl.append('\n') bl.duplicate().pipe(process.stdout, { end: false }) console.log(bl.toString()) ``` -------------------------------------------------------- <a name="consume"></a> ### bl.consume(bytes) `consume()` will shift bytes *off the start of the list*. The number of bytes consumed don't need to line up with the sizes of the internal Buffers—initial offsets will be calculated accordingly in order to give you a consistent view of the data. -------------------------------------------------------- <a name="toString"></a> ### bl.toString([encoding, [ start, [ end ]]]) `toString()` will return a string representation of the buffer. The optional `start` and `end` arguments are passed on to `slice()`, while the `encoding` is passed on to `toString()` of the resulting Buffer. See the [Buffer#toString()](http://nodejs.org/docs/latest/api/buffer.html#buffer_buf_tostring_encoding_start_end) documentation for more information. -------------------------------------------------------- <a name="readXX"></a> ### bl.readDoubleBE(), bl.readDoubleLE(), bl.readFloatBE(), bl.readFloatLE(), bl.readInt32BE(), bl.readInt32LE(), bl.readUInt32BE(), bl.readUInt32LE(), bl.readInt16BE(), bl.readInt16LE(), bl.readUInt16BE(), bl.readUInt16LE(), bl.readInt8(), bl.readUInt8() All of the standard byte-reading methods of the `Buffer` interface are implemented and will operate across internal Buffer boundaries transparently. See the <b><code>[Buffer](http://nodejs.org/docs/latest/api/buffer.html)</code></b> documentation for how these work. -------------------------------------------------------- <a name="streams"></a> ### Streams **bl** is a Node **[Duplex Stream](http://nodejs.org/docs/latest/api/stream.html#stream_class_stream_duplex)**, so it can be read from and written to like a standard Node stream. You can also `pipe()` to and from a **bl** instance. -------------------------------------------------------- ## Contributors **bl** is brought to you by the following hackers: * [Rod Vagg](https://github.com/rvagg) * [Matteo Collina](https://github.com/mcollina) * [Jarett Cruger](https://github.com/jcrugzz) ======= ## License **bl** is Copyright (c) 2013 Rod Vagg [@rvagg](https://twitter.com/rvagg) and licenced under the MIT licence. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE.md file for more details. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/aws-sign2/index.js����000644 �000766 �000024 �00000007470 12455173731 033013� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ /*! * knox - auth * Copyright(c) 2010 LearnBoost <dev@learnboost.com> * MIT Licensed */ /** * Module dependencies. */ var crypto = require('crypto') , parse = require('url').parse ; /** * Valid keys. */ var keys = [ 'acl' , 'location' , 'logging' , 'notification' , 'partNumber' , 'policy' , 'requestPayment' , 'torrent' , 'uploadId' , 'uploads' , 'versionId' , 'versioning' , 'versions' , 'website' ] /** * Return an "Authorization" header value with the given `options` * in the form of "AWS <key>:<signature>" * * @param {Object} options * @return {String} * @api private */ function authorization (options) { return 'AWS ' + options.key + ':' + sign(options) } module.exports = authorization module.exports.authorization = authorization /** * Simple HMAC-SHA1 Wrapper * * @param {Object} options * @return {String} * @api private */ function hmacSha1 (options) { return crypto.createHmac('sha1', options.secret).update(options.message).digest('base64') } module.exports.hmacSha1 = hmacSha1 /** * Create a base64 sha1 HMAC for `options`. * * @param {Object} options * @return {String} * @api private */ function sign (options) { options.message = stringToSign(options) return hmacSha1(options) } module.exports.sign = sign /** * Create a base64 sha1 HMAC for `options`. * * Specifically to be used with S3 presigned URLs * * @param {Object} options * @return {String} * @api private */ function signQuery (options) { options.message = queryStringToSign(options) return hmacSha1(options) } module.exports.signQuery= signQuery /** * Return a string for sign() with the given `options`. * * Spec: * * <verb>\n * <md5>\n * <content-type>\n * <date>\n * [headers\n] * <resource> * * @param {Object} options * @return {String} * @api private */ function stringToSign (options) { var headers = options.amazonHeaders || '' if (headers) headers += '\n' var r = [ options.verb , options.md5 , options.contentType , options.date ? options.date.toUTCString() : '' , headers + options.resource ] return r.join('\n') } module.exports.queryStringToSign = stringToSign /** * Return a string for sign() with the given `options`, but is meant exclusively * for S3 presigned URLs * * Spec: * * <date>\n * <resource> * * @param {Object} options * @return {String} * @api private */ function queryStringToSign (options){ return 'GET\n\n\n' + options.date + '\n' + options.resource } module.exports.queryStringToSign = queryStringToSign /** * Perform the following: * * - ignore non-amazon headers * - lowercase fields * - sort lexicographically * - trim whitespace between ":" * - join with newline * * @param {Object} headers * @return {String} * @api private */ function canonicalizeHeaders (headers) { var buf = [] , fields = Object.keys(headers) ; for (var i = 0, len = fields.length; i < len; ++i) { var field = fields[i] , val = headers[field] , field = field.toLowerCase() ; if (0 !== field.indexOf('x-amz')) continue buf.push(field + ':' + val) } return buf.sort().join('\n') } module.exports.canonicalizeHeaders = canonicalizeHeaders /** * Perform the following: * * - ignore non sub-resources * - sort lexicographically * * @param {String} resource * @return {String} * @api private */ function canonicalizeResource (resource) { var url = parse(resource, true) , path = url.pathname , buf = [] ; Object.keys(url.query).forEach(function(key){ if (!~keys.indexOf(key)) return var val = '' == url.query[key] ? '' : '=' + encodeURIComponent(url.query[key]) buf.push(key + val) }) return path + (buf.length ? '?' + buf.sort().join('&') : '') } module.exports.canonicalizeResource = canonicalizeResource ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/aws-sign2/LICENSE�����000644 �000766 �000024 �00000021664 12455173731 032354� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS����������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/aws-sign2/package.json000644 �000766 �000024 �00000002612 12455173731 033625� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "aws-sign2", "description": "AWS signing. Originally pulled from LearnBoost/knox, maintained as vendor in request, now a standalone module.", "version": "0.5.0", "repository": { "url": "https://github.com/mikeal/aws-sign" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "readme": "aws-sign\n========\n\nAWS signing. Originally pulled from LearnBoost/knox, maintained as vendor in request, now a standalone module.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/mikeal/aws-sign/issues" }, "_id": "aws-sign2@0.5.0", "dist": { "shasum": "c57103f7a17fc037f02d7c2e64b602ea223f7d63", "tarball": "http://registry.npmjs.org/aws-sign2/-/aws-sign2-0.5.0.tgz" }, "_from": "aws-sign2@>=0.5.0 <0.6.0", "_npmVersion": "1.3.2", "_npmUser": { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" } ], "directories": {}, "_shasum": "c57103f7a17fc037f02d7c2e64b602ea223f7d63", "_resolved": "https://registry.npmjs.org/aws-sign2/-/aws-sign2-0.5.0.tgz", "homepage": "https://github.com/mikeal/aws-sign", "scripts": {} } ����������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/node_modules/aws-sign2/README.md���000644 �000766 �000024 �00000000202 12455173731 032607� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������aws-sign ======== AWS signing. Originally pulled from LearnBoost/knox, maintained as vendor in request, now a standalone module. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/lib/cookies.js���������������������000644 �000766 �000024 �00000001647 12455173731 027617� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var tough = require('tough-cookie') var Cookie = tough.Cookie , CookieJar = tough.CookieJar exports.parse = function(str) { if (str && str.uri) { str = str.uri } if (typeof str !== 'string') { throw new Error('The cookie function only accepts STRING as param') } return Cookie.parse(str) } // Adapt the sometimes-Async api of tough.CookieJar to our requirements function RequestJar(store) { var self = this self._jar = new CookieJar(store) } RequestJar.prototype.setCookie = function(cookieOrStr, uri, options) { var self = this return self._jar.setCookieSync(cookieOrStr, uri, options || {}) } RequestJar.prototype.getCookieString = function(uri) { var self = this return self._jar.getCookieStringSync(uri) } RequestJar.prototype.getCookies = function(uri) { var self = this return self._jar.getCookiesSync(uri) } exports.jar = function(store) { return new RequestJar(store) } �����������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/lib/copy.js������������������������000644 �000766 �000024 �00000000220 12455173731 027117� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' module.exports = function copy (obj) { var o = {} Object.keys(obj).forEach(function (i) { o[i] = obj[i] }) return o } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/lib/debug.js�����������������������000644 �000766 �000024 �00000000322 12455173731 027236� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var util = require('util') , request = require('../index') module.exports = function debug() { if (request.debug) { console.error('REQUEST %s', util.format.apply(util, arguments)) } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/lib/helpers.js���������������������000644 �000766 �000024 �00000004040 12455173731 027613� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var extend = require('util')._extend , jsonSafeStringify = require('json-stringify-safe') , crypto = require('crypto') function deferMethod() { if(typeof setImmediate === 'undefined') { return process.nextTick } return setImmediate } function constructObject(initialObject) { initialObject = initialObject || {} return { extend: function (object) { return constructObject(extend(initialObject, object)) }, done: function () { return initialObject } } } function constructOptionsFrom(uri, options) { var params = constructObject() if (typeof options === 'object') { params.extend(options).extend({uri: uri}) } else if (typeof uri === 'string') { params.extend({uri: uri}) } else { params.extend(uri) } return params.done() } function isFunction(value) { return typeof value === 'function' } function filterForCallback(values) { var callbacks = values.filter(isFunction) return callbacks[0] } function paramsHaveRequestBody(params) { return ( params.options.body || params.options.requestBodyStream || (params.options.json && typeof params.options.json !== 'boolean') || params.options.multipart ) } function safeStringify (obj) { var ret try { ret = JSON.stringify(obj) } catch (e) { ret = jsonSafeStringify(obj) } return ret } function md5 (str) { return crypto.createHash('md5').update(str).digest('hex') } function isReadStream (rs) { return rs.readable && rs.path && rs.mode } function toBase64 (str) { return (new Buffer(str || '', 'ascii')).toString('base64') } exports.isFunction = isFunction exports.constructObject = constructObject exports.constructOptionsFrom = constructOptionsFrom exports.filterForCallback = filterForCallback exports.paramsHaveRequestBody = paramsHaveRequestBody exports.safeStringify = safeStringify exports.md5 = md5 exports.isReadStream = isReadStream exports.toBase64 = toBase64 exports.defer = deferMethod() ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/request/examples/README.md�����������������000644 �000766 �000024 �00000004637 12455173731 030156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ # Authentication ## OAuth ### OAuth1.0 Refresh Token - http://oauth.googlecode.com/svn/spec/ext/session/1.0/drafts/1/spec.html#anchor4 - https://developer.yahoo.com/oauth/guide/oauth-refreshaccesstoken.html ```js request.post('https://api.login.yahoo.com/oauth/v2/get_token', { oauth: { consumer_key: '...', consumer_secret: '...', token: '...', token_secret: '...', session_handle: '...' } }, function (err, res, body) { var result = require('querystring').parse(body) // assert.equal(typeof result, 'object') }) ``` ### OAuth2 Refresh Token - https://tools.ietf.org/html/draft-ietf-oauth-v2-31#section-6 ```js request.post('https://accounts.google.com/o/oauth2/token', { form: { grant_type: 'refresh_token', client_id: '...', client_secret: '...', refresh_token: '...' }, json: true }, function (err, res, body) { // assert.equal(typeof body, 'object') }) ``` # Multipart ## multipart/form-data ### Flickr Image Upload - https://www.flickr.com/services/api/upload.api.html ```js request.post('https://up.flickr.com/services/upload', { oauth: { consumer_key: '...', consumer_secret: '...', token: '...', token_secret: '...' }, // all meta data should be included here for proper signing qs: { title: 'My cat is awesome', description: 'Sent on ' + new Date(), is_public: 1 }, // again the same meta data + the actual photo formData: { title: 'My cat is awesome', description: 'Sent on ' + new Date(), is_public: 1, photo:fs.createReadStream('cat.png') }, json: true }, function (err, res, body) { // assert.equal(typeof body, 'object') }) ``` # Streams ## `POST` data Use Request as a Writable stream to easily `POST` Readable streams (like files, other HTTP requests, or otherwise). TL;DR: Pipe a Readable Stream onto Request via: ``` READABLE.pipe(request.post(URL)); ``` A more detailed example: ```js var fs = require('fs') , path = require('path') , http = require('http') , request = require('request') , TMP_FILE_PATH = path.join(path.sep, 'tmp', 'foo') ; // write a temporary file: fs.writeFileSync(TMP_FILE_PATH, 'foo bar baz quk\n'); http.createServer(function(req, res) { console.log('the server is receiving data!\n'); req .on('end', res.end.bind(res)) .pipe(process.stdout) ; }).listen(3000).unref(); fs.createReadStream(TMP_FILE_PATH) .pipe(request.post('http://127.0.0.1:3000')) ; ``` �������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/realize-package-specifier/.npmignore�������000644 �000766 �000024 �00000000024 12455173731 032345� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*~ .#* node_modules ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/realize-package-specifier/index.js���������000644 �000766 �000024 �00000002152 12455173731 032017� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������"use strict" var fs = require("fs") var path = require("path") var dz = require("dezalgo") var npa = require("npm-package-arg") module.exports = function (spec, where, cb) { if (where instanceof Function) { cb = where; where = null } if (where == null) where = "." cb = dz(cb) try { var dep = npa(spec) } catch (e) { return cb(e) } if ((dep.type == "range" || dep.type == "version") && dep.name != dep.raw) return cb(null, dep) var specpath = dep.type == "local" ? path.resolve(where, dep.spec) : path.resolve(dep.rawSpec? dep.rawSpec: dep.name) fs.stat(specpath, function (er, s) { if (er) return finalize() if (!s.isDirectory()) return finalize("local") fs.stat(path.join(specpath, "package.json"), function (er) { finalize(er ? null : "directory") }) }) function finalize(type) { if (type != null && type != dep.type) { dep.type = type if (! dep.rawSpec) { dep.rawSpec = dep.name dep.name = null } } if (dep.type == "local" || dep.type == "directory") dep.spec = specpath cb(null, dep) } } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/realize-package-specifier/package.json�����000644 �000766 �000024 �00000006734 12455173731 032652� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "realize-package-specifier", "version": "1.3.0", "description": "Like npm-package-arg, but more so, producing full file paths and differentiating local tar and directory sources.", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "license": "ISC", "repository": { "type": "git", "url": "https://github.com/npm/realize-package-specifier.git" }, "author": { "name": "Rebecca Turner", "email": "me@re-becca.org", "url": "http://re-becca.org" }, "homepage": "https://github.com/npm/realize-package-specifier", "dependencies": { "dezalgo": "^1.0.1", "npm-package-arg": "^2.1.3" }, "devDependencies": { "require-inject": "^1.1.0", "tap": "^0.4.12" }, "gitHead": "d05d49409e28404473a292cf99df05642a24c08f", "readme": "realize-package-specifier\n-------------------------\n\nParse a package specifier, peeking at the disk to differentiate between\nlocal tarballs, directories and named modules. This implements the logic\nused by `npm install` and `npm cache` to determine where to get packages\nfrom.\n\n```javascript\nvar realizePackageSpecifier = require(\"realize-package-specifier\")\nrealizePackageSpecifier(\"foo.tar.gz\", \".\", function (err, package) {\n …\n})\n```\n\n* realizePackageSpecifier(*spec*, [*where*,] *callback*)\n\nParses *spec* using `npm-package-arg` and then uses stat to check to see if\nit refers to a local tarball or package directory. Stats are done relative\nto *where*. If it does then the local module is loaded. If it doesn't then\ntarget is left as a remote package specifier. Package directories are\nrecognized by the presence of a package.json in them.\n\n*spec* -- a package specifier, like: `foo@1.2`, or `foo@user/foo`, or\n`http://x.com/foo.tgz`, or `git+https://github.com/user/foo`\n\n*where* (optional, default: .) -- The directory in which we should look for\nlocal tarballs or package directories.\n\n*callback* function(*err*, *result*) -- Called once we've determined what\nkind of specifier this is. The *result* object will be very like the one\nreturned by `npm-package-arg` except with three differences: 1) There's a\nnew type of `directory`. 2) The `local` type only refers to tarballs. 2)\nFor all `local` and `directory` type results spec will contain the full path of\nthe local package.\n\n## Result Objects\n\nThe full definition of the result object is:\n\n* `name` - If known, the `name` field expected in the resulting pkg.\n* `type` - One of the following strings:\n * `git` - A git repo\n * `github` - A github shorthand, like `user/project`\n * `tag` - A tagged version, like `\"foo@latest\"`\n * `version` - A specific version number, like `\"foo@1.2.3\"`\n * `range` - A version range, like `\"foo@2.x\"`\n * `local` - A local file path\n * `directory` - A local package directory\n * `remote` - An http url (presumably to a tgz)\n* `spec` - The \"thing\". URL, the range, git repo, etc.\n* `raw` - The original un-modified string that was provided.\n* `rawSpec` - The part after the `name@...`, as it was originally\n provided.\n* `scope` - If a name is something like `@org/module` then the `scope`\n field will be set to `org`. If it doesn't have a scoped name, then\n scope is `null`.\n\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/npm/realize-package-specifier/issues" }, "_id": "realize-package-specifier@1.3.0", "_shasum": "23374a84e6a9188483f346cc939eb58eec85efa5", "_from": "realize-package-specifier@~1.3.0" } ������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/realize-package-specifier/README.md��������000644 �000766 �000024 �00000004535 12455173731 031640� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������realize-package-specifier ------------------------- Parse a package specifier, peeking at the disk to differentiate between local tarballs, directories and named modules. This implements the logic used by `npm install` and `npm cache` to determine where to get packages from. ```javascript var realizePackageSpecifier = require("realize-package-specifier") realizePackageSpecifier("foo.tar.gz", ".", function (err, package) { … }) ``` * realizePackageSpecifier(*spec*, [*where*,] *callback*) Parses *spec* using `npm-package-arg` and then uses stat to check to see if it refers to a local tarball or package directory. Stats are done relative to *where*. If it does then the local module is loaded. If it doesn't then target is left as a remote package specifier. Package directories are recognized by the presence of a package.json in them. *spec* -- a package specifier, like: `foo@1.2`, or `foo@user/foo`, or `http://x.com/foo.tgz`, or `git+https://github.com/user/foo` *where* (optional, default: .) -- The directory in which we should look for local tarballs or package directories. *callback* function(*err*, *result*) -- Called once we've determined what kind of specifier this is. The *result* object will be very like the one returned by `npm-package-arg` except with three differences: 1) There's a new type of `directory`. 2) The `local` type only refers to tarballs. 2) For all `local` and `directory` type results spec will contain the full path of the local package. ## Result Objects The full definition of the result object is: * `name` - If known, the `name` field expected in the resulting pkg. * `type` - One of the following strings: * `git` - A git repo * `github` - A github shorthand, like `user/project` * `tag` - A tagged version, like `"foo@latest"` * `version` - A specific version number, like `"foo@1.2.3"` * `range` - A version range, like `"foo@2.x"` * `local` - A local file path * `directory` - A local package directory * `remote` - An http url (presumably to a tgz) * `spec` - The "thing". URL, the range, git repo, etc. * `raw` - The original un-modified string that was provided. * `rawSpec` - The part after the `name@...`, as it was originally provided. * `scope` - If a name is something like `@org/module` then the `scope` field will be set to `org`. If it doesn't have a scoped name, then scope is `null`. �������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/.npmignore�����������������000644 �000766 �000024 �00000000044 12455173731 030404� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������build/ test/ examples/ fs.js zlib.js��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/duplex.js������������������000644 �000766 �000024 �00000000064 12455173731 030246� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_duplex.js") ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/lib/�����������������������000755 �000766 �000024 �00000000000 12456115117 027150� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/LICENSE��������������������000644 �000766 �000024 �00000002110 12455173731 027406� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/node_modules/��������������000755 �000766 �000024 �00000000000 12456115117 031057� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/package.json���������������000644 �000766 �000024 �00000003266 12455173731 030704� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "readable-stream", "version": "1.0.33", "description": "Streams2, a user-land copy of the stream library from Node.js v0.10.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "isarray": "0.0.1", "string_decoder": "~0.10.x", "inherits": "~2.0.1" }, "devDependencies": { "tap": "~0.2.6" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/readable-stream" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "gitHead": "0bf97a117c5646556548966409ebc57a6dda2638", "bugs": { "url": "https://github.com/isaacs/readable-stream/issues" }, "homepage": "https://github.com/isaacs/readable-stream", "_id": "readable-stream@1.0.33", "_shasum": "3a360dd66c1b1d7fd4705389860eda1d0f61126c", "_from": "readable-stream@>=1.0.33 <1.1.0", "_npmVersion": "1.4.28", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "3a360dd66c1b1d7fd4705389860eda1d0f61126c", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-1.0.33.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.0.33.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/passthrough.js�������������000644 �000766 �000024 �00000000071 12455173731 031312� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_passthrough.js") �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/readable.js����������������000644 �000766 �000024 �00000000703 12455173731 030504� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var Stream = require('stream'); // hack to fix a circular dependency issue when used with browserify exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = Stream; exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); �������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/README.md������������������000644 �000766 �000024 �00000002431 12455173731 027666� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# readable-stream ***Node-core streams for userland*** [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/) This package is a mirror of the Streams2 and Streams3 implementations in Node-core. If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core. **readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12. **readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` you’ll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when you’re ready to start using Streams3, pin to `"~1.1.0"` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/transform.js���������������000644 �000766 �000024 �00000000067 12455173731 030763� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_transform.js") �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/writable.js����������������000644 �000766 �000024 �00000000066 12455173731 030560� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_writable.js") ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/�000755 �000766 �000024 �00000000000 12456115117 033373� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/������000755 �000766 �000024 �00000000000 12456115117 032531� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/string_decoder/����������������������000755 �000766 �000024 �00000000000 12456115117 033773� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/string_decoder/.npmignore������������000644 �000766 �000024 �00000000013 12455173731 035771� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������build test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/string_decoder/index.js��������������000644 �000766 �000024 �00000015006 12455173731 035447� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/string_decoder/LICENSE���������������000644 �000766 �000024 �00000002064 12455173731 035007� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/string_decoder/package.json����������000644 �000766 �000024 �00000002527 12455173731 036274� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_from": "string_decoder@>=0.10.0 <0.11.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/string_decoder/README.md�������������000644 �000766 �000024 �00000000762 12455173731 035264� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.��������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/build/000755 �000766 �000024 �00000000000 12456115117 033630� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/component.json���������������000644 �000766 �000024 �00000000726 12455173731 035361� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/index.js���������������������000644 �000766 �000024 �00000000170 12455173731 034122� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/package.json�����������������000644 �000766 �000024 �00000005534 12455173731 034754� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "readme": "\n# isarray\n\n`Array#isArray` for older browsers.\n\n## Usage\n\n```js\nvar isArray = require('isarray');\n\nconsole.log(isArray([])); // => true\nconsole.log(isArray({})); // => false\n```\n\n## Installation\n\nWith [npm](http://npmjs.org) do\n\n```bash\n$ npm install isarray\n```\n\nThen bundle for the browser with\n[browserify](https://github.com/substack/browserify).\n\nWith [component](http://component.io) do\n\n```bash\n$ component install juliangruber/isarray\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber <julian@juliangruber.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n", "readmeFilename": "README.md", "_id": "isarray@0.0.1", "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "_from": "isarray@0.0.1", "_npmVersion": "1.2.18", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "directories": {}, "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/README.md��������������������000644 �000766 �000024 �00000003025 12455173731 033736� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/isarray/build/build.js���������������000644 �000766 �000024 �00000007771 12455173731 035227� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); �������lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/float.patch�������������000644 �000766 �000024 �00000037626 12455173731 035465� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}����������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/lib/��������������������000755 �000766 �000024 �00000000000 12456115117 034062� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/package.json������������000644 �000766 �000024 �00000002523 12455173731 035611� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is", "_id": "core-util-is@1.0.1", "dist": { "shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" }, "_from": "core-util-is@>=1.0.0 <1.1.0", "_npmVersion": "1.3.23", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz", "scripts": {} } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/README.md���������������000644 �000766 �000024 �00000000103 12455173731 034572� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# core-util-is The `util.is*` functions introduced in Node v0.12. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/util.js�����������������000644 �000766 �000024 �00000003526 12455173731 034642� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/readable-stream/node_modules/core-util-is/lib/util.js�������������000644 �000766 �000024 �00000003562 12455173731 035410� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/lib/_stream_duplex.js������000644 �000766 �000024 �00000003215 12455173731 032527� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. module.exports = Duplex; /*<replacement>*/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /*</replacement>*/ /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); forEach(objectKeys(Writable.prototype), function(method) { if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; }); function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. process.nextTick(this.end.bind(this)); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/lib/_stream_passthrough.js�000644 �000766 �000024 �00000001121 12455173731 033567� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. module.exports = PassThrough; var Transform = require('./_stream_transform'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/lib/_stream_readable.js����000644 �000766 �000024 �00000062602 12455173731 032772� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = Readable; /*<replacement>*/ var isArray = require('isarray'); /*</replacement>*/ /*<replacement>*/ var Buffer = require('buffer').Buffer; /*</replacement>*/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /*<replacement>*/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /*</replacement>*/ var Stream = require('stream'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var StringDecoder; util.inherits(Readable, Stream); function ReadableState(options, stream) { options = options || {}; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; this.highWaterMark = (hwm || hwm === 0) ? hwm : 16 * 1024; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = false; this.ended = false; this.endEmitted = false; this.reading = false; // In streams that never have any data, and do push(null) right away, // the consumer can miss the 'end' event if they do some I/O before // consuming the stream. So, we don't emit('end') until some reading // happens. this.calledRead = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, becuase any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (typeof chunk === 'string' && !state.objectMode) { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (chunk === null || chunk === undefined) { state.reading = false; if (!state.ended) onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) { state.buffer.unshift(chunk); } else { state.reading = false; state.buffer.push(chunk); } if (state.needReadable) emitReadable(stream); maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (n === null || isNaN(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else return state.length; } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { var state = this._readableState; state.calledRead = true; var nOrig = n; var ret; if (typeof n !== 'number' || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { ret = null; // In cases where the decoder did not receive enough data // to produce a full chunk, then immediately received an // EOF, state.buffer will contain [<Buffer >, <Buffer 00 ...>]. // howMuchToRead will see this and coerce the amount to // read to zero (because it's looking at the length of the // first <Buffer > in state.buffer), and we'll end up here. // // This can only happen via state.decoder -- no other venue // exists for pushing a zero-length chunk into state.buffer // and triggering this behavior. In this case, we return our // remaining data and end the stream, if appropriate. if (state.length > 0 && state.decoder) { ret = fromList(n, state); state.length -= ret.length; } if (state.length === 0) endReadable(this); return ret; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; // if we currently have less than the highWaterMark, then also read some if (state.length - n <= state.highWaterMark) doRead = true; // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) doRead = false; if (doRead) { state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read called its callback synchronously, then `reading` // will be false, and we need to re-evaluate how much data we // can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); if (n > 0) ret = fromList(n, state); else ret = null; if (ret === null) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we happened to read() exactly the remaining amount in the // buffer, and the EOF has been seen at this point, then make sure // that we emit 'end' on the very next tick. if (state.ended && !state.endEmitted && state.length === 0) endReadable(this); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!Buffer.isBuffer(chunk) && 'string' !== typeof chunk && chunk !== null && chunk !== undefined && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // if we've ended and we have some data left, then emit // 'readable' now to make sure it gets picked up. if (state.length > 0) emitReadable(stream); else endReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (state.emittedReadable) return; state.emittedReadable = true; if (state.sync) process.nextTick(function() { emitReadable_(stream); }); else emitReadable_(stream); } function emitReadable_(stream) { stream.emit('readable'); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; process.nextTick(function() { maybeReadMore_(stream, state); }); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) process.nextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { if (readable !== src) return; cleanup(); } function onend() { dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (!dest._writableState || dest._writableState.needDrain) ondrain(); } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { // the handler that waits for readable events after all // the data gets sucked out in flow. // This would be easier to follow with a .once() handler // in flow(), but that is too slow. this.on('readable', pipeOnReadable); state.flowing = true; process.nextTick(function() { flow(src); }); } return dest; }; function pipeOnDrain(src) { return function() { var dest = this; var state = src._readableState; state.awaitDrain--; if (state.awaitDrain === 0) flow(src); }; } function flow(src) { var state = src._readableState; var chunk; state.awaitDrain = 0; function write(dest, i, list) { var written = dest.write(chunk); if (false === written) { state.awaitDrain++; } } while (state.pipesCount && null !== (chunk = src.read())) { if (state.pipesCount === 1) write(state.pipes, 0, null); else forEach(state.pipes, write); src.emit('data', chunk); // if anyone needs a drain, then we have to wait for that. if (state.awaitDrain > 0) return; } // if every destination was unpiped, either before entering this // function, or in the while loop, then stop flowing. // // NB: This is a pretty rare edge case. if (state.pipesCount === 0) { state.flowing = false; // if there were data event listeners added, then switch to old mode. if (EE.listenerCount(src, 'data') > 0) emitDataEvents(src); return; } // at this point, no one needed a drain, so we just ran out of data // on the next readable event, start it over again. state.ranOut = true; } function pipeOnReadable() { if (this._readableState.ranOut) { this._readableState.ranOut = false; flow(this); } } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; this.removeListener('readable', pipeOnReadable); state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; this.removeListener('readable', pipeOnReadable); state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); if (ev === 'data' && !this._readableState.flowing) emitDataEvents(this); if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { this.read(0); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { emitDataEvents(this); this.read(0); this.emit('resume'); }; Readable.prototype.pause = function() { emitDataEvents(this, true); this.emit('pause'); }; function emitDataEvents(stream, startPaused) { var state = stream._readableState; if (state.flowing) { // https://github.com/isaacs/readable-stream/issues/16 throw new Error('Cannot switch to old mode now.'); } var paused = startPaused || false; var readable = false; // convert to an old-style stream. stream.readable = true; stream.pipe = Stream.prototype.pipe; stream.on = stream.addListener = Stream.prototype.on; stream.on('readable', function() { readable = true; var c; while (!paused && (null !== (c = stream.read()))) stream.emit('data', c); if (c === null) { readable = false; stream._readableState.needReadable = true; } }); stream.pause = function() { paused = true; this.emit('pause'); }; stream.resume = function() { paused = false; if (readable) process.nextTick(function() { stream.emit('readable'); }); else this.read(0); this.emit('resume'); }; // now make it start, just in case it hadn't already. stream.emit('readable'); } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { if (state.decoder) chunk = state.decoder.write(chunk); // don't skip over falsy values in objectMode //if (state.objectMode && util.isNullOrUndefined(chunk)) if (state.objectMode && (chunk === null || chunk === undefined)) return; else if (!state.objectMode && (!chunk || !chunk.length)) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (typeof stream[i] === 'function' && typeof this[i] === 'undefined') { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }}(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted && state.calledRead) { state.ended = true; process.nextTick(function() { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } }); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } ������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/lib/_stream_transform.js���000644 �000766 �000024 �00000014165 12455173731 033247� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. module.exports = Transform; var Duplex = require('./_stream_duplex'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ util.inherits(Transform, Duplex); function TransformState(options, stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (data !== null && data !== undefined) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); var ts = this._transformState = new TransformState(options, this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; this.once('finish', function() { if ('function' === typeof this._flush) this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (ts.writechunk !== null && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var rs = stream._readableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/readable-stream/lib/_stream_writable.js����000644 �000766 �000024 �00000023050 12455173731 033036� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. module.exports = Writable; /*<replacement>*/ var Buffer = require('buffer').Buffer; /*</replacement>*/ Writable.WritableState = WritableState; /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var Stream = require('stream'); util.inherits(Writable, Stream); function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; } function WritableState(options, stream) { options = options || {}; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; this.highWaterMark = (hwm || hwm === 0) ? hwm : 16 * 1024; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, becuase any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.buffer = []; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, state, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); process.nextTick(function() { cb(er); }); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!Buffer.isBuffer(chunk) && 'string' !== typeof chunk && chunk !== null && chunk !== undefined && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); process.nextTick(function() { cb(er); }); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (Buffer.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (typeof cb !== 'function') cb = function() {}; if (state.ended) writeAfterEnd(this, state, cb); else if (validChunk(this, state, chunk, cb)) ret = writeOrBuffer(this, state, chunk, encoding, cb); return ret; }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (Buffer.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing) state.buffer.push(new WriteReq(chunk, encoding, cb)); else doWrite(stream, state, len, chunk, encoding, cb); return ret; } function doWrite(stream, state, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { if (sync) process.nextTick(function() { cb(er); }); else cb(er); stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(stream, state); if (!finished && !state.bufferProcessing && state.buffer.length) clearBuffer(stream, state); if (sync) { process.nextTick(function() { afterWrite(stream, state, finished, cb); }); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); cb(); if (finished) finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; for (var c = 0; c < state.buffer.length; c++) { var entry = state.buffer[c]; var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, len, chunk, encoding, cb); // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { c++; break; } } state.bufferProcessing = false; if (c < state.buffer.length) state.buffer = state.buffer.slice(c); else state.buffer.length = 0; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (typeof chunk === 'function') { cb = chunk; chunk = null; encoding = null; } else if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (typeof chunk !== 'undefined' && chunk !== null) this.write(chunk, encoding); // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(stream, state) { return (state.ending && state.length === 0 && !state.finished && !state.writing); } function finishMaybe(stream, state) { var need = needFinish(stream, state); if (need) { state.finished = true; stream.emit('finish'); } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) process.nextTick(cb); else stream.once('finish', cb); } state.ended = true; } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-package-json/.npmignore���������������000644 �000766 �000024 �00000000146 12455173731 030632� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modules ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-package-json/LICENSE������������������000644 �000766 �000024 �00000001354 12455173731 027642� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-package-json/package.json�������������000644 �000766 �000024 �00000003241 12455173731 031120� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "read-package-json", "version": "1.2.7", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "description": "The thing npm uses to read package.json files with semantics and defaults and validation", "repository": { "type": "git", "url": "git://github.com/isaacs/read-package-json.git" }, "main": "read-json.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "github-url-from-git": "^1.3.0", "github-url-from-username-repo": "~1.0.0", "glob": "^4.0.2", "lru-cache": "2", "normalize-package-data": "^1.0.0", "graceful-fs": "2 || 3" }, "devDependencies": { "tap": "~0.2.5" }, "optionalDependencies": { "graceful-fs": "2 || 3" }, "license": "ISC", "gitHead": "41d6696c527e32a1cb38ebf0b6fc91b489b0499c", "bugs": { "url": "https://github.com/isaacs/read-package-json/issues" }, "homepage": "https://github.com/isaacs/read-package-json", "_id": "read-package-json@1.2.7", "_shasum": "f0b440c461a218f4dbf48b094e80fc65c5248502", "_from": "read-package-json@>=1.2.7-0 <1.3.0-0", "_npmVersion": "2.0.0-beta.0", "_npmUser": { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "dist": { "shasum": "f0b440c461a218f4dbf48b094e80fc65c5248502", "tarball": "http://registry.npmjs.org/read-package-json/-/read-package-json-1.2.7.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/read-package-json/-/read-package-json-1.2.7.tgz" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-package-json/read-json.js�������������000644 �000766 �000024 �00000035456 12455173731 031067� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// vim: set softtabstop=16 shiftwidth=16: try { var fs = require("graceful-fs") } catch (er) { var fs = require("fs") } module.exports = readJson var LRU = require("lru-cache") readJson.cache = new LRU({max: 1000}) var path = require("path") var glob = require("glob") var normalizeData = require("normalize-package-data") // put more stuff on here to customize. readJson.extraSet = [ gypfile, serverjs, scriptpath, authors, readme, mans, bins, githead ] var typoWarned = {} function readJson (file, log_, strict_, cb_) { var log, strict, cb for (var i = 1; i < arguments.length - 1; i++) { if (typeof arguments[i] === 'boolean') strict = arguments[i] else if (typeof arguments[i] === 'function') log = arguments[i] } if (!log) log = function () {}; cb = arguments[ arguments.length - 1 ] var c = readJson.cache.get(file) if (c) { cb = cb.bind(null, null, c) return process.nextTick(cb); } cb = (function (orig) { return function (er, data) { if (data) readJson.cache.set(file, data); return orig(er, data) } })(cb) readJson_(file, log, strict, cb) } function readJson_ (file, log, strict, cb) { fs.readFile(file, "utf8", function (er, d) { parseJson(file, er, d, log, strict, cb) }) } function stripBOM(content) { // Remove byte order marker. This catches EF BB BF (the UTF-8 BOM) // because the buffer-to-string conversion in `fs.readFileSync()` // translates it to FEFF, the UTF-16 BOM. if (content.charCodeAt(0) === 0xFEFF) { content = content.slice(1); } return content; } function parseJson (file, er, d, log, strict, cb) { if (er && er.code === "ENOENT") { indexjs(file, er, log, strict, cb) return } if (er) return cb(er); try { d = JSON.parse(stripBOM(d)) } catch (er) { d = parseIndex(d) if (!d) return cb(parseError(er, file)); } extras(file, d, log, strict, cb) } function indexjs (file, er, log, strict, cb) { if (path.basename(file) === "index.js") { return cb(er); } var index = path.resolve(path.dirname(file), "index.js") fs.readFile(index, "utf8", function (er2, d) { if (er2) return cb(er); d = parseIndex(d) if (!d) return cb(er); extras(file, d, log, strict, cb) }) } readJson.extras = extras function extras (file, data, log_, strict_, cb_) { var log, strict, cb for (var i = 2; i < arguments.length - 1; i++) { if (typeof arguments[i] === 'boolean') strict = arguments[i] else if (typeof arguments[i] === 'function') log = arguments[i] } if (!log) log = function () {}; cb = arguments[i] var set = readJson.extraSet var n = set.length var errState = null set.forEach(function (fn) { fn(file, data, then) }) function then(er) { if (errState) return; if (er) return cb(errState = er); if (--n > 0) return; final(file, data, log, strict, cb); } } function scriptpath (file, data, cb) { if (!data.scripts) return cb(null, data); var k = Object.keys(data.scripts) k.forEach(scriptpath_, data.scripts) cb(null, data); } function scriptpath_(key) { s = this[key] // This is never allowed, and only causes problems if (typeof s !== 'string') return delete this[key] var spre = /^(\.[\/\\])?node_modules[\/\\].bin[\\\/]/ if (s.match(spre)) this[key] = this[key].replace(spre, '') } function gypfile (file, data, cb) { var dir = path.dirname(file) var s = data.scripts || {} if (s.install || s.preinstall) return cb(null, data); glob("*.gyp", { cwd: dir }, function (er, files) { if (er) return cb(er); gypfile_(file, data, files, cb) }) } function gypfile_ (file, data, files, cb) { if (!files.length) return cb(null, data); var s = data.scripts || {} s.install = "node-gyp rebuild" data.scripts = s data.gypfile = true return cb(null, data); } function serverjs (file, data, cb) { var dir = path.dirname(file) var s = data.scripts || {} if (s.start) return cb(null, data) glob("server.js", { cwd: dir }, function (er, files) { if (er) return cb(er); serverjs_(file, data, files, cb) }) } function serverjs_ (file, data, files, cb) { if (!files.length) return cb(null, data); var s = data.scripts || {} s.start = "node server.js" data.scripts = s return cb(null, data) } function authors (file, data, cb) { if (data.contributors) return cb(null, data); var af = path.resolve(path.dirname(file), "AUTHORS") fs.readFile(af, "utf8", function (er, ad) { // ignore error. just checking it. if (er) return cb(null, data); authors_(file, data, ad, cb) }) } function authors_ (file, data, ad, cb) { ad = ad.split(/\r?\n/g).map(function (line) { return line.replace(/^\s*#.*$/, '').trim() }).filter(function (line) { return line }) data.contributors = ad return cb(null, data) } var defDesc = "Unnamed repository; edit this file " + "'description' to name the repository." function gitDescription (file, data, cb) { if (data.description) return cb(null, data); var dir = path.dirname(file) // just cuz it'd be nice if this file mattered... var gitDesc = path.resolve(dir, '.git/description') fs.readFile(gitDesc, 'utf8', function (er, desc) { if (desc) desc = desc.trim() if (!er && desc !== defDesc) data.description = desc return cb(null, data) }) } function readmeDescription (file, data) { if (data.description) return cb(null, data); var d = data.readme if (!d) return; // the first block of text before the first heading // that isn't the first line heading d = d.trim().split('\n') for (var s = 0; d[s] && d[s].trim().match(/^(#|$)/); s ++); var l = d.length for (var e = s + 1; e < l && d[e].trim(); e ++); data.description = d.slice(s, e).join(' ').trim() } function readme (file, data, cb) { if (data.readme) return cb(null, data); var dir = path.dirname(file) var globOpts = { cwd: dir, nocase: true, mark: true } glob("{README,README.*}", globOpts, function (er, files) { if (er) return cb(er); // don't accept directories. files = files.filter(function (file) { return !file.match(/\/$/) }) if (!files.length) return cb(); var fn = preferMarkdownReadme(files) var rm = path.resolve(dir, fn) readme_(file, data, rm, cb) }) } function preferMarkdownReadme(files) { var fallback = 0; var re = /\.m?a?r?k?d?o?w?n?$/i for (var i = 0; i < files.length; i++) { if (files[i].match(re)) return files[i] else if (files[i].match(/README$/)) fallback = i } // prefer README.md, followed by README; otherwise, return // the first filename (which could be README) return files[fallback]; } function readme_(file, data, rm, cb) { var rmfn = path.basename(rm); fs.readFile(rm, "utf8", function (er, rm) { // maybe not readable, or something. if (er) return cb() data.readme = rm data.readmeFilename = rmfn return cb(er, data) }) } function mans (file, data, cb) { var m = data.directories && data.directories.man if (data.man || !m) return cb(null, data); m = path.resolve(path.dirname(file), m) glob("**/*.[0-9]", { cwd: m }, function (er, mans) { if (er) return cb(er); mans_(file, data, mans, cb) }) } function mans_ (file, data, mans, cb) { var m = data.directories && data.directories.man data.man = mans.map(function (mf) { return path.resolve(path.dirname(file), m, mf) }) return cb(null, data) } function bins (file, data, cb) { if (Array.isArray(data.bin)) { return bins_(file, data, data.bin, cb) } var m = data.directories && data.directories.bin if (data.bin || !m) return cb(null, data); m = path.resolve(path.dirname(file), m) glob("**", { cwd: m }, function (er, bins) { if (er) return cb(er); bins_(file, data, bins, cb) }) } function bins_ (file, data, bins, cb) { var m = data.directories && data.directories.bin || '.' data.bin = bins.reduce(function (acc, mf) { if (mf && mf.charAt(0) !== '.') { var f = path.basename(mf) acc[f] = path.join(m, mf) } return acc }, {}) return cb(null, data) } function githead (file, data, cb) { if (data.gitHead) return cb(null, data); var dir = path.dirname(file) var head = path.resolve(dir, '.git/HEAD') fs.readFile(head, 'utf8', function (er, head) { if (er) return cb(null, data); githead_(file, data, dir, head, cb) }) } function githead_ (file, data, dir, head, cb) { if (!head.match(/^ref: /)) { data.gitHead = head.trim() return cb(null, data) } var headFile = head.replace(/^ref: /, '').trim() headFile = path.resolve(dir, '.git', headFile) fs.readFile(headFile, 'utf8', function (er, head) { if (er || !head) return cb(null, data) head = head.replace(/^ref: /, '').trim() data.gitHead = head return cb(null, data) }) } function final (file, data, log, strict, cb) { var pId = makePackageId(data) function warn(msg) { if (typoWarned[pId]) return; if (log) log("package.json", pId, msg); } try { normalizeData(data, warn, strict) } catch (error) { return cb(error) } typoWarned[pId] = true readJson.cache.set(file, data) cb(null, data) } function makePackageId (data) { var name = cleanString(data.name) var ver = cleanString(data.version) return name + "@" + ver } function cleanString(str) { return (!str || typeof(str) !== "string") ? "" : str.trim() } // /**package { "name": "foo", "version": "1.2.3", ... } **/ function parseIndex (data) { data = data.split(/^\/\*\*package(?:\s|$)/m) if (data.length < 2) return null data = data[1] data = data.split(/\*\*\/$/m) if (data.length < 2) return null data = data[0] data = data.replace(/^\s*\*/mg, "") try { return JSON.parse(data) } catch (er) { return null } } function parseError (ex, file) { var e = new Error("Failed to parse json\n"+ex.message) e.code = "EJSONPARSE" e.file = file return e } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-package-json/README.md����������������000644 �000766 �000024 �00000011662 12455173731 030117� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# read-package-json This is the thing that npm uses to read package.json files. It validates some stuff, and loads some default things. It keeps a cache of the files you've read, so that you don't end up reading the same package.json file multiple times. Note that if you just want to see what's literally in the package.json file, you can usually do `var data = require('some-module/package.json')`. This module is basically only needed by npm, but it's handy to see what npm will see when it looks at your package. ## Usage ```javascript var readJson = require('read-package-json') // readJson(filename, [logFunction=noop], [strict=false], cb) readJson('/path/to/package.json', console.error, false, function (er, data) { if (er) { console.error("There was an error reading the file") return } console.error('the package data is', data) }); ``` ## readJson(file, [logFn = noop], [strict = false], cb) * `file` {String} The path to the package.json file * `logFn` {Function} Function to handle logging. Defaults to a noop. * `strict` {Boolean} True to enforce SemVer 2.0 version strings, and other strict requirements. * `cb` {Function} Gets called with `(er, data)`, as is The Node Way. Reads the JSON file and does the things. ## `package.json` Fields See `man 5 package.json` or `npm help json`. ## readJson.log By default this is a reference to the `npmlog` module. But if that module can't be found, then it'll be set to just a dummy thing that does nothing. Replace with your own `{log,warn,error}` object for fun loggy time. ## readJson.extras(file, data, cb) Run all the extra stuff relative to the file, with the parsed data. Modifies the data as it does stuff. Calls the cb when it's done. ## readJson.extraSet = [fn, fn, ...] Array of functions that are called by `extras`. Each one receives the arguments `fn(file, data, cb)` and is expected to call `cb(er, data)` when done or when an error occurs. Order is indeterminate, so each function should be completely independent. Mix and match! ## readJson.cache The `lru-cache` object that readJson uses to not read the same file over and over again. See [lru-cache](https://github.com/isaacs/node-lru-cache) for details. ## Other Relevant Files Besides `package.json` Some other files have an effect on the resulting data object, in the following ways: ### `README?(.*)` If there is a `README` or `README.*` file present, then npm will attach a `readme` field to the data with the contents of this file. Owing to the fact that roughly 100% of existing node modules have Markdown README files, it will generally be assumed to be Markdown, regardless of the extension. Please plan accordingly. ### `server.js` If there is a `server.js` file, and there is not already a `scripts.start` field, then `scripts.start` will be set to `node server.js`. ### `AUTHORS` If there is not already a `contributors` field, then the `contributors` field will be set to the contents of the `AUTHORS` file, split by lines, and parsed. ### `bindings.gyp` If a bindings.gyp file exists, and there is not already a `scripts.install` field, then the `scripts.install` field will be set to `node-gyp rebuild`. ### `wscript` If a wscript file exists, and there is not already a `scripts.install` field, then the `scripts.install` field will be set to `node-waf clean ; node-waf configure build`. Note that the `bindings.gyp` file supercedes this, since node-waf has been deprecated in favor of node-gyp. ### `index.js` If the json file does not exist, but there is a `index.js` file present instead, and that file has a package comment, then it will try to parse the package comment, and use that as the data instead. A package comment looks like this: ```javascript /**package * { "name": "my-bare-module" * , "version": "1.2.3" * , "description": "etc...." } **/ // or... /**package { "name": "my-bare-module" , "version": "1.2.3" , "description": "etc...." } **/ ``` The important thing is that it starts with `/**package`, and ends with `**/`. If the package.json file exists, then the index.js is not parsed. ### `{directories.man}/*.[0-9]` If there is not already a `man` field defined as an array of files or a single file, and there is a `directories.man` field defined, then that directory will be searched for manpages. Any valid manpages found in that directory will be assigned to the `man` array, and installed in the appropriate man directory at package install time, when installed globally on a Unix system. ### `{directories.bin}/*` If there is not already a `bin` field defined as a string filename or a hash of `<name> : <filename>` pairs, then the `directories.bin` directory will be searched and all the files within it will be linked as executables at install time. When installing locally, npm links bins into `node_modules/.bin`, which is in the `PATH` environ when npm runs scripts. When installing globally, they are linked into `{prefix}/bin`, which is presumably in the `PATH` environment variable. ������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/.npmignore������������������000644 �000766 �000024 �00000000146 12455173731 030247� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modules ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/LICENSE���������������������000644 �000766 �000024 �00000001355 12455173731 027260� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/node_modules/���������������000755 �000766 �000024 �00000000000 12456115117 030717� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/package.json����������������000644 �000766 �000024 �00000003477 12455173731 030550� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "read-installed", "description": "Read all the installed packages in a folder, and return a tree structure with all the data.", "version": "3.1.5", "repository": { "type": "git", "url": "git://github.com/isaacs/read-installed" }, "main": "read-installed.js", "scripts": { "test": "tap ./test/*.js" }, "dependencies": { "debuglog": "^1.0.1", "read-package-json": "1", "readdir-scoped-modules": "^1.0.0", "semver": "2 || 3 || 4", "slide": "~1.1.3", "util-extend": "^1.0.1", "graceful-fs": "2 || 3" }, "optionalDependencies": { "graceful-fs": "2 || 3" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "devDependencies": { "mkdirp": "^0.5.0", "rimraf": "^2.2.8", "tap": "~0.4.8" }, "readme": "# read-installed\n\nRead all the installed packages in a folder, and return a tree\nstructure with all the data.\n\nnpm uses this.\n\n## 2.0.0\n\nBreaking changes in `2.0.0`:\n\nThe second argument is now an `Object` that contains the following keys:\n\n * `depth` optional, defaults to Infinity\n * `log` optional log Function\n * `dev` optional, default false, set to true to include devDependencies\n\n## Usage\n\n```javascript\nvar readInstalled = require(\"read-installed\")\n// optional options\nvar options = { dev: false, log: fn, depth: 2 }\nreadInstalled(folder, options, function (er, data) {\n ...\n})\n```\n", "readmeFilename": "README.md", "gitHead": "577c3f3f4f1e435f9bd944b8f99ce3f7552709ef", "bugs": { "url": "https://github.com/isaacs/read-installed/issues" }, "homepage": "https://github.com/isaacs/read-installed", "_id": "read-installed@3.1.5", "_shasum": "4ae36081afd3e2204dc2e279807aaa52c30c8c0c", "_from": "read-installed@>=3.1.5 <3.2.0" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/read-installed.js�����������000644 �000766 �000024 �00000025733 12455173731 031507� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // Walk through the file-system "database" of installed // packages, and create a data object related to the // installed versions of each package. /* This will traverse through all node_modules folders, resolving the dependencies object to the object corresponding to the package that meets that dep, or just the version/range if unmet. Assuming that you had this folder structure: /path/to +-- package.json { name = "root" } `-- node_modules +-- foo {bar, baz, asdf} | +-- node_modules | +-- bar { baz } | `-- baz `-- asdf where "foo" depends on bar, baz, and asdf, bar depends on baz, and bar and baz are bundled with foo, whereas "asdf" is at the higher level (sibling to foo), you'd get this object structure: { <package.json data> , path: "/path/to" , parent: null , dependencies: { foo : { version: "1.2.3" , path: "/path/to/node_modules/foo" , parent: <Circular: root> , dependencies: { bar: { parent: <Circular: foo> , path: "/path/to/node_modules/foo/node_modules/bar" , version: "2.3.4" , dependencies: { baz: <Circular: foo.dependencies.baz> } } , baz: { ... } , asdf: <Circular: asdf> } } , asdf: { ... } } } Unmet deps are left as strings. Extraneous deps are marked with extraneous:true deps that don't meet a requirement are marked with invalid:true deps that don't meet a peer requirement are marked with peerInvalid:true to READ(packagefolder, parentobj, name, reqver) obj = read package.json installed = ./node_modules/* if parentobj is null, and no package.json obj = {dependencies:{<installed>:"*"}} deps = Object.keys(obj.dependencies) obj.path = packagefolder obj.parent = parentobj if name, && obj.name !== name, obj.invalid = true if reqver, && obj.version !satisfies reqver, obj.invalid = true if !reqver && parentobj, obj.extraneous = true for each folder in installed obj.dependencies[folder] = READ(packagefolder+node_modules+folder, obj, folder, obj.dependencies[folder]) # walk tree to find unmet deps for each dep in obj.dependencies not in installed r = obj.parent while r if r.dependencies[dep] if r.dependencies[dep].verion !satisfies obj.dependencies[dep] WARN r.dependencies[dep].invalid = true obj.dependencies[dep] = r.dependencies[dep] r = null else r = r.parent return obj TODO: 1. Find unmet deps in parent directories, searching as node does up as far as the left-most node_modules folder. 2. Ignore anything in node_modules that isn't a package folder. */ try { var fs = require("graceful-fs") } catch (er) { var fs = require("fs") } var path = require("path") var asyncMap = require("slide").asyncMap var semver = require("semver") var readJson = require("read-package-json") var url = require("url") var util = require("util") var extend = require("util-extend") var debug = require("debuglog")("read-installed") var readdir = require("readdir-scoped-modules") module.exports = readInstalled function readInstalled (folder, opts, cb) { if (typeof opts === 'function') { cb = opts opts = {} } else { opts = extend({}, opts) } if (typeof opts.depth !== 'number') opts.depth = Infinity opts.depth = Math.max(0, opts.depth) if (typeof opts.log !== 'function') opts.log = function () {} opts.dev = !!opts.dev opts.realpathSeen = {} opts.findUnmetSeen = [] readInstalled_(folder, null, null, null, 0, opts, function (er, obj) { if (er) return cb(er) // now obj has all the installed things, where they're installed // figure out the inheritance links, now that the object is built. resolveInheritance(obj, opts) obj.root = true unmarkExtraneous(obj, opts) cb(null, obj) }) } function readInstalled_ (folder, parent, name, reqver, depth, opts, cb) { var installed , obj , real , link , realpathSeen = opts.realpathSeen readdir(path.resolve(folder, "node_modules"), function (er, i) { // error indicates that nothing is installed here if (er) i = [] installed = i.filter(function (f) { return f.charAt(0) !== "." }) next() }) readJson(path.resolve(folder, "package.json"), function (er, data) { obj = copy(data) if (!parent) { obj = obj || true er = null } return next(er) }) fs.lstat(folder, function (er, st) { if (er) { if (!parent) real = true return next(er) } fs.realpath(folder, function (er, rp) { debug("realpath(%j) = %j", folder, rp) real = rp if (st.isSymbolicLink()) link = rp next(er) }) }) var errState = null , called = false function next (er) { if (errState) return if (er) { errState = er return cb(null, []) } debug('next', installed, obj && typeof obj, name, real) if (!installed || !obj || !real || called) return called = true if (realpathSeen[real]) return cb(null, realpathSeen[real]) if (obj === true) { obj = {dependencies:{}, path:folder} installed.forEach(function (i) { obj.dependencies[i] = "*" }) } if (name && obj.name !== name) obj.invalid = true obj.realName = name || obj.name obj.dependencies = obj.dependencies || {} // At this point, figure out what dependencies we NEED to get met obj._dependencies = copy(obj.dependencies) // "foo":"http://blah" and "foo":"latest" are always presumed valid if (reqver && semver.validRange(reqver, true) && !semver.satisfies(obj.version, reqver, true)) { obj.invalid = true } // Mark as extraneous at this point. // This will be un-marked in unmarkExtraneous, where we mark as // not-extraneous everything that is required in some way from // the root object. obj.extraneous = true obj.path = obj.path || folder obj.realPath = real obj.link = link if (parent && !obj.link) obj.parent = parent realpathSeen[real] = obj obj.depth = depth //if (depth >= opts.depth) return cb(null, obj) asyncMap(installed, function (pkg, cb) { var rv = obj.dependencies[pkg] if (!rv && obj.devDependencies && opts.dev) rv = obj.devDependencies[pkg] if (depth > opts.depth) { obj.dependencies = {} return cb(null, obj) } readInstalled_( path.resolve(folder, "node_modules/"+pkg) , obj, pkg, obj.dependencies[pkg], depth + 1, opts , cb ) }, function (er, installedData) { if (er) return cb(er) installedData.forEach(function (dep) { obj.dependencies[dep.realName] = dep }) // any strings here are unmet things. however, if it's // optional, then that's fine, so just delete it. if (obj.optionalDependencies) { Object.keys(obj.optionalDependencies).forEach(function (dep) { if (typeof obj.dependencies[dep] === "string") { delete obj.dependencies[dep] } }) } return cb(null, obj) }) } } // starting from a root object, call findUnmet on each layer of children var riSeen = [] function resolveInheritance (obj, opts) { if (typeof obj !== "object") return if (riSeen.indexOf(obj) !== -1) return riSeen.push(obj) if (typeof obj.dependencies !== "object") { obj.dependencies = {} } Object.keys(obj.dependencies).forEach(function (dep) { findUnmet(obj.dependencies[dep], opts) }) Object.keys(obj.dependencies).forEach(function (dep) { if (typeof obj.dependencies[dep] === "object") { resolveInheritance(obj.dependencies[dep], opts) } else { debug("unmet dep! %s %s@%s", obj.name, dep, obj.dependencies[dep]) } }) findUnmet(obj, opts) } // find unmet deps by walking up the tree object. // No I/O function findUnmet (obj, opts) { var findUnmetSeen = opts.findUnmetSeen if (findUnmetSeen.indexOf(obj) !== -1) return findUnmetSeen.push(obj) debug("find unmet parent=%s obj=", obj.parent && obj.parent.name, obj.name || obj) var deps = obj.dependencies = obj.dependencies || {} debug(deps) Object.keys(deps) .filter(function (d) { return typeof deps[d] === "string" }) .forEach(function (d) { var found = findDep(obj, d) debug("finding dep %j", d, found && found.name || found) // "foo":"http://blah" and "foo":"latest" are always presumed valid if (typeof deps[d] === "string" && semver.validRange(deps[d], true) && found && !semver.satisfies(found.version, deps[d], true)) { // the bad thing will happen opts.log( "unmet dependency" , obj.path + " requires "+d+"@'"+deps[d] + "' but will load\n" + found.path+",\nwhich is version "+found.version ) found.invalid = true } if (found) { deps[d] = found } }) var peerDeps = obj.peerDependencies = obj.peerDependencies || {} Object.keys(peerDeps).forEach(function (d) { var dependency if (!obj.parent) { dependency = obj.dependencies[d] // read it as a missing dep if (!dependency) { obj.dependencies[d] = peerDeps[d] } } else { var r = obj.parent while (r && !dependency) { dependency = r.dependencies && r.dependencies[d] r = r.link ? null : r.parent } } if (!dependency) { // mark as a missing dep! obj.dependencies[d] = peerDeps[d] } else if (!semver.satisfies(dependency.version, peerDeps[d], true)) { dependency.peerInvalid = true } }) return obj } function unmarkExtraneous (obj, opts) { // Mark all non-required deps as extraneous. // start from the root object and mark as non-extraneous all modules // that haven't been previously flagged as extraneous then propagate // to all their dependencies obj.extraneous = false var deps = obj._dependencies || [] if (opts.dev && obj.devDependencies && (obj.root || obj.link)) { Object.keys(obj.devDependencies).forEach(function (k) { deps[k] = obj.devDependencies[k] }) } if (obj.peerDependencies) { Object.keys(obj.peerDependencies).forEach(function (k) { deps[k] = obj.peerDependencies[k] }) } debug("not extraneous", obj._id, deps) Object.keys(deps).forEach(function (d) { var dep = findDep(obj, d) if (dep && dep.extraneous) { unmarkExtraneous(dep, opts) } }) } // Find the one that will actually be loaded by require() // so we can make sure it's valid etc. function findDep (obj, d) { var r = obj , found = null while (r && !found) { // if r is a valid choice, then use that. // kinda weird if a pkg depends on itself, but after the first // iteration of this loop, it indicates a dep cycle. if (typeof r.dependencies[d] === "object") { found = r.dependencies[d] } if (!found && r.realName === d) found = r r = r.link ? null : r.parent } return found } function copy (obj) { if (!obj || typeof obj !== 'object') return obj if (Array.isArray(obj)) return obj.map(copy) var o = {} for (var i in obj) o[i] = copy(obj[i]) return o } �������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/README.md�������������������000644 �000766 �000024 �00000001120 12455173731 027520� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# read-installed Read all the installed packages in a folder, and return a tree structure with all the data. npm uses this. ## 2.0.0 Breaking changes in `2.0.0`: The second argument is now an `Object` that contains the following keys: * `depth` optional, defaults to Infinity * `log` optional log Function * `dev` optional, default false, set to true to include devDependencies ## Usage ```javascript var readInstalled = require("read-installed") // optional options var options = { dev: false, log: fn, depth: 2 } readInstalled(folder, options, function (er, data) { ... }) ``` ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/node_modules/debuglog/������000755 �000766 �000024 �00000000000 12456115117 032507� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/readdir-scoped-modules/���������������000755 �000766 �000024 �00000000000 12456115117 035173� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read-installed/node_modules/util-extend/���000755 �000766 �000024 �00000000000 12456115117 033161� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/util-extend/extend.js�����������������000644 �000766 �000024 �00000000436 12455173731 034737� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = extend; function extend(origin, add) { // Don't do anything if add isn't an object if (!add || typeof add !== 'object') return origin; var keys = Object.keys(add); var i = keys.length; while (i--) { origin[keys[i]] = add[keys[i]]; } return origin; } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/util-extend/package.json��������������000644 �000766 �000024 �00000002473 12455173731 035403� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "util-extend", "version": "1.0.1", "description": "Node's internal object extension function", "main": "extend.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/util-extend" }, "author": "", "license": "MIT", "readmeFilename": "README.md", "readme": "# util-extend\n\nThe Node object extending function that Node uses for Node!\n\n## Usage\n\n```js\nvar extend = require('util-extend');\nfunction functionThatTakesOptions(options) {\n var options = extend(defaults, options);\n // now any unset options are set to the defaults.\n}\n```\n", "bugs": { "url": "https://github.com/isaacs/util-extend/issues" }, "_id": "util-extend@1.0.1", "dist": { "shasum": "bb703b79480293ddcdcfb3c6a9fea20f483415bc", "tarball": "http://registry.npmjs.org/util-extend/-/util-extend-1.0.1.tgz" }, "_from": "util-extend@>=1.0.1 <2.0.0", "_npmVersion": "1.3.4", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "bb703b79480293ddcdcfb3c6a9fea20f483415bc", "_resolved": "https://registry.npmjs.org/util-extend/-/util-extend-1.0.1.tgz", "homepage": "https://github.com/isaacs/util-extend" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/util-extend/README.md�����������������000644 �000766 �000024 �00000000423 12455173731 034365� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# util-extend The Node object extending function that Node uses for Node! ## Usage ```js var extend = require('util-extend'); function functionThatTakesOptions(options) { var options = extend(defaults, options); // now any unset options are set to the defaults. } ``` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/util-extend/test.js�������������������000644 �000766 �000024 �00000000705 12455173731 034426� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var assert = require('assert'); var extend = require('./'); assert.deepEqual(extend({a:1}), {a:1}); assert.deepEqual(extend({a:1}, []), {a:1}); assert.deepEqual(extend({a:1}, null), {a:1}); assert.deepEqual(extend({a:1}, true), {a:1}); assert.deepEqual(extend({a:1}, false), {a:1}); assert.deepEqual(extend({a:1}, {b:2}), {a:1, b:2}); assert.deepEqual(extend({a:1, b:2}, {b:3}), {a:1, b:3}); console.log('ok'); �����������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/readdir-scoped-modules/.eslintrc������000644 �000766 �000024 �00000000557 12455173731 037033� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "env" : { "node" : true }, "rules" : { "semi": [2, "never"], "strict": 0, "quotes": [1, "double", "avoid-escape"], "no-use-before-define": 0, "curly": 0, "no-underscore-dangle": 0, "no-lonely-if": 1, "no-unused-vars": [2, {"vars" : "all", "args" : "after-used"}], "no-mixed-requires": 0, "space-infix-ops": 0 } } �������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/readdir-scoped-modules/LICENSE��������000644 �000766 �000024 �00000001375 12455173731 036213� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/readdir-scoped-modules/package.json���000644 �000766 �000024 �00000002617 12455173731 037474� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "readdir-scoped-modules", "version": "1.0.1", "description": "Like `fs.readdir` but handling `@org/module` dirs as if they were a single entry.", "main": "readdir.js", "directories": { "test": "test" }, "dependencies": { "debuglog": "^1.0.1", "dezalgo": "^1.0.0", "graceful-fs": "^3.0.4", "once": "^1.3.0" }, "devDependencies": { "tap": "0.4" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/readdir-scoped-modules" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/readdir-scoped-modules/issues" }, "homepage": "https://github.com/npm/readdir-scoped-modules", "readme": "# readdir-scoped-modules\n\nLike `fs.readdir` but handling `@org/module` dirs as if they were\na single entry.\n\nUsed by npm.\n\n## USAGE\n\n```javascript\nvar readdir = require('readdir-scoped-modules')\n\nreaddir('node_modules', function (er, entries) {\n // entries will be something like\n // ['a', '@org/foo', '@org/bar']\n})\n```\n", "readmeFilename": "README.md", "gitHead": "451d38946c5b6b6c0db33a890f33536a11ed79f7", "_id": "readdir-scoped-modules@1.0.1", "_shasum": "5c2a77f3e08250a8fddf53fa58cdc17900b808b9", "_from": "readdir-scoped-modules@>=1.0.0 <2.0.0" } �����������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/readdir-scoped-modules/readdir.js�����000644 �000766 �000024 �00000003335 12455173731 037154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var fs = require ('graceful-fs') var dz = require ('dezalgo') var once = require ('once') var path = require ('path') var debug = require ('debuglog') ('rds') module . exports = readdir function readdir (dir, cb) { fs . readdir (dir, function (er, kids) { if (er) return cb (er) debug ('dir=%j, kids=%j', dir, kids) readScopes (dir, kids, function (er, data) { if (er) return cb (er) // Sort for bonus consistency points data = data . sort (function (a, b) { return a > b ? 1 : -1 }) return cb (null, data) }) }) } // Turn [ 'a', '@scope' ] into // ['a', '@scope/foo', '@scope/bar'] function readScopes (root, kids, cb) { var scopes = kids . filter (function (kid) { return kid . charAt (0) === '@' }) kids = kids . filter (function (kid) { return kid . charAt (0) !== '@' }) debug ('scopes=%j', scopes) if (scopes . length === 0) dz (cb) (null, kids) // prevent maybe-sync zalgo release cb = once (cb) var l = scopes . length scopes . forEach (function (scope) { var scopedir = path . resolve (root, scope) debug ('root=%j scope=%j scopedir=%j', root, scope, scopedir) fs . readdir (scopedir, then . bind (null, scope)) }) function then (scope, er, scopekids) { if (er) return cb (er) // XXX: Not sure how old this node bug is. Maybe superstition? scopekids = scopekids . filter (function (scopekid) { return !(scopekid === '.' || scopekid === '..' || !scopekid) }) kids . push . apply (kids, scopekids . map (function (scopekid) { return scope + '/' + scopekid })) debug ('scope=%j scopekids=%j kids=%j', scope, scopekids, kids) if (--l === 0) cb (null, kids) } } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/readdir-scoped-modules/README.md������000644 �000766 �000024 �00000000503 12455173731 036455� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# readdir-scoped-modules Like `fs.readdir` but handling `@org/module` dirs as if they were a single entry. Used by npm. ## USAGE ```javascript var readdir = require('readdir-scoped-modules') readdir('node_modules', function (er, entries) { // entries will be something like // ['a', '@org/foo', '@org/bar'] }) ``` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/debuglog/debuglog.js������������������000644 �000766 �000024 �00000001052 12455173731 034561� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var util = require('util'); module.exports = (util && util.debuglog) || debuglog; var debugs = {}; var debugEnviron = process.env.NODE_DEBUG || ''; function debuglog(set) { set = set.toUpperCase(); if (!debugs[set]) { if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { var pid = process.pid; debugs[set] = function() { var msg = util.format.apply(exports, arguments); console.error('%s %d: %s', set, pid, msg); }; } else { debugs[set] = function() {}; } } return debugs[set]; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/debuglog/LICENSE����������������������000644 �000766 �000024 �00000002111 12455173731 033435� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/debuglog/package.json�����������������000644 �000766 �000024 �00000002210 12455173731 034716� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "debuglog", "version": "1.0.1", "description": "backport of util.debuglog from node v0.11", "license": "MIT", "main": "debuglog.js", "repository": { "type": "git", "url": "https://github.com/sam-github/node-debuglog.git" }, "author": { "name": "Sam Roberts", "email": "sam@strongloop.com" }, "engines": { "node": "*" }, "browser": { "util": false }, "bugs": { "url": "https://github.com/sam-github/node-debuglog/issues" }, "homepage": "https://github.com/sam-github/node-debuglog", "_id": "debuglog@1.0.1", "dist": { "shasum": "aa24ffb9ac3df9a2351837cfb2d279360cd78492", "tarball": "http://registry.npmjs.org/debuglog/-/debuglog-1.0.1.tgz" }, "_from": "debuglog@>=1.0.1 <2.0.0", "_npmVersion": "1.4.3", "_npmUser": { "name": "octet", "email": "sam@strongloop.com" }, "maintainers": [ { "name": "octet", "email": "sam@strongloop.com" } ], "directories": {}, "_shasum": "aa24ffb9ac3df9a2351837cfb2d279360cd78492", "_resolved": "https://registry.npmjs.org/debuglog/-/debuglog-1.0.1.tgz", "readme": "ERROR: No README data found!" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/read-installed/node_modules/debuglog/README.md��������������������000644 �000766 �000024 �00000002406 12455173731 033716� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# debuglog - backport of util.debuglog() from node v0.11 To facilitate using the `util.debuglog()` function that will be available when node v0.12 is released now, this is a copy extracted from the source. ## require('debuglog') Return `util.debuglog`, if it exists, otherwise it will return an internal copy of the implementation from node v0.11. ## debuglog(section) * `section` {String} The section of the program to be debugged * Returns: {Function} The logging function This is used to create a function which conditionally writes to stderr based on the existence of a `NODE_DEBUG` environment variable. If the `section` name appears in that environment variable, then the returned function will be similar to `console.error()`. If not, then the returned function is a no-op. For example: ```javascript var debuglog = util.debuglog('foo'); var bar = 123; debuglog('hello from foo [%d]', bar); ``` If this program is run with `NODE_DEBUG=foo` in the environment, then it will output something like: FOO 3245: hello from foo [123] where `3245` is the process id. If it is not run with that environment variable set, then it will not print anything. You may separate multiple `NODE_DEBUG` environment variables with a comma. For example, `NODE_DEBUG=fs,net,tls`. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/.npmignore����������������������������000644 �000766 �000024 �00000000033 12455173731 026265� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-debug.log node_modules �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/example/������������������������������000755 �000766 �000024 �00000000000 12456115117 025720� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/lib/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025033� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/LICENCE�������������������������������000644 �000766 �000024 �00000002446 12455173731 025265� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/node_modules/�������������������������000755 �000766 �000024 �00000000000 12456115117 026742� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/package.json��������������������������000644 �000766 �000024 �00000004633 12455173731 026566� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "read", "version": "1.0.5", "main": "lib/read.js", "dependencies": { "mute-stream": "~0.0.4" }, "devDependencies": { "tap": "*" }, "engines": { "node": ">=0.8" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "description": "read(1) for node programs", "repository": { "type": "git", "url": "git://github.com/isaacs/read.git" }, "license": "BSD", "scripts": { "test": "tap test/*.js" }, "readme": "## read\n\nFor reading user input from stdin.\n\nSimilar to the `readline` builtin's `question()` method, but with a\nfew more features.\n\n## USAGE\n\n```javascript\nvar read = require(\"read\")\nread(options, callback)\n```\n\nThe callback gets called with either the user input, or the default\nspecified, or an error, as `callback(error, result, isDefault)`\nnode style.\n\n## OPTIONS\n\nEvery option is optional.\n\n* `prompt` What to write to stdout before reading input.\n* `silent` Don't echo the output as the user types it.\n* `replace` Replace silenced characters with the supplied character value.\n* `timeout` Number of ms to wait for user input before giving up.\n* `default` The default value if the user enters nothing.\n* `edit` Allow the user to edit the default value.\n* `terminal` Treat the output as a TTY, whether it is or not.\n* `input` Readable stream to get input data from. (default `process.stdin`)\n* `output` Writeable stream to write prompts to. (default: `process.stdout`)\n\nIf silent is true, and the input is a TTY, then read will set raw\nmode, and read character by character.\n\n## COMPATIBILITY\n\nThis module works sort of with node 0.6. It does not work with node\nversions less than 0.6. It is best on node 0.8.\n\nOn node version 0.6, it will remove all listeners on the input\nstream's `data` and `keypress` events, because the readline module did\nnot fully clean up after itself in that version of node, and did not\nmake it possible to clean up after it in a way that has no potential\nfor side effects.\n\nAdditionally, some of the readline options (like `terminal`) will not\nfunction in versions of node before 0.8, because they were not\nimplemented in the builtin readline module.\n\n## CONTRIBUTING\n\nPatches welcome.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/read/issues" }, "_id": "read@1.0.5", "_from": "read@latest" } �����������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/README.md�����������������������������000644 �000766 �000024 �00000003271 12455173731 025554� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������## read For reading user input from stdin. Similar to the `readline` builtin's `question()` method, but with a few more features. ## USAGE ```javascript var read = require("read") read(options, callback) ``` The callback gets called with either the user input, or the default specified, or an error, as `callback(error, result, isDefault)` node style. ## OPTIONS Every option is optional. * `prompt` What to write to stdout before reading input. * `silent` Don't echo the output as the user types it. * `replace` Replace silenced characters with the supplied character value. * `timeout` Number of ms to wait for user input before giving up. * `default` The default value if the user enters nothing. * `edit` Allow the user to edit the default value. * `terminal` Treat the output as a TTY, whether it is or not. * `input` Readable stream to get input data from. (default `process.stdin`) * `output` Writeable stream to write prompts to. (default: `process.stdout`) If silent is true, and the input is a TTY, then read will set raw mode, and read character by character. ## COMPATIBILITY This module works sort of with node 0.6. It does not work with node versions less than 0.6. It is best on node 0.8. On node version 0.6, it will remove all listeners on the input stream's `data` and `keypress` events, because the readline module did not fully clean up after itself in that version of node, and did not make it possible to clean up after it in a way that has no potential for side effects. Additionally, some of the readline options (like `terminal`) will not function in versions of node before 0.8, because they were not implemented in the builtin readline module. ## CONTRIBUTING Patches welcome. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/rs.js���������������������������������000644 �000766 �000024 �00000000166 12455173731 025257� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var read = require('read'); read({ silent: true, prompt: 'stars: ' }, function(er, data) { console.log(er, data) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/�������������000755 �000766 �000024 �00000000000 12456115117 031205� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/LICENSE������000644 �000766 �000024 �00000002436 12455173731 032224� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/mute.js������000644 �000766 �000024 �00000006430 12455173731 032525� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var Stream = require('stream') module.exports = MuteStream // var out = new MuteStream(process.stdout) // argument auto-pipes function MuteStream (opts) { Stream.apply(this) opts = opts || {} this.writable = this.readable = true this.muted = false this.on('pipe', this._onpipe) this.replace = opts.replace // For readline-type situations // This much at the start of a line being redrawn after a ctrl char // is seen (such as backspace) won't be redrawn as the replacement this._prompt = opts.prompt || null this._hadControl = false } MuteStream.prototype = Object.create(Stream.prototype) Object.defineProperty(MuteStream.prototype, 'constructor', { value: MuteStream, enumerable: false }) MuteStream.prototype.mute = function () { this.muted = true } MuteStream.prototype.unmute = function () { this.muted = false } Object.defineProperty(MuteStream.prototype, '_onpipe', { value: onPipe, enumerable: false, writable: true, configurable: true }) function onPipe (src) { this._src = src } Object.defineProperty(MuteStream.prototype, 'isTTY', { get: getIsTTY, set: setIsTTY, enumerable: true, configurable: true }) function getIsTTY () { return( (this._dest) ? this._dest.isTTY : (this._src) ? this._src.isTTY : false ) } // basically just get replace the getter/setter with a regular value function setIsTTY (isTTY) { Object.defineProperty(this, 'isTTY', { value: isTTY, enumerable: true, writable: true, configurable: true }) } Object.defineProperty(MuteStream.prototype, 'rows', { get: function () { return( this._dest ? this._dest.rows : this._src ? this._src.rows : undefined ) }, enumerable: true, configurable: true }) Object.defineProperty(MuteStream.prototype, 'columns', { get: function () { return( this._dest ? this._dest.columns : this._src ? this._src.columns : undefined ) }, enumerable: true, configurable: true }) MuteStream.prototype.pipe = function (dest) { this._dest = dest return Stream.prototype.pipe.call(this, dest) } MuteStream.prototype.pause = function () { if (this._src) return this._src.pause() } MuteStream.prototype.resume = function () { if (this._src) return this._src.resume() } MuteStream.prototype.write = function (c) { if (this.muted) { if (!this.replace) return true if (c.match(/^\u001b/)) { this._hadControl = true return this.emit('data', c) } else { if (this._prompt && this._hadControl && c.indexOf(this._prompt) === 0) { this._hadControl = false this.emit('data', this._prompt) c = c.substr(this._prompt.length) } c = c.toString().replace(/./g, this.replace) } } this.emit('data', c) } MuteStream.prototype.end = function (c) { if (this.muted) { if (c && this.replace) { c = c.toString().replace(/./g, this.replace) } else { c = null } } if (c) this.emit('data', c) this.emit('end') } function proxy (fn) { return function () { var d = this._dest var s = this._src if (d && d[fn]) d[fn].apply(d, arguments) if (s && s[fn]) s[fn].apply(s, arguments) }} MuteStream.prototype.destroy = proxy('destroy') MuteStream.prototype.destroySoon = proxy('destroySoon') MuteStream.prototype.close = proxy('close') ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/package.json�000644 �000766 �000024 �00000004645 12455173731 033511� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "mute-stream", "version": "0.0.4", "main": "mute.js", "directories": { "test": "test" }, "devDependencies": { "tap": "~0.2.5" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/mute-stream" }, "keywords": [ "mute", "stream", "pipe" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "BSD", "description": "Bytes go in, but they don't come out (when muted).", "readme": "# mute-stream\n\nBytes go in, but they don't come out (when muted).\n\nThis is a basic pass-through stream, but when muted, the bytes are\nsilently dropped, rather than being passed through.\n\n## Usage\n\n```javascript\nvar MuteStream = require('mute-stream')\n\nvar ms = new MuteStream(options)\n\nms.pipe(process.stdout)\nms.write('foo') // writes 'foo' to stdout\nms.mute()\nms.write('bar') // does not write 'bar'\nms.unmute()\nms.write('baz') // writes 'baz' to stdout\n\n// can also be used to mute incoming data\nvar ms = new MuteStream\ninput.pipe(ms)\n\nms.on('data', function (c) {\n console.log('data: ' + c)\n})\n\ninput.emit('data', 'foo') // logs 'foo'\nms.mute()\ninput.emit('data', 'bar') // does not log 'bar'\nms.unmute()\ninput.emit('data', 'baz') // logs 'baz'\n```\n\n## Options\n\nAll options are optional.\n\n* `replace` Set to a string to replace each character with the\n specified string when muted. (So you can show `****` instead of the\n password, for example.)\n\n* `prompt` If you are using a replacement char, and also using a\n prompt with a readline stream (as for a `Password: *****` input),\n then specify what the prompt is so that backspace will work\n properly. Otherwise, pressing backspace will overwrite the prompt\n with the replacement character, which is weird.\n\n## ms.mute()\n\nSet `muted` to `true`. Turns `.write()` into a no-op.\n\n## ms.unmute()\n\nSet `muted` to `false`\n\n## ms.isTTY\n\nTrue if the pipe destination is a TTY, or if the incoming pipe source is\na TTY.\n\n## Other stream methods...\n\nThe other standard readable and writable stream methods are all\navailable. The MuteStream object acts as a facade to its pipe source\nand destination.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/mute-stream/issues" }, "_id": "mute-stream@0.0.4", "_from": "mute-stream@~0.0.4" } �������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/node_modules/mute-stream/README.md����000644 �000766 �000024 �00000003165 12455173731 032476� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# mute-stream Bytes go in, but they don't come out (when muted). This is a basic pass-through stream, but when muted, the bytes are silently dropped, rather than being passed through. ## Usage ```javascript var MuteStream = require('mute-stream') var ms = new MuteStream(options) ms.pipe(process.stdout) ms.write('foo') // writes 'foo' to stdout ms.mute() ms.write('bar') // does not write 'bar' ms.unmute() ms.write('baz') // writes 'baz' to stdout // can also be used to mute incoming data var ms = new MuteStream input.pipe(ms) ms.on('data', function (c) { console.log('data: ' + c) }) input.emit('data', 'foo') // logs 'foo' ms.mute() input.emit('data', 'bar') // does not log 'bar' ms.unmute() input.emit('data', 'baz') // logs 'baz' ``` ## Options All options are optional. * `replace` Set to a string to replace each character with the specified string when muted. (So you can show `****` instead of the password, for example.) * `prompt` If you are using a replacement char, and also using a prompt with a readline stream (as for a `Password: *****` input), then specify what the prompt is so that backspace will work properly. Otherwise, pressing backspace will overwrite the prompt with the replacement character, which is weird. ## ms.mute() Set `muted` to `true`. Turns `.write()` into a no-op. ## ms.unmute() Set `muted` to `false` ## ms.isTTY True if the pipe destination is a TTY, or if the incoming pipe source is a TTY. ## Other stream methods... The other standard readable and writable stream methods are all available. The MuteStream object acts as a facade to its pipe source and destination. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/lib/read.js���������������������������000644 �000766 �000024 �00000004577 12455173731 026326� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = read var readline = require('readline') var Mute = require('mute-stream') function read (opts, cb) { if (opts.num) { throw new Error('read() no longer accepts a char number limit') } if (typeof opts.default !== 'undefined' && typeof opts.default !== 'string' && typeof opts.default !== 'number') { throw new Error('default value must be string or number') } var input = opts.input || process.stdin var output = opts.output || process.stdout var prompt = (opts.prompt || '').trim() + ' ' var silent = opts.silent var editDef = false var timeout = opts.timeout var def = opts.default || '' if (def) { if (silent) { prompt += '(<default hidden>) ' } else if (opts.edit) { editDef = true } else { prompt += '(' + def + ') ' } } var terminal = !!(opts.terminal || output.isTTY) var m = new Mute({ replace: opts.replace, prompt: prompt }) m.pipe(output, {end: false}) output = m var rlOpts = { input: input, output: output, terminal: terminal } if (process.version.match(/^v0\.6/)) { var rl = readline.createInterface(rlOpts.input, rlOpts.output) } else { var rl = readline.createInterface(rlOpts) } output.unmute() rl.setPrompt(prompt) rl.prompt() if (silent) { output.mute() } else if (editDef) { rl.line = def rl.cursor = def.length rl._refreshLine() } var called = false rl.on('line', onLine) rl.on('error', onError) rl.on('SIGINT', function () { rl.close() onError(new Error('canceled')) }) var timer if (timeout) { timer = setTimeout(function () { onError(new Error('timed out')) }, timeout) } function done () { called = true rl.close() if (process.version.match(/^v0\.6/)) { rl.input.removeAllListeners('data') rl.input.removeAllListeners('keypress') rl.input.pause() } clearTimeout(timer) output.mute() output.end() } function onError (er) { if (called) return done() return cb(er) } function onLine (line) { if (called) return if (silent && terminal) { output.unmute() output.write('\r\n') } done() // truncate the \n at the end. line = line.replace(/\r?\n$/, '') var isDefault = !!(editDef && line === def) if (def && !line) { isDefault = true line = def } cb(null, line, isDefault) } } ���������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/read/example/example.js��������������������000644 �000766 �000024 �00000001010 12455173731 027706� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var read = require("../lib/read.js") read({prompt: "Username: ", default: "test-user" }, function (er, user) { read({prompt: "Password: ", default: "test-pass", silent: true }, function (er, pass) { read({prompt: "Password again: ", default: "test-pass", silent: true }, function (er, pass2) { console.error({user: user, pass: pass, verify: pass2, passMatch: (pass === pass2)}) console.error("the program should exit now") }) }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/path-is-inside/lib/������������������������000755 �000766 �000024 �00000000000 12456115117 026736� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/path-is-inside/LICENSE.txt�����������������000644 �000766 �000024 �00000001355 12455173731 030024� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright © 2013–2014 Domenic Denicola <domenic@domenicdenicola.com> This work is free. You can redistribute it and/or modify it under the terms of the Do What The Fuck You Want To Public License, Version 2, as published by Sam Hocevar. See below for more details. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE Version 2, December 2004 Copyright (C) 2004 Sam Hocevar <sam@hocevar.net> Everyone is permitted to copy and distribute verbatim or modified copies of this license document, and changing it is allowed as long as the name is changed. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. You just DO WHAT THE FUCK YOU WANT TO. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/path-is-inside/package.json����������������000644 �000766 �000024 �00000005070 12455173731 030465� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "path-is-inside", "description": "Tests whether one path is inside another path", "keywords": [ "path", "directory", "folder", "inside", "relative" ], "version": "1.0.1", "author": { "name": "Domenic Denicola", "email": "domenic@domenicdenicola.com", "url": "http://domenic.me" }, "license": "WTFPL", "repository": { "type": "git", "url": "git://github.com/domenic/path-is-inside.git" }, "bugs": { "url": "http://github.com/domenic/path-is-inside/issues" }, "main": "lib/path-is-inside.js", "scripts": { "test": "mocha", "lint": "jshint lib" }, "devDependencies": { "jshint": "~2.3.0", "mocha": "~1.15.1" }, "readme": "# Is This Path Inside This Other Path?\n\nIt turns out this question isn't trivial to answer using Node's built-in path APIs. A naive `indexOf`-based solution will fail sometimes on Windows, which is case-insensitive (see e.g. [isaacs/npm#4214][]). You might then think to be clever with `path.resolve`, but you have to be careful to account for situations whether the paths have different drive letters, or else you'll cause bugs like [isaacs/npm#4313][]. And let's not even get started on trailing slashes.\n\nThe **path-is-inside** package will give you a robust, cross-platform way of detecting whether a given path is inside another path.\n\n## Usage\n\nPretty simple. First the path being tested; then the potential parent. Like so:\n\n```js\nvar pathIsInside = require(\"path-is-inside\");\n\npathIsInside(\"/x/y/z\", \"/x/y\") // true\npathIsInside(\"/x/y\", \"/x/y/z\") // false\n```\n\n## OS-Specific Behavior\n\nLike Node's built-in path module, path-is-inside treats all file paths on Windows as case-insensitive, whereas it treats all file paths on *-nix operating systems as case-sensitive. Keep this in mind especially when working on a Mac, where, despite Node's defaults, the OS usually treats paths case-insensitively.\n\nIn practice, this means:\n\n```js\n// On Windows\n\npathIsInside(\"C:\\\\X\\\\Y\\\\Z\", \"C:\\\\x\\\\y\") // true\n\n// On *-nix, including Mac OS X\n\npathIsInside(\"/X/Y/Z\", \"/x/y\") // false\n```\n\n[isaacs/npm#4214]: https://github.com/isaacs/npm/pull/4214\n[isaacs/npm#4313]: https://github.com/isaacs/npm/issues/4313\n", "readmeFilename": "README.md", "homepage": "https://github.com/domenic/path-is-inside", "_id": "path-is-inside@1.0.1", "dist": { "shasum": "c5e6c4764c4cd41f2ac839c53be5621d085726b3" }, "_from": "path-is-inside@1.0.1", "_resolved": "https://registry.npmjs.org/path-is-inside/-/path-is-inside-1.0.1.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/path-is-inside/README.md�������������������000644 �000766 �000024 �00000002737 12455173731 027465� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Is This Path Inside This Other Path? It turns out this question isn't trivial to answer using Node's built-in path APIs. A naive `indexOf`-based solution will fail sometimes on Windows, which is case-insensitive (see e.g. [isaacs/npm#4214][]). You might then think to be clever with `path.resolve`, but you have to be careful to account for situations whether the paths have different drive letters, or else you'll cause bugs like [isaacs/npm#4313][]. And let's not even get started on trailing slashes. The **path-is-inside** package will give you a robust, cross-platform way of detecting whether a given path is inside another path. ## Usage Pretty simple. First the path being tested; then the potential parent. Like so: ```js var pathIsInside = require("path-is-inside"); pathIsInside("/x/y/z", "/x/y") // true pathIsInside("/x/y", "/x/y/z") // false ``` ## OS-Specific Behavior Like Node's built-in path module, path-is-inside treats all file paths on Windows as case-insensitive, whereas it treats all file paths on *-nix operating systems as case-sensitive. Keep this in mind especially when working on a Mac, where, despite Node's defaults, the OS usually treats paths case-insensitively. In practice, this means: ```js // On Windows pathIsInside("C:\\X\\Y\\Z", "C:\\x\\y") // true // On *-nix, including Mac OS X pathIsInside("/X/Y/Z", "/x/y") // false ``` [isaacs/npm#4214]: https://github.com/isaacs/npm/pull/4214 [isaacs/npm#4313]: https://github.com/isaacs/npm/issues/4313 ���������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/path-is-inside/lib/path-is-inside.js�������000644 �000766 �000024 �00000001532 12455173731 032120� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������"use strict"; var path = require("path"); module.exports = function (thePath, potentialParent) { // For inside-directory checking, we want to allow trailing slashes, so normalize. thePath = stripTrailingSep(thePath); potentialParent = stripTrailingSep(potentialParent); // Node treats only Windows as case-insensitive in its path module; we follow those conventions. if (process.platform === "win32") { thePath = thePath.toLowerCase(); potentialParent = potentialParent.toLowerCase(); } return thePath.lastIndexOf(potentialParent, 0) === 0 && ( thePath[potentialParent.length] === path.sep || thePath[potentialParent.length] === undefined ); }; function stripTrailingSep(thePath) { if (thePath[thePath.length - 1] === path.sep) { return thePath.slice(0, -1); } return thePath; } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/osenv/LICENSE������������������������������000644 �000766 �000024 �00000002436 12455173731 025523� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/osenv/osenv.js�����������������������������000644 �000766 �000024 �00000003527 12455173731 026210� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var isWindows = process.platform === 'win32' var path = require('path') var exec = require('child_process').exec var os = require('os') // looking up envs is a bit costly. // Also, sometimes we want to have a fallback // Pass in a callback to wait for the fallback on failures // After the first lookup, always returns the same thing. function memo (key, lookup, fallback) { var fell = false var falling = false exports[key] = function (cb) { var val = lookup() if (!val && !fell && !falling && fallback) { fell = true falling = true exec(fallback, function (er, output, stderr) { falling = false if (er) return // oh well, we tried val = output.trim() }) } exports[key] = function (cb) { if (cb) process.nextTick(cb.bind(null, null, val)) return val } if (cb && !falling) process.nextTick(cb.bind(null, null, val)) return val } } memo('user', function () { return ( isWindows ? process.env.USERDOMAIN + '\\' + process.env.USERNAME : process.env.USER ) }, 'whoami') memo('prompt', function () { return isWindows ? process.env.PROMPT : process.env.PS1 }) memo('hostname', function () { return isWindows ? process.env.COMPUTERNAME : process.env.HOSTNAME }, 'hostname') memo('tmpdir', function () { return os.tmpDir() }) memo('home', function () { return ( isWindows ? process.env.USERPROFILE : process.env.HOME ) }) memo('path', function () { return (process.env.PATH || process.env.Path || process.env.path).split(isWindows ? ';' : ':') }) memo('editor', function () { return process.env.EDITOR || process.env.VISUAL || (isWindows ? 'notepad.exe' : 'vi') }) memo('shell', function () { return isWindows ? process.env.ComSpec || 'cmd' : process.env.SHELL || 'bash' }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/osenv/package.json�������������������������000644 �000766 �000024 �00000004630 12455173731 027002� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "osenv", "version": "0.1.0", "main": "osenv.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.4.9" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/osenv" }, "keywords": [ "environment", "variable", "home", "tmpdir", "path", "prompt", "ps1" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "BSD", "description": "Look up environment settings specific to different operating systems", "readme": "# osenv\n\nLook up environment settings specific to different operating systems.\n\n## Usage\n\n```javascript\nvar osenv = require('osenv')\nvar path = osenv.path()\nvar user = osenv.user()\n// etc.\n\n// Some things are not reliably in the env, and have a fallback command:\nvar h = osenv.hostname(function (er, hostname) {\n h = hostname\n})\n// This will still cause it to be memoized, so calling osenv.hostname()\n// is now an immediate operation.\n\n// You can always send a cb, which will get called in the nextTick\n// if it's been memoized, or wait for the fallback data if it wasn't\n// found in the environment.\nosenv.hostname(function (er, hostname) {\n if (er) console.error('error looking up hostname')\n else console.log('this machine calls itself %s', hostname)\n})\n```\n\n## osenv.hostname()\n\nThe machine name. Calls `hostname` if not found.\n\n## osenv.user()\n\nThe currently logged-in user. Calls `whoami` if not found.\n\n## osenv.prompt()\n\nEither PS1 on unix, or PROMPT on Windows.\n\n## osenv.tmpdir()\n\nThe place where temporary files should be created.\n\n## osenv.home()\n\nNo place like it.\n\n## osenv.path()\n\nAn array of the places that the operating system will search for\nexecutables.\n\n## osenv.editor() \n\nReturn the executable name of the editor program. This uses the EDITOR\nand VISUAL environment variables, and falls back to `vi` on Unix, or\n`notepad.exe` on Windows.\n\n## osenv.shell()\n\nThe SHELL on Unix, which Windows calls the ComSpec. Defaults to 'bash'\nor 'cmd'.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/osenv/issues" }, "homepage": "https://github.com/isaacs/osenv", "_id": "osenv@0.1.0", "_shasum": "61668121eec584955030b9f470b1d2309504bfcb", "_from": "osenv@~0.1.0" } ��������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/osenv/README.md����������������������������000644 �000766 �000024 �00000002674 12455173731 026001� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# osenv Look up environment settings specific to different operating systems. ## Usage ```javascript var osenv = require('osenv') var path = osenv.path() var user = osenv.user() // etc. // Some things are not reliably in the env, and have a fallback command: var h = osenv.hostname(function (er, hostname) { h = hostname }) // This will still cause it to be memoized, so calling osenv.hostname() // is now an immediate operation. // You can always send a cb, which will get called in the nextTick // if it's been memoized, or wait for the fallback data if it wasn't // found in the environment. osenv.hostname(function (er, hostname) { if (er) console.error('error looking up hostname') else console.log('this machine calls itself %s', hostname) }) ``` ## osenv.hostname() The machine name. Calls `hostname` if not found. ## osenv.user() The currently logged-in user. Calls `whoami` if not found. ## osenv.prompt() Either PS1 on unix, or PROMPT on Windows. ## osenv.tmpdir() The place where temporary files should be created. ## osenv.home() No place like it. ## osenv.path() An array of the places that the operating system will search for executables. ## osenv.editor() Return the executable name of the editor program. This uses the EDITOR and VISUAL environment variables, and falls back to `vi` on Unix, or `notepad.exe` on Windows. ## osenv.shell() The SHELL on Unix, which Windows calls the ComSpec. Defaults to 'bash' or 'cmd'. ��������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/opener/LICENSE.txt�������������������������000644 �000766 �000024 �00000001355 12455173731 026476� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright © 2012–2014 Domenic Denicola <domenic@domenicdenicola.com> This work is free. You can redistribute it and/or modify it under the terms of the Do What The Fuck You Want To Public License, Version 2, as published by Sam Hocevar. See below for more details. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE Version 2, December 2004 Copyright (C) 2004 Sam Hocevar <sam@hocevar.net> Everyone is permitted to copy and distribute verbatim or modified copies of this license document, and changing it is allowed as long as the name is changed. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. You just DO WHAT THE FUCK YOU WANT TO. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/opener/opener.js���������������������������000755 �000766 �000024 �00000003572 12455173731 026507� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node "use strict"; var childProcess = require("child_process"); function opener(args, options, callback) { // http://stackoverflow.com/q/1480971/3191, but see below for Windows. var command = process.platform === "win32" ? "cmd" : process.platform === "darwin" ? "open" : "xdg-open"; if (typeof args === "string") { args = [args]; } if (typeof options === "function") { callback = options; options = {}; } if (options && typeof options === "object" && options.command) { if (process.platform === "win32") { // *always* use cmd on windows args = [options.command].concat(args); } else { command = options.command; } } if (process.platform === "win32") { // On Windows, we really want to use the "start" command. But, the rules regarding arguments with spaces, and // escaping them with quotes, can get really arcane. So the easiest way to deal with this is to pass off the // responsibility to "cmd /c", which has that logic built in. // // Furthermore, if "cmd /c" double-quoted the first parameter, then "start" will interpret it as a window title, // so we need to add a dummy empty-string window title: http://stackoverflow.com/a/154090/3191 args = ["/c", "start", '""'].concat(args); } return childProcess.execFile(command, args, options, callback); } // Export `opener` for programmatic access. // You might use this to e.g. open a website: `opener("http://google.com")` module.exports = opener; // If we're being called from the command line, just execute, using the command-line arguments. if (require.main && require.main.id === module.id) { opener(process.argv.slice(2), function (error) { if (error) { throw error; } }); } ��������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/opener/package.json������������������������000644 �000766 �000024 �00000002415 12455173731 027137� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "opener", "description": "Opens stuff, like webpages and files and executables, cross-platform", "version": "1.4.0", "author": { "name": "Domenic Denicola", "email": "domenic@domenicdenicola.com", "url": "http://domenic.me/" }, "license": "WTFPL", "repository": { "type": "git", "url": "git://github.com/domenic/opener.git" }, "bugs": { "url": "http://github.com/domenic/opener/issues" }, "main": "opener.js", "bin": { "opener": "opener.js" }, "scripts": { "lint": "jshint opener.js" }, "devDependencies": { "jshint": "^2.5.4" }, "gitHead": "b9d36d4f82c26560acdadbabbb10ddba46a30dc5", "homepage": "https://github.com/domenic/opener", "_id": "opener@1.4.0", "_shasum": "d11f86eeeb076883735c9d509f538fe82d10b941", "_from": "opener@>=1.4.0 <1.5.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "domenic", "email": "domenic@domenicdenicola.com" }, "maintainers": [ { "name": "domenic", "email": "domenic@domenicdenicola.com" } ], "dist": { "shasum": "d11f86eeeb076883735c9d509f538fe82d10b941", "tarball": "http://registry.npmjs.org/opener/-/opener-1.4.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/opener/-/opener-1.4.0.tgz" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/opener/README.md���������������������������000644 �000766 �000024 �00000002425 12455173731 026131� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# It Opens Stuff That is, in your desktop environment. This will make *actual windows pop up*, with stuff in them: ```bash npm install opener -g opener http://google.com opener ./my-file.txt opener firefox opener npm run lint ``` Also if you want to use it programmatically you can do that too: ```js var opener = require("opener"); opener("http://google.com"); opener("./my-file.txt"); opener("firefox"); opener("npm run lint"); ``` Plus, it returns the child process created, so you can do things like let your script exit while the window stays open: ```js var editor = opener("documentation.odt"); editor.unref(); // These other unrefs may be necessary if your OS's opener process // exits before the process it started is complete. editor.stdin.unref(); editor.stdout.unref(); editor.stderr.unref(); ``` ## Use It for Good Like opening the user's browser with a test harness in your package's test script: ```json { "scripts": { "test": "opener ./test/runner.html" }, "devDependencies": { "opener": "*" } } ``` ## Why Because Windows has `start`, Macs have `open`, and *nix has `xdg-open`. At least [according to some guy on StackOverflow](http://stackoverflow.com/q/1480971/3191). And I like things that work on all three. Like Node.js. And Opener. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/once/LICENSE�������������������������������000644 �000766 �000024 �00000002436 12455173731 025315� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/once/once.js�������������������������������000644 �000766 �000024 �00000000641 12455173731 025566� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var wrappy = require('wrappy') module.exports = wrappy(once) once.proto = once(function () { Object.defineProperty(Function.prototype, 'once', { value: function () { return once(this) }, configurable: true }) }) function once (fn) { var f = function () { if (f.called) return f.value f.called = true return f.value = fn.apply(this, arguments) } f.called = false return f } �����������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/once/package.json��������������������������000644 �000766 �000024 �00000002464 12455173731 026577� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "once", "version": "1.3.1", "description": "Run a function exactly one time", "main": "once.js", "directories": { "test": "test" }, "dependencies": { "wrappy": "1" }, "devDependencies": { "tap": "~0.3.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/once" }, "keywords": [ "once", "function", "one", "single" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "BSD", "gitHead": "c90ac02a74f433ce47f6938869e68dd6196ffc2c", "bugs": { "url": "https://github.com/isaacs/once/issues" }, "homepage": "https://github.com/isaacs/once", "_id": "once@1.3.1", "_shasum": "f3f3e4da5b7d27b5c732969ee3e67e729457b31f", "_from": "once@>=1.3.1 <2.0.0", "_npmVersion": "2.0.0", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "f3f3e4da5b7d27b5c732969ee3e67e729457b31f", "tarball": "http://registry.npmjs.org/once/-/once-1.3.1.tgz" }, "_resolved": "https://registry.npmjs.org/once/-/once-1.3.1.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/once/README.md�����������������������������000644 �000766 �000024 �00000001764 12455173731 025572� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# once Only call a function once. ## usage ```javascript var once = require('once') function load (file, cb) { cb = once(cb) loader.load('file') loader.once('load', cb) loader.once('error', cb) } ``` Or add to the Function.prototype in a responsible way: ```javascript // only has to be done once require('once').proto() function load (file, cb) { cb = cb.once() loader.load('file') loader.once('load', cb) loader.once('error', cb) } ``` Ironically, the prototype feature makes this module twice as complicated as necessary. To check whether you function has been called, use `fn.called`. Once the function is called for the first time the return value of the original function is saved in `fn.value` and subsequent calls will continue to return this value. ```javascript var once = require('once') function load (cb) { cb = once(cb) var stream = createStream() stream.once('data', cb) stream.once('end', function () { if (!cb.called) cb(new Error('not found')) }) } ``` ������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/.npmrc������������������������������000644 �000766 �000024 �00000000054 12455173731 025772� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������save-prefix = ~ proprietary-attribs = false ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/example.js��������������������������000644 �000766 �000024 �00000003125 12455173731 026645� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var log = require('./log.js') log.heading = 'npm' console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n'+ new Error('a 404 error').stack) log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/LICENSE�����������������������������000644 �000766 �000024 �00000002436 12455173731 025665� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/log.js������������������������������000644 �000766 �000024 �00000007557 12455173731 026010� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var EE = require('events').EventEmitter var log = exports = module.exports = new EE var util = require('util') var ansi = require('ansi') log.cursor = ansi(process.stderr) log.stream = process.stderr // by default, let ansi decide based on tty-ness. var colorEnabled = undefined log.enableColor = function () { colorEnabled = true this.cursor.enabled = true } log.disableColor = function () { colorEnabled = false this.cursor.enabled = false } // default level log.level = 'info' // temporarily stop emitting, but don't drop log.pause = function () { this._paused = true } log.resume = function () { if (!this._paused) return this._paused = false var b = this._buffer this._buffer = [] b.forEach(function (m) { this.emitLog(m) }, this) } log._buffer = [] var id = 0 log.record = [] log.maxRecordSize = 10000 log.log = function (lvl, prefix, message) { var l = this.levels[lvl] if (l === undefined) { return this.emit('error', new Error(util.format( 'Undefined log level: %j', lvl))) } var a = new Array(arguments.length - 2) var stack = null for (var i = 2; i < arguments.length; i ++) { var arg = a[i-2] = arguments[i] // resolve stack traces to a plain string. if (typeof arg === 'object' && arg && (arg instanceof Error) && arg.stack) { arg.stack = stack = arg.stack + '' } } if (stack) a.unshift(stack + '\n') message = util.format.apply(util, a) var m = { id: id++, level: lvl, prefix: String(prefix || ''), message: message, messageRaw: a } this.emit('log', m) this.emit('log.' + lvl, m) if (m.prefix) this.emit(m.prefix, m) this.record.push(m) var mrs = this.maxRecordSize var n = this.record.length - mrs if (n > mrs / 10) { var newSize = Math.floor(mrs * 0.9) this.record = this.record.slice(-1 * newSize) } this.emitLog(m) }.bind(log) log.emitLog = function (m) { if (this._paused) { this._buffer.push(m) return } var l = this.levels[m.level] if (l === undefined) return if (l < this.levels[this.level]) return if (l > 0 && !isFinite(l)) return var style = log.style[m.level] var disp = log.disp[m.level] || m.level m.message.split(/\r?\n/).forEach(function (line) { if (this.heading) { this.write(this.heading, this.headingStyle) this.write(' ') } this.write(disp, log.style[m.level]) var p = m.prefix || '' if (p) this.write(' ') this.write(p, this.prefixStyle) this.write(' ' + line + '\n') }, this) } log.write = function (msg, style) { if (!this.cursor) return if (this.stream !== this.cursor.stream) { this.cursor = ansi(this.stream, { enabled: colorEnabled }) } style = style || {} if (style.fg) this.cursor.fg[style.fg]() if (style.bg) this.cursor.bg[style.bg]() if (style.bold) this.cursor.bold() if (style.underline) this.cursor.underline() if (style.inverse) this.cursor.inverse() if (style.beep) this.cursor.beep() this.cursor.write(msg).reset() } log.addLevel = function (lvl, n, style, disp) { if (!disp) disp = lvl this.levels[lvl] = n this.style[lvl] = style if (!this[lvl]) this[lvl] = function () { var a = new Array(arguments.length + 1) a[0] = lvl for (var i = 0; i < arguments.length; i ++) { a[i + 1] = arguments[i] } return this.log.apply(this, a) }.bind(this) this.disp[lvl] = disp } log.prefixStyle = { fg: 'magenta' } log.headingStyle = { fg: 'white', bg: 'black' } log.style = {} log.levels = {} log.disp = {} log.addLevel('silly', -Infinity, { inverse: true }, 'sill') log.addLevel('verbose', 1000, { fg: 'blue', bg: 'black' }, 'verb') log.addLevel('info', 2000, { fg: 'green' }) log.addLevel('http', 3000, { fg: 'green', bg: 'black' }) log.addLevel('warn', 4000, { fg: 'black', bg: 'yellow' }, 'WARN') log.addLevel('error', 5000, { fg: 'red', bg: 'black' }, 'ERR!') log.addLevel('silent', Infinity) �������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/package.json������������������������000644 �000766 �000024 �00000011467 12455173731 027152� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "npmlog", "description": "logger for npm", "version": "0.1.1", "repository": { "type": "git", "url": "git://github.com/isaacs/npmlog.git" }, "main": "log.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "ansi": "~0.3.0" }, "devDependencies": { "tap": "" }, "license": "BSD", "readme": "# npmlog\n\nThe logger util that npm uses.\n\nThis logger is very basic. It does the logging for npm. It supports\ncustom levels and colored output.\n\nBy default, logs are written to stderr. If you want to send log messages\nto outputs other than streams, then you can change the `log.stream`\nmember, or you can just listen to the events that it emits, and do\nwhatever you want with them.\n\n# Basic Usage\n\n```\nvar log = require('npmlog')\n\n// additional stuff ---------------------------+\n// message ----------+ |\n// prefix ----+ | |\n// level -+ | | |\n// v v v v\n log.info('fyi', 'I have a kitty cat: %j', myKittyCat)\n```\n\n## log.level\n\n* {String}\n\nThe level to display logs at. Any logs at or above this level will be\ndisplayed. The special level `silent` will prevent anything from being\ndisplayed ever.\n\n## log.record\n\n* {Array}\n\nAn array of all the log messages that have been entered.\n\n## log.maxRecordSize\n\n* {Number}\n\nThe maximum number of records to keep. If log.record gets bigger than\n10% over this value, then it is sliced down to 90% of this value.\n\nThe reason for the 10% window is so that it doesn't have to resize a\nlarge array on every log entry.\n\n## log.prefixStyle\n\n* {Object}\n\nA style object that specifies how prefixes are styled. (See below)\n\n## log.headingStyle\n\n* {Object}\n\nA style object that specifies how the heading is styled. (See below)\n\n## log.heading\n\n* {String} Default: \"\"\n\nIf set, a heading that is printed at the start of every line.\n\n## log.stream\n\n* {Stream} Default: `process.stderr`\n\nThe stream where output is written.\n\n## log.enableColor()\n\nForce colors to be used on all messages, regardless of the output\nstream.\n\n## log.disableColor()\n\nDisable colors on all messages.\n\n## log.pause()\n\nStop emitting messages to the stream, but do not drop them.\n\n## log.resume()\n\nEmit all buffered messages that were written while paused.\n\n## log.log(level, prefix, message, ...)\n\n* `level` {String} The level to emit the message at\n* `prefix` {String} A string prefix. Set to \"\" to skip.\n* `message...` Arguments to `util.format`\n\nEmit a log message at the specified level.\n\n## log\\[level](prefix, message, ...)\n\nFor example,\n\n* log.silly(prefix, message, ...)\n* log.verbose(prefix, message, ...)\n* log.info(prefix, message, ...)\n* log.http(prefix, message, ...)\n* log.warn(prefix, message, ...)\n* log.error(prefix, message, ...)\n\nLike `log.log(level, prefix, message, ...)`. In this way, each level is\ngiven a shorthand, so you can do `log.info(prefix, message)`.\n\n## log.addLevel(level, n, style, disp)\n\n* `level` {String} Level indicator\n* `n` {Number} The numeric level\n* `style` {Object} Object with fg, bg, inverse, etc.\n* `disp` {String} Optional replacement for `level` in the output.\n\nSets up a new level with a shorthand function and so forth.\n\nNote that if the number is `Infinity`, then setting the level to that\nwill cause all log messages to be suppressed. If the number is\n`-Infinity`, then the only way to show it is to enable all log messages.\n\n# Events\n\nEvents are all emitted with the message object.\n\n* `log` Emitted for all messages\n* `log.<level>` Emitted for all messages with the `<level>` level.\n* `<prefix>` Messages with prefixes also emit their prefix as an event.\n\n# Style Objects\n\nStyle objects can have the following fields:\n\n* `fg` {String} Color for the foreground text\n* `bg` {String} Color for the background\n* `bold`, `inverse`, `underline` {Boolean} Set the associated property\n* `bell` {Boolean} Make a noise (This is pretty annoying, probably.)\n\n# Message Objects\n\nEvery log event is emitted with a message object, and the `log.record`\nlist contains all of them that have been created. They have the\nfollowing fields:\n\n* `id` {Number}\n* `level` {String}\n* `prefix` {String}\n* `message` {String} Result of `util.format()`\n* `messageRaw` {Array} Arguments to `util.format()`\n", "readmeFilename": "README.md", "gitHead": "b58e360cd99db707d1191ce6125ae53d79f075a1", "bugs": { "url": "https://github.com/isaacs/npmlog/issues" }, "homepage": "https://github.com/isaacs/npmlog", "_id": "npmlog@0.1.1", "_shasum": "8b9b9e4405d7ec48c31c2346965aadc7abaecaa5", "_from": "npmlog@latest" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npmlog/README.md���������������������������000644 �000766 �000024 �00000007600 12455173731 026135� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# npmlog The logger util that npm uses. This logger is very basic. It does the logging for npm. It supports custom levels and colored output. By default, logs are written to stderr. If you want to send log messages to outputs other than streams, then you can change the `log.stream` member, or you can just listen to the events that it emits, and do whatever you want with them. # Basic Usage ``` var log = require('npmlog') // additional stuff ---------------------------+ // message ----------+ | // prefix ----+ | | // level -+ | | | // v v v v log.info('fyi', 'I have a kitty cat: %j', myKittyCat) ``` ## log.level * {String} The level to display logs at. Any logs at or above this level will be displayed. The special level `silent` will prevent anything from being displayed ever. ## log.record * {Array} An array of all the log messages that have been entered. ## log.maxRecordSize * {Number} The maximum number of records to keep. If log.record gets bigger than 10% over this value, then it is sliced down to 90% of this value. The reason for the 10% window is so that it doesn't have to resize a large array on every log entry. ## log.prefixStyle * {Object} A style object that specifies how prefixes are styled. (See below) ## log.headingStyle * {Object} A style object that specifies how the heading is styled. (See below) ## log.heading * {String} Default: "" If set, a heading that is printed at the start of every line. ## log.stream * {Stream} Default: `process.stderr` The stream where output is written. ## log.enableColor() Force colors to be used on all messages, regardless of the output stream. ## log.disableColor() Disable colors on all messages. ## log.pause() Stop emitting messages to the stream, but do not drop them. ## log.resume() Emit all buffered messages that were written while paused. ## log.log(level, prefix, message, ...) * `level` {String} The level to emit the message at * `prefix` {String} A string prefix. Set to "" to skip. * `message...` Arguments to `util.format` Emit a log message at the specified level. ## log\[level](prefix, message, ...) For example, * log.silly(prefix, message, ...) * log.verbose(prefix, message, ...) * log.info(prefix, message, ...) * log.http(prefix, message, ...) * log.warn(prefix, message, ...) * log.error(prefix, message, ...) Like `log.log(level, prefix, message, ...)`. In this way, each level is given a shorthand, so you can do `log.info(prefix, message)`. ## log.addLevel(level, n, style, disp) * `level` {String} Level indicator * `n` {Number} The numeric level * `style` {Object} Object with fg, bg, inverse, etc. * `disp` {String} Optional replacement for `level` in the output. Sets up a new level with a shorthand function and so forth. Note that if the number is `Infinity`, then setting the level to that will cause all log messages to be suppressed. If the number is `-Infinity`, then the only way to show it is to enable all log messages. # Events Events are all emitted with the message object. * `log` Emitted for all messages * `log.<level>` Emitted for all messages with the `<level>` level. * `<prefix>` Messages with prefixes also emit their prefix as an event. # Style Objects Style objects can have the following fields: * `fg` {String} Color for the foreground text * `bg` {String} Color for the background * `bold`, `inverse`, `underline` {Boolean} Set the associated property * `bell` {Boolean} Make a noise (This is pretty annoying, probably.) # Message Objects Every log event is emitted with a message object, and the `log.record` list contains all of them that have been created. They have the following fields: * `id` {Number} * `level` {String} * `prefix` {String} * `message` {String} Result of `util.format()` * `messageRaw` {Array} Arguments to `util.format()` ��������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/.npmignore���������������000644 �000766 �000024 �00000000145 12455173731 030713� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modules���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/.travis.yml��������������000644 �000766 �000024 �00000000057 12455173731 031027� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10"���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/LICENSE������������������000644 �000766 �000024 �00000002417 12455173731 027725� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Robert Kowalski All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/npm-user-validate.js�����000644 �000766 �000024 �00000001476 12455173731 032617� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������exports.email = email exports.pw = pw exports.username = username var requirements = exports.requirements = { username: { lowerCase: 'Username must be lowercase', urlSafe: 'Username may not contain non-url-safe chars', dot: 'Username may not start with "."' }, password: {}, email: { valid: 'Email must be an email address' } }; function username (un) { if (un !== un.toLowerCase()) { return new Error(requirements.username.lowerCase) } if (un !== encodeURIComponent(un)) { return new Error(requirements.username.urlSafe) } if (un.charAt(0) === '.') { return new Error(requirements.username.dot) } return null } function email (em) { if (!em.match(/^.+@.+\..+$/)) { return new Error(requirements.email.valid) } return null } function pw (pw) { return null } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/package.json�������������000644 �000766 �000024 �00000002561 12455173731 031206� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-user-validate", "version": "0.1.1", "description": "User validations for npm", "main": "npm-user-validate.js", "devDependencies": { "tap": "0.4.3" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/npm/npm-user-validate.git" }, "keywords": [ "npm", "validation", "registry" ], "author": { "name": "Robert Kowalski", "email": "rok@kowalski.gd" }, "license": "BSD", "gitHead": "64c9bd4ded742c41afdb3a8414fbbfdbfdcdf6b7", "bugs": { "url": "https://github.com/npm/npm-user-validate/issues" }, "homepage": "https://github.com/npm/npm-user-validate", "_id": "npm-user-validate@0.1.1", "_shasum": "ea7774636c3c8fe6d01e174bd9f2ee0e22eeed57", "_from": "npm-user-validate@>=0.1.1 <0.2.0", "_npmVersion": "2.1.3", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "robertkowalski", "email": "rok@kowalski.gd" }, { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "ea7774636c3c8fe6d01e174bd9f2ee0e22eeed57", "tarball": "http://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.1.tgz" } �����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-user-validate/README.md����������������000644 �000766 �000024 �00000000566 12455173731 030202� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������[![Build Status](https://travis-ci.org/npm/npm-user-validate.png?branch=master)](https://travis-ci.org/npm/npm-user-validate) [![devDependency Status](https://david-dm.org/npm/npm-user-validate/dev-status.png)](https://david-dm.org/npm/npm-user-validate#info=devDependencies) # npm-user-validate Validation for the npm client and npm-www (and probably other npm projects) ������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/.npmignore�������������000644 �000766 �000024 �00000000103 12455173731 031264� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������test/fixtures/cache node_modules npm-debug.log .eslintrc .jshintrc �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/index.js���������������000644 �000766 �000024 �00000003155 12455173731 030744� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// utilities for working with the js-registry site. module.exports = RegClient var join = require("path").join , fs = require("graceful-fs") var npmlog try { npmlog = require("npmlog") } catch (er) { npmlog = { error : noop, warn : noop, info : noop, verbose : noop, silly : noop, http : noop, pause : noop, resume : noop } } function noop () {} function RegClient (config) { this.config = Object.create(config || {}) this.config.proxy = this.config.proxy || {} if (!this.config.proxy.https && this.config.proxy.http) { this.config.proxy.https = this.config.proxy.http } this.config.ssl = this.config.ssl || {} if (this.config.ssl.strict === undefined) this.config.ssl.strict = true this.config.retry = this.config.retry || {} if (typeof this.config.retry.retries !== "number") this.config.retry.retries = 2 if (typeof this.config.retry.factor !== "number") this.config.retry.factor = 10 if (typeof this.config.retry.minTimeout !== "number") this.config.retry.minTimeout = 10000 if (typeof this.config.retry.maxTimeout !== "number") this.config.retry.maxTimeout = 60000 this.config.userAgent = this.config.userAgent || "node/" + process.version this.config.defaultTag = this.config.defaultTag || "latest" this.log = this.config.log || npmlog delete this.config.log } fs.readdirSync(join(__dirname, "lib")).forEach(function (f) { if (!f.match(/\.js$/)) return var name = f.replace(/\.js$/, "") .replace(/-([a-z])/, function (_, l) { return l.toUpperCase() }) RegClient.prototype[name] = require(join(__dirname, "lib", f)) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/�������������������000755 �000766 �000024 �00000000000 12456115117 030034� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/LICENSE����������������000644 �000766 �000024 �00000001375 12455173731 030306� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/node_modules/����������000755 �000766 �000024 �00000000000 12456115117 031743� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/package.json�����������000644 �000766 �000024 �00000023131 12455173731 031561� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "npm-registry-client", "description": "Client for the npm registry", "version": "4.0.5", "repository": { "url": "git://github.com/isaacs/npm-registry-client" }, "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "chownr": "0", "concat-stream": "^1.4.6", "graceful-fs": "^3.0.0", "mkdirp": "^0.5.0", "normalize-package-data": "~1.0.1", "once": "^1.3.0", "request": "^2.47.0", "retry": "^0.6.1", "rimraf": "2", "semver": "2 >=2.2.1 || 3.x || 4", "slide": "^1.1.3", "npmlog": "" }, "devDependencies": { "negotiator": "^0.4.9", "tap": "" }, "optionalDependencies": { "npmlog": "" }, "license": "ISC", "readme": "# npm-registry-client\n\nThe code that npm uses to talk to the registry.\n\nIt handles all the caching and HTTP calls.\n\n## Usage\n\n```javascript\nvar RegClient = require('npm-registry-client')\nvar client = new RegClient(config)\nvar uri = \"npm://registry.npmjs.org/npm\"\nvar params = {timeout: 1000}\n\nclient.get(uri, params, function (error, data, raw, res) {\n // error is an error if there was a problem.\n // data is the parsed data object\n // raw is the json string\n // res is the response from couch\n})\n```\n\n# Registry URLs\n\nThe registry calls take either a full URL pointing to a resource in the\nregistry, or a base URL for the registry as a whole (including the registry\npath – but be sure to terminate the path with `/`). `http` and `https` URLs are\nthe only ones supported.\n\n## Using the client\n\nEvery call to the client follows the same pattern:\n\n* `uri` {String} The *fully-qualified* URI of the registry API method being\n invoked.\n* `params` {Object} Per-request parameters.\n* `callback` {Function} Callback to be invoked when the call is complete.\n\n### Credentials\n\nMany requests to the registry can by authenticated, and require credentials\nfor authorization. These credentials always look the same:\n\n* `username` {String}\n* `password` {String}\n* `email` {String}\n* `alwaysAuth` {Boolean} Whether calls to the target registry are always\n authed.\n\n**or**\n\n* `token` {String}\n* `alwaysAuth` {Boolean} Whether calls to the target registry are always\n authed.\n\n## API\n\n### client.adduser(uri, params, cb)\n\n* `uri` {String} Base registry URL.\n* `params` {Object} Object containing per-request properties.\n * `auth` {Credentials}\n* `cb` {Function}\n * `error` {Error | null}\n * `data` {Object} the parsed data object\n * `raw` {String} the json\n * `res` {Response Object} response from couch\n\nAdd a user account to the registry, or verify the credentials.\n\n### client.deprecate(uri, params, cb)\n\n* `uri` {String} Full registry URI for the deprecated package.\n* `params` {Object} Object containing per-request properties.\n * `version` {String} Semver version range.\n * `message` {String} The message to use as a deprecation warning.\n * `auth` {Credentials}\n* `cb` {Function}\n\nDeprecate a version of a package in the registry.\n\n### client.get(uri, params, cb)\n\n* `uri` {String} The complete registry URI to fetch\n* `params` {Object} Object containing per-request properties.\n * `timeout` {Number} Duration before the request times out. Optional\n (default: never).\n * `follow` {Boolean} Follow 302/301 responses. Optional (default: true).\n * `staleOk` {Boolean} If there's cached data available, then return that to\n the callback quickly, and update the cache the background. Optional\n (default: false).\n * `auth` {Credentials} Optional.\n* `cb` {Function}\n\nFetches data from the registry via a GET request, saving it in the cache folder\nwith the ETag.\n\n### client.publish(uri, params, cb)\n\n* `uri` {String} The registry URI for the package to publish.\n* `params` {Object} Object containing per-request properties.\n * `metadata` {Object} Package metadata.\n * `body` {Stream} Stream of the package body / tarball.\n * `auth` {Credentials}\n* `cb` {Function}\n\nPublish a package to the registry.\n\nNote that this does not create the tarball from a folder.\n\n### client.star(uri, params, cb)\n\n* `uri` {String} The complete registry URI for the package to star.\n* `params` {Object} Object containing per-request properties.\n * `starred` {Boolean} True to star the package, false to unstar it. Optional\n (default: false).\n * `auth` {Credentials}\n* `cb` {Function}\n\nStar or unstar a package.\n\nNote that the user does not have to be the package owner to star or unstar a\npackage, though other writes do require that the user be the package owner.\n\n### client.stars(uri, params, cb)\n\n* `uri` {String} The base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `username` {String} Name of user to fetch starred packages for. Optional\n (default: user in `auth`).\n * `auth` {Credentials} Optional (required if `username` is omitted).\n* `cb` {Function}\n\nView your own or another user's starred packages.\n\n### client.tag(uri, params, cb)\n\n* `uri` {String} The complete registry URI to tag\n* `params` {Object} Object containing per-request properties.\n * `version` {String} Version to tag.\n * `tag` {String} Tag name to apply.\n * `auth` {Credentials}\n* `cb` {Function}\n\nMark a version in the `dist-tags` hash, so that `pkg@tag` will fetch the\nspecified version.\n\n### client.unpublish(uri, params, cb)\n\n* `uri` {String} The complete registry URI of the package to unpublish.\n* `params` {Object} Object containing per-request properties.\n * `version` {String} version to unpublish. Optional – omit to unpublish all\n versions.\n * `auth` {Credentials}\n* `cb` {Function}\n\nRemove a version of a package (or all versions) from the registry. When the\nlast version us unpublished, the entire document is removed from the database.\n\n### client.whoami(uri, params, cb)\n\n* `uri` {String} The base registry for the URI.\n* `params` {Object} Object containing per-request properties.\n * `auth` {Credentials}\n* `cb` {Function}\n\nSimple call to see who the registry thinks you are. Especially useful with\ntoken-based auth.\n\n\n## PLUMBING\n\nThe below are primarily intended for use by the rest of the API, or by the npm\ncaching logic directly.\n\n### client.request(uri, params, cb)\n\n* `uri` {String} URI pointing to the resource to request.\n* `params` {Object} Object containing per-request properties.\n * `method` {String} HTTP method. Optional (default: \"GET\").\n * `body` {Stream | Buffer | String | Object} The request body. Objects\n that are not Buffers or Streams are encoded as JSON. Optional – body\n only used for write operations.\n * `etag` {String} The cached ETag. Optional.\n * `follow` {Boolean} Follow 302/301 responses. Optional (default: true).\n * `auth` {Credentials} Optional.\n* `cb` {Function}\n * `error` {Error | null}\n * `data` {Object} the parsed data object\n * `raw` {String} the json\n * `res` {Response Object} response from couch\n\nMake a generic request to the registry. All the other methods are wrappers\naround `client.request`.\n\n### client.fetch(uri, params, cb)\n\n* `uri` {String} The complete registry URI to upload to\n* `params` {Object} Object containing per-request properties.\n * `headers` {Stream} HTTP headers to be included with the request. Optional.\n * `auth` {Credentials} Optional.\n* `cb` {Function}\n\nFetch a package from a URL, with auth set appropriately if included. Used to\ncache remote tarballs as well as request package tarballs from the registry.\n\n# Configuration\n\nThe client uses its own configuration, which is just passed in as a simple\nnested object. The following are the supported values (with their defaults, if\nany):\n\n* `proxy.http` {URL} The URL to proxy HTTP requests through.\n* `proxy.https` {URL} The URL to proxy HTTPS requests through. Defaults to be\n the same as `proxy.http` if unset.\n* `proxy.localAddress` {IP} The local address to use on multi-homed systems.\n* `ssl.ca` {String} Cerficate signing authority certificates to trust.\n* `ssl.certificate` {String} Client certificate (PEM encoded). Enable access\n to servers that require client certificates.\n* `ssl.key` {String} Private key (PEM encoded) for client certificate.\n* `ssl.strict` {Boolean} Whether or not to be strict with SSL certificates.\n Default = `true`\n* `retry.count` {Number} Number of times to retry on GET failures. Default = 2.\n* `retry.factor` {Number} `factor` setting for `node-retry`. Default = 10.\n* `retry.minTimeout` {Number} `minTimeout` setting for `node-retry`.\n Default = 10000 (10 seconds)\n* `retry.maxTimeout` {Number} `maxTimeout` setting for `node-retry`.\n Default = 60000 (60 seconds)\n* `userAgent` {String} User agent header to send. Default =\n `\"node/{process.version}\"`\n* `log` {Object} The logger to use. Defaults to `require(\"npmlog\")` if\n that works, otherwise logs are disabled.\n* `defaultTag` {String} The default tag to use when publishing new packages.\n Default = `\"latest\"`\n* `couchToken` {Object} A token for use with\n [couch-login](https://npmjs.org/package/couch-login).\n* `sessionToken` {string} A random identifier for this set of client requests.\n Default = 8 random hexadecimal bytes.\n", "readmeFilename": "README.md", "gitHead": "33bd08aa65bb26ba1b956d2f119b5e952f4d3141", "bugs": { "url": "https://github.com/isaacs/npm-registry-client/issues" }, "homepage": "https://github.com/isaacs/npm-registry-client", "_id": "npm-registry-client@4.0.5", "_shasum": "27d37ca0c7bbd5df14f4ae35223a4d588dd4fea6", "_from": "npm-registry-client@>=4.0.5 <4.1.0" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/README.md��������������000644 �000766 �000024 �00000020211 12455173731 030546� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# npm-registry-client The code that npm uses to talk to the registry. It handles all the caching and HTTP calls. ## Usage ```javascript var RegClient = require('npm-registry-client') var client = new RegClient(config) var uri = "npm://registry.npmjs.org/npm" var params = {timeout: 1000} client.get(uri, params, function (error, data, raw, res) { // error is an error if there was a problem. // data is the parsed data object // raw is the json string // res is the response from couch }) ``` # Registry URLs The registry calls take either a full URL pointing to a resource in the registry, or a base URL for the registry as a whole (including the registry path – but be sure to terminate the path with `/`). `http` and `https` URLs are the only ones supported. ## Using the client Every call to the client follows the same pattern: * `uri` {String} The *fully-qualified* URI of the registry API method being invoked. * `params` {Object} Per-request parameters. * `callback` {Function} Callback to be invoked when the call is complete. ### Credentials Many requests to the registry can by authenticated, and require credentials for authorization. These credentials always look the same: * `username` {String} * `password` {String} * `email` {String} * `alwaysAuth` {Boolean} Whether calls to the target registry are always authed. **or** * `token` {String} * `alwaysAuth` {Boolean} Whether calls to the target registry are always authed. ## API ### client.adduser(uri, params, cb) * `uri` {String} Base registry URL. * `params` {Object} Object containing per-request properties. * `auth` {Credentials} * `cb` {Function} * `error` {Error | null} * `data` {Object} the parsed data object * `raw` {String} the json * `res` {Response Object} response from couch Add a user account to the registry, or verify the credentials. ### client.deprecate(uri, params, cb) * `uri` {String} Full registry URI for the deprecated package. * `params` {Object} Object containing per-request properties. * `version` {String} Semver version range. * `message` {String} The message to use as a deprecation warning. * `auth` {Credentials} * `cb` {Function} Deprecate a version of a package in the registry. ### client.get(uri, params, cb) * `uri` {String} The complete registry URI to fetch * `params` {Object} Object containing per-request properties. * `timeout` {Number} Duration before the request times out. Optional (default: never). * `follow` {Boolean} Follow 302/301 responses. Optional (default: true). * `staleOk` {Boolean} If there's cached data available, then return that to the callback quickly, and update the cache the background. Optional (default: false). * `auth` {Credentials} Optional. * `cb` {Function} Fetches data from the registry via a GET request, saving it in the cache folder with the ETag. ### client.publish(uri, params, cb) * `uri` {String} The registry URI for the package to publish. * `params` {Object} Object containing per-request properties. * `metadata` {Object} Package metadata. * `body` {Stream} Stream of the package body / tarball. * `auth` {Credentials} * `cb` {Function} Publish a package to the registry. Note that this does not create the tarball from a folder. ### client.star(uri, params, cb) * `uri` {String} The complete registry URI for the package to star. * `params` {Object} Object containing per-request properties. * `starred` {Boolean} True to star the package, false to unstar it. Optional (default: false). * `auth` {Credentials} * `cb` {Function} Star or unstar a package. Note that the user does not have to be the package owner to star or unstar a package, though other writes do require that the user be the package owner. ### client.stars(uri, params, cb) * `uri` {String} The base URL for the registry. * `params` {Object} Object containing per-request properties. * `username` {String} Name of user to fetch starred packages for. Optional (default: user in `auth`). * `auth` {Credentials} Optional (required if `username` is omitted). * `cb` {Function} View your own or another user's starred packages. ### client.tag(uri, params, cb) * `uri` {String} The complete registry URI to tag * `params` {Object} Object containing per-request properties. * `version` {String} Version to tag. * `tag` {String} Tag name to apply. * `auth` {Credentials} * `cb` {Function} Mark a version in the `dist-tags` hash, so that `pkg@tag` will fetch the specified version. ### client.unpublish(uri, params, cb) * `uri` {String} The complete registry URI of the package to unpublish. * `params` {Object} Object containing per-request properties. * `version` {String} version to unpublish. Optional – omit to unpublish all versions. * `auth` {Credentials} * `cb` {Function} Remove a version of a package (or all versions) from the registry. When the last version us unpublished, the entire document is removed from the database. ### client.whoami(uri, params, cb) * `uri` {String} The base registry for the URI. * `params` {Object} Object containing per-request properties. * `auth` {Credentials} * `cb` {Function} Simple call to see who the registry thinks you are. Especially useful with token-based auth. ## PLUMBING The below are primarily intended for use by the rest of the API, or by the npm caching logic directly. ### client.request(uri, params, cb) * `uri` {String} URI pointing to the resource to request. * `params` {Object} Object containing per-request properties. * `method` {String} HTTP method. Optional (default: "GET"). * `body` {Stream | Buffer | String | Object} The request body. Objects that are not Buffers or Streams are encoded as JSON. Optional – body only used for write operations. * `etag` {String} The cached ETag. Optional. * `follow` {Boolean} Follow 302/301 responses. Optional (default: true). * `auth` {Credentials} Optional. * `cb` {Function} * `error` {Error | null} * `data` {Object} the parsed data object * `raw` {String} the json * `res` {Response Object} response from couch Make a generic request to the registry. All the other methods are wrappers around `client.request`. ### client.fetch(uri, params, cb) * `uri` {String} The complete registry URI to upload to * `params` {Object} Object containing per-request properties. * `headers` {Stream} HTTP headers to be included with the request. Optional. * `auth` {Credentials} Optional. * `cb` {Function} Fetch a package from a URL, with auth set appropriately if included. Used to cache remote tarballs as well as request package tarballs from the registry. # Configuration The client uses its own configuration, which is just passed in as a simple nested object. The following are the supported values (with their defaults, if any): * `proxy.http` {URL} The URL to proxy HTTP requests through. * `proxy.https` {URL} The URL to proxy HTTPS requests through. Defaults to be the same as `proxy.http` if unset. * `proxy.localAddress` {IP} The local address to use on multi-homed systems. * `ssl.ca` {String} Cerficate signing authority certificates to trust. * `ssl.certificate` {String} Client certificate (PEM encoded). Enable access to servers that require client certificates. * `ssl.key` {String} Private key (PEM encoded) for client certificate. * `ssl.strict` {Boolean} Whether or not to be strict with SSL certificates. Default = `true` * `retry.count` {Number} Number of times to retry on GET failures. Default = 2. * `retry.factor` {Number} `factor` setting for `node-retry`. Default = 10. * `retry.minTimeout` {Number} `minTimeout` setting for `node-retry`. Default = 10000 (10 seconds) * `retry.maxTimeout` {Number} `maxTimeout` setting for `node-retry`. Default = 60000 (60 seconds) * `userAgent` {String} User agent header to send. Default = `"node/{process.version}"` * `log` {Object} The logger to use. Defaults to `require("npmlog")` if that works, otherwise logs are disabled. * `defaultTag` {String} The default tag to use when publishing new packages. Default = `"latest"` * `couchToken` {Object} A token for use with [couch-login](https://npmjs.org/package/couch-login). * `sessionToken` {string} A random identifier for this set of client requests. Default = 8 random hexadecimal bytes. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/�������������������000755 �000766 �000024 �00000000000 12456115117 034424� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/.npmignore���������000644 �000766 �000024 �00000000014 12455173731 036423� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/index.js�����������000644 �000766 �000024 �00000006754 12455173731 036112� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var Writable = require('readable-stream').Writable var inherits = require('inherits') var TA = require('typedarray') var U8 = typeof Uint8Array !== 'undefined' ? Uint8Array : TA.Uint8Array function ConcatStream(opts, cb) { if (!(this instanceof ConcatStream)) return new ConcatStream(opts, cb) if (typeof opts === 'function') { cb = opts opts = {} } if (!opts) opts = {} var encoding = opts.encoding var shouldInferEncoding = false if (!encoding) { shouldInferEncoding = true } else { encoding = String(encoding).toLowerCase() if (encoding === 'u8' || encoding === 'uint8') { encoding = 'uint8array' } } Writable.call(this, { objectMode: true }) this.encoding = encoding this.shouldInferEncoding = shouldInferEncoding if (cb) this.on('finish', function () { cb(this.getBody()) }) this.body = [] } module.exports = ConcatStream inherits(ConcatStream, Writable) ConcatStream.prototype._write = function(chunk, enc, next) { this.body.push(chunk) next() } ConcatStream.prototype.inferEncoding = function (buff) { var firstBuffer = buff === undefined ? this.body[0] : buff; if (Buffer.isBuffer(firstBuffer)) return 'buffer' if (typeof Uint8Array !== 'undefined' && firstBuffer instanceof Uint8Array) return 'uint8array' if (Array.isArray(firstBuffer)) return 'array' if (typeof firstBuffer === 'string') return 'string' if (Object.prototype.toString.call(firstBuffer) === "[object Object]") return 'object' return 'buffer' } ConcatStream.prototype.getBody = function () { if (!this.encoding && this.body.length === 0) return [] if (this.shouldInferEncoding) this.encoding = this.inferEncoding() if (this.encoding === 'array') return arrayConcat(this.body) if (this.encoding === 'string') return stringConcat(this.body) if (this.encoding === 'buffer') return bufferConcat(this.body) if (this.encoding === 'uint8array') return u8Concat(this.body) return this.body } var isArray = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]' } function isArrayish (arr) { return /Array\]$/.test(Object.prototype.toString.call(arr)) } function stringConcat (parts) { var strings = [] var needsToString = false for (var i = 0; i < parts.length; i++) { var p = parts[i] if (typeof p === 'string') { strings.push(p) } else if (Buffer.isBuffer(p)) { strings.push(p) } else { strings.push(Buffer(p)) } } if (Buffer.isBuffer(parts[0])) { strings = Buffer.concat(strings) strings = strings.toString('utf8') } else { strings = strings.join('') } return strings } function bufferConcat (parts) { var bufs = [] for (var i = 0; i < parts.length; i++) { var p = parts[i] if (Buffer.isBuffer(p)) { bufs.push(p) } else if (typeof p === 'string' || isArrayish(p) || (p && typeof p.subarray === 'function')) { bufs.push(Buffer(p)) } else bufs.push(Buffer(String(p))) } return Buffer.concat(bufs) } function arrayConcat (parts) { var res = [] for (var i = 0; i < parts.length; i++) { res.push.apply(res, parts[i]) } return res } function u8Concat (parts) { var len = 0 for (var i = 0; i < parts.length; i++) { if (typeof parts[i] === 'string') { parts[i] = Buffer(parts[i]) } len += parts[i].length } var u8 = new U8(len) for (var i = 0, offset = 0; i < parts.length; i++) { var part = parts[i] for (var j = 0; j < part.length; j++) { u8[offset++] = part[j] } } return u8 } ��������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/LICENSE������������000644 �000766 �000024 �00000002055 12455173731 035440� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������The MIT License Copyright (c) 2013 Max Ogden Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/������000755 �000766 �000024 �00000000000 12456115117 037101� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/package.json�������000644 �000766 �000024 �00000003625 12455173731 036725� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "concat-stream", "version": "1.4.7", "description": "writable stream that concatenates strings or binary data and calls a callback with the result", "tags": [ "stream", "simple", "util", "utility" ], "author": { "name": "Max Ogden", "email": "max@maxogden.com" }, "repository": { "type": "git", "url": "http://github.com/maxogden/concat-stream.git" }, "bugs": { "url": "http://github.com/maxogden/concat-stream/issues" }, "engines": [ "node >= 0.8" ], "main": "index.js", "scripts": { "test": "tape test/*.js test/server/*.js" }, "license": "MIT", "dependencies": { "inherits": "~2.0.1", "typedarray": "~0.0.5", "readable-stream": "~1.1.9" }, "devDependencies": { "tape": "~2.3.2" }, "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/17..latest", "firefox/nightly", "chrome/22..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "41edc57536490dce9f015131c29a6470c9412b27", "homepage": "https://github.com/maxogden/concat-stream", "_id": "concat-stream@1.4.7", "_shasum": "0ceaa47b87a581d2a7a782b92b81d5020c3f9925", "_from": "concat-stream@>=1.4.6 <2.0.0", "_npmVersion": "2.1.8", "_nodeVersion": "0.10.28", "_npmUser": { "name": "maxogden", "email": "max@maxogden.com" }, "maintainers": [ { "name": "maxogden", "email": "max@maxogden.com" } ], "dist": { "shasum": "0ceaa47b87a581d2a7a782b92b81d5020c3f9925", "tarball": "http://registry.npmjs.org/concat-stream/-/concat-stream-1.4.7.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/concat-stream/-/concat-stream-1.4.7.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/npm-registry-client/node_modules/concat-stream/readme.md����������000644 �000766 �000024 �00000004646 12455173731 036222� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# concat-stream Writable stream that concatenates strings or binary data and calls a callback with the result. Not a transform stream -- more of a stream sink. [![NPM](https://nodei.co/npm/concat-stream.png)](https://nodei.co/npm/concat-stream/) ### examples #### Buffers ```js var fs = require('fs') var concat = require('concat-stream') var readStream = fs.createReadStream('cat.png') var concatStream = concat(gotPicture) readStream.on('error', handleError) readStream.pipe(concatStream) function gotPicture(imageBuffer) { // imageBuffer is all of `cat.png` as a node.js Buffer } function handleError(err) { // handle your error appropriately here, e.g.: console.error(err) // print the error to STDERR process.exit(1) // exit program with non-zero exit code } ``` #### Arrays ```js var write = concat(function(data) {}) write.write([1,2,3]) write.write([4,5,6]) write.end() // data will be [1,2,3,4,5,6] in the above callback ``` #### Uint8Arrays ```js var write = concat(function(data) {}) var a = new Uint8Array(3) a[0] = 97; a[1] = 98; a[2] = 99 write.write(a) write.write('!') write.end(Buffer('!!1')) ``` See `test/` for more examples # methods ```js var concat = require('concat-stream') ``` ## var writable = concat(opts={}, cb) Return a `writable` stream that will fire `cb(data)` with all of the data that was written to the stream. Data can be written to `writable` as strings, Buffers, arrays of byte integers, and Uint8Arrays. By default `concat-stream` will give you back the same data type as the type of the first buffer written to the stream. Use `opts.encoding` to set what format `data` should be returned as, e.g. if you if you don't want to rely on the built-in type checking or for some other reason. * `string` - get a string * `buffer` - get back a Buffer * `array` - get an array of byte integers * `uint8array`, `u8`, `uint8` - get back a Uint8Array * `object`, get back an array of Objects If you don't specify an encoding, and the types can't be inferred (e.g. you write things that aren't int he list above), it will try to convert concat them into a `Buffer`. # error handling `concat-stream` does not handle errors for you, so you must handle errors on whatever streams you pipe into `concat-stream`. This is a general rule when programming with node.js streams: always handle errors on each and every stream. Since `concat-stream` is not itself a stream it does not emit errors. # license MIT LICENSE ������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/�������000755 �000766 �000024 �00000000000 12456115117 042131� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/������������000755 �000766 �000024 �00000000000 12456115117 041265� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/.travis.yml�000644 �000766 �000024 �00000000060 12455173731 043377� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10" ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/example/����000755 �000766 �000024 �00000000000 12456115117 042720� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/index.js����000644 �000766 �000024 �00000051515 12455173731 042746� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var undefined = (void 0); // Paranoia // Beyond this value, index getters/setters (i.e. array[0], array[1]) are so slow to // create, and consume so much memory, that the browser appears frozen. var MAX_ARRAY_LENGTH = 1e5; // Approximations of internal ECMAScript conversion functions var ECMAScript = (function() { // Stash a copy in case other scripts modify these var opts = Object.prototype.toString, ophop = Object.prototype.hasOwnProperty; return { // Class returns internal [[Class]] property, used to avoid cross-frame instanceof issues: Class: function(v) { return opts.call(v).replace(/^\[object *|\]$/g, ''); }, HasProperty: function(o, p) { return p in o; }, HasOwnProperty: function(o, p) { return ophop.call(o, p); }, IsCallable: function(o) { return typeof o === 'function'; }, ToInt32: function(v) { return v >> 0; }, ToUint32: function(v) { return v >>> 0; } }; }()); // Snapshot intrinsics var LN2 = Math.LN2, abs = Math.abs, floor = Math.floor, log = Math.log, min = Math.min, pow = Math.pow, round = Math.round; // ES5: lock down object properties function configureProperties(obj) { if (getOwnPropNames && defineProp) { var props = getOwnPropNames(obj), i; for (i = 0; i < props.length; i += 1) { defineProp(obj, props[i], { value: obj[props[i]], writable: false, enumerable: false, configurable: false }); } } } // emulate ES5 getter/setter API using legacy APIs // http://blogs.msdn.com/b/ie/archive/2010/09/07/transitioning-existing-code-to-the-es5-getter-setter-apis.aspx // (second clause tests for Object.defineProperty() in IE<9 that only supports extending DOM prototypes, but // note that IE<9 does not support __defineGetter__ or __defineSetter__ so it just renders the method harmless) var defineProp if (Object.defineProperty && (function() { try { Object.defineProperty({}, 'x', {}); return true; } catch (e) { return false; } })()) { defineProp = Object.defineProperty; } else { defineProp = function(o, p, desc) { if (!o === Object(o)) throw new TypeError("Object.defineProperty called on non-object"); if (ECMAScript.HasProperty(desc, 'get') && Object.prototype.__defineGetter__) { Object.prototype.__defineGetter__.call(o, p, desc.get); } if (ECMAScript.HasProperty(desc, 'set') && Object.prototype.__defineSetter__) { Object.prototype.__defineSetter__.call(o, p, desc.set); } if (ECMAScript.HasProperty(desc, 'value')) { o[p] = desc.value; } return o; }; } var getOwnPropNames = Object.getOwnPropertyNames || function (o) { if (o !== Object(o)) throw new TypeError("Object.getOwnPropertyNames called on non-object"); var props = [], p; for (p in o) { if (ECMAScript.HasOwnProperty(o, p)) { props.push(p); } } return props; }; // ES5: Make obj[index] an alias for obj._getter(index)/obj._setter(index, value) // for index in 0 ... obj.length function makeArrayAccessors(obj) { if (!defineProp) { return; } if (obj.length > MAX_ARRAY_LENGTH) throw new RangeError("Array too large for polyfill"); function makeArrayAccessor(index) { defineProp(obj, index, { 'get': function() { return obj._getter(index); }, 'set': function(v) { obj._setter(index, v); }, enumerable: true, configurable: false }); } var i; for (i = 0; i < obj.length; i += 1) { makeArrayAccessor(i); } } // Internal conversion functions: // pack<Type>() - take a number (interpreted as Type), output a byte array // unpack<Type>() - take a byte array, output a Type-like number function as_signed(value, bits) { var s = 32 - bits; return (value << s) >> s; } function as_unsigned(value, bits) { var s = 32 - bits; return (value << s) >>> s; } function packI8(n) { return [n & 0xff]; } function unpackI8(bytes) { return as_signed(bytes[0], 8); } function packU8(n) { return [n & 0xff]; } function unpackU8(bytes) { return as_unsigned(bytes[0], 8); } function packU8Clamped(n) { n = round(Number(n)); return [n < 0 ? 0 : n > 0xff ? 0xff : n & 0xff]; } function packI16(n) { return [(n >> 8) & 0xff, n & 0xff]; } function unpackI16(bytes) { return as_signed(bytes[0] << 8 | bytes[1], 16); } function packU16(n) { return [(n >> 8) & 0xff, n & 0xff]; } function unpackU16(bytes) { return as_unsigned(bytes[0] << 8 | bytes[1], 16); } function packI32(n) { return [(n >> 24) & 0xff, (n >> 16) & 0xff, (n >> 8) & 0xff, n & 0xff]; } function unpackI32(bytes) { return as_signed(bytes[0] << 24 | bytes[1] << 16 | bytes[2] << 8 | bytes[3], 32); } function packU32(n) { return [(n >> 24) & 0xff, (n >> 16) & 0xff, (n >> 8) & 0xff, n & 0xff]; } function unpackU32(bytes) { return as_unsigned(bytes[0] << 24 | bytes[1] << 16 | bytes[2] << 8 | bytes[3], 32); } function packIEEE754(v, ebits, fbits) { var bias = (1 << (ebits - 1)) - 1, s, e, f, ln, i, bits, str, bytes; function roundToEven(n) { var w = floor(n), f = n - w; if (f < 0.5) return w; if (f > 0.5) return w + 1; return w % 2 ? w + 1 : w; } // Compute sign, exponent, fraction if (v !== v) { // NaN // http://dev.w3.org/2006/webapi/WebIDL/#es-type-mapping e = (1 << ebits) - 1; f = pow(2, fbits - 1); s = 0; } else if (v === Infinity || v === -Infinity) { e = (1 << ebits) - 1; f = 0; s = (v < 0) ? 1 : 0; } else if (v === 0) { e = 0; f = 0; s = (1 / v === -Infinity) ? 1 : 0; } else { s = v < 0; v = abs(v); if (v >= pow(2, 1 - bias)) { e = min(floor(log(v) / LN2), 1023); f = roundToEven(v / pow(2, e) * pow(2, fbits)); if (f / pow(2, fbits) >= 2) { e = e + 1; f = 1; } if (e > bias) { // Overflow e = (1 << ebits) - 1; f = 0; } else { // Normalized e = e + bias; f = f - pow(2, fbits); } } else { // Denormalized e = 0; f = roundToEven(v / pow(2, 1 - bias - fbits)); } } // Pack sign, exponent, fraction bits = []; for (i = fbits; i; i -= 1) { bits.push(f % 2 ? 1 : 0); f = floor(f / 2); } for (i = ebits; i; i -= 1) { bits.push(e % 2 ? 1 : 0); e = floor(e / 2); } bits.push(s ? 1 : 0); bits.reverse(); str = bits.join(''); // Bits to bytes bytes = []; while (str.length) { bytes.push(parseInt(str.substring(0, 8), 2)); str = str.substring(8); } return bytes; } function unpackIEEE754(bytes, ebits, fbits) { // Bytes to bits var bits = [], i, j, b, str, bias, s, e, f; for (i = bytes.length; i; i -= 1) { b = bytes[i - 1]; for (j = 8; j; j -= 1) { bits.push(b % 2 ? 1 : 0); b = b >> 1; } } bits.reverse(); str = bits.join(''); // Unpack sign, exponent, fraction bias = (1 << (ebits - 1)) - 1; s = parseInt(str.substring(0, 1), 2) ? -1 : 1; e = parseInt(str.substring(1, 1 + ebits), 2); f = parseInt(str.substring(1 + ebits), 2); // Produce number if (e === (1 << ebits) - 1) { return f !== 0 ? NaN : s * Infinity; } else if (e > 0) { // Normalized return s * pow(2, e - bias) * (1 + f / pow(2, fbits)); } else if (f !== 0) { // Denormalized return s * pow(2, -(bias - 1)) * (f / pow(2, fbits)); } else { return s < 0 ? -0 : 0; } } function unpackF64(b) { return unpackIEEE754(b, 11, 52); } function packF64(v) { return packIEEE754(v, 11, 52); } function unpackF32(b) { return unpackIEEE754(b, 8, 23); } function packF32(v) { return packIEEE754(v, 8, 23); } // // 3 The ArrayBuffer Type // (function() { /** @constructor */ var ArrayBuffer = function ArrayBuffer(length) { length = ECMAScript.ToInt32(length); if (length < 0) throw new RangeError('ArrayBuffer size is not a small enough positive integer'); this.byteLength = length; this._bytes = []; this._bytes.length = length; var i; for (i = 0; i < this.byteLength; i += 1) { this._bytes[i] = 0; } configureProperties(this); }; exports.ArrayBuffer = exports.ArrayBuffer || ArrayBuffer; // // 4 The ArrayBufferView Type // // NOTE: this constructor is not exported /** @constructor */ var ArrayBufferView = function ArrayBufferView() { //this.buffer = null; //this.byteOffset = 0; //this.byteLength = 0; }; // // 5 The Typed Array View Types // function makeConstructor(bytesPerElement, pack, unpack) { // Each TypedArray type requires a distinct constructor instance with // identical logic, which this produces. var ctor; ctor = function(buffer, byteOffset, length) { var array, sequence, i, s; if (!arguments.length || typeof arguments[0] === 'number') { // Constructor(unsigned long length) this.length = ECMAScript.ToInt32(arguments[0]); if (length < 0) throw new RangeError('ArrayBufferView size is not a small enough positive integer'); this.byteLength = this.length * this.BYTES_PER_ELEMENT; this.buffer = new ArrayBuffer(this.byteLength); this.byteOffset = 0; } else if (typeof arguments[0] === 'object' && arguments[0].constructor === ctor) { // Constructor(TypedArray array) array = arguments[0]; this.length = array.length; this.byteLength = this.length * this.BYTES_PER_ELEMENT; this.buffer = new ArrayBuffer(this.byteLength); this.byteOffset = 0; for (i = 0; i < this.length; i += 1) { this._setter(i, array._getter(i)); } } else if (typeof arguments[0] === 'object' && !(arguments[0] instanceof ArrayBuffer || ECMAScript.Class(arguments[0]) === 'ArrayBuffer')) { // Constructor(sequence<type> array) sequence = arguments[0]; this.length = ECMAScript.ToUint32(sequence.length); this.byteLength = this.length * this.BYTES_PER_ELEMENT; this.buffer = new ArrayBuffer(this.byteLength); this.byteOffset = 0; for (i = 0; i < this.length; i += 1) { s = sequence[i]; this._setter(i, Number(s)); } } else if (typeof arguments[0] === 'object' && (arguments[0] instanceof ArrayBuffer || ECMAScript.Class(arguments[0]) === 'ArrayBuffer')) { // Constructor(ArrayBuffer buffer, // optional unsigned long byteOffset, optional unsigned long length) this.buffer = buffer; this.byteOffset = ECMAScript.ToUint32(byteOffset); if (this.byteOffset > this.buffer.byteLength) { throw new RangeError("byteOffset out of range"); } if (this.byteOffset % this.BYTES_PER_ELEMENT) { // The given byteOffset must be a multiple of the element // size of the specific type, otherwise an exception is raised. throw new RangeError("ArrayBuffer length minus the byteOffset is not a multiple of the element size."); } if (arguments.length < 3) { this.byteLength = this.buffer.byteLength - this.byteOffset; if (this.byteLength % this.BYTES_PER_ELEMENT) { throw new RangeError("length of buffer minus byteOffset not a multiple of the element size"); } this.length = this.byteLength / this.BYTES_PER_ELEMENT; } else { this.length = ECMAScript.ToUint32(length); this.byteLength = this.length * this.BYTES_PER_ELEMENT; } if ((this.byteOffset + this.byteLength) > this.buffer.byteLength) { throw new RangeError("byteOffset and length reference an area beyond the end of the buffer"); } } else { throw new TypeError("Unexpected argument type(s)"); } this.constructor = ctor; configureProperties(this); makeArrayAccessors(this); }; ctor.prototype = new ArrayBufferView(); ctor.prototype.BYTES_PER_ELEMENT = bytesPerElement; ctor.prototype._pack = pack; ctor.prototype._unpack = unpack; ctor.BYTES_PER_ELEMENT = bytesPerElement; // getter type (unsigned long index); ctor.prototype._getter = function(index) { if (arguments.length < 1) throw new SyntaxError("Not enough arguments"); index = ECMAScript.ToUint32(index); if (index >= this.length) { return undefined; } var bytes = [], i, o; for (i = 0, o = this.byteOffset + index * this.BYTES_PER_ELEMENT; i < this.BYTES_PER_ELEMENT; i += 1, o += 1) { bytes.push(this.buffer._bytes[o]); } return this._unpack(bytes); }; // NONSTANDARD: convenience alias for getter: type get(unsigned long index); ctor.prototype.get = ctor.prototype._getter; // setter void (unsigned long index, type value); ctor.prototype._setter = function(index, value) { if (arguments.length < 2) throw new SyntaxError("Not enough arguments"); index = ECMAScript.ToUint32(index); if (index >= this.length) { return undefined; } var bytes = this._pack(value), i, o; for (i = 0, o = this.byteOffset + index * this.BYTES_PER_ELEMENT; i < this.BYTES_PER_ELEMENT; i += 1, o += 1) { this.buffer._bytes[o] = bytes[i]; } }; // void set(TypedArray array, optional unsigned long offset); // void set(sequence<type> array, optional unsigned long offset); ctor.prototype.set = function(index, value) { if (arguments.length < 1) throw new SyntaxError("Not enough arguments"); var array, sequence, offset, len, i, s, d, byteOffset, byteLength, tmp; if (typeof arguments[0] === 'object' && arguments[0].constructor === this.constructor) { // void set(TypedArray array, optional unsigned long offset); array = arguments[0]; offset = ECMAScript.ToUint32(arguments[1]); if (offset + array.length > this.length) { throw new RangeError("Offset plus length of array is out of range"); } byteOffset = this.byteOffset + offset * this.BYTES_PER_ELEMENT; byteLength = array.length * this.BYTES_PER_ELEMENT; if (array.buffer === this.buffer) { tmp = []; for (i = 0, s = array.byteOffset; i < byteLength; i += 1, s += 1) { tmp[i] = array.buffer._bytes[s]; } for (i = 0, d = byteOffset; i < byteLength; i += 1, d += 1) { this.buffer._bytes[d] = tmp[i]; } } else { for (i = 0, s = array.byteOffset, d = byteOffset; i < byteLength; i += 1, s += 1, d += 1) { this.buffer._bytes[d] = array.buffer._bytes[s]; } } } else if (typeof arguments[0] === 'object' && typeof arguments[0].length !== 'undefined') { // void set(sequence<type> array, optional unsigned long offset); sequence = arguments[0]; len = ECMAScript.ToUint32(sequence.length); offset = ECMAScript.ToUint32(arguments[1]); if (offset + len > this.length) { throw new RangeError("Offset plus length of array is out of range"); } for (i = 0; i < len; i += 1) { s = sequence[i]; this._setter(offset + i, Number(s)); } } else { throw new TypeError("Unexpected argument type(s)"); } }; // TypedArray subarray(long begin, optional long end); ctor.prototype.subarray = function(start, end) { function clamp(v, min, max) { return v < min ? min : v > max ? max : v; } start = ECMAScript.ToInt32(start); end = ECMAScript.ToInt32(end); if (arguments.length < 1) { start = 0; } if (arguments.length < 2) { end = this.length; } if (start < 0) { start = this.length + start; } if (end < 0) { end = this.length + end; } start = clamp(start, 0, this.length); end = clamp(end, 0, this.length); var len = end - start; if (len < 0) { len = 0; } return new this.constructor( this.buffer, this.byteOffset + start * this.BYTES_PER_ELEMENT, len); }; return ctor; } var Int8Array = makeConstructor(1, packI8, unpackI8); var Uint8Array = makeConstructor(1, packU8, unpackU8); var Uint8ClampedArray = makeConstructor(1, packU8Clamped, unpackU8); var Int16Array = makeConstructor(2, packI16, unpackI16); var Uint16Array = makeConstructor(2, packU16, unpackU16); var Int32Array = makeConstructor(4, packI32, unpackI32); var Uint32Array = makeConstructor(4, packU32, unpackU32); var Float32Array = makeConstructor(4, packF32, unpackF32); var Float64Array = makeConstructor(8, packF64, unpackF64); exports.Int8Array = exports.Int8Array || Int8Array; exports.Uint8Array = exports.Uint8Array || Uint8Array; exports.Uint8ClampedArray = exports.Uint8ClampedArray || Uint8ClampedArray; exports.Int16Array = exports.Int16Array || Int16Array; exports.Uint16Array = exports.Uint16Array || Uint16Array; exports.Int32Array = exports.Int32Array || Int32Array; exports.Uint32Array = exports.Uint32Array || Uint32Array; exports.Float32Array = exports.Float32Array || Float32Array; exports.Float64Array = exports.Float64Array || Float64Array; }()); // // 6 The DataView View Type // (function() { function r(array, index) { return ECMAScript.IsCallable(array.get) ? array.get(index) : array[index]; } var IS_BIG_ENDIAN = (function() { var u16array = new(exports.Uint16Array)([0x1234]), u8array = new(exports.Uint8Array)(u16array.buffer); return r(u8array, 0) === 0x12; }()); // Constructor(ArrayBuffer buffer, // optional unsigned long byteOffset, // optional unsigned long byteLength) /** @constructor */ var DataView = function DataView(buffer, byteOffset, byteLength) { if (arguments.length === 0) { buffer = new exports.ArrayBuffer(0); } else if (!(buffer instanceof exports.ArrayBuffer || ECMAScript.Class(buffer) === 'ArrayBuffer')) { throw new TypeError("TypeError"); } this.buffer = buffer || new exports.ArrayBuffer(0); this.byteOffset = ECMAScript.ToUint32(byteOffset); if (this.byteOffset > this.buffer.byteLength) { throw new RangeError("byteOffset out of range"); } if (arguments.length < 3) { this.byteLength = this.buffer.byteLength - this.byteOffset; } else { this.byteLength = ECMAScript.ToUint32(byteLength); } if ((this.byteOffset + this.byteLength) > this.buffer.byteLength) { throw new RangeError("byteOffset and length reference an area beyond the end of the buffer"); } configureProperties(this); }; function makeGetter(arrayType) { return function(byteOffset, littleEndian) { byteOffset = ECMAScript.ToUint32(byteOffset); if (byteOffset + arrayType.BYTES_PER_ELEMENT > this.byteLength) { throw new RangeError("Array index out of range"); } byteOffset += this.byteOffset; var uint8Array = new exports.Uint8Array(this.buffer, byteOffset, arrayType.BYTES_PER_ELEMENT), bytes = [], i; for (i = 0; i < arrayType.BYTES_PER_ELEMENT; i += 1) { bytes.push(r(uint8Array, i)); } if (Boolean(littleEndian) === Boolean(IS_BIG_ENDIAN)) { bytes.reverse(); } return r(new arrayType(new exports.Uint8Array(bytes).buffer), 0); }; } DataView.prototype.getUint8 = makeGetter(exports.Uint8Array); DataView.prototype.getInt8 = makeGetter(exports.Int8Array); DataView.prototype.getUint16 = makeGetter(exports.Uint16Array); DataView.prototype.getInt16 = makeGetter(exports.Int16Array); DataView.prototype.getUint32 = makeGetter(exports.Uint32Array); DataView.prototype.getInt32 = makeGetter(exports.Int32Array); DataView.prototype.getFloat32 = makeGetter(exports.Float32Array); DataView.prototype.getFloat64 = makeGetter(exports.Float64Array); function makeSetter(arrayType) { return function(byteOffset, value, littleEndian) { byteOffset = ECMAScript.ToUint32(byteOffset); if (byteOffset + arrayType.BYTES_PER_ELEMENT > this.byteLength) { throw new RangeError("Array index out of range"); } // Get bytes var typeArray = new arrayType([value]), byteArray = new exports.Uint8Array(typeArray.buffer), bytes = [], i, byteView; for (i = 0; i < arrayType.BYTES_PER_ELEMENT; i += 1) { bytes.push(r(byteArray, i)); } // Flip if necessary if (Boolean(littleEndian) === Boolean(IS_BIG_ENDIAN)) { bytes.reverse(); } // Write them byteView = new exports.Uint8Array(this.buffer, byteOffset, arrayType.BYTES_PER_ELEMENT); byteView.set(bytes); }; } DataView.prototype.setUint8 = makeSetter(exports.Uint8Array); DataView.prototype.setInt8 = makeSetter(exports.Int8Array); DataView.prototype.setUint16 = makeSetter(exports.Uint16Array); DataView.prototype.setInt16 = makeSetter(exports.Int16Array); DataView.prototype.setUint32 = makeSetter(exports.Uint32Array); DataView.prototype.setInt32 = makeSetter(exports.Int32Array); DataView.prototype.setFloat32 = makeSetter(exports.Float32Array); DataView.prototype.setFloat64 = makeSetter(exports.Float64Array); exports.DataView = exports.DataView || DataView; }()); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/LICENSE�����000644 �000766 �000024 �00000003035 12455173731 042300� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������/* Copyright (c) 2010, Linden Research, Inc. Copyright (c) 2012, Joshua Bell Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. $/LicenseInfo$ */ // Original can be found at: // https://bitbucket.org/lindenlab/llsd // Modifications by Joshua Bell inexorabletash@gmail.com // https://github.com/inexorabletash/polyfill // ES3/ES5 implementation of the Krhonos Typed Array Specification // Ref: http://www.khronos.org/registry/typedarray/specs/latest/ // Date: 2011-02-01 // // Variations: // * Allows typed_array.get/set() as alias for subscripts (typed_array[]) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/package.json000644 �000766 �000024 �00000003505 12455173731 043563� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "name": "typedarray", "version": "0.0.6", "description": "TypedArray polyfill for old browsers", "main": "index.js", "devDependencies": { "tape": "~2.3.2" }, "scripts": { "test": "tape test/*.js test/server/*.js" }, "repository": { "type": "git", "url": "git://github.com/substack/typedarray.git" }, "homepage": "https://github.com/substack/typedarray", "keywords": [ "ArrayBuffer", "DataView", "Float32Array", "Float64Array", "Int8Array", "Int16Array", "Int32Array", "Uint8Array", "Uint8ClampedArray", "Uint16Array", "Uint32Array", "typed", "array", "polyfill" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/6..latest", "firefox/16..latest", "firefox/nightly", "chrome/22..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "bugs": { "url": "https://github.com/substack/typedarray/issues" }, "_id": "typedarray@0.0.6", "dist": { "shasum": "867ac74e3864187b1d3d47d996a78ec5c8830777", "tarball": "http://registry.npmjs.org/typedarray/-/typedarray-0.0.6.tgz" }, "_from": "typedarray@>=0.0.5 <0.1.0", "_npmVersion": "1.4.3", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "directories": {}, "_shasum": "867ac74e3864187b1d3d47d996a78ec5c8830777", "_resolved": "https://registry.npmjs.org/typedarray/-/typedarray-0.0.6.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/readme.markdown�000644 �000766 �000024 �00000002044 12455173731 044273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������# typedarray TypedArray polyfill ripped from [this module](https://raw.github.com/inexorabletash/polyfill). [![build status](https://secure.travis-ci.org/substack/typedarray.png)](http://travis-ci.org/substack/typedarray) [![testling badge](https://ci.testling.com/substack/typedarray.png)](https://ci.testling.com/substack/typedarray) # example ``` js var Uint8Array = require('typedarray').Uint8Array; var ua = new Uint8Array(5); ua[1] = 256 + 55; console.log(ua[1]); ``` output: ``` 55 ``` # methods ``` js var TA = require('typedarray') ``` The `TA` object has the following constructors: * TA.ArrayBuffer * TA.DataView * TA.Float32Array * TA.Float64Array * TA.Int8Array * TA.Int16Array * TA.Int32Array * TA.Uint8Array * TA.Uint8ClampedArray * TA.Uint16Array * TA.Uint32Array # install With [npm](https://npmjs.org) do: ``` npm install typedarray ``` To use this module in the browser, compile with [browserify](http://browserify.org) or download a UMD build from browserify CDN: http://wzrd.in/standalone/typedarray@latest # license MIT ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/typedarray/example/tarray.js������������000644 �000766 �000024 �00000000156 12455173731 044567� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������var Uint8Array = require('../').Uint8Array; var ua = new Uint8Array(5); ua[1] = 256 + 55; console.log(ua[1]); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/.npmignore�000644 �000766 �000024 �00000000044 12455173731 044133� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������build/ test/ examples/ fs.js zlib.js��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/duplex.js��000644 �000766 �000024 �00000000064 12455173731 043775� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_duplex.js") ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/float.patch000644 �000766 �000024 �00000072424 12455173731 044275� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm���������������������������������������������������������������������������������������������������������������������������� module.exports = PassThrough; -var Transform = require('_stream_transform'); +var Transform = require('./_stream_transform'); var util = require('util'); util.inherits(PassThrough, Transform); diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js index 0c3fe3e..90a8298 100644 --- a/lib/_stream_readable.js +++ b/lib/_stream_readable.js @@ -23,10 +23,34 @@ module.exports = Readable; Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; +if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { + return emitter.listeners(type).length; +}; + +if (!global.setImmediate) global.setImmediate = function setImmediate(fn) { + return setTimeout(fn, 0); +}; +if (!global.clearImmediate) global.clearImmediate = function clearImmediate(i) { + return clearTimeout(i); +}; + var Stream = require('stream'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var StringDecoder; -var debug = util.debuglog('stream'); +var debug; +if (util.debuglog) + debug = util.debuglog('stream'); +else try { + debug = require('debuglog')('stream'); +} catch (er) { + debug = function() {}; +} util.inherits(Readable, Stream); @@ -380,7 +404,7 @@ function chunkInvalid(state, chunk) { function onEofChunk(stream, state) { - if (state.decoder && !state.ended) { + if (state.decoder && !state.ended && state.decoder.end) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js index b1f9fcc..b0caf57 100644 --- a/lib/_stream_transform.js +++ b/lib/_stream_transform.js @@ -64,8 +64,14 @@ module.exports = Transform; -var Duplex = require('_stream_duplex'); +var Duplex = require('./_stream_duplex'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} util.inherits(Transform, Duplex); diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index ba2e920..f49288b 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -27,6 +27,12 @@ module.exports = Writable; Writable.WritableState = WritableState; var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var Stream = require('stream'); util.inherits(Writable, Stream); @@ -119,7 +125,7 @@ function WritableState(options, stream) { function Writable(options) { // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. - if (!(this instanceof Writable) && !(this instanceof Stream.Duplex)) + if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex'))) return new Writable(options); this._writableState = new WritableState(options, this); diff --git a/test/simple/test-stream-big-push.js b/test/simple/test-stream-big-push.js index e3787e4..8cd2127 100644 --- a/test/simple/test-stream-big-push.js +++ b/test/simple/test-stream-big-push.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var str = 'asdfasdfasdfasdfasdf'; var r = new stream.Readable({ diff --git a/test/simple/test-stream-end-paused.js b/test/simple/test-stream-end-paused.js index bb73777..d40efc7 100644 --- a/test/simple/test-stream-end-paused.js +++ b/test/simple/test-stream-end-paused.js @@ -25,7 +25,7 @@ var gotEnd = false; // Make sure we don't miss the end event for paused 0-length streams -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var stream = new Readable(); var calledRead = false; stream._read = function() { diff --git a/test/simple/test-stream-pipe-after-end.js b/test/simple/test-stream-pipe-after-end.js index b46ee90..0be8366 100644 --- a/test/simple/test-stream-pipe-after-end.js +++ b/test/simple/test-stream-pipe-after-end.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var util = require('util'); util.inherits(TestReadable, Readable); diff --git a/test/simple/test-stream-pipe-cleanup.js b/test/simple/test-stream-pipe-cleanup.js deleted file mode 100644 index f689358..0000000 --- a/test/simple/test-stream-pipe-cleanup.js +++ /dev/null @@ -1,122 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -// This test asserts that Stream.prototype.pipe does not leave listeners -// hanging on the source or dest. - -var common = require('../common'); -var stream = require('stream'); -var assert = require('assert'); -var util = require('util'); - -function Writable() { - this.writable = true; - this.endCalls = 0; - stream.Stream.call(this); -} -util.inherits(Writable, stream.Stream); -Writable.prototype.end = function() { - this.endCalls++; -}; - -Writable.prototype.destroy = function() { - this.endCalls++; -}; - -function Readable() { - this.readable = true; - stream.Stream.call(this); -} -util.inherits(Readable, stream.Stream); - -function Duplex() { - this.readable = true; - Writable.call(this); -} -util.inherits(Duplex, Writable); - -var i = 0; -var limit = 100; - -var w = new Writable(); - -var r; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('end'); -} -assert.equal(0, r.listeners('end').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('close'); -} -assert.equal(0, r.listeners('close').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -r = new Readable(); - -for (i = 0; i < limit; i++) { - w = new Writable(); - r.pipe(w); - w.emit('close'); -} -assert.equal(0, w.listeners('close').length); - -r = new Readable(); -w = new Writable(); -var d = new Duplex(); -r.pipe(d); // pipeline A -d.pipe(w); // pipeline B -assert.equal(r.listeners('end').length, 2); // A.onend, A.cleanup -assert.equal(r.listeners('close').length, 2); // A.onclose, A.cleanup -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 3); // A.cleanup, B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -r.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 0); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 2); // B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -d.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 1); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 0); -assert.equal(d.listeners('close').length, 0); -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 0); diff --git a/test/simple/test-stream-pipe-error-handling.js b/test/simple/test-stream-pipe-error-handling.js index c5d724b..c7d6b7d 100644 --- a/test/simple/test-stream-pipe-error-handling.js +++ b/test/simple/test-stream-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream').Stream; +var Stream = require('../../').Stream; (function testErrorListenerCatches() { var source = new Stream(); diff --git a/test/simple/test-stream-pipe-event.js b/test/simple/test-stream-pipe-event.js index cb9d5fe..56f8d61 100644 --- a/test/simple/test-stream-pipe-event.js +++ b/test/simple/test-stream-pipe-event.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common'); -var stream = require('stream'); +var stream = require('../../'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream-push-order.js b/test/simple/test-stream-push-order.js index f2e6ec2..a5c9bf9 100644 --- a/test/simple/test-stream-push-order.js +++ b/test/simple/test-stream-push-order.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var assert = require('assert'); var s = new Readable({ diff --git a/test/simple/test-stream-push-strings.js b/test/simple/test-stream-push-strings.js index 06f43dc..1701a9a 100644 --- a/test/simple/test-stream-push-strings.js +++ b/test/simple/test-stream-push-strings.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var util = require('util'); util.inherits(MyStream, Readable); diff --git a/test/simple/test-stream-readable-event.js b/test/simple/test-stream-readable-event.js index ba6a577..a8e6f7b 100644 --- a/test/simple/test-stream-readable-event.js +++ b/test/simple/test-stream-readable-event.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; (function first() { // First test, not reading when the readable is added. diff --git a/test/simple/test-stream-readable-flow-recursion.js b/test/simple/test-stream-readable-flow-recursion.js index 2891ad6..11689ba 100644 --- a/test/simple/test-stream-readable-flow-recursion.js +++ b/test/simple/test-stream-readable-flow-recursion.js @@ -27,7 +27,7 @@ var assert = require('assert'); // more data continuously, but without triggering a nextTick // warning or RangeError. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; // throw an error if we trigger a nextTick warning. process.throwDeprecation = true; diff --git a/test/simple/test-stream-unshift-empty-chunk.js b/test/simple/test-stream-unshift-empty-chunk.js index 0c96476..7827538 100644 --- a/test/simple/test-stream-unshift-empty-chunk.js +++ b/test/simple/test-stream-unshift-empty-chunk.js @@ -24,7 +24,7 @@ var assert = require('assert'); // This test verifies that stream.unshift(Buffer(0)) or // stream.unshift('') does not set state.reading=false. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var nChunks = 10; diff --git a/test/simple/test-stream-unshift-read-race.js b/test/simple/test-stream-unshift-read-race.js index 83fd9fa..17c18aa 100644 --- a/test/simple/test-stream-unshift-read-race.js +++ b/test/simple/test-stream-unshift-read-race.js @@ -29,7 +29,7 @@ var assert = require('assert'); // 3. push() after the EOF signaling null is an error. // 4. _read() is not called after pushing the EOF null chunk. -var stream = require('stream'); +var stream = require('../../'); var hwm = 10; var r = stream.Readable({ highWaterMark: hwm }); var chunks = 10; @@ -51,7 +51,14 @@ r._read = function(n) { function push(fast) { assert(!pushedNull, 'push() after null push'); - var c = pos >= data.length ? null : data.slice(pos, pos + n); + var c; + if (pos >= data.length) + c = null; + else { + if (n + pos > data.length) + n = data.length - pos; + c = data.slice(pos, pos + n); + } pushedNull = c === null; if (fast) { pos += n; diff --git a/test/simple/test-stream-writev.js b/test/simple/test-stream-writev.js index 5b49e6e..b5321f3 100644 --- a/test/simple/test-stream-writev.js +++ b/test/simple/test-stream-writev.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var queue = []; for (var decode = 0; decode < 2; decode++) { diff --git a/test/simple/test-stream2-basic.js b/test/simple/test-stream2-basic.js index 3814bf0..248c1be 100644 --- a/test/simple/test-stream2-basic.js +++ b/test/simple/test-stream2-basic.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-compatibility.js b/test/simple/test-stream2-compatibility.js index 6cdd4e9..f0fa84b 100644 --- a/test/simple/test-stream2-compatibility.js +++ b/test/simple/test-stream2-compatibility.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-finish-pipe.js b/test/simple/test-stream2-finish-pipe.js index 39b274f..006a19b 100644 --- a/test/simple/test-stream2-finish-pipe.js +++ b/test/simple/test-stream2-finish-pipe.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Buffer = require('buffer').Buffer; var r = new stream.Readable(); diff --git a/test/simple/test-stream2-fs.js b/test/simple/test-stream2-fs.js deleted file mode 100644 index e162406..0000000 --- a/test/simple/test-stream2-fs.js +++ /dev/null @@ -1,72 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - - -var common = require('../common.js'); -var R = require('_stream_readable'); -var assert = require('assert'); - -var fs = require('fs'); -var FSReadable = fs.ReadStream; - -var path = require('path'); -var file = path.resolve(common.fixturesDir, 'x1024.txt'); - -var size = fs.statSync(file).size; - -var expectLengths = [1024]; - -var util = require('util'); -var Stream = require('stream'); - -util.inherits(TestWriter, Stream); - -function TestWriter() { - Stream.apply(this); - this.buffer = []; - this.length = 0; -} - -TestWriter.prototype.write = function(c) { - this.buffer.push(c.toString()); - this.length += c.length; - return true; -}; - -TestWriter.prototype.end = function(c) { - if (c) this.buffer.push(c.toString()); - this.emit('results', this.buffer); -} - -var r = new FSReadable(file); -var w = new TestWriter(); - -w.on('results', function(res) { - console.error(res, w.length); - assert.equal(w.length, size); - var l = 0; - assert.deepEqual(res.map(function (c) { - return c.length; - }), expectLengths); - console.log('ok'); -}); - -r.pipe(w); diff --git a/test/simple/test-stream2-httpclient-response-end.js b/test/simple/test-stream2-httpclient-response-end.js deleted file mode 100644 index 15cffc2..0000000 --- a/test/simple/test-stream2-httpclient-response-end.js +++ /dev/null @@ -1,52 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -var common = require('../common.js'); -var assert = require('assert'); -var http = require('http'); -var msg = 'Hello'; -var readable_event = false; -var end_event = false; -var server = http.createServer(function(req, res) { - res.writeHead(200, {'Content-Type': 'text/plain'}); - res.end(msg); -}).listen(common.PORT, function() { - http.get({port: common.PORT}, function(res) { - var data = ''; - res.on('readable', function() { - console.log('readable event'); - readable_event = true; - data += res.read(); - }); - res.on('end', function() { - console.log('end event'); - end_event = true; - assert.strictEqual(msg, data); - server.close(); - }); - }); -}); - -process.on('exit', function() { - assert(readable_event); - assert(end_event); -}); - diff --git a/test/simple/test-stream2-large-read-stall.js b/test/simple/test-stream2-large-read-stall.js index 2fbfbca..667985b 100644 --- a/test/simple/test-stream2-large-read-stall.js +++ b/test/simple/test-stream2-large-read-stall.js @@ -30,7 +30,7 @@ var PUSHSIZE = 20; var PUSHCOUNT = 1000; var HWM = 50; -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable({ highWaterMark: HWM }); @@ -39,23 +39,23 @@ var rs = r._readableState; r._read = push; r.on('readable', function() { - console.error('>> readable'); + //console.error('>> readable'); do { - console.error(' > read(%d)', READSIZE); + //console.error(' > read(%d)', READSIZE); var ret = r.read(READSIZE); - console.error(' < %j (%d remain)', ret && ret.length, rs.length); + //console.error(' < %j (%d remain)', ret && ret.length, rs.length); } while (ret && ret.length === READSIZE); - console.error('<< after read()', - ret && ret.length, - rs.needReadable, - rs.length); + //console.error('<< after read()', + // ret && ret.length, + // rs.needReadable, + // rs.length); }); var endEmitted = false; r.on('end', function() { endEmitted = true; - console.error('end'); + //console.error('end'); }); var pushes = 0; @@ -64,11 +64,11 @@ function push() { return; if (pushes++ === PUSHCOUNT) { - console.error(' push(EOF)'); + //console.error(' push(EOF)'); return r.push(null); } - console.error(' push #%d', pushes); + //console.error(' push #%d', pushes); if (r.push(new Buffer(PUSHSIZE))) setTimeout(push); } diff --git a/test/simple/test-stream2-objects.js b/test/simple/test-stream2-objects.js index 3e6931d..ff47d89 100644 --- a/test/simple/test-stream2-objects.js +++ b/test/simple/test-stream2-objects.js @@ -21,8 +21,8 @@ var common = require('../common.js'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var assert = require('assert'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-pipe-error-handling.js b/test/simple/test-stream2-pipe-error-handling.js index cf7531c..e3f3e4e 100644 --- a/test/simple/test-stream2-pipe-error-handling.js +++ b/test/simple/test-stream2-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); (function testErrorListenerCatches() { var count = 1000; diff --git a/test/simple/test-stream2-pipe-error-once-listener.js b/test/simple/test-stream2-pipe-error-once-listener.js index 5e8e3cb..53b2616 100755 --- a/test/simple/test-stream2-pipe-error-once-listener.js +++ b/test/simple/test-stream2-pipe-error-once-listener.js @@ -24,7 +24,7 @@ var common = require('../common.js'); var assert = require('assert'); var util = require('util'); -var stream = require('stream'); +var stream = require('../../'); var Read = function() { diff --git a/test/simple/test-stream2-push.js b/test/simple/test-stream2-push.js index b63edc3..eb2b0e9 100644 --- a/test/simple/test-stream2-push.js +++ b/test/simple/test-stream2-push.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; var assert = require('assert'); diff --git a/test/simple/test-stream2-read-sync-stack.js b/test/simple/test-stream2-read-sync-stack.js index e8a7305..9740a47 100644 --- a/test/simple/test-stream2-read-sync-stack.js +++ b/test/simple/test-stream2-read-sync-stack.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var N = 256 * 1024; diff --git a/test/simple/test-stream2-readable-empty-buffer-no-eof.js b/test/simple/test-stream2-readable-empty-buffer-no-eof.js index cd30178..4b1659d 100644 --- a/test/simple/test-stream2-readable-empty-buffer-no-eof.js +++ b/test/simple/test-stream2-readable-empty-buffer-no-eof.js @@ -22,10 +22,9 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; test1(); -test2(); function test1() { var r = new Readable(); @@ -88,31 +87,3 @@ function test1() { console.log('ok'); }); } - -function test2() { - var r = new Readable({ encoding: 'base64' }); - var reads = 5; - r._read = function(n) { - if (!reads--) - return r.push(null); // EOF - else - return r.push(new Buffer('x')); - }; - - var results = []; - function flow() { - var chunk; - while (null !== (chunk = r.read())) - results.push(chunk + ''); - } - r.on('readable', flow); - r.on('end', function() { - results.push('EOF'); - }); - flow(); - - process.on('exit', function() { - assert.deepEqual(results, [ 'eHh4', 'eHg=', 'EOF' ]); - console.log('ok'); - }); -} diff --git a/test/simple/test-stream2-readable-from-list.js b/test/simple/test-stream2-readable-from-list.js index 7c96ffe..04a96f5 100644 --- a/test/simple/test-stream2-readable-from-list.js +++ b/test/simple/test-stream2-readable-from-list.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var fromList = require('_stream_readable')._fromList; +var fromList = require('../../lib/_stream_readable')._fromList; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-readable-legacy-drain.js b/test/simple/test-stream2-readable-legacy-drain.js index 675da8e..51fd3d5 100644 --- a/test/simple/test-stream2-readable-legacy-drain.js +++ b/test/simple/test-stream2-readable-legacy-drain.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream'); +var Stream = require('../../'); var Readable = Stream.Readable; var r = new Readable(); diff --git a/test/simple/test-stream2-readable-non-empty-end.js b/test/simple/test-stream2-readable-non-empty-end.js index 7314ae7..c971898 100644 --- a/test/simple/test-stream2-readable-non-empty-end.js +++ b/test/simple/test-stream2-readable-non-empty-end.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var len = 0; var chunks = new Array(10); diff --git a/test/simple/test-stream2-readable-wrap-empty.js b/test/simple/test-stream2-readable-wrap-empty.js index 2e5cf25..fd8a3dc 100644 --- a/test/simple/test-stream2-readable-wrap-empty.js +++ b/test/simple/test-stream2-readable-wrap-empty.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var EE = require('events').EventEmitter; var oldStream = new EE(); diff --git a/test/simple/test-stream2-readable-wrap.js b/test/simple/test-stream2-readable-wrap.js index 90eea01..6b177f7 100644 --- a/test/simple/test-stream2-readable-wrap.js +++ b/test/simple/test-stream2-readable-wrap.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var EE = require('events').EventEmitter; var testRuns = 0, completedRuns = 0; diff --git a/test/simple/test-stream2-set-encoding.js b/test/simple/test-stream2-set-encoding.js index 5d2c32a..685531b 100644 --- a/test/simple/test-stream2-set-encoding.js +++ b/test/simple/test-stream2-set-encoding.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var util = require('util'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-transform.js b/test/simple/test-stream2-transform.js index 9c9ddd8..a0cacc6 100644 --- a/test/simple/test-stream2-transform.js +++ b/test/simple/test-stream2-transform.js @@ -21,8 +21,8 @@ var assert = require('assert'); var common = require('../common.js'); -var PassThrough = require('_stream_passthrough'); -var Transform = require('_stream_transform'); +var PassThrough = require('../../').PassThrough; +var Transform = require('../../').Transform; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-unpipe-drain.js b/test/simple/test-stream2-unpipe-drain.js index d66dc3c..365b327 100644 --- a/test/simple/test-stream2-unpipe-drain.js +++ b/test/simple/test-stream2-unpipe-drain.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var crypto = require('crypto'); var util = require('util'); diff --git a/test/simple/test-stream2-unpipe-leak.js b/test/simple/test-stream2-unpipe-leak.js index 99f8746..17c92ae 100644 --- a/test/simple/test-stream2-unpipe-leak.js +++ b/test/simple/test-stream2-unpipe-leak.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var chunk = new Buffer('hallo'); diff --git a/test/simple/test-stream2-writable.js b/test/simple/test-stream2-writable.js index 704100c..209c3a6 100644 --- a/test/simple/test-stream2-writable.js +++ b/test/simple/test-stream2-writable.js @@ -20,8 +20,8 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var W = require('_stream_writable'); -var D = require('_stream_duplex'); +var W = require('../../').Writable; +var D = require('../../').Duplex; var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream3-pause-then-read.js b/test/simple/test-stream3-pause-then-read.js index b91bde3..2f72c15 100644 --- a/test/simple/test-stream3-pause-then-read.js +++ b/test/simple/test-stream3-pause-then-read.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/���000755 �000766 �000024 �00000000000 12456115117 042677� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/LICENSE000644 �000766 �000024 �00000002110 12455173731 043135� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/�����������000755 �000766 �000024 �00000000000 12456115117 044606� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/package.json������������000644 �000766 �000024 �00000003265 12455173731 044432� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������{ "name": "readable-stream", "version": "1.1.13", "description": "Streams3, a user-land copy of the stream library from Node.js v0.11.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "isarray": "0.0.1", "string_decoder": "~0.10.x", "inherits": "~2.0.1" }, "devDependencies": { "tap": "~0.2.6" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/readable-stream" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "gitHead": "3b672fd7ae92acf5b4ffdbabf74b372a0a56b051", "bugs": { "url": "https://github.com/isaacs/readable-stream/issues" }, "homepage": "https://github.com/isaacs/readable-stream", "_id": "readable-stream@1.1.13", "_shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "_from": "readable-stream@>=1.1.9 <1.2.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/passthrough.js����������000644 �000766 �000024 �00000000071 12455173731 045041� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_passthrough.js") �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/readable.js000644 �000766 �000024 �00000000551 12455173731 044234� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = require('stream'); exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); �������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/README.md��000644 �000766 �000024 �00000002427 12455173731 043422� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������# readable-stream ***Node-core streams for userland*** [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/) This package is a mirror of the Streams2 and Streams3 implementations in Node-core. If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core. **readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12. **readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` you’ll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when you’re ready to start using Streams3, pin to `"~1.1.0"` �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/transform.js������������000644 �000766 �000024 �00000000067 12455173731 044512� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_transform.js") �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/writable.js000644 �000766 �000024 �00000000066 12455173731 044307� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������module.exports = require("./lib/_stream_writable.js") ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/������������������000755 �000766 �000024 �00000000000 12456115117 047122� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/���000755 �000766 �000024 �00000000000 12456115117 046260� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/����������������000755 �000766 �000024 �00000000000 12456115117 047601� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/.npmignore������000644 �000766 �000024 �00000000013 12455173731 051577� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������build test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/index.js��������000644 �000766 �000024 �00000015006 12455173731 051255� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/LICENSE���������000644 �000766 �000024 �00000002064 12455173731 050615� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/package.json����000644 �000766 �000024 �00000002527 12455173731 052102� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_from": "string_decoder@>=0.10.0 <0.11.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "readme": "ERROR: No README data found!" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/README.md�������000644 �000766 �000024 �00000000762 12455173731 051072� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.��������������node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/build/�����������������000755 �000766 �000024 �00000000000 12456115117 047357� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/component.json���������000644 �000766 �000024 �00000000726 12455173731 051167� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/index.js���������������000644 �000766 �000024 �00000000170 12455173731 047730� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/package.json�����������000644 �000766 �000024 �00000005534 12455173731 050562� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "readme": "\n# isarray\n\n`Array#isArray` for older browsers.\n\n## Usage\n\n```js\nvar isArray = require('isarray');\n\nconsole.log(isArray([])); // => true\nconsole.log(isArray({})); // => false\n```\n\n## Installation\n\nWith [npm](http://npmjs.org) do\n\n```bash\n$ npm install isarray\n```\n\nThen bundle for the browser with\n[browserify](https://github.com/substack/browserify).\n\nWith [component](http://component.io) do\n\n```bash\n$ component install juliangruber/isarray\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber <julian@juliangruber.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n", "readmeFilename": "README.md", "_id": "isarray@0.0.1", "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "_from": "isarray@0.0.1", "_npmVersion": "1.2.18", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "directories": {}, "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/README.md��������������000644 �000766 �000024 �00000003025 12455173731 047544� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client������������������������������������������������������������������������������������������� # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/build/build.js���������000644 �000766 �000024 �00000007770 12455173731 051034� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client������������������������������������������������������������������������������������������� /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ��������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/float.patch�������000644 �000766 �000024 �00000037626 12455173731 051273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}����������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/lib/��������������000755 �000766 �000024 �00000000000 12456115117 047670� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/package.json������000644 �000766 �000024 �00000002523 12455173731 051417� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is", "_id": "core-util-is@1.0.1", "dist": { "shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" }, "_from": "core-util-is@>=1.0.0 <1.1.0", "_npmVersion": "1.3.23", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz", "scripts": {} } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/README.md���������000644 �000766 �000024 �00000000103 12455173731 050400� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������# core-util-is The `util.is*` functions introduced in Node v0.12. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/util.js�����������000644 �000766 �000024 �00000003526 12455173731 050450� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������// NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/lib/util.js�������000644 �000766 �000024 �00000003562 12455173731 051216� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������// NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }����������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_duplex.js���000644 �000766 �000024 �00000003215 12455173731 046256� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������// a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. module.exports = Duplex; /*<replacement>*/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /*</replacement>*/ /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); forEach(objectKeys(Writable.prototype), function(method) { if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; }); function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. process.nextTick(this.end.bind(this)); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/concat-stream/node_modules/readable-stream/lib/_stream_passthrough.js������������������000644 �000766 �000024 �00000001121 12455173731 047316� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client�������������������������������������������������������������������������������������������// a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. module.exports = PassThrough; var Transform = require('./_stream_transform'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_readable.js�000644 �000766 �000024 �00000060371 12455173731 046522� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������module.exports = Readable; /*<replacement>*/ var isArray = require('isarray'); /*</replacement>*/ /*<replacement>*/ var Buffer = require('buffer').Buffer; /*</replacement>*/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /*<replacement>*/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /*</replacement>*/ var Stream = require('stream'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var StringDecoder; /*<replacement>*/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /*</replacement>*/ util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (util.isString(chunk) && !state.objectMode) { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (util.isNullOrUndefined(chunk)) { state.reading = false; if (!state.ended) onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (isNaN(n) || util.isNull(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else return state.length; } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (!util.isNumber(n) || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (util.isNull(ret)) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (!util.isNull(ret)) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) process.nextTick(function() { emitReadable_(stream); }); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; process.nextTick(function() { maybeReadMore_(stream, state); }); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) process.nextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { var self = this; process.nextTick(function() { debug('readable nexttick read 0'); self.read(0); }); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; if (!state.reading) { debug('resume read 0'); this.read(0); } resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; process.nextTick(function() { resume_(stream, state); }); } } function resume_(stream, state) { state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); if (!chunk || !state.objectMode && !chunk.length) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (util.isFunction(stream[i]) && util.isUndefined(this[i])) { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }}(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; process.nextTick(function() { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } }); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_transform.js000644 �000766 �000024 �00000014107 12455173731 046772� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������// a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. module.exports = Transform; var Duplex = require('./_stream_duplex'); /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ util.inherits(Transform, Duplex); function TransformState(options, stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (!util.isNullOrUndefined(data)) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(options, this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; this.once('prefinish', function() { if (util.isFunction(this._flush)) this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (!util.isNull(ts.writechunk) && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_writable.js�000644 �000766 �000024 �00000027237 12455173731 046600� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules���������������������������������������������������������������������������������������������������������������// A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. module.exports = Writable; /*<replacement>*/ var Buffer = require('buffer').Buffer; /*</replacement>*/ Writable.WritableState = WritableState; /*<replacement>*/ var util = require('core-util-is'); util.inherits = require('inherits'); /*</replacement>*/ var Stream = require('stream'); util.inherits(Writable, Stream); function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.buffer = []; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, state, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); process.nextTick(function() { cb(er); }); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); process.nextTick(function() { cb(er); }); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (util.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (!util.isFunction(cb)) cb = function() {}; if (state.ended) writeAfterEnd(this, state, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.buffer.length) clearBuffer(this, state); } }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && util.isString(chunk)) { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (util.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) state.buffer.push(new WriteReq(chunk, encoding, cb)); else doWrite(stream, state, false, len, chunk, encoding, cb); return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { if (sync) process.nextTick(function() { state.pendingcb--; cb(er); }); else { state.pendingcb--; cb(er); } stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(stream, state); if (!finished && !state.corked && !state.bufferProcessing && state.buffer.length) { clearBuffer(stream, state); } if (sync) { process.nextTick(function() { afterWrite(stream, state, finished, cb); }); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; if (stream._writev && state.buffer.length > 1) { // Fast case, write everything using _writev() var cbs = []; for (var c = 0; c < state.buffer.length; c++) cbs.push(state.buffer[c].callback); // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; doWrite(stream, state, true, state.length, state.buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer state.buffer = []; } else { // Slow case, write chunks one-by-one for (var c = 0; c < state.buffer.length; c++) { var entry = state.buffer[c]; var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { c++; break; } } if (c < state.buffer.length) state.buffer = state.buffer.slice(c); else state.buffer.length = 0; } state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (util.isFunction(chunk)) { cb = chunk; chunk = null; encoding = null; } else if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (!util.isNullOrUndefined(chunk)) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(stream, state) { return (state.ending && state.length === 0 && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(stream, state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else prefinish(stream, state); } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) process.nextTick(cb); else stream.once('finish', cb); } state.ended = true; } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/adduser.js���������000644 �000766 �000024 �00000007446 12455173731 032041� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = adduser var url = require("url") var assert = require("assert") function adduser (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to adduser") assert( params && typeof params === "object", "must pass params to adduser" ) assert(typeof cb === "function", "must pass callback to adduser") assert(params.auth && typeof params.auth, "must pass auth to adduser") var auth = params.auth assert(typeof auth.username === "string", "must include username in auth") assert(typeof auth.password === "string", "must include password in auth") assert(typeof auth.email === "string", "must include email in auth") // normalize registry URL if (uri.slice(-1) !== "/") uri += "/" var username = auth.username.trim() var password = auth.password.trim() var email = auth.email.trim() // validation if (!username) return cb(new Error("No username supplied.")) if (!password) return cb(new Error("No password supplied.")) if (!email) return cb(new Error("No email address supplied.")) if (!email.match(/^[^@]+@[^\.]+\.[^\.]+/)) { return cb(new Error("Please use a real email address.")) } var userobj = { _id : "org.couchdb.user:"+username, name : username, password : password, email : email, type : "user", roles : [], date : new Date().toISOString() } var token = this.config.couchToken if (this.couchLogin) this.couchLogin.token = null cb = done.call(this, token, cb) var logObj = Object.keys(userobj).map(function (k) { if (k === "password") return [k, "XXXXX"] return [k, userobj[k]] }).reduce(function (s, kv) { s[kv[0]] = kv[1] return s }, {}) this.log.verbose("adduser", "before first PUT", logObj) var client = this uri = url.resolve(uri, "-/user/org.couchdb.user:" + encodeURIComponent(username)) var options = { method : "PUT", body : userobj, auth : auth } this.request( uri, options, function (error, data, json, response) { if (!error || !response || response.statusCode !== 409) { return cb(error, data, json, response) } client.log.verbose("adduser", "update existing user") return client.request( uri+"?write=true", { body : userobj, auth : auth }, function (er, data, json, response) { if (er || data.error) { return cb(er, data, json, response) } Object.keys(data).forEach(function (k) { if (!userobj[k] || k === "roles") { userobj[k] = data[k] } }) client.log.verbose("adduser", "userobj", logObj) client.request(uri+"/-rev/"+userobj._rev, options, cb) } ) } ) function done (token, cb) { return function (error, data, json, response) { if (!error && (!response || response.statusCode === 201)) { return cb(error, data, json, response) } // there was some kind of error, reinstate previous auth/token/etc. if (client.couchLogin) { client.couchLogin.token = token if (client.couchLogin.tokenSet) { client.couchLogin.tokenSet(token) } } client.log.verbose("adduser", "back", [error, data, json]) if (!error) { error = new Error( (response && response.statusCode || "") + " " + "Could not create user\n" + JSON.stringify(data) ) } if (response && (response.statusCode === 401 || response.statusCode === 403)) { client.log.warn("adduser", "Incorrect username or password\n" + "You can reset your account by visiting:\n" + "\n" + " https://npmjs.org/forgot\n") } return cb(error) } } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/attempt.js���������000644 �000766 �000024 �00000000733 12455173731 032060� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var retry = require("retry") module.exports = attempt function attempt(cb) { // Tuned to spread 3 attempts over about a minute. // See formula at <https://github.com/tim-kos/node-retry>. var operation = retry.operation(this.config.retry) var client = this operation.attempt(function (currentAttempt) { client.log.info("attempt", "registry request try #"+currentAttempt+ " at "+(new Date()).toLocaleTimeString()) cb(operation) }) } �������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/authify.js���������000644 �000766 �000024 �00000001245 12455173731 032052� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = authify function authify (authed, parsed, headers, credentials) { if (credentials && credentials.token) { this.log.verbose("request", "using bearer token for auth") headers.authorization = "Bearer " + credentials.token return null } if (authed) { if (credentials && credentials.username && credentials.password) { var username = encodeURIComponent(credentials.username) var password = encodeURIComponent(credentials.password) parsed.auth = username + ":" + password } else { return new Error( "This request requires auth credentials. Run `npm login` and repeat the request." ) } } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/deprecate.js�������000644 �000766 �000024 �00000002516 12455173731 032337� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = deprecate var assert = require("assert") var url = require("url") var semver = require("semver") function deprecate (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to deprecate") assert(params && typeof params === "object", "must pass params to deprecate") assert(typeof cb === "function", "must pass callback to deprecate") assert(typeof params.version === "string", "must pass version to deprecate") assert(typeof params.message === "string", "must pass message to deprecate") assert( params.auth && typeof params.auth === "object", "must pass auth to deprecate" ) var version = params.version var message = params.message var auth = params.auth if (semver.validRange(version) === null) { return cb(new Error("invalid version range: "+version)) } this.get(uri + "?write=true", { auth : auth }, function (er, data) { if (er) return cb(er) // filter all the versions that match Object.keys(data.versions).filter(function (v) { return semver.satisfies(v, version) }).forEach(function (v) { data.versions[v].deprecated = message }) // now update the doc on the registry var options = { method : "PUT", body : data, auth : auth } this.request(url.resolve(uri, data._id), options, cb) }.bind(this)) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/fetch.js�����������000644 �000766 �000024 �00000004616 12455173731 031477� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") , url = require("url") var request = require("request") , once = require("once") module.exports = fetch function fetch (uri, params, cb) { assert(typeof uri === "string", "must pass uri to request") assert(params && typeof params === "object", "must pass params to request") assert(typeof cb === "function", "must pass callback to request") cb = once(cb) var client = this this.attempt(function (operation) { makeRequest.call(client, uri, params, function (er, req) { if (er) return cb(er) req.on("error", function (er) { if (operation.retry(er)) { client.log.info("retry", "will retry, error on last attempt: " + er) } }) req.on("response", function (res) { client.log.http("fetch", "" + res.statusCode, uri) var er var statusCode = res && res.statusCode if (statusCode === 200) { // Work around bug in node v0.10.0 where the CryptoStream // gets stuck and never starts reading again. res.resume() if (process.version === "v0.10.0") unstick(res) return cb(null, res) } // Only retry on 408, 5xx or no `response`. else if (statusCode === 408) { er = new Error("request timed out") } else if (statusCode >= 500) { er = new Error("server error " + statusCode) } if (er && operation.retry(er)) { client.log.info("retry", "will retry, error on last attempt: " + er) } else { cb(new Error("fetch failed with status code " + statusCode)) } }) }) }) } function unstick(response) { response.resume = function (orig) { return function() { var ret = orig.apply(response, arguments) if (response.socket.encrypted) response.socket.encrypted.read(0) return ret }}(response.resume) } function makeRequest (remote, params, cb) { var parsed = url.parse(remote) this.log.http("fetch", "GET", parsed.href) var headers = params.headers || {} var er = this.authify( params.auth && params.auth.alwaysAuth, parsed, headers, params.auth ) if (er) return cb(er) var opts = this.initialize( parsed, "GET", "application/x-tar, application/vnd.github+json; q=0.1", headers ) // always want to follow redirects for fetch opts.followRedirect = true cb(null, request(opts)) } ������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/get.js�������������000644 �000766 �000024 �00000001213 12455173731 031153� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = get var assert = require("assert") , url = require("url") /* * This is meant to be overridden in specific implementations if you * want specialized behavior for metadata (i.e. caching). */ function get (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to get") assert(params && typeof params === "object", "must pass params to get") assert(typeof cb === "function", "must pass callback to get") var parsed = url.parse(uri) assert( parsed.protocol === "http:" || parsed.protocol === "https:", "must have a URL that starts with http: or https:" ) this.request(uri, params, cb) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/initialize.js������000644 �000766 �000024 �00000002300 12455173731 032533� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var crypto = require("crypto") var pkg = require("../package.json") module.exports = initialize function initialize (uri, method, accept, headers) { if (!this.config.sessionToken) { this.config.sessionToken = crypto.randomBytes(8).toString("hex") this.log.verbose("request id", this.config.sessionToken) } var opts = { url : uri, method : method, headers : headers, localAddress : this.config.proxy.localAddress, strictSSL : this.config.ssl.strict, cert : this.config.ssl.certificate, key : this.config.ssl.key, ca : this.config.ssl.ca } // request will not pay attention to the NOPROXY environment variable if a // config value named proxy is passed in, even if it's set to null. var proxy if (uri.protocol === "https") { proxy = this.config.proxy.https } else { proxy = this.config.proxy.http } if (typeof proxy === "string") opts.proxy = proxy headers.version = this.version || pkg.version headers.accept = accept if (this.refer) headers.referer = this.refer headers["npm-session"] = this.config.sessionToken headers["user-agent"] = this.config.userAgent return opts } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/publish.js���������000644 �000766 �000024 �00000012212 12455173731 032043� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = publish var url = require("url") , semver = require("semver") , crypto = require("crypto") , Stream = require("stream").Stream , assert = require("assert") , fixer = require("normalize-package-data/lib/fixer.js") , concat = require("concat-stream") function escaped (name) { return name.replace("/", "%2f") } function publish (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to publish") assert(params && typeof params === "object", "must pass params to publish") assert(typeof cb === "function", "must pass callback to publish") var auth = params.auth assert(auth && typeof auth === "object", "must pass auth to publish") if (!(auth.token || (auth.password && auth.username && auth.email))) { var er = new Error("auth required for publishing") er.code = "ENEEDAUTH" return cb(er) } var metadata = params.metadata assert( metadata && typeof metadata === "object", "must pass package metadata to publish" ) try { fixer.fixNameField(metadata, true) } catch (er) { return cb(er) } var version = semver.clean(metadata.version) if (!version) return cb(new Error("invalid semver: " + metadata.version)) metadata.version = version var body = params.body assert(body, "must pass package body to publish") assert(body instanceof Stream, "package body passed to publish must be a stream") var client = this var sink = concat(function (tarbuffer) { putFirst.call(client, uri, metadata, tarbuffer, auth, cb) }) sink.on("error", cb) body.pipe(sink) } function putFirst (registry, data, tarbuffer, auth, cb) { // optimistically try to PUT all in one single atomic thing. // If 409, then GET and merge, try again. // If other error, then fail. var root = { _id : data.name , name : data.name , description : data.description , "dist-tags" : {} , versions : {} , readme: data.readme || "" } if (!auth.token) { root.maintainers = [{name : auth.username, email : auth.email}] data.maintainers = JSON.parse(JSON.stringify(root.maintainers)) } root.versions[ data.version ] = data var tag = data.tag || this.config.defaultTag root["dist-tags"][tag] = data.version var tbName = data.name + "-" + data.version + ".tgz" , tbURI = data.name + "/-/" + tbName data._id = data.name+"@"+data.version data.dist = data.dist || {} data.dist.shasum = crypto.createHash("sha1").update(tarbuffer).digest("hex") data.dist.tarball = url.resolve(registry, tbURI) .replace(/^https:\/\//, "http://") root._attachments = {} root._attachments[ tbName ] = { "content_type": "application/octet-stream", "data": tarbuffer.toString("base64"), "length": tarbuffer.length } var fixed = url.resolve(registry, escaped(data.name)) var client = this var options = { method : "PUT", body : root, auth : auth } this.request(fixed, options, function (er, parsed, json, res) { var r409 = "must supply latest _rev to update existing package" var r409b = "Document update conflict." var conflict = res && res.statusCode === 409 if (parsed && (parsed.reason === r409 || parsed.reason === r409b)) conflict = true // a 409 is typical here. GET the data and merge in. if (er && !conflict) { client.log.error("publish", "Failed PUT "+(res && res.statusCode)) return cb(er) } if (!er && !conflict) return cb(er, parsed, json, res) // let's see what versions are already published. client.request(fixed+"?write=true", { auth : auth }, function (er, current) { if (er) return cb(er) putNext.call(client, registry, data.version, root, current, auth, cb) }) }) } function putNext (registry, newVersion, root, current, auth, cb) { // already have the tardata on the root object // just merge in existing stuff var curVers = Object.keys(current.versions || {}).map(function (v) { return semver.clean(v, true) }).concat(Object.keys(current.time || {}).map(function(v) { if (semver.valid(v, true)) return semver.clean(v, true) }).filter(function(v) { return v })) if (curVers.indexOf(newVersion) !== -1) { return cb(conflictError(root.name, newVersion)) } current.versions[newVersion] = root.versions[newVersion] current._attachments = current._attachments || {} for (var i in root) { switch (i) { // objects that copy over the new stuffs case "dist-tags": case "versions": case "_attachments": for (var j in root[i]) current[i][j] = root[i][j] break // ignore these case "maintainers": break // copy default: current[i] = root[i] } } var maint = JSON.parse(JSON.stringify(root.maintainers)) root.versions[newVersion].maintainers = maint var uri = url.resolve(registry, escaped(root.name)) var options = { method : "PUT", body : current, auth : auth } this.request(uri, options, cb) } function conflictError (pkgid, version) { var e = new Error("cannot modify pre-existing version") e.code = "EPUBLISHCONFLICT" e.pkgid = pkgid e.version = version return e } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/request.js���������000644 �000766 �000024 �00000016355 12455173731 032101� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = regRequest // npm: means // 1. https // 2. send authorization // 3. content-type is 'application/json' -- metadata // var assert = require("assert") , url = require("url") , zlib = require("zlib") , Stream = require("stream").Stream var request = require("request") , once = require("once") function regRequest (uri, params, cb_) { assert(typeof uri === "string", "must pass uri to request") assert(params && typeof params === "object", "must pass params to request") assert(typeof cb_ === "function", "must pass callback to request") params.method = params.method || "GET" this.log.verbose("request", "uri", uri) // Since there are multiple places where an error could occur, // don't let the cb be called more than once. var cb = once(cb_) if (uri.match(/^\/?favicon.ico/)) { return cb(new Error("favicon.ico isn't a package, it's a picture.")) } var adduserChange = /\/?-\/user\/org\.couchdb\.user:([^/]+)\/-rev/ , isUserChange = uri.match(adduserChange) , adduserNew = /\/?-\/user\/org\.couchdb\.user:([^/?]+)$/ , isNewUser = uri.match(adduserNew) , alwaysAuth = params.auth && params.auth.alwaysAuth , isDelete = params.method === "DELETE" , isWrite = params.body || isDelete if (isUserChange && !isWrite) { return cb(new Error("trying to change user document without writing(?!)")) } // new users can *not* use auth, because they don't *have* auth yet if (isUserChange) { this.log.verbose("request", "updating existing user; sending authorization") params.authed = true } else if (isNewUser) { this.log.verbose("request", "new user, so can't send auth") params.authed = false } else if (alwaysAuth) { this.log.verbose("request", "always-auth set; sending authorization") params.authed = true } else if (isWrite) { this.log.verbose("request", "sending authorization for write operation") params.authed = true } else { // most of the time we don't want to auth this.log.verbose("request", "no auth needed") params.authed = false } var self = this this.attempt(function (operation) { makeRequest.call(self, uri, params, function (er, parsed, raw, response) { if (!er || (er.message && er.message.match(/^SSL Error/))) { if (er) er.code = "ESSL" return cb(er, parsed, raw, response) } // Only retry on 408, 5xx or no `response`. var statusCode = response && response.statusCode var timeout = statusCode === 408 var serverError = statusCode >= 500 var statusRetry = !statusCode || timeout || serverError if (er && statusRetry && operation.retry(er)) { self.log.info("retry", "will retry, error on last attempt: " + er) return undefined } if (response) { self.log.verbose("headers", response.headers) if (response.headers["npm-notice"]) { self.log.warn("notice", response.headers["npm-notice"]) } } cb.apply(null, arguments) }) }) } function makeRequest (uri, params, cb_) { var cb = once(cb_) var parsed = url.parse(uri) var headers = {} // metadata should be compressed headers["accept-encoding"] = "gzip" var er = this.authify(params.authed, parsed, headers, params.auth) if (er) return cb_(er) var opts = this.initialize( parsed, params.method, "application/json", headers ) opts.followRedirect = (typeof params.follow === "boolean" ? params.follow : true) opts.encoding = null // tell request let body be Buffer instance if (params.etag) { this.log.verbose("etag", params.etag) headers[params.method === "GET" ? "if-none-match" : "if-match"] = params.etag } // figure out wth body is if (params.body) { if (Buffer.isBuffer(params.body)) { opts.body = params.body headers["content-type"] = "application/json" headers["content-length"] = params.body.length } else if (typeof params.body === "string") { opts.body = params.body headers["content-type"] = "application/json" headers["content-length"] = Buffer.byteLength(params.body) } else if (params.body instanceof Stream) { headers["content-type"] = "application/octet-stream" if (params.body.size) headers["content-length"] = params.body.size } else { delete params.body._etag opts.json = params.body } } this.log.http("request", params.method, parsed.href || "/") var done = requestDone.call(this, params.method, uri, cb) var req = request(opts, decodeResponseBody(done)) req.on("error", cb) req.on("socket", function (s) { s.on("error", cb) }) if (params.body && (params.body instanceof Stream)) { params.body.pipe(req) } } function decodeResponseBody(cb) { return function (er, response, data) { if (er) return cb(er, response, data) // don't ever re-use connections that had server errors. // those sockets connect to the Bad Place! if (response.socket && response.statusCode > 500) { response.socket.destroy() } if (response.headers["content-encoding"] !== "gzip") { return cb(er, response, data) } zlib.gunzip(data, function (er, buf) { if (er) return cb(er, response, data) cb(null, response, buf) }) } } // cb(er, parsed, raw, response) function requestDone (method, where, cb) { return function (er, response, data) { if (er) return cb(er) var urlObj = url.parse(where) if (urlObj.auth) urlObj.auth = "***" this.log.http(response.statusCode, url.format(urlObj)) if (Buffer.isBuffer(data)) { data = data.toString() } var parsed if (data && typeof data === "string" && response.statusCode !== 304) { try { parsed = JSON.parse(data) } catch (ex) { ex.message += "\n" + data this.log.verbose("bad json", data) this.log.error("registry", "error parsing json") return cb(ex, null, data, response) } } else if (data) { parsed = data data = JSON.stringify(parsed) } // expect data with any error codes if (!data && response.statusCode >= 400) { return cb( response.statusCode + " " + require("http").STATUS_CODES[response.statusCode] , null, data, response ) } er = null if (parsed && response.headers.etag) { parsed._etag = response.headers.etag } // for the search endpoint, the "error" property can be an object if (parsed && parsed.error && typeof parsed.error !== "object" || response.statusCode >= 400) { var w = url.parse(where).pathname.substr(1) var name if (!w.match(/^-/)) { w = w.split("/") name = w[w.indexOf("_rewrite") + 1] } if (!parsed.error) { er = new Error( "Registry returned " + response.statusCode + " for " + method + " on " + where ) } else if (name && parsed.error === "not_found") { er = new Error("404 Not Found: " + name) } else { er = new Error( parsed.error + " " + (parsed.reason || "") + ": " + w ) } if (name) er.pkgid = name er.statusCode = response.statusCode er.code = "E" + er.statusCode } return cb(er, parsed, data, response) }.bind(this) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/star.js������������000644 �000766 �000024 �00000002524 12455173731 031353� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = star var assert = require("assert") function star (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to star") assert(params && typeof params === "object", "must pass params to star") assert(typeof cb === "function", "must pass callback to star") var starred = params.starred ? true : false var auth = params.auth assert(auth && typeof auth === "object", "must pass auth to star") if (auth.token) { return cb(new Error("This operation is unsupported for token-based auth")) } else if (!(auth.username && auth.password)) { return cb(new Error("Must be logged in to star/unstar packages")) } var client = this this.request(uri+"?write=true", { auth : auth }, function (er, fullData) { if (er) return cb(er) fullData = { _id : fullData._id, _rev : fullData._rev, users : fullData.users || {} } if (starred) { client.log.info("starring", fullData._id) fullData.users[auth.username] = true client.log.verbose("starring", fullData) } else { delete fullData.users[auth.username] client.log.info("unstarring", fullData._id) client.log.verbose("unstarring", fullData) } var options = { method : "PUT", body : fullData, auth : auth } return client.request(uri, options, cb) }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/stars.js�����������000644 �000766 �000024 �00000001212 12455173731 031527� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = stars var assert = require("assert") var url = require("url") function stars (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to stars") assert(params && typeof params === "object", "must pass params to stars") assert(typeof cb === "function", "must pass callback to stars") var auth = params.auth var name = params.username || (auth && auth.username) if (!name) return cb(new Error("must pass either username or auth to stars")) var encoded = encodeURIComponent(name) var path = "-/_view/starredByUser?key=\""+encoded+"\"" this.request(url.resolve(uri, path), { auth : auth }, cb) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/tag.js�������������000644 �000766 �000024 �00000001245 12455173731 031154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = tag var assert = require("assert") function tag (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to tag") assert(params && typeof params === "object", "must pass params to tag") assert(typeof cb === "function", "must pass callback to tag") assert(typeof params.version === "string", "must pass version to tag") assert(typeof params.tag === "string", "must pass tag name to tag") assert(params.auth && typeof params.auth === "object", "must pass auth to tag") var options = { method : "PUT", body : JSON.stringify(params.version), auth : params.auth } this.request(uri+"/"+params.tag, options, cb) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/unpublish.js�������000644 �000766 �000024 �00000007311 12455173731 032412� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = unpublish // fetch the data // modify to remove the version in question // If no versions remaining, then DELETE // else, PUT the modified data // delete the tarball var semver = require("semver") , url = require("url") , chain = require("slide").chain , assert = require("assert") function unpublish (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to unpublish") assert(params && typeof params === "object", "must pass params to unpublish") assert(typeof cb === "function", "must pass callback to unpublish") var ver = params.version var auth = params.auth assert(auth && typeof auth === "object", "must pass auth to unpublish") var options = { timeout : -1, follow : false, auth : auth } this.get(uri + "?write=true", options, function (er, data) { if (er) { this.log.info("unpublish", uri+" not published") return cb() } // remove all if no version specified if (!ver) { this.log.info("unpublish", "No version specified, removing all") return this.request(uri+"/-rev/"+data._rev, { method : "DELETE", auth : auth }, cb) } var versions = data.versions || {} , versionPublic = versions.hasOwnProperty(ver) var dist if (!versionPublic) { this.log.info("unpublish", uri+"@"+ver+" not published") } else { dist = versions[ver].dist this.log.verbose("unpublish", "removing attachments", dist) } delete versions[ver] // if it was the only version, then delete the whole package. if (!Object.keys(versions).length) { this.log.info("unpublish", "No versions remain, removing entire package") return this.request(uri+"/-rev/"+data._rev, { method : "DELETE", auth : auth }, cb) } if (!versionPublic) return cb() var latestVer = data["dist-tags"].latest for (var tag in data["dist-tags"]) { if (data["dist-tags"][tag] === ver) delete data["dist-tags"][tag] } if (latestVer === ver) { data["dist-tags"].latest = Object.getOwnPropertyNames(versions).sort(semver.compareLoose).pop() } var rev = data._rev delete data._revisions delete data._attachments var cb_ = detacher.call(this, uri, data, dist, auth, cb) this.request(uri+"/-rev/"+rev, { method : "PUT", body : data, auth : auth }, function (er) { if (er) { this.log.error("unpublish", "Failed to update data") } cb_(er) }.bind(this)) }.bind(this)) } function detacher (uri, data, dist, credentials, cb) { return function (er) { if (er) return cb(er) this.get(escape(uri, data.name), { auth : credentials }, function (er, data) { if (er) return cb(er) var tb = url.parse(dist.tarball) detach.call(this, uri, data, tb.pathname, data._rev, credentials, function (er) { if (er || !dist.bin) return cb(er) chain(Object.keys(dist.bin).map(function (bt) { return function (cb) { var d = dist.bin[bt] detach.call(this, uri, data, url.parse(d.tarball).pathname, null, credentials, cb) }.bind(this) }, this), cb) }.bind(this)) }.bind(this)) }.bind(this) } function detach (uri, data, path, rev, credentials, cb) { if (rev) { path += "/-rev/" + rev this.log.info("detach", path) return this.request(url.resolve(uri, path), { method : "DELETE", auth : credentials }, cb) } this.get(escape(uri, data.name), { auth : credentials }, function (er, data) { rev = data._rev if (!rev) return cb(new Error( "No _rev found in "+data._id)) detach.call(this, data, path, rev, cb) }.bind(this)) } function escape (base, name) { var escaped = name.replace(/\//, "%2f") return url.resolve(base, escaped) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-registry-client/lib/whoami.js����������000644 �000766 �000024 �00000001106 12455173731 031661� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = whoami var url = require("url") , assert = require("assert") function whoami (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to whoami") assert(params && typeof params === "object", "must pass params to whoami") assert(typeof cb === "function", "must pass callback to whoami") var auth = params.auth assert(auth && typeof auth === "object", "must pass auth to whoami") this.request(url.resolve(uri, "whoami"), { auth : auth }, function (er, userdata) { if (er) return cb(er) cb(null, userdata.username) }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-package-arg/LICENSE��������������������000644 �000766 �000024 �00000001354 12455173731 027321� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-package-arg/npa.js���������������������000644 �000766 �000024 �00000011031 12455173731 027421� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var url = require("url") var assert = require("assert") var util = require("util") var semver = require("semver") var path = require("path") module.exports = npa var isWindows = process.platform === "win32" || global.FAKE_WINDOWS var slashRe = isWindows ? /\\|\// : /\// var parseName = /^(?:@([^\/]+?)\/)?([^\/]+?)$/ var nameAt = /^(@([^\/]+?)\/)?([^\/]+?)@/ var debug = util.debuglog ? util.debuglog("npa") : /\bnpa\b/i.test(process.env.NODE_DEBUG || "") ? function () { console.error("NPA: " + util.format.apply(util, arguments).split("\n").join("\nNPA: ")) } : function () {} function validName (name) { if (!name) { debug("not a name %j", name) return false } var n = name.trim() if (!n || n.charAt(0) === "." || !n.match(/^[a-zA-Z0-9]/) || n.match(/[\/\(\)&\?#\|<>@:%\s\\\*'"!~`]/) || n.toLowerCase() === "node_modules" || n !== encodeURIComponent(n) || n.toLowerCase() === "favicon.ico") { debug("not a valid name %j", name) return false } return n } function npa (arg) { assert.equal(typeof arg, "string") arg = arg.trim() var res = new Result res.raw = arg res.scope = null // See if it's something like foo@... var nameparse = arg.match(nameAt) debug("nameparse", nameparse) if (nameparse && validName(nameparse[3]) && (!nameparse[2] || validName(nameparse[2]))) { res.name = (nameparse[1] || "") + nameparse[3] if (nameparse[2]) res.scope = "@" + nameparse[2] arg = arg.substr(nameparse[0].length) } else { res.name = null } res.rawSpec = arg res.spec = arg var urlparse = url.parse(arg) debug("urlparse", urlparse) // windows paths look like urls // don't be fooled! if (isWindows && urlparse && urlparse.protocol && urlparse.protocol.match(/^[a-zA-Z]:$/)) { debug("windows url-ish local path", urlparse) urlparse = {} } if (urlparse.protocol) { return parseUrl(res, arg, urlparse) } // parse git stuff // parse tag/range/local/remote if (maybeGitHubShorthand(arg)) { res.type = "github" res.spec = arg return res } // at this point, it's not a url, and not github // If it's a valid name, and doesn't already have a name, then assume // $name@"" range // // if it's got / chars in it, then assume that it's local. if (res.name) { var version = semver.valid(arg, true) var range = semver.validRange(arg, true) // foo@... if (version) { res.spec = version res.type = "version" } else if (range) { res.spec = range res.type = "range" } else if (slashRe.test(arg)) { parseLocal(res, arg) } else { res.type = "tag" res.spec = arg } } else { var p = arg.match(parseName) if (p && validName(p[2]) && (!p[1] || validName(p[1]))) { res.type = "range" res.spec = "*" res.rawSpec = "" res.name = arg if (p[1]) res.scope = "@" + p[1] } else { parseLocal(res, arg) } } return res } function parseLocal (res, arg) { // turns out nearly every character is allowed in fs paths if (/\0/.test(arg)) { throw new Error("Invalid Path: " + JSON.stringify(arg)) } res.type = "local" res.spec = path.resolve(arg) } function maybeGitHubShorthand (arg) { // Note: This does not fully test the git ref format. // See https://www.kernel.org/pub/software/scm/git/docs/git-check-ref-format.html // // The only way to do this properly would be to shell out to // git-check-ref-format, and as this is a fast sync function, // we don't want to do that. Just let git fail if it turns // out that the commit-ish is invalid. // GH usernames cannot start with . or - return /^[^@%\/\s\.-][^@%\/\s]*\/[^@\s\/%]+(?:#.*)?$/.test(arg) } function parseUrl (res, arg, urlparse) { // check the protocol, and then see if it's git or not switch (urlparse.protocol) { case "git:": case "git+http:": case "git+https:": case "git+rsync:": case "git+ftp:": case "git+ssh:": case "git+file:": res.type = 'git' res.spec = arg.replace(/^git\+/, '') break case 'http:': case 'https:': res.type = 'remote' res.spec = arg break case 'file:': res.type = 'local' res.spec = urlparse.pathname break; default: throw new Error('Unsupported URL Type: ' + arg) break } return res } function Result () { if (!(this instanceof Result)) return new Result } Result.prototype.name = null Result.prototype.type = null Result.prototype.spec = null Result.prototype.raw = null �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-package-arg/package.json���������������000644 �000766 �000024 �00000005326 12455173731 030605� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-package-arg", "version": "2.1.3", "description": "Parse the things that can be arguments to `npm install`", "main": "npa.js", "directories": { "test": "test" }, "dependencies": { "semver": "4" }, "devDependencies": { "tap": "^0.4.9" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/npm-package-arg" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/npm-package-arg/issues" }, "homepage": "https://github.com/npm/npm-package-arg", "readme": "# npm-package-arg\n\nParse the things that can be arguments to `npm install`\n\nTakes an argument like `foo@1.2`, or `foo@user/foo`, or\n`http://x.com/foo.tgz`, or `git+https://github.com/user/foo`, and\nfigures out what type of thing it is.\n\n## USAGE\n\n```javascript\nvar assert = require(\"assert\")\nvar npa = require(\"npm-package-arg\")\n\n// Pass in the descriptor, and it'll return an object\nvar parsed = npa(\"foo@1.2\")\n\n// Returns an object like:\n// {\n// name: \"foo\", // The bit in front of the @\n// type: \"range\", // the type of descriptor this is\n// spec: \"1.2\" // the specifier for this descriptor\n// }\n\n// Completely unreasonable invalid garbage throws an error\n// Make sure you wrap this in a try/catch if you have not\n// already sanitized the inputs!\nassert.throws(function() {\n npa(\"this is not \\0 a valid package name or url\")\n})\n```\n\nFor more examples, see the test file.\n\n## Result Objects\n\nThe objects that are returned by npm-package-arg contain the following\nfields:\n\n* `name` - If known, the `name` field expected in the resulting pkg.\n* `type` - One of the following strings:\n * `git` - A git repo\n * `github` - A github shorthand, like `user/project`\n * `tag` - A tagged version, like `\"foo@latest\"`\n * `version` - A specific version number, like `\"foo@1.2.3\"`\n * `range` - A version range, like `\"foo@2.x\"`\n * `local` - A local file or folder path\n * `remote` - An http url (presumably to a tgz)\n* `spec` - The \"thing\". URL, the range, git repo, etc.\n* `raw` - The original un-modified string that was provided.\n* `rawSpec` - The part after the `name@...`, as it was originally\n provided.\n* `scope` - If a name is something like `@org/module` then the `scope`\n field will be set to `org`. If it doesn't have a scoped name, then\n scope is `null`.\n", "readmeFilename": "README.md", "gitHead": "9aaabc2aae746371a05f54cdb57a5f9ada003d8f", "_id": "npm-package-arg@2.1.3", "_shasum": "dfba34bd82dd327c10cb43a65c8db6ef0b812bf7", "_from": "npm-package-arg@~2.1.3" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-package-arg/README.md������������������000644 �000766 �000024 �00000003355 12455173731 027576� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# npm-package-arg Parse the things that can be arguments to `npm install` Takes an argument like `foo@1.2`, or `foo@user/foo`, or `http://x.com/foo.tgz`, or `git+https://github.com/user/foo`, and figures out what type of thing it is. ## USAGE ```javascript var assert = require("assert") var npa = require("npm-package-arg") // Pass in the descriptor, and it'll return an object var parsed = npa("foo@1.2") // Returns an object like: // { // name: "foo", // The bit in front of the @ // type: "range", // the type of descriptor this is // spec: "1.2" // the specifier for this descriptor // } // Completely unreasonable invalid garbage throws an error // Make sure you wrap this in a try/catch if you have not // already sanitized the inputs! assert.throws(function() { npa("this is not \0 a valid package name or url") }) ``` For more examples, see the test file. ## Result Objects The objects that are returned by npm-package-arg contain the following fields: * `name` - If known, the `name` field expected in the resulting pkg. * `type` - One of the following strings: * `git` - A git repo * `github` - A github shorthand, like `user/project` * `tag` - A tagged version, like `"foo@latest"` * `version` - A specific version number, like `"foo@1.2.3"` * `range` - A version range, like `"foo@2.x"` * `local` - A local file or folder path * `remote` - An http url (presumably to a tgz) * `spec` - The "thing". URL, the range, git repo, etc. * `raw` - The original un-modified string that was provided. * `rawSpec` - The part after the `name@...`, as it was originally provided. * `scope` - If a name is something like `@org/module` then the `scope` field will be set to `org`. If it doesn't have a scoped name, then scope is `null`. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-install-checks/index.js����������������000644 �000766 �000024 �00000010022 12455173731 030513� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require("fs") var path = require("path") var log = require("npmlog") var semver = require("semver") exports.checkEngine = checkEngine function checkEngine (target, npmVer, nodeVer, force, strict, cb) { var nodev = force ? null : nodeVer , strict = strict || target.engineStrict , eng = target.engines if (!eng) return cb() if (nodev && eng.node && !semver.satisfies(nodev, eng.node) || eng.npm && !semver.satisfies(npmVer, eng.npm)) { if (strict) { var er = new Error("Unsupported") er.code = "ENOTSUP" er.required = eng er.pkgid = target._id return cb(er) } else { log.warn( "engine", "%s: wanted: %j (current: %j)" , target._id, eng, {node: nodev, npm: npmVer} ) } } return cb() } exports.checkPlatform = checkPlatform function checkPlatform (target, force, cb) { var platform = process.platform , arch = process.arch , osOk = true , cpuOk = true if (force) { return cb() } if (target.os) { osOk = checkList(platform, target.os) } if (target.cpu) { cpuOk = checkList(arch, target.cpu) } if (!osOk || !cpuOk) { var er = new Error("Unsupported") er.code = "EBADPLATFORM" er.os = target.os || ['any'] er.cpu = target.cpu || ['any'] er.pkgid = target._id return cb(er) } return cb() } function checkList (value, list) { var tmp , match = false , blc = 0 if (typeof list === "string") { list = [list] } if (list.length === 1 && list[0] === "any") { return true } for (var i = 0; i < list.length; ++i) { tmp = list[i] if (tmp[0] === '!') { tmp = tmp.slice(1) if (tmp === value) { return false } ++blc } else { match = match || tmp === value } } return match || blc === list.length } exports.checkCycle = checkCycle function checkCycle (target, ancestors, cb) { // there are some very rare and pathological edge-cases where // a cycle can cause npm to try to install a never-ending tree // of stuff. // Simplest: // // A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ... // // Solution: Simply flat-out refuse to install any name@version // that is already in the prototype tree of the ancestors object. // A more correct, but more complex, solution would be to symlink // the deeper thing into the new location. // Will do that if anyone whines about this irl. // // Note: `npm install foo` inside of the `foo` package will abort // earlier if `--force` is not set. However, if it IS set, then // we need to still fail here, but just skip the first level. Of // course, it'll still fail eventually if it's a true cycle, and // leave things in an undefined state, but that's what is to be // expected when `--force` is used. That is why getPrototypeOf // is used *twice* here: to skip the first level of repetition. var p = Object.getPrototypeOf(Object.getPrototypeOf(ancestors)) , name = target.name , version = target.version while (p && p !== Object.prototype && p[name] !== version) { p = Object.getPrototypeOf(p) } if (p[name] !== version) return cb() var er = new Error("Unresolvable cycle detected") var tree = [target._id, JSON.parse(JSON.stringify(ancestors))] , t = Object.getPrototypeOf(ancestors) while (t && t !== Object.prototype) { if (t === p) t.THIS_IS_P = true tree.push(JSON.parse(JSON.stringify(t))) t = Object.getPrototypeOf(t) } log.verbose("unresolvable dependency tree", tree) er.pkgid = target._id er.code = "ECYCLE" return cb(er) } exports.checkGit = checkGit function checkGit (folder, cb) { // if it's a git repo then don't touch it! fs.lstat(folder, function (er, s) { if (er || !s.isDirectory()) return cb() else checkGit_(folder, cb) }) } function checkGit_ (folder, cb) { fs.stat(path.resolve(folder, ".git"), function (er, s) { if (!er && s.isDirectory()) { var e = new Error("Appears to be a git repo or submodule.") e.path = folder e.code = "EISGIT" return cb(e) } cb() }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-install-checks/LICENSE�����������������000644 �000766 �000024 �00000002465 12455173731 030067� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Robert Kowalski and Isaac Z. Schlueter ("Authors") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHORS AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-install-checks/package.json������������000644 �000766 �000024 �00000002766 12455173731 031354� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-install-checks", "version": "1.0.4", "description": "checks that npm runs during the installation of a module", "main": "index.js", "dependencies": { "npmlog": "0.1", "semver": "^2.3.0 || 3.x || 4" }, "devDependencies": { "tap": "~0.4.8", "rimraf": "~2.2.5", "mkdirp": "~0.3.5" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/npm/npm-install-checks.git" }, "homepage": "https://github.com/npm/npm-install-checks", "keywords": [ "npm,", "install" ], "author": { "name": "Robert Kowalski", "email": "rok@kowalski.gd" }, "license": "BSD-2-Clause", "bugs": { "url": "https://github.com/npm/npm-install-checks/issues" }, "gitHead": "05944f95860b0ac3769667551c4b7aa3d3fcdc32", "_id": "npm-install-checks@1.0.4", "_shasum": "9757c6f9d4d493c2489465da6d07a8ed416d44c8", "_from": "npm-install-checks@>=1.0.2-0 <1.1.0-0", "_npmVersion": "2.0.0-beta.3", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "robertkowalski", "email": "rok@kowalski.gd" }, { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "9757c6f9d4d493c2489465da6d07a8ed416d44c8", "tarball": "http://registry.npmjs.org/npm-install-checks/-/npm-install-checks-1.0.4.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/npm-install-checks/-/npm-install-checks-1.0.4.tgz" } ����������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-install-checks/README.md���������������000644 �000766 �000024 �00000001023 12455173731 030326� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# npm-install-checks A package that contains checks that npm runs during the installation. ## API ### .checkEngine(target, npmVer, nodeVer, force, strict, cb) Check if node/npm version is supported by the package. Error type: `ENOTSUP` ### .checkPlatform(target, force, cb) Check if OS/Arch is supported by the package. Error type: `EBADPLATFORM` ### .checkCycle(target, ancestors, cb) Check for cyclic dependencies. Error type: `ECYCLE` ### .checkGit(folder, cb) Check if a folder is a .git folder. Error type: `EISGIT` �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-cache-filename/index.js����������������000644 �000766 �000024 �00000001006 12455173731 030432� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var url = require('url');; var path = require('path');; module.exports = cf;; function cf(root, u) { if (!u) return cf.bind(null, root);; u = url.parse(u);; var h = u.host.replace(/:/g, '_');; // Strip off any /-rev/... or ?rev=... bits var revre = /(\?rev=|\?.*?&rev=|\/-rev\/).*$/ var parts = u.path.replace(revre, '').split('/').slice(1) var p = [root, h].concat(parts.map(function(part) { return encodeURIComponent(part).replace(/%/g, '_');; }));; return path.join.apply(path, p);; } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-cache-filename/LICENSE�����������������000644 �000766 �000024 �00000001364 12455173731 030001� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) npm, Inc. and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-cache-filename/package.json������������000644 �000766 �000024 �00000002740 12455173731 031261� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-cache-filename", "version": "1.0.1", "description": "Given a cache folder and url, return the appropriate cache folder.", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "^0.4.10" }, "scripts": { "test": "tap test.js" }, "repository": { "type": "git", "url": "git://github.com/npm/npm-cache-filename" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/npm-cache-filename/issues" }, "homepage": "https://github.com/npm/npm-cache-filename", "readme": "# npm-cache-filename\n\nGiven a cache folder and url, return the appropriate cache folder.\n\n## USAGE\n\n```javascript\nvar cf = require('npm-cache-filename');\nconsole.log(cf('/tmp/cache', 'https://registry.npmjs.org:1234/foo/bar'));\n// outputs: /tmp/cache/registry.npmjs.org_1234/foo/bar\n```\n\nAs a bonus, you can also bind it to a specific root path:\n\n```javascript\nvar cf = require('npm-cache-filename');\nvar getFile = cf('/tmp/cache');\n\nconsole.log(getFile('https://registry.npmjs.org:1234/foo/bar'));\n// outputs: /tmp/cache/registry.npmjs.org_1234/foo/bar\n```\n", "readmeFilename": "README.md", "_id": "npm-cache-filename@1.0.1", "_shasum": "9b640f0c1a5ba1145659685372a9ff71f70c4323", "_from": "npm-cache-filename@latest", "_resolved": "https://registry.npmjs.org/npm-cache-filename/-/npm-cache-filename-1.0.1.tgz" } ��������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-cache-filename/README.md���������������000644 �000766 �000024 �00000001056 12455173731 030251� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# npm-cache-filename Given a cache folder and url, return the appropriate cache folder. ## USAGE ```javascript var cf = require('npm-cache-filename'); console.log(cf('/tmp/cache', 'https://registry.npmjs.org:1234/foo/bar')); // outputs: /tmp/cache/registry.npmjs.org_1234/foo/bar ``` As a bonus, you can also bind it to a specific root path: ```javascript var cf = require('npm-cache-filename'); var getFile = cf('/tmp/cache'); console.log(getFile('https://registry.npmjs.org:1234/foo/bar')); // outputs: /tmp/cache/registry.npmjs.org_1234/foo/bar ``` ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/npm-cache-filename/test.js�����������������000644 �000766 �000024 �00000001372 12455173731 030310� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test;; test('it does the thing it says it does', function(t) { var cf = require('./');; t.equal(cf('/tmp/cache', 'https://foo:134/xyz?adf=foo:bar/baz'), '/tmp/cache/foo_134/xyz_3Fadf_3Dfoo_3Abar/baz');; var getFile = cf('/tmp/cache');; t.equal(getFile('https://foo:134/xyz?adf=foo:bar/baz'), '/tmp/cache/foo_134/xyz_3Fadf_3Dfoo_3Abar/baz');; t.equal(cf("/tmp", "https://foo:134/xyz/-rev/baz"), '/tmp/foo_134/xyz') t.equal(cf("/tmp", "https://foo:134/xyz/?rev=baz"), '/tmp/foo_134/xyz') t.equal(cf("/tmp", "https://foo:134/xyz/?foo&rev=baz"), '/tmp/foo_134/xyz') t.equal(cf("/tmp", "https://foo:134/xyz-rev/baz"), '/tmp/foo_134/xyz-rev/baz') t.end(); });; ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/.npmignore����������000644 �000766 �000024 �00000000016 12455173731 031653� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/node_modules/������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/.travis.yml���������000644 �000766 �000024 �00000000046 12455173731 031770� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.10" ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/AUTHORS�������������000644 �000766 �000024 �00000000227 12455173731 030730� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Names sorted by how much code was originally theirs. Isaac Z. Schlueter <i@izs.me> Meryn Stol <merynstol@gmail.com> Robert Kowalski <rok@kowalski.gd>�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/lib/����������������000755 �000766 �000024 �00000000000 12456115117 030420� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/LICENSE�������������000644 �000766 �000024 �00000002563 12455173731 030672� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������This package contains code originally written by Isaac Z. Schlueter. Used with permission. Copyright (c) Meryn Stol ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ���������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/package.json��������000644 �000766 �000024 �00000003562 12455173731 032153� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "normalize-package-data", "version": "1.0.3", "author": { "name": "Meryn Stol", "email": "merynstol@gmail.com" }, "description": "Normalizes data that can be found in package.json files.", "repository": { "type": "git", "url": "git://github.com/meryn/normalize-package-data.git" }, "main": "lib/normalize.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "github-url-from-git": "^1.3.0", "github-url-from-username-repo": "^1.0.0", "semver": "2 || 3 || 4" }, "devDependencies": { "tap": "~0.2.5", "underscore": "~1.4.4", "async": "~0.9.0" }, "contributors": [ { "name": "Isaac Z. Schlueter", "email": "i@izs.me" }, { "name": "Meryn Stol", "email": "merynstol@gmail.com" }, { "name": "Robert Kowalski", "email": "rok@kowalski.gd" } ], "gitHead": "8c30091c83b1a41e113757148c4543ef61ff863d", "bugs": { "url": "https://github.com/meryn/normalize-package-data/issues" }, "homepage": "https://github.com/meryn/normalize-package-data", "_id": "normalize-package-data@1.0.3", "_shasum": "8be955b8907af975f1a4584ea8bb9b41492312f5", "_from": "normalize-package-data@>=1.0.3 <1.1.0", "_npmVersion": "2.1.0", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "meryn", "email": "merynstol@gmail.com" }, { "name": "isaacs", "email": "i@izs.me" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "dist": { "shasum": "8be955b8907af975f1a4584ea8bb9b41492312f5", "tarball": "http://registry.npmjs.org/normalize-package-data/-/normalize-package-data-1.0.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/normalize-package-data/-/normalize-package-data-1.0.3.tgz" } ����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/README.md�����������000644 �000766 �000024 �00000014400 12455173731 031135� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# normalize-package-data [![Build Status](https://travis-ci.org/meryn/normalize-package-data.png?branch=master)](https://travis-ci.org/meryn/normalize-package-data) normalize-package data exports a function that normalizes package metadata. This data is typically found in a package.json file, but in principle could come from any source - for example the npm registry. normalize-package-data is used by [read-package-json](https://npmjs.org/package/read-package-json) to normalize the data it reads from a package.json file. In turn, read-package-json is used by [npm](https://npmjs.org/package/npm) and various npm-related tools. ## Installation ``` npm install normalize-package-data ``` ## Usage Basic usage is really simple. You call the function that normalize-package-data exports. Let's call it `normalizeData`. ```javascript normalizeData = require('normalize-package-data') packageData = fs.readFileSync("package.json") normalizeData(packageData) // packageData is now normalized ``` #### Strict mode You may activate strict validation by passing true as the second argument. ```javascript normalizeData = require('normalize-package-data') packageData = fs.readFileSync("package.json") warnFn = function(msg) { console.error(msg) } normalizeData(packageData, true) // packageData is now normalized ``` If strict mode is activated, only Semver 2.0 version strings are accepted. Otherwise, Semver 1.0 strings are accepted as well. Packages must have a name, and the name field must not have contain leading or trailing whitespace. #### Warnings Optionally, you may pass a "warning" function. It gets called whenever the `normalizeData` function encounters something that doesn't look right. It indicates less than perfect input data. ```javascript normalizeData = require('normalize-package-data') packageData = fs.readFileSync("package.json") warnFn = function(msg) { console.error(msg) } normalizeData(packageData, warnFn) // packageData is now normalized. Any number of warnings may have been logged. ``` You may combine strict validation with warnings by passing `true` as the second argument, and `warnFn` as third. When `private` field is set to `true`, warnings will be suppressed. ### Potential exceptions If the supplied data has an invalid name or version vield, `normalizeData` will throw an error. Depending on where you call `normalizeData`, you may want to catch these errors so can pass them to a callback. ## What normalization (currently) entails * The value of `name` field gets trimmed (unless in strict mode). * The value of the `version` field gets cleaned by `semver.clean`. See [documentation for the semver module](https://github.com/isaacs/node-semver). * If `name` and/or `version` fields are missing, they are set to empty strings. * If `files` field is not an array, it will be removed. * If `bin` field is a string, then `bin` field will become an object with `name` set to the value of the `name` field, and `bin` set to the original string value. * If `man` field is a string, it will become an array with the original string as its sole member. * If `keywords` field is string, it is considered to be a list of keywords separated by one or more white-space characters. It gets converted to an array by splitting on `\s+`. * All people fields (`author`, `maintainers`, `contributors`) get converted into objects with name, email and url properties. * If `bundledDependencies` field (a typo) exists and `bundleDependencies` field does not, `bundledDependencies` will get renamed to `bundleDependencies`. * If the value of any of the dependencies fields (`dependencies`, `devDependencies`, `optionalDependencies`) is a string, it gets converted into an object with familiar `name=>value` pairs. * The values in `optionalDependencies` get added to `dependencies`. The `optionalDependencies` array is left untouched. * If `description` field does not exist, but `readme` field does, then (more or less) the first paragraph of text that's found in the readme is taken as value for `description`. * If `repository` field is a string, it will become an object with `url` set to the original string value, and `type` set to `"git"`. * If `repository.url` is not a valid url, but in the style of "[owner-name]/[repo-name]", `repository.url` will be set to git://github.com/[owner-name]/[repo-name] * If `bugs` field is a string, the value of `bugs` field is changed into an object with `url` set to the original string value. * If `bugs` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `bugs` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/issues . If the repository field points to a GitHub Gist repo url, the associated http url is chosen. * If `bugs` field is an object, the resulting value only has email and url properties. If email and url properties are not strings, they are ignored. If no valid values for either email or url is found, bugs field will be removed. * If `homepage` field is not a string, it will be removed. * If the url in the `homepage` field does not specify a protocol, then http is assumed. For example, `myproject.org` will be changed to `http://myproject.org`. * If `homepage` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `homepage` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/ . If the repository field points to a GitHub Gist repo url, the associated http url is chosen. ### Rules for name field If `name` field is given, the value of the name field must be a string. The string may not: * start with a period. * contain the following characters: `/@\s+%` * contain and characters that would need to be encoded for use in urls. * resemble the word `node_modules` or `favicon.ico` (case doesn't matter). ### Rules for version field If `version` field is given, the value of the version field must be a valid *semver* string, as determined by the `semver.valid` method. See [documentation for the semver module](https://github.com/isaacs/node-semver). ## Credits This package contains code based on read-package-json written by Isaac Z. Schlueter. Used with permisson. ## License normalize-package-data is released under the [BSD 2-Clause License](http://opensource.org/licenses/MIT). Copyright (c) 2013 Meryn Stol ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/normalize-package-data/lib/core_module_names.json�����������������000644 �000766 �000024 �00000000403 12455173731 034716� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������[ "http", "events", "util", "domain", "cluster", "buffer", "stream", "crypto", "tls", "fs", "string_decoder", "path", "net", "dgram", "dns", "https", "url", "punycode", "readline", "repl", "vm", "child_process", "assert", "zlib", "tty", "os", "querystring" ] �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/normalize-package-data/lib/extract_description.js�����������������000644 �000766 �000024 �00000000775 12455173731 034772� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = extractDescription // Extracts description from contents of a readme file in markdown format function extractDescription (d) { if (!d) return; if (d === "ERROR: No README data found!") return; // the first block of text before the first heading // that isn't the first line heading d = d.trim().split('\n') for (var s = 0; d[s] && d[s].trim().match(/^(#|$)/); s ++); var l = d.length for (var e = s + 1; e < l && d[e].trim(); e ++); return d.slice(s, e).join(' ').trim() } ���iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/lib/fixer.js��������000644 �000766 �000024 �00000026054 12455173731 032107� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var semver = require("semver") var parseGitHubURL = require("github-url-from-git") var depTypes = ["dependencies","devDependencies","optionalDependencies"] var extractDescription = require("./extract_description") var url = require("url") var typos = require("./typos") var coreModuleNames = require("./core_module_names") var githubUserRepo = require("github-url-from-username-repo") var fixer = module.exports = { // default warning function warn: function() {}, fixRepositoryField: function(data) { if (data.repositories) { this.warn("repositories"); data.repository = data.repositories[0] } if (!data.repository) return this.warn("missingRepository") if (typeof data.repository === "string") { data.repository = { type: "git", url: data.repository } } var r = data.repository.url || "" if (r) { var ghurl = parseGitHubURL(r) if (ghurl) { r = ghurl.replace(/^https?:\/\//, 'git://') } else if (githubUserRepo(r)) { // repo has 'user/reponame' filled in as repo data.repository.url = githubUserRepo(r) } } if (r.match(/github.com\/[^\/]+\/[^\/]+\.git\.git$/)) { this.warn("brokenGitUrl", r) } } , fixTypos: function(data) { Object.keys(typos.topLevel).forEach(function (d) { if (data.hasOwnProperty(d)) { this.warn("typo", d, typos.topLevel[d]) } }, this) } , fixScriptsField: function(data) { if (!data.scripts) return if (typeof data.scripts !== "object") { this.warn("nonObjectScripts") delete data.scripts } Object.keys(data.scripts).forEach(function (k) { if (typeof data.scripts[k] !== "string") { this.warn("nonStringScript") delete data.scripts[k] } else if (typos.script[k]) { this.warn("typo", k, typos.script[k], "scripts") } }, this) } , fixFilesField: function(data) { var files = data.files if (files && !Array.isArray(files)) { this.warn("nonArrayFiles") delete data.files } else if (data.files) { data.files = data.files.filter(function(file) { if (!file || typeof file !== "string") { this.warn("invalidFilename", file) return false } else { return true } }, this) } } , fixBinField: function(data) { if (!data.bin) return; if (typeof data.bin === "string") { var b = {} b[data.name] = data.bin data.bin = b } } , fixManField: function(data) { if (!data.man) return; if (typeof data.man === "string") { data.man = [ data.man ] } } , fixBundleDependenciesField: function(data) { var bdd = "bundledDependencies" var bd = "bundleDependencies" if (data[bdd] && !data[bd]) { data[bd] = data[bdd] delete data[bdd] } if (data[bd] && !Array.isArray(data[bd])) { this.warn("nonArrayBundleDependencies") delete data[bd] } else if (data[bd]) { data[bd] = data[bd].filter(function(bd) { if (!bd || typeof bd !== 'string') { this.warn("nonStringBundleDependency", bd) return false } else { if (!data.dependencies) { data.dependencies = {} } if (!data.dependencies.hasOwnProperty(bd)) { this.warn("nonDependencyBundleDependency", bd) data.dependencies[bd] = "*" } return true } }, this) } } , fixDependencies: function(data, strict) { var loose = !strict objectifyDeps(data, this.warn) addOptionalDepsToDeps(data, this.warn) this.fixBundleDependenciesField(data) ;['dependencies','devDependencies'].forEach(function(deps) { if (!(deps in data)) return if (!data[deps] || typeof data[deps] !== "object") { this.warn("nonObjectDependencies", deps) delete data[deps] return } Object.keys(data[deps]).forEach(function (d) { var r = data[deps][d] if (typeof r !== 'string') { this.warn("nonStringDependency", d, JSON.stringify(r)) delete data[deps][d] } // "/" is not allowed as packagename for publishing, but for git-urls // normalize shorthand-urls if (githubUserRepo(data[deps][d])) { data[deps][d] = 'git+' + githubUserRepo(data[deps][d]) } }, this) }, this) } , fixModulesField: function (data) { if (data.modules) { this.warn("deprecatedModules") delete data.modules } } , fixKeywordsField: function (data) { if (typeof data.keywords === "string") { data.keywords = data.keywords.split(/,\s+/) } if (data.keywords && !Array.isArray(data.keywords)) { delete data.keywords this.warn("nonArrayKeywords") } else if (data.keywords) { data.keywords = data.keywords.filter(function(kw) { if (typeof kw !== "string" || !kw) { this.warn("nonStringKeyword"); return false } else { return true } }, this) } } , fixVersionField: function(data, strict) { // allow "loose" semver 1.0 versions in non-strict mode // enforce strict semver 2.0 compliance in strict mode var loose = !strict if (!data.version) { data.version = "" return true } if (!semver.valid(data.version, loose)) { throw new Error('Invalid version: "'+ data.version + '"') } data.version = semver.clean(data.version, loose) return true } , fixPeople: function(data) { modifyPeople(data, unParsePerson) modifyPeople(data, parsePerson) } , fixNameField: function(data, strict) { if (!data.name && !strict) { data.name = "" return } if (typeof data.name !== "string") { throw new Error("name field must be a string.") } if (!strict) data.name = data.name.trim() ensureValidName(data.name, strict) if (coreModuleNames.indexOf(data.name) !== -1) this.warn("conflictingName", data.name) } , fixDescriptionField: function (data) { if (data.description && typeof data.description !== 'string') { this.warn("nonStringDescription") delete data.description } if (data.readme && !data.description) data.description = extractDescription(data.readme) if(data.description === undefined) delete data.description; if (!data.description) this.warn("missingDescription") } , fixReadmeField: function (data) { if (!data.readme) { this.warn("missingReadme") data.readme = "ERROR: No README data found!" } } , fixBugsField: function(data) { if (!data.bugs && data.repository && data.repository.url) { var gh = parseGitHubURL(data.repository.url) if(gh) { if(gh.match(/^https:\/\/github.com\//)) data.bugs = {url: gh + "/issues"} else // gist url data.bugs = {url: gh} } } else if(data.bugs) { var emailRe = /^.+@.*\..+$/ if(typeof data.bugs == "string") { if(emailRe.test(data.bugs)) data.bugs = {email:data.bugs} else if(url.parse(data.bugs).protocol) data.bugs = {url: data.bugs} else this.warn("nonEmailUrlBugsString") } else { bugsTypos(data.bugs, this.warn) var oldBugs = data.bugs data.bugs = {} if(oldBugs.url) { if(typeof(oldBugs.url) == "string" && url.parse(oldBugs.url).protocol) data.bugs.url = oldBugs.url else this.warn("nonUrlBugsUrlField") } if(oldBugs.email) { if(typeof(oldBugs.email) == "string" && emailRe.test(oldBugs.email)) data.bugs.email = oldBugs.email else this.warn("nonEmailBugsEmailField") } } if(!data.bugs.email && !data.bugs.url) { delete data.bugs this.warn("emptyNormalizedBugs") } } } , fixHomepageField: function(data) { if (!data.homepage && data.repository && data.repository.url) { var gh = parseGitHubURL(data.repository.url) if (gh) data.homepage = gh else return true } else if (!data.homepage) return true if(typeof data.homepage !== "string") { this.warn("nonUrlHomepage") return delete data.homepage } if(!url.parse(data.homepage).protocol) { this.warn("missingProtocolHomepage") data.homepage = "http://" + data.homepage } } } function isValidScopedPackageName(spec) { if (spec.charAt(0) !== '@') return false var rest = spec.slice(1).split('/') if (rest.length !== 2) return false return rest[0] && rest[1] && rest[0] === encodeURIComponent(rest[0]) && rest[1] === encodeURIComponent(rest[1]) } function isCorrectlyEncodedName(spec) { return !spec.match(/[\/@\s\+%:]/) && spec === encodeURIComponent(spec) } function ensureValidName (name, strict) { if (name.charAt(0) === "." || !(isValidScopedPackageName(name) || isCorrectlyEncodedName(name)) || (strict && name !== name.toLowerCase()) || name.toLowerCase() === "node_modules" || name.toLowerCase() === "favicon.ico") { throw new Error("Invalid name: " + JSON.stringify(name)) } } function modifyPeople (data, fn) { if (data.author) data.author = fn(data.author) ;["maintainers", "contributors"].forEach(function (set) { if (!Array.isArray(data[set])) return; data[set] = data[set].map(fn) }) return data } function unParsePerson (person) { if (typeof person === "string") return person var name = person.name || "" var u = person.url || person.web var url = u ? (" ("+u+")") : "" var e = person.email || person.mail var email = e ? (" <"+e+">") : "" return name+email+url } function parsePerson (person) { if (typeof person !== "string") return person var name = person.match(/^([^\(<]+)/) var url = person.match(/\(([^\)]+)\)/) var email = person.match(/<([^>]+)>/) var obj = {} if (name && name[0].trim()) obj.name = name[0].trim() if (email) obj.email = email[1]; if (url) obj.url = url[1]; return obj } function addOptionalDepsToDeps (data, warn) { var o = data.optionalDependencies if (!o) return; var d = data.dependencies || {} Object.keys(o).forEach(function (k) { d[k] = o[k] }) data.dependencies = d } function depObjectify (deps, type, warn) { if (!deps) return {} if (typeof deps === "string") { deps = deps.trim().split(/[\n\r\s\t ,]+/) } if (!Array.isArray(deps)) return deps warn("deprecatedArrayDependencies", type) var o = {} deps.filter(function (d) { return typeof d === "string" }).forEach(function(d) { d = d.trim().split(/(:?[@\s><=])/) var dn = d.shift() var dv = d.join("") dv = dv.trim() dv = dv.replace(/^@/, "") o[dn] = dv }) return o } function objectifyDeps (data, warn) { depTypes.forEach(function (type) { if (!data[type]) return; data[type] = depObjectify(data[type], type, warn) }) } function bugsTypos(bugs, warn) { if (!bugs) return Object.keys(bugs).forEach(function (k) { if (typos.bugs[k]) { warn("typo", k, typos.bugs[k], "bugs") bugs[typos.bugs[k]] = bugs[k] delete bugs[k] } }) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/lib/make_warning.js�000644 �000766 �000024 �00000001305 12455173731 033424� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var util = require("util") var messages = require("./warning_messages.json") module.exports = function() { var args = Array.prototype.slice.call(arguments, 0) var warningName = args.shift() if (warningName == "typo") { return makeTypoWarning.apply(null,args) } else { var msgTemplate = messages[warningName] ? messages[warningName] : warningName + ": '%s'" args.unshift(msgTemplate) return util.format.apply(null, args) } } function makeTypoWarning (providedName, probableName, field) { if (field) { providedName = field + "['" + providedName + "']" probableName = field + "['" + probableName + "']" } return util.format(messages.typo, providedName, probableName) }���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/lib/normalize.js����000644 �000766 �000024 �00000002435 12455173731 032767� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = normalize var fixer = require("./fixer") var makeWarning = require("./make_warning") var fieldsToFix = ['name','version','description','repository','modules','scripts' ,'files','bin','man','bugs','keywords','readme','homepage'] var otherThingsToFix = ['dependencies','people', 'typos'] var thingsToFix = fieldsToFix.map(function(fieldName) { return ucFirst(fieldName) + "Field" }) // two ways to do this in CoffeeScript on only one line, sub-70 chars: // thingsToFix = fieldsToFix.map (name) -> ucFirst(name) + "Field" // thingsToFix = (ucFirst(name) + "Field" for name in fieldsToFix) thingsToFix = thingsToFix.concat(otherThingsToFix) function normalize (data, warn, strict) { if(warn === true) warn = null, strict = true if(!strict) strict = false if(!warn || data.private) warn = function(msg) { /* noop */ } if (data.scripts && data.scripts.install === "node-gyp rebuild" && !data.scripts.preinstall) { data.gypfile = true } fixer.warn = function() { warn(makeWarning.apply(null, arguments)) } thingsToFix.forEach(function(thingName) { fixer["fix" + ucFirst(thingName)](data, strict) }) data._id = data.name + "@" + data.version } function ucFirst (string) { return string.charAt(0).toUpperCase() + string.slice(1); } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/lib/safe_format.js��000644 �000766 �000024 �00000000365 12455173731 033255� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var util = require('util') module.exports = function() { var args = Array.prototype.slice.call(arguments, 0) args.forEach(function(arg) { if (!arg) throw new TypeError('Bad arguments.') }) return util.format.apply(null, arguments) }���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-package-data/lib/typos.json������000644 �000766 �000024 �00000001354 12455173731 032501� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "topLevel": { "dependancies": "dependencies" ,"dependecies": "dependencies" ,"depdenencies": "dependencies" ,"devEependencies": "devDependencies" ,"depends": "dependencies" ,"dev-dependencies": "devDependencies" ,"devDependences": "devDependencies" ,"devDepenencies": "devDependencies" ,"devdependencies": "devDependencies" ,"repostitory": "repository" ,"repo": "repository" ,"prefereGlobal": "preferGlobal" ,"hompage": "homepage" ,"hampage": "homepage" ,"autohr": "author" ,"autor": "author" ,"contributers": "contributors" ,"publicationConfig": "publishConfig" ,"script": "scripts" }, "bugs": { "web": "url", "name": "url" }, "script": { "server": "start", "tests": "test" } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/normalize-package-data/lib/warning_messages.json������������������000644 �000766 �000024 �00000003337 12455173731 034603� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "repositories": "'repositories' (plural) Not supported. Please pick one as the 'repository' field" ,"missingRepository": "No repository field." ,"brokenGitUrl": "Probably broken git url: %s" ,"nonObjectScripts": "scripts must be an object" ,"nonStringScript": "script values must be string commands" ,"nonArrayFiles": "Invalid 'files' member" ,"invalidFilename": "Invalid filename in 'files' list: %s" ,"nonArrayBundleDependencies": "Invalid 'bundleDependencies' list. Must be array of package names" ,"nonStringBundleDependency": "Invalid bundleDependencies member: %s" ,"nonDependencyBundleDependency": "Non-dependency in bundleDependencies: %s" ,"nonObjectDependencies": "%s field must be an object" ,"nonStringDependency": "Invalid dependency: %s %s" ,"deprecatedArrayDependencies": "specifying %s as array is deprecated" ,"deprecatedModules": "modules field is deprecated" ,"nonArrayKeywords": "keywords should be an array of strings" ,"nonStringKeyword": "keywords should be an array of strings" ,"conflictingName": "%s is also the name of a node core module." ,"nonStringDescription": "'description' field should be a string" ,"missingDescription": "No description" ,"missingReadme": "No README data" ,"nonEmailUrlBugsString": "Bug string field must be url, email, or {email,url}" ,"nonUrlBugsUrlField": "bugs.url field must be a string url. Deleted." ,"nonEmailBugsEmailField": "bugs.email field must be a string email. Deleted." ,"emptyNormalizedBugs": "Normalized value of bugs field is an empty object. Deleted." ,"nonUrlHomepage": "homepage field must be a string url. Deleted." ,"missingProtocolHomepage": "homepage field must start with a protocol." ,"typo": "%s should probably be %s." } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/.eslintrc����������������000644 �000766 �000024 �00000000637 12455173731 030572� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "env" : { "node" : true }, "rules" : { "semi": [2, "never"], "strict": 0, "quotes": [1, "double", "avoid-escape"], "no-use-before-define": 0, "curly": 0, "no-underscore-dangle": 0, "no-lonely-if": 1, "no-unused-vars": [2, {"vars" : "all", "args" : "after-used"}], "no-mixed-requires": 0, "space-infix-ops": 0, "key-spacing": 0, "no-multi-spaces": 0 } } �������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/.npmignore���������������000644 �000766 �000024 �00000000016 12455173731 030734� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/ ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/CHANGELOG.md�������������000644 �000766 �000024 �00000000324 12455173731 030550� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������### 1.0.0 (2014-12-25): * [`8b3d874`](https://github.com/npm/normalize-git-url/commit/8b3d874afd14f4cdde65d418e0a35a615c746bba) Initial version, with simple tests. ([@othiym23](https://github.com/othiym23)) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/normalize-git-url.js�����000644 �000766 �000024 �00000001340 12455173731 032655� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var url = require("url") module.exports = function normalize (u) { var parsed = url.parse(u, true) // git is so tricky! // if the path is like ssh://foo:22/some/path then it works, but // it needs the ssh:// // If the path is like ssh://foo:some/path then it works, but // only if you remove the ssh:// if (parsed.protocol) { parsed.protocol = parsed.protocol.replace(/^git\+/, "") // ssh paths that are scp-style urls don't need the ssh:// parsed.pathname = parsed.pathname.replace(/^\/?:/, "/") } // figure out what we should check out. var checkout = parsed.hash && parsed.hash.substr(1) || "master" parsed.hash = "" u = url.format(parsed) return { url : u, branch : checkout } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/package.json�������������000644 �000766 �000024 �00000004557 12455173731 031241� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "normalize-git-url", "version": "1.0.0", "description": "Normalizes Git URLs. For npm, but you can use it too.", "main": "normalize-git-url.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "^0.4.13" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/normalize-git-url.git" }, "keywords": [ "git", "github", "url", "normalize", "npm" ], "author": { "name": "Forrest L Norvell", "email": "ogd@aoaioxxysz.net" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/normalize-git-url/issues" }, "homepage": "https://github.com/npm/normalize-git-url", "readme": "# normalize-git-url\n\nYou have a bunch of Git URLs. You want to convert them to a canonical\nrepresentation, probably for use inside npm so that it doesn't end up creating\na bunch of superfluous cached origins. You use this package.\n\n## Usage\n\n```javascript\nvar ngu = require('normalize-git-url');\nvar normalized = ngu(\"git+ssh://git@github.com:organization/repo.git#hashbrowns\")\n// get back:\n// {\n// url : \"ssh://git@github.com/organization/repo.git\",\n// branch : \"hashbrowns\" // did u know hashbrowns are delicious?\n// }\n```\n\n## API\n\nThere's just the one function, and all it takes is a single parameter, a non-normalized Git URL.\n\n### normalizeGitUrl(url)\n\n* `url` {String} The Git URL (very loosely speaking) to be normalized.\n\nReturns an object with the following format:\n\n* `url` {String} The normalized URL.\n* `branch` {String} The treeish to be checked out once the repo at `url` is\n cloned. It doesn't have to be a branch, but it's a lot easier to intuit what\n the output is for with that name.\n\n## Limitations\n\nRight now this doesn't try to special-case GitHub too much -- it doesn't ensure\nthat `.git` is added to the end of URLs, it doesn't prefer `https:` over\n`http:` or `ssh:`, it doesn't deal with redirects, and it doesn't try to\nresolve symbolic names to treeish hashcodes. For now, it just tries to account\nfor minor differences in representation.\n", "readmeFilename": "README.md", "gitHead": "e51f43718af66ffbced4ccfd9a6514470fc4c553", "_id": "normalize-git-url@1.0.0", "_shasum": "80e59471f0616b579893973e3f1b3684bedbad48", "_from": "normalize-git-url@>=1.0.0 <1.1.0" } �������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/normalize-git-url/README.md����������������000644 �000766 �000024 �00000002533 12455173731 030222� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# normalize-git-url You have a bunch of Git URLs. You want to convert them to a canonical representation, probably for use inside npm so that it doesn't end up creating a bunch of superfluous cached origins. You use this package. ## Usage ```javascript var ngu = require('normalize-git-url'); var normalized = ngu("git+ssh://git@github.com:organization/repo.git#hashbrowns") // get back: // { // url : "ssh://git@github.com/organization/repo.git", // branch : "hashbrowns" // did u know hashbrowns are delicious? // } ``` ## API There's just the one function, and all it takes is a single parameter, a non-normalized Git URL. ### normalizeGitUrl(url) * `url` {String} The Git URL (very loosely speaking) to be normalized. Returns an object with the following format: * `url` {String} The normalized URL. * `branch` {String} The treeish to be checked out once the repo at `url` is cloned. It doesn't have to be a branch, but it's a lot easier to intuit what the output is for with that name. ## Limitations Right now this doesn't try to special-case GitHub too much -- it doesn't ensure that `.git` is added to the end of URLs, it doesn't prefer `https:` over `http:` or `ssh:`, it doesn't deal with redirects, and it doesn't try to resolve symbolic names to treeish hashcodes. For now, it just tries to account for minor differences in representation. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/.npmignore����������������������������000644 �000766 �000024 �00000000015 12455173731 026332� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/bin/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025102� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/examples/�����������������������������000755 �000766 �000024 �00000000000 12456115117 026150� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/lib/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025100� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/LICENSE�������������������������������000644 �000766 �000024 �00000002104 12455173731 025341� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/package.json��������������������������000644 �000766 �000024 �00000021257 12455173731 026634� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "nopt", "version": "3.0.1", "description": "Option parsing for Node, supporting types, shorthands, etc. Used by npm.", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "main": "lib/nopt.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "http://github.com/isaacs/nopt" }, "bin": { "nopt": "./bin/nopt.js" }, "license": { "type": "MIT", "url": "https://github.com/isaacs/nopt/raw/master/LICENSE" }, "dependencies": { "abbrev": "1" }, "devDependencies": { "tap": "~0.4.8" }, "readme": "If you want to write an option parser, and have it be good, there are\ntwo ways to do it. The Right Way, and the Wrong Way.\n\nThe Wrong Way is to sit down and write an option parser. We've all done\nthat.\n\nThe Right Way is to write some complex configurable program with so many\noptions that you go half-insane just trying to manage them all, and put\nit off with duct-tape solutions until you see exactly to the core of the\nproblem, and finally snap and write an awesome option parser.\n\nIf you want to write an option parser, don't write an option parser.\nWrite a package manager, or a source control system, or a service\nrestarter, or an operating system. You probably won't end up with a\ngood one of those, but if you don't give up, and you are relentless and\ndiligent enough in your procrastination, you may just end up with a very\nnice option parser.\n\n## USAGE\n\n // my-program.js\n var nopt = require(\"nopt\")\n , Stream = require(\"stream\").Stream\n , path = require(\"path\")\n , knownOpts = { \"foo\" : [String, null]\n , \"bar\" : [Stream, Number]\n , \"baz\" : path\n , \"bloo\" : [ \"big\", \"medium\", \"small\" ]\n , \"flag\" : Boolean\n , \"pick\" : Boolean\n , \"many\" : [String, Array]\n }\n , shortHands = { \"foofoo\" : [\"--foo\", \"Mr. Foo\"]\n , \"b7\" : [\"--bar\", \"7\"]\n , \"m\" : [\"--bloo\", \"medium\"]\n , \"p\" : [\"--pick\"]\n , \"f\" : [\"--flag\"]\n }\n // everything is optional.\n // knownOpts and shorthands default to {}\n // arg list defaults to process.argv\n // slice defaults to 2\n , parsed = nopt(knownOpts, shortHands, process.argv, 2)\n console.log(parsed)\n\nThis would give you support for any of the following:\n\n```bash\n$ node my-program.js --foo \"blerp\" --no-flag\n{ \"foo\" : \"blerp\", \"flag\" : false }\n\n$ node my-program.js ---bar 7 --foo \"Mr. Hand\" --flag\n{ bar: 7, foo: \"Mr. Hand\", flag: true }\n\n$ node my-program.js --foo \"blerp\" -f -----p\n{ foo: \"blerp\", flag: true, pick: true }\n\n$ node my-program.js -fp --foofoo\n{ foo: \"Mr. Foo\", flag: true, pick: true }\n\n$ node my-program.js --foofoo -- -fp # -- stops the flag parsing.\n{ foo: \"Mr. Foo\", argv: { remain: [\"-fp\"] } }\n\n$ node my-program.js --blatzk -fp # unknown opts are ok.\n{ blatzk: true, flag: true, pick: true }\n\n$ node my-program.js --blatzk=1000 -fp # but you need to use = if they have a value\n{ blatzk: 1000, flag: true, pick: true }\n\n$ node my-program.js --no-blatzk -fp # unless they start with \"no-\"\n{ blatzk: false, flag: true, pick: true }\n\n$ node my-program.js --baz b/a/z # known paths are resolved.\n{ baz: \"/Users/isaacs/b/a/z\" }\n\n# if Array is one of the types, then it can take many\n# values, and will always be an array. The other types provided\n# specify what types are allowed in the list.\n\n$ node my-program.js --many 1 --many null --many foo\n{ many: [\"1\", \"null\", \"foo\"] }\n\n$ node my-program.js --many foo\n{ many: [\"foo\"] }\n```\n\nRead the tests at the bottom of `lib/nopt.js` for more examples of\nwhat this puppy can do.\n\n## Types\n\nThe following types are supported, and defined on `nopt.typeDefs`\n\n* String: A normal string. No parsing is done.\n* path: A file system path. Gets resolved against cwd if not absolute.\n* url: A url. If it doesn't parse, it isn't accepted.\n* Number: Must be numeric.\n* Date: Must parse as a date. If it does, and `Date` is one of the options,\n then it will return a Date object, not a string.\n* Boolean: Must be either `true` or `false`. If an option is a boolean,\n then it does not need a value, and its presence will imply `true` as\n the value. To negate boolean flags, do `--no-whatever` or `--whatever\n false`\n* NaN: Means that the option is strictly not allowed. Any value will\n fail.\n* Stream: An object matching the \"Stream\" class in node. Valuable\n for use when validating programmatically. (npm uses this to let you\n supply any WriteStream on the `outfd` and `logfd` config options.)\n* Array: If `Array` is specified as one of the types, then the value\n will be parsed as a list of options. This means that multiple values\n can be specified, and that the value will always be an array.\n\nIf a type is an array of values not on this list, then those are\nconsidered valid values. For instance, in the example above, the\n`--bloo` option can only be one of `\"big\"`, `\"medium\"`, or `\"small\"`,\nand any other value will be rejected.\n\nWhen parsing unknown fields, `\"true\"`, `\"false\"`, and `\"null\"` will be\ninterpreted as their JavaScript equivalents.\n\nYou can also mix types and values, or multiple types, in a list. For\ninstance `{ blah: [Number, null] }` would allow a value to be set to\neither a Number or null. When types are ordered, this implies a\npreference, and the first type that can be used to properly interpret\nthe value will be used.\n\nTo define a new type, add it to `nopt.typeDefs`. Each item in that\nhash is an object with a `type` member and a `validate` method. The\n`type` member is an object that matches what goes in the type list. The\n`validate` method is a function that gets called with `validate(data,\nkey, val)`. Validate methods should assign `data[key]` to the valid\nvalue of `val` if it can be handled properly, or return boolean\n`false` if it cannot.\n\nYou can also call `nopt.clean(data, types, typeDefs)` to clean up a\nconfig object and remove its invalid properties.\n\n## Error Handling\n\nBy default, nopt outputs a warning to standard error when invalid\noptions are found. You can change this behavior by assigning a method\nto `nopt.invalidHandler`. This method will be called with\nthe offending `nopt.invalidHandler(key, val, types)`.\n\nIf no `nopt.invalidHandler` is assigned, then it will console.error\nits whining. If it is assigned to boolean `false` then the warning is\nsuppressed.\n\n## Abbreviations\n\nYes, they are supported. If you define options like this:\n\n```javascript\n{ \"foolhardyelephants\" : Boolean\n, \"pileofmonkeys\" : Boolean }\n```\n\nThen this will work:\n\n```bash\nnode program.js --foolhar --pil\nnode program.js --no-f --pileofmon\n# etc.\n```\n\n## Shorthands\n\nShorthands are a hash of shorter option names to a snippet of args that\nthey expand to.\n\nIf multiple one-character shorthands are all combined, and the\ncombination does not unambiguously match any other option or shorthand,\nthen they will be broken up into their constituent parts. For example:\n\n```json\n{ \"s\" : [\"--loglevel\", \"silent\"]\n, \"g\" : \"--global\"\n, \"f\" : \"--force\"\n, \"p\" : \"--parseable\"\n, \"l\" : \"--long\"\n}\n```\n\n```bash\nnpm ls -sgflp\n# just like doing this:\nnpm ls --loglevel silent --global --force --long --parseable\n```\n\n## The Rest of the args\n\nThe config object returned by nopt is given a special member called\n`argv`, which is an object with the following fields:\n\n* `remain`: The remaining args after all the parsing has occurred.\n* `original`: The args as they originally appeared.\n* `cooked`: The args after flags and shorthands are expanded.\n\n## Slicing\n\nNode programs are called with more or less the exact argv as it appears\nin C land, after the v8 and node-specific options have been plucked off.\nAs such, `argv[0]` is always `node` and `argv[1]` is always the\nJavaScript program being run.\n\nThat's usually not very useful to you. So they're sliced off by\ndefault. If you want them, then you can pass in `0` as the last\nargument, or any other number that you'd like to slice off the start of\nthe list.\n", "readmeFilename": "README.md", "gitHead": "4296f7aba7847c198fea2da594f9e1bec02817ec", "bugs": { "url": "https://github.com/isaacs/nopt/issues" }, "homepage": "https://github.com/isaacs/nopt", "_id": "nopt@3.0.1", "_shasum": "bce5c42446a3291f47622a370abbf158fbbacbfd", "_from": "nopt@latest" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/README.md�����������������������������000644 �000766 �000024 �00000016626 12455173731 025631� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������If you want to write an option parser, and have it be good, there are two ways to do it. The Right Way, and the Wrong Way. The Wrong Way is to sit down and write an option parser. We've all done that. The Right Way is to write some complex configurable program with so many options that you go half-insane just trying to manage them all, and put it off with duct-tape solutions until you see exactly to the core of the problem, and finally snap and write an awesome option parser. If you want to write an option parser, don't write an option parser. Write a package manager, or a source control system, or a service restarter, or an operating system. You probably won't end up with a good one of those, but if you don't give up, and you are relentless and diligent enough in your procrastination, you may just end up with a very nice option parser. ## USAGE // my-program.js var nopt = require("nopt") , Stream = require("stream").Stream , path = require("path") , knownOpts = { "foo" : [String, null] , "bar" : [Stream, Number] , "baz" : path , "bloo" : [ "big", "medium", "small" ] , "flag" : Boolean , "pick" : Boolean , "many" : [String, Array] } , shortHands = { "foofoo" : ["--foo", "Mr. Foo"] , "b7" : ["--bar", "7"] , "m" : ["--bloo", "medium"] , "p" : ["--pick"] , "f" : ["--flag"] } // everything is optional. // knownOpts and shorthands default to {} // arg list defaults to process.argv // slice defaults to 2 , parsed = nopt(knownOpts, shortHands, process.argv, 2) console.log(parsed) This would give you support for any of the following: ```bash $ node my-program.js --foo "blerp" --no-flag { "foo" : "blerp", "flag" : false } $ node my-program.js ---bar 7 --foo "Mr. Hand" --flag { bar: 7, foo: "Mr. Hand", flag: true } $ node my-program.js --foo "blerp" -f -----p { foo: "blerp", flag: true, pick: true } $ node my-program.js -fp --foofoo { foo: "Mr. Foo", flag: true, pick: true } $ node my-program.js --foofoo -- -fp # -- stops the flag parsing. { foo: "Mr. Foo", argv: { remain: ["-fp"] } } $ node my-program.js --blatzk -fp # unknown opts are ok. { blatzk: true, flag: true, pick: true } $ node my-program.js --blatzk=1000 -fp # but you need to use = if they have a value { blatzk: 1000, flag: true, pick: true } $ node my-program.js --no-blatzk -fp # unless they start with "no-" { blatzk: false, flag: true, pick: true } $ node my-program.js --baz b/a/z # known paths are resolved. { baz: "/Users/isaacs/b/a/z" } # if Array is one of the types, then it can take many # values, and will always be an array. The other types provided # specify what types are allowed in the list. $ node my-program.js --many 1 --many null --many foo { many: ["1", "null", "foo"] } $ node my-program.js --many foo { many: ["foo"] } ``` Read the tests at the bottom of `lib/nopt.js` for more examples of what this puppy can do. ## Types The following types are supported, and defined on `nopt.typeDefs` * String: A normal string. No parsing is done. * path: A file system path. Gets resolved against cwd if not absolute. * url: A url. If it doesn't parse, it isn't accepted. * Number: Must be numeric. * Date: Must parse as a date. If it does, and `Date` is one of the options, then it will return a Date object, not a string. * Boolean: Must be either `true` or `false`. If an option is a boolean, then it does not need a value, and its presence will imply `true` as the value. To negate boolean flags, do `--no-whatever` or `--whatever false` * NaN: Means that the option is strictly not allowed. Any value will fail. * Stream: An object matching the "Stream" class in node. Valuable for use when validating programmatically. (npm uses this to let you supply any WriteStream on the `outfd` and `logfd` config options.) * Array: If `Array` is specified as one of the types, then the value will be parsed as a list of options. This means that multiple values can be specified, and that the value will always be an array. If a type is an array of values not on this list, then those are considered valid values. For instance, in the example above, the `--bloo` option can only be one of `"big"`, `"medium"`, or `"small"`, and any other value will be rejected. When parsing unknown fields, `"true"`, `"false"`, and `"null"` will be interpreted as their JavaScript equivalents. You can also mix types and values, or multiple types, in a list. For instance `{ blah: [Number, null] }` would allow a value to be set to either a Number or null. When types are ordered, this implies a preference, and the first type that can be used to properly interpret the value will be used. To define a new type, add it to `nopt.typeDefs`. Each item in that hash is an object with a `type` member and a `validate` method. The `type` member is an object that matches what goes in the type list. The `validate` method is a function that gets called with `validate(data, key, val)`. Validate methods should assign `data[key]` to the valid value of `val` if it can be handled properly, or return boolean `false` if it cannot. You can also call `nopt.clean(data, types, typeDefs)` to clean up a config object and remove its invalid properties. ## Error Handling By default, nopt outputs a warning to standard error when invalid options are found. You can change this behavior by assigning a method to `nopt.invalidHandler`. This method will be called with the offending `nopt.invalidHandler(key, val, types)`. If no `nopt.invalidHandler` is assigned, then it will console.error its whining. If it is assigned to boolean `false` then the warning is suppressed. ## Abbreviations Yes, they are supported. If you define options like this: ```javascript { "foolhardyelephants" : Boolean , "pileofmonkeys" : Boolean } ``` Then this will work: ```bash node program.js --foolhar --pil node program.js --no-f --pileofmon # etc. ``` ## Shorthands Shorthands are a hash of shorter option names to a snippet of args that they expand to. If multiple one-character shorthands are all combined, and the combination does not unambiguously match any other option or shorthand, then they will be broken up into their constituent parts. For example: ```json { "s" : ["--loglevel", "silent"] , "g" : "--global" , "f" : "--force" , "p" : "--parseable" , "l" : "--long" } ``` ```bash npm ls -sgflp # just like doing this: npm ls --loglevel silent --global --force --long --parseable ``` ## The Rest of the args The config object returned by nopt is given a special member called `argv`, which is an object with the following fields: * `remain`: The remaining args after all the parsing has occurred. * `original`: The args as they originally appeared. * `cooked`: The args after flags and shorthands are expanded. ## Slicing Node programs are called with more or less the exact argv as it appears in C land, after the v8 and node-specific options have been plucked off. As such, `argv[0]` is always `node` and `argv[1]` is always the JavaScript program being run. That's usually not very useful to you. So they're sliced off by default. If you want them, then you can pass in `0` as the last argument, or any other number that you'd like to slice off the start of the list. ����������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/lib/nopt.js���������������������������000644 �000766 �000024 �00000026605 12455173731 026434� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// info about each config option. var debug = process.env.DEBUG_NOPT || process.env.NOPT_DEBUG ? function () { console.error.apply(console, arguments) } : function () {} var url = require("url") , path = require("path") , Stream = require("stream").Stream , abbrev = require("abbrev") module.exports = exports = nopt exports.clean = clean exports.typeDefs = { String : { type: String, validate: validateString } , Boolean : { type: Boolean, validate: validateBoolean } , url : { type: url, validate: validateUrl } , Number : { type: Number, validate: validateNumber } , path : { type: path, validate: validatePath } , Stream : { type: Stream, validate: validateStream } , Date : { type: Date, validate: validateDate } } function nopt (types, shorthands, args, slice) { args = args || process.argv types = types || {} shorthands = shorthands || {} if (typeof slice !== "number") slice = 2 debug(types, shorthands, args, slice) args = args.slice(slice) var data = {} , key , remain = [] , cooked = args , original = args.slice(0) parse(args, data, remain, types, shorthands) // now data is full clean(data, types, exports.typeDefs) data.argv = {remain:remain,cooked:cooked,original:original} Object.defineProperty(data.argv, 'toString', { value: function () { return this.original.map(JSON.stringify).join(" ") }, enumerable: false }) return data } function clean (data, types, typeDefs) { typeDefs = typeDefs || exports.typeDefs var remove = {} , typeDefault = [false, true, null, String, Array] Object.keys(data).forEach(function (k) { if (k === "argv") return var val = data[k] , isArray = Array.isArray(val) , type = types[k] if (!isArray) val = [val] if (!type) type = typeDefault if (type === Array) type = typeDefault.concat(Array) if (!Array.isArray(type)) type = [type] debug("val=%j", val) debug("types=", type) val = val.map(function (val) { // if it's an unknown value, then parse false/true/null/numbers/dates if (typeof val === "string") { debug("string %j", val) val = val.trim() if ((val === "null" && ~type.indexOf(null)) || (val === "true" && (~type.indexOf(true) || ~type.indexOf(Boolean))) || (val === "false" && (~type.indexOf(false) || ~type.indexOf(Boolean)))) { val = JSON.parse(val) debug("jsonable %j", val) } else if (~type.indexOf(Number) && !isNaN(val)) { debug("convert to number", val) val = +val } else if (~type.indexOf(Date) && !isNaN(Date.parse(val))) { debug("convert to date", val) val = new Date(val) } } if (!types.hasOwnProperty(k)) { return val } // allow `--no-blah` to set 'blah' to null if null is allowed if (val === false && ~type.indexOf(null) && !(~type.indexOf(false) || ~type.indexOf(Boolean))) { val = null } var d = {} d[k] = val debug("prevalidated val", d, val, types[k]) if (!validate(d, k, val, types[k], typeDefs)) { if (exports.invalidHandler) { exports.invalidHandler(k, val, types[k], data) } else if (exports.invalidHandler !== false) { debug("invalid: "+k+"="+val, types[k]) } return remove } debug("validated val", d, val, types[k]) return d[k] }).filter(function (val) { return val !== remove }) if (!val.length) delete data[k] else if (isArray) { debug(isArray, data[k], val) data[k] = val } else data[k] = val[0] debug("k=%s val=%j", k, val, data[k]) }) } function validateString (data, k, val) { data[k] = String(val) } function validatePath (data, k, val) { if (val === true) return false if (val === null) return true val = String(val) var homePattern = process.platform === 'win32' ? /^~(\/|\\)/ : /^~\// if (val.match(homePattern) && process.env.HOME) { val = path.resolve(process.env.HOME, val.substr(2)) } data[k] = path.resolve(String(val)) return true } function validateNumber (data, k, val) { debug("validate Number %j %j %j", k, val, isNaN(val)) if (isNaN(val)) return false data[k] = +val } function validateDate (data, k, val) { debug("validate Date %j %j %j", k, val, Date.parse(val)) var s = Date.parse(val) if (isNaN(s)) return false data[k] = new Date(val) } function validateBoolean (data, k, val) { if (val instanceof Boolean) val = val.valueOf() else if (typeof val === "string") { if (!isNaN(val)) val = !!(+val) else if (val === "null" || val === "false") val = false else val = true } else val = !!val data[k] = val } function validateUrl (data, k, val) { val = url.parse(String(val)) if (!val.host) return false data[k] = val.href } function validateStream (data, k, val) { if (!(val instanceof Stream)) return false data[k] = val } function validate (data, k, val, type, typeDefs) { // arrays are lists of types. if (Array.isArray(type)) { for (var i = 0, l = type.length; i < l; i ++) { if (type[i] === Array) continue if (validate(data, k, val, type[i], typeDefs)) return true } delete data[k] return false } // an array of anything? if (type === Array) return true // NaN is poisonous. Means that something is not allowed. if (type !== type) { debug("Poison NaN", k, val, type) delete data[k] return false } // explicit list of values if (val === type) { debug("Explicitly allowed %j", val) // if (isArray) (data[k] = data[k] || []).push(val) // else data[k] = val data[k] = val return true } // now go through the list of typeDefs, validate against each one. var ok = false , types = Object.keys(typeDefs) for (var i = 0, l = types.length; i < l; i ++) { debug("test type %j %j %j", k, val, types[i]) var t = typeDefs[types[i]] if (t && type === t.type) { var d = {} ok = false !== t.validate(d, k, val) val = d[k] if (ok) { // if (isArray) (data[k] = data[k] || []).push(val) // else data[k] = val data[k] = val break } } } debug("OK? %j (%j %j %j)", ok, k, val, types[i]) if (!ok) delete data[k] return ok } function parse (args, data, remain, types, shorthands) { debug("parse", args, data, remain) var key = null , abbrevs = abbrev(Object.keys(types)) , shortAbbr = abbrev(Object.keys(shorthands)) for (var i = 0; i < args.length; i ++) { var arg = args[i] debug("arg", arg) if (arg.match(/^-{2,}$/)) { // done with keys. // the rest are args. remain.push.apply(remain, args.slice(i + 1)) args[i] = "--" break } var hadEq = false if (arg.charAt(0) === "-" && arg.length > 1) { if (arg.indexOf("=") !== -1) { hadEq = true var v = arg.split("=") arg = v.shift() v = v.join("=") args.splice.apply(args, [i, 1].concat([arg, v])) } // see if it's a shorthand // if so, splice and back up to re-parse it. var shRes = resolveShort(arg, shorthands, shortAbbr, abbrevs) debug("arg=%j shRes=%j", arg, shRes) if (shRes) { debug(arg, shRes) args.splice.apply(args, [i, 1].concat(shRes)) if (arg !== shRes[0]) { i -- continue } } arg = arg.replace(/^-+/, "") var no = null while (arg.toLowerCase().indexOf("no-") === 0) { no = !no arg = arg.substr(3) } if (abbrevs[arg]) arg = abbrevs[arg] var isArray = types[arg] === Array || Array.isArray(types[arg]) && types[arg].indexOf(Array) !== -1 // allow unknown things to be arrays if specified multiple times. if (!types.hasOwnProperty(arg) && data.hasOwnProperty(arg)) { if (!Array.isArray(data[arg])) data[arg] = [data[arg]] isArray = true } var val , la = args[i + 1] var isBool = typeof no === 'boolean' || types[arg] === Boolean || Array.isArray(types[arg]) && types[arg].indexOf(Boolean) !== -1 || (typeof types[arg] === 'undefined' && !hadEq) || (la === "false" && (types[arg] === null || Array.isArray(types[arg]) && ~types[arg].indexOf(null))) if (isBool) { // just set and move along val = !no // however, also support --bool true or --bool false if (la === "true" || la === "false") { val = JSON.parse(la) la = null if (no) val = !val i ++ } // also support "foo":[Boolean, "bar"] and "--foo bar" if (Array.isArray(types[arg]) && la) { if (~types[arg].indexOf(la)) { // an explicit type val = la i ++ } else if ( la === "null" && ~types[arg].indexOf(null) ) { // null allowed val = null i ++ } else if ( !la.match(/^-{2,}[^-]/) && !isNaN(la) && ~types[arg].indexOf(Number) ) { // number val = +la i ++ } else if ( !la.match(/^-[^-]/) && ~types[arg].indexOf(String) ) { // string val = la i ++ } } if (isArray) (data[arg] = data[arg] || []).push(val) else data[arg] = val continue } if (types[arg] === String && la === undefined) la = "" if (la && la.match(/^-{2,}$/)) { la = undefined i -- } val = la === undefined ? true : la if (isArray) (data[arg] = data[arg] || []).push(val) else data[arg] = val i ++ continue } remain.push(arg) } } function resolveShort (arg, shorthands, shortAbbr, abbrevs) { // handle single-char shorthands glommed together, like // npm ls -glp, but only if there is one dash, and only if // all of the chars are single-char shorthands, and it's // not a match to some other abbrev. arg = arg.replace(/^-+/, '') // if it's an exact known option, then don't go any further if (abbrevs[arg] === arg) return null // if it's an exact known shortopt, same deal if (shorthands[arg]) { // make it an array, if it's a list of words if (shorthands[arg] && !Array.isArray(shorthands[arg])) shorthands[arg] = shorthands[arg].split(/\s+/) return shorthands[arg] } // first check to see if this arg is a set of single-char shorthands var singles = shorthands.___singles if (!singles) { singles = Object.keys(shorthands).filter(function (s) { return s.length === 1 }).reduce(function (l,r) { l[r] = true return l }, {}) shorthands.___singles = singles debug('shorthand singles', singles) } var chrs = arg.split("").filter(function (c) { return singles[c] }) if (chrs.join("") === arg) return chrs.map(function (c) { return shorthands[c] }).reduce(function (l, r) { return l.concat(r) }, []) // if it's an arg abbrev, and not a literal shorthand, then prefer the arg if (abbrevs[arg] && !shorthands[arg]) return null // if it's an abbr for a shorthand, then use that if (shortAbbr[arg]) arg = shortAbbr[arg] // make it an array, if it's a list of words if (shorthands[arg] && !Array.isArray(shorthands[arg])) shorthands[arg] = shorthands[arg].split(/\s+/) return shorthands[arg] } ���������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/examples/my-program.js����������������000755 �000766 �000024 �00000002010 12455173731 030601� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node //process.env.DEBUG_NOPT = 1 // my-program.js var nopt = require("../lib/nopt") , Stream = require("stream").Stream , path = require("path") , knownOpts = { "foo" : [String, null] , "bar" : [Stream, Number] , "baz" : path , "bloo" : [ "big", "medium", "small" ] , "flag" : Boolean , "pick" : Boolean } , shortHands = { "foofoo" : ["--foo", "Mr. Foo"] , "b7" : ["--bar", "7"] , "m" : ["--bloo", "medium"] , "p" : ["--pick"] , "f" : ["--flag", "true"] , "g" : ["--flag"] , "s" : "--flag" } // everything is optional. // knownOpts and shorthands default to {} // arg list defaults to process.argv // slice defaults to 2 , parsed = nopt(knownOpts, shortHands, process.argv, 2) console.log("parsed =\n"+ require("util").inspect(parsed)) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/nopt/bin/nopt.js���������������������������000755 �000766 �000024 �00000003015 12455173731 026427� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var nopt = require("../lib/nopt") , path = require("path") , types = { num: Number , bool: Boolean , help: Boolean , list: Array , "num-list": [Number, Array] , "str-list": [String, Array] , "bool-list": [Boolean, Array] , str: String , clear: Boolean , config: Boolean , length: Number , file: path } , shorthands = { s: [ "--str", "astring" ] , b: [ "--bool" ] , nb: [ "--no-bool" ] , tft: [ "--bool-list", "--no-bool-list", "--bool-list", "true" ] , "?": ["--help"] , h: ["--help"] , H: ["--help"] , n: [ "--num", "125" ] , c: ["--config"] , l: ["--length"] , f: ["--file"] } , parsed = nopt( types , shorthands , process.argv , 2 ) console.log("parsed", parsed) if (parsed.help) { console.log("") console.log("nopt cli tester") console.log("") console.log("types") console.log(Object.keys(types).map(function M (t) { var type = types[t] if (Array.isArray(type)) { return [t, type.map(function (type) { return type.name })] } return [t, type && type.name] }).reduce(function (s, i) { s[i[0]] = i[1] return s }, {})) console.log("") console.log("shorthands") console.log(shorthands) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/.jshintrc�������������������������000644 �000766 �000024 �00000000130 12455173731 026720� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "asi": true, "laxcomma": true, "es5": true, "node": true, "strict": false } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/.npmignore������������������������000644 �000766 �000024 �00000000011 12455173731 027070� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������gyp/test �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/addon.gypi������������������������000644 �000766 �000024 �00000003370 12456106751 027062� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ 'target_defaults': { 'type': 'loadable_module', 'product_prefix': '', 'include_dirs': [ '<(node_root_dir)/src', '<(node_root_dir)/deps/uv/include', '<(node_root_dir)/deps/v8/include' ], 'target_conditions': [ ['_type=="loadable_module"', { 'product_extension': 'node', 'defines': [ 'BUILDING_NODE_EXTENSION' ], }], ['_type=="static_library"', { # set to `1` to *disable* the -T thin archive 'ld' flag. # older linkers don't support this flag. 'standalone_static_library': '<(standalone_static_library)' }], ], 'conditions': [ [ 'OS=="mac"', { 'defines': [ '_DARWIN_USE_64_BIT_INODE=1' ], 'libraries': [ '-undefined dynamic_lookup' ], 'xcode_settings': { 'DYLIB_INSTALL_NAME_BASE': '@rpath' }, }], [ 'OS=="win"', { 'libraries': [ '-lkernel32.lib', '-luser32.lib', '-lgdi32.lib', '-lwinspool.lib', '-lcomdlg32.lib', '-ladvapi32.lib', '-lshell32.lib', '-lole32.lib', '-loleaut32.lib', '-luuid.lib', '-lodbc32.lib', '-lDelayImp.lib', '-l"<(node_root_dir)/$(ConfigurationName)/iojs.lib"' ], # warning C4251: 'node::ObjectWrap::handle_' : class 'v8::Persistent<T>' # needs to have dll-interface to be used by clients of class 'node::ObjectWrap' 'msvs_disabled_warnings': [ 4251 ], }, { # OS!="win" 'defines': [ '_LARGEFILE_SOURCE', '_FILE_OFFSET_BITS=64' ], }], [ 'OS=="freebsd" or OS=="openbsd" or OS=="solaris" or (OS=="linux" and target_arch!="ia32")', { 'cflags': [ '-fPIC' ], }] ] } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/bin/������������������������������000755 �000766 �000024 �00000000000 12456115117 025644� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/������������������������������000755 �000766 �000024 �00000000000 12456115117 025673� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/������������������������������000755 �000766 �000024 �00000000000 12456115117 025642� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/LICENSE���������������������������000644 �000766 �000024 �00000002116 12455173731 026106� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������(The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/node_modules/���������������������000755 �000766 �000024 �00000000000 12456115117 027551� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/package.json����������������������000644 �000766 �000024 �00000003530 12455173731 027370� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "node-gyp", "description": "Node.js native addon build tool", "keywords": [ "native", "addon", "module", "c", "c++", "bindings", "gyp" ], "version": "1.0.2", "installVersion": 9, "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://tootallnate.net" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/node-gyp.git" }, "preferGlobal": true, "bin": { "node-gyp": "./bin/node-gyp.js" }, "main": "./lib/node-gyp.js", "dependencies": { "fstream": "^1.0.0", "glob": "3 || 4", "graceful-fs": "3", "minimatch": "1", "mkdirp": "^0.5.0", "nopt": "2 || 3", "npmlog": "0", "osenv": "0", "request": "2", "rimraf": "2", "semver": "2.x || 3.x || 4", "tar": "^1.0.0", "which": "1" }, "engines": { "node": ">= 0.8.0" }, "gitHead": "1e399b471945b35f3bfbca4a10fba31a6739b5db", "bugs": { "url": "https://github.com/TooTallNate/node-gyp/issues" }, "homepage": "https://github.com/TooTallNate/node-gyp", "_id": "node-gyp@1.0.2", "scripts": {}, "_shasum": "b0bb6d2d762271408dd904853e7aa3000ed2eb57", "_from": "node-gyp@>=1.0.2 <1.1.0", "_npmVersion": "2.0.0-beta.3", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "TooTallNate", "email": "nathan@tootallnate.net" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "b0bb6d2d762271408dd904853e7aa3000ed2eb57", "tarball": "http://registry.npmjs.org/node-gyp/-/node-gyp-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/node-gyp/-/node-gyp-1.0.2.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/README.md�������������������������000644 �000766 �000024 �00000015341 12455173731 026364� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node-gyp ========= ### Node.js native addon build tool `node-gyp` is a cross-platform command-line tool written in Node.js for compiling native addon modules for Node.js. It bundles the [gyp](https://code.google.com/p/gyp/) project used by the Chromium team and takes away the pain of dealing with the various differences in build platforms. It is the replacement to the `node-waf` program which is removed for node `v0.8`. If you have a native addon for node that still has a `wscript` file, then you should definitely add a `binding.gyp` file to support the latest versions of node. Multiple target versions of node are supported (i.e. `0.8`, `0.9`, `0.10`, ..., `1.0`, etc.), regardless of what version of node is actually installed on your system (`node-gyp` downloads the necessary development files for the target version). #### Features: * Easy to use, consistent interface * Same commands to build your module on every platform * Supports multiple target versions of Node Installation ------------ You can install with `npm`: ``` bash $ npm install -g node-gyp ``` You will also need to install: * On Unix: * `python` (`v2.7` recommended, `v3.x.x` is __*not*__ supported) * `make` * A proper C/C++ compiler toolchain, like GCC * On Windows: * [Python][windows-python] ([`v2.7.3`][windows-python-v2.7.3] recommended, `v3.x.x` is __*not*__ supported) * Windows XP/Vista/7: * Microsoft Visual Studio C++ 2010 ([Express][msvc2010] version works well) * For 64-bit builds of node and native modules you will _**also**_ need the [Windows 7 64-bit SDK][win7sdk] * If the install fails, try uninstalling any C++ 2010 x64&x86 Redistributable that you have installed first. * If you get errors that the 64-bit compilers are not installed you may also need the [compiler update for the Windows SDK 7.1] * Windows 7/8: * Microsoft Visual Studio C++ 2012 for Windows Desktop ([Express][msvc2012] version works well) If you have multiple Python versions installed, you can identify which Python version `node-gyp` uses by setting the '--python' variable: ``` bash $ node-gyp --python /path/to/python2.7 ``` If `node-gyp` is called by way of `npm` *and* you have multiple versions of Python installed, then you can set `npm`'s 'python' config key to the appropriate value: ``` bash $ npm config set python /path/to/executable/python2.7 ``` Note that OS X is just a flavour of Unix and so needs `python`, `make`, and C/C++. An easy way to obtain these is to install XCode from Apple, and then use it to install the command line tools (under Preferences -> Downloads). How to Use ---------- To compile your native addon, first go to its root directory: ``` bash $ cd my_node_addon ``` The next step is to generate the appropriate project build files for the current platform. Use `configure` for that: ``` bash $ node-gyp configure ``` __Note__: The `configure` step looks for the `binding.gyp` file in the current directory to processs. See below for instructions on creating the `binding.gyp` file. Now you will have either a `Makefile` (on Unix platforms) or a `vcxproj` file (on Windows) in the `build/` directory. Next invoke the `build` command: ``` bash $ node-gyp build ``` Now you have your compiled `.node` bindings file! The compiled bindings end up in `build/Debug/` or `build/Release/`, depending on the build mode. At this point you can require the `.node` file with Node and run your tests! __Note:__ To create a _Debug_ build of the bindings file, pass the `--debug` (or `-d`) switch when running the either `configure` or `build` command. The "binding.gyp" file ---------------------- Previously when node had `node-waf` you had to write a `wscript` file. The replacement for that is the `binding.gyp` file, which describes the configuration to build your module in a JSON-like format. This file gets placed in the root of your package, alongside the `package.json` file. A barebones `gyp` file appropriate for building a node addon looks like: ``` python { "targets": [ { "target_name": "binding", "sources": [ "src/binding.cc" ] } ] } ``` Some additional resources for writing `gyp` files: * ["Hello World" node addon example](https://github.com/joyent/node/tree/master/test/addons/hello-world) * [gyp user documentation](http://code.google.com/p/gyp/wiki/GypUserDocumentation) * [gyp input format reference](http://code.google.com/p/gyp/wiki/InputFormatReference) * [*"binding.gyp" files out in the wild* wiki page](https://github.com/TooTallNate/node-gyp/wiki/%22binding.gyp%22-files-out-in-the-wild) Commands -------- `node-gyp` responds to the following commands: | **Command** | **Description** |:--------------|:--------------------------------------------------------------- | `build` | Invokes `make`/`msbuild.exe` and builds the native addon | `clean` | Removes any the `build` dir if it exists | `configure` | Generates project build files for the current platform | `rebuild` | Runs "clean", "configure" and "build" all in a row | `install` | Installs node development header files for the given version | `list` | Lists the currently installed node development file versions | `remove` | Removes the node development header files for the given version License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. [windows-python]: http://www.python.org/getit/windows [windows-python-v2.7.3]: http://www.python.org/download/releases/2.7.3#download [msvc2010]: http://go.microsoft.com/?linkid=9709949 [msvc2012]: http://go.microsoft.com/?linkid=9816758 [win7sdk]: http://www.microsoft.com/en-us/download/details.aspx?id=8279 [compiler update for the Windows SDK 7.1]: http://www.microsoft.com/en-us/download/details.aspx?id=4422 �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/�����������000755 �000766 �000024 �00000000000 12456115117 031522� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/.npmignore�000644 �000766 �000024 �00000000015 12455173731 033522� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/.travis.yml000644 �000766 �000024 �00000000055 12455173731 033640� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 - 0.11 �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/LICENSE����000644 �000766 �000024 �00000002104 12455173731 032531� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/minimatch.js����������������������000644 �000766 �000024 �00000070212 12455173731 033761� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������;(function (require, exports, module, platform) { if (module) module.exports = minimatch else exports.minimatch = minimatch if (!require) { require = function (id) { switch (id) { case "sigmund": return function sigmund (obj) { return JSON.stringify(obj) } case "path": return { basename: function (f) { f = f.split(/[\/\\]/) var e = f.pop() if (!e) e = f.pop() return e }} case "lru-cache": return function LRUCache () { // not quite an LRU, but still space-limited. var cache = {} var cnt = 0 this.set = function (k, v) { cnt ++ if (cnt >= 100) cache = {} cache[k] = v } this.get = function (k) { return cache[k] } } } } } minimatch.Minimatch = Minimatch var LRU = require("lru-cache") , cache = minimatch.cache = new LRU({max: 100}) , GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} , sigmund = require("sigmund") var path = require("path") // any single thing other than / // don't need to escape / when using new RegExp() , qmark = "[^/]" // * => any number of characters , star = qmark + "*?" // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. , twoStarDot = "(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?" // not a ^ or / followed by a dot, // followed by anything, any number of times. , twoStarNoDot = "(?:(?!(?:\\\/|^)\\.).)*?" // characters that need to be escaped in RegExp. , reSpecials = charSet("().*{}+?[]^$\\!") // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split("").reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { return false } // "" only matches "" if (pattern.trim() === "") return p === "" return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options, cache) } if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} pattern = pattern.trim() // windows: need to use /, not \ // On other platforms, \ is a valid (albeit bad) filename char. if (platform === "win32") { pattern = pattern.split("\\").join("/") } // lru storage. // these things aren't particularly big, but walking down the string // and turning it into a regexp can get pretty costly. var cacheKey = pattern + "\n" + sigmund(options) var cached = minimatch.cache.get(cacheKey) if (cached) return cached minimatch.cache.set(cacheKey, this) this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function() {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return -1 === s.indexOf(false) }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern , negate = false , options = this.options , negateOffset = 0 if (options.nonegate) return for ( var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === "!" ; i ++) { negate = !negate negateOffset ++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return new Minimatch(pattern, options).braceExpand() } Minimatch.prototype.braceExpand = braceExpand function pad(n, width, z) { z = z || '0'; n = n + ''; return n.length >= width ? n : new Array(width - n.length + 1).join(z) + n; } function braceExpand (pattern, options) { options = options || this.options pattern = typeof pattern === "undefined" ? this.pattern : pattern if (typeof pattern === "undefined") { throw new Error("undefined pattern") } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } var escaping = false // examples and comments refer to this crazy pattern: // a{b,c{d,e},{f,g}h}x{y,z} // expected: // abxy // abxz // acdxy // acdxz // acexy // acexz // afhxy // afhxz // aghxy // aghxz // everything before the first \{ is just a prefix. // So, we pluck that off, and work with the rest, // and then prepend it to everything we find. if (pattern.charAt(0) !== "{") { this.debug(pattern) var prefix = null for (var i = 0, l = pattern.length; i < l; i ++) { var c = pattern.charAt(i) this.debug(i, c) if (c === "\\") { escaping = !escaping } else if (c === "{" && !escaping) { prefix = pattern.substr(0, i) break } } // actually no sets, all { were escaped. if (prefix === null) { this.debug("no sets") return [pattern] } var tail = braceExpand.call(this, pattern.substr(i), options) return tail.map(function (t) { return prefix + t }) } // now we have something like: // {b,c{d,e},{f,g}h}x{y,z} // walk through the set, expanding each part, until // the set ends. then, we'll expand the suffix. // If the set only has a single member, then'll put the {} back // first, handle numeric sets, since they're easier var numset = pattern.match(/^\{(-?[0-9]+)\.\.(-?[0-9]+)\}/) if (numset) { this.debug("numset", numset[1], numset[2]) var suf = braceExpand.call(this, pattern.substr(numset[0].length), options) , start = +numset[1] , needPadding = numset[1][0] === '0' , startWidth = numset[1].length , padded , end = +numset[2] , inc = start > end ? -1 : 1 , set = [] for (var i = start; i != (end + inc); i += inc) { padded = needPadding ? pad(i, startWidth) : i + '' // append all the suffixes for (var ii = 0, ll = suf.length; ii < ll; ii ++) { set.push(padded + suf[ii]) } } return set } // ok, walk through the set // We hope, somewhat optimistically, that there // will be a } at the end. // If the closing brace isn't found, then the pattern is // interpreted as braceExpand("\\" + pattern) so that // the leading \{ will be interpreted literally. var i = 1 // skip the \{ , depth = 1 , set = [] , member = "" , sawEnd = false , escaping = false function addMember () { set.push(member) member = "" } this.debug("Entering for") FOR: for (i = 1, l = pattern.length; i < l; i ++) { var c = pattern.charAt(i) this.debug("", i, c) if (escaping) { escaping = false member += "\\" + c } else { switch (c) { case "\\": escaping = true continue case "{": depth ++ member += "{" continue case "}": depth -- // if this closes the actual set, then we're done if (depth === 0) { addMember() // pluck off the close-brace i ++ break FOR } else { member += c continue } case ",": if (depth === 1) { addMember() } else { member += c } continue default: member += c continue } // switch } // else } // for // now we've either finished the set, and the suffix is // pattern.substr(i), or we have *not* closed the set, // and need to escape the leading brace if (depth !== 0) { this.debug("didn't close", pattern) return braceExpand.call(this, "\\" + pattern, options) } // x{y,z} -> ["xy", "xz"] this.debug("set", set) this.debug("suffix", pattern.substr(i)) var suf = braceExpand.call(this, pattern.substr(i), options) // ["b", "c{d,e}","{f,g}h"] -> // [["b"], ["cd", "ce"], ["fh", "gh"]] var addBraces = set.length === 1 this.debug("set pre-expanded", set) set = set.map(function (p) { return braceExpand.call(this, p, options) }, this) this.debug("set expanded", set) // [["b"], ["cd", "ce"], ["fh", "gh"]] -> // ["b", "cd", "ce", "fh", "gh"] set = set.reduce(function (l, r) { return l.concat(r) }) if (addBraces) { set = set.map(function (s) { return "{" + s + "}" }) } // now attach the suffixes. var ret = [] for (var i = 0, l = set.length; i < l; i ++) { for (var ii = 0, ll = suf.length; ii < ll; ii ++) { ret.push(set[i] + suf[ii]) } } return ret } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === "**") return GLOBSTAR if (pattern === "") return "" var re = "" , hasMagic = !!options.nocase , escaping = false // ? => one single character , patternListStack = [] , plType , stateChar , inClass = false , reClassStart = -1 , classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. , patternStart = pattern.charAt(0) === "." ? "" // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? "(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))" : "(?!\\.)" , self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case "*": re += star hasMagic = true break case "?": re += qmark hasMagic = true break default: re += "\\"+stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for ( var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i ++ ) { this.debug("%s\t%s %s %j", pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += "\\" + c escaping = false continue } SWITCH: switch (c) { case "/": // completely not allowed, even escaped. // Should already be path-split by now. return false case "\\": clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case "?": case "*": case "+": case "@": case "!": this.debug("%s\t%s %s %j <-- stateChar", pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === "!" && i === classStart + 1) c = "^" re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case "(": if (inClass) { re += "(" continue } if (!stateChar) { re += "\\(" continue } plType = stateChar patternListStack.push({ type: plType , start: i - 1 , reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === "!" ? "(?:(?!" : "(?:" this.debug('plType %j %j', stateChar, re) stateChar = false continue case ")": if (inClass || !patternListStack.length) { re += "\\)" continue } clearStateChar() hasMagic = true re += ")" plType = patternListStack.pop().type // negation is (?:(?!js)[^/]*) // The others are (?:<pattern>)<type> switch (plType) { case "!": re += "[^/]*?)" break case "?": case "+": case "*": re += plType case "@": break // the default anyway } continue case "|": if (inClass || !patternListStack.length || escaping) { re += "\\|" escaping = false continue } clearStateChar() re += "|" continue // these are mostly the same in regexp and glob case "[": // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += "\\" + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case "]": // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += "\\" + c escaping = false continue } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === "^" && inClass)) { re += "\\" } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is var cs = pattern.substr(classStart + 1) , sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + "\\[" + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. var pl while (pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = "\\" } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + "|" }) this.debug("tail=%j\n %s", tail, tail) var t = pl.type === "*" ? star : pl.type === "?" ? qmark : "\\" + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + "\\(" + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += "\\\\" } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case ".": case "[": case "(": addPatternStart = true } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== "" && hasMagic) re = "(?=.)" + re if (addPatternStart) re = patternStart + re // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [ re, hasMagic ] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? "i" : "" , regExp = new RegExp("^" + re + "$", flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) return this.regexp = false var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot , flags = options.nocase ? "i" : "" var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === "string") ? regExpEscape(p) : p._src }).join("\\\/") }).join("|") // must match entire pattern // ending in a * or ** will make it less strict. re = "^(?:" + re + ")$" // can match anything, as long as it's not this. if (this.negate) re = "^(?!" + re + ").*$" try { return this.regexp = new RegExp(re, flags) } catch (ex) { return this.regexp = false } } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug("match", f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === "" if (f === "/" && partial) return true var options = this.options // windows: need to use /, not \ // On other platforms, \ is a valid (albeit bad) filename char. if (platform === "win32") { f = f.split("\\").join("/") } // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, "split", f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, "set", set) // Find the basename of the path by looking for the last non-empty segment var filename; for (var i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (var i = 0, l = set.length; i < l; i ++) { var pattern = set[i], file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug("matchOne", { "this": this , file: file , pattern: pattern }) this.debug("matchOne", file.length, pattern.length) for ( var fi = 0 , pi = 0 , fl = file.length , pl = pattern.length ; (fi < fl) && (pi < pl) ; fi ++, pi ++ ) { this.debug("matchOne loop") var p = pattern[pi] , f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi , pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for ( ; fi < fl; fi ++) { if (file[fi] === "." || file[fi] === ".." || (!options.dot && file[fi].charAt(0) === ".")) return false } return true } // ok, let's see if we can swallow whatever we can. WHILE: while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === "." || swallowee === ".." || (!options.dot && swallowee.charAt(0) === ".")) { this.debug("dot detected!", file, fr, pattern, pr) break WHILE } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr ++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug("\n>>> no match, partial?", file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === "string") { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug("string match", p, f, hit) } else { hit = f.match(p) this.debug("pattern match", p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === "") return emptyFileEnd } // should be unreachable. throw new Error("wtf?") } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, "$1") } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, "\\$&") } })( typeof require === "function" ? require : null, this, typeof module === "object" ? module : null, typeof process === "object" ? process.platform : "win32" ) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/���������������������000755 �000766 �000024 �00000000000 12456115117 034120� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/package.json����������������������000644 �000766 �000024 �00000002600 12455173731 033734� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "minimatch", "description": "a glob matcher in javascript", "version": "1.0.0", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "main": "minimatch.js", "scripts": { "test": "tap test/*.js" }, "engines": { "node": "*" }, "dependencies": { "lru-cache": "2", "sigmund": "~1.0.0" }, "devDependencies": { "tap": "" }, "license": { "type": "MIT", "url": "http://github.com/isaacs/minimatch/raw/master/LICENSE" }, "gitHead": "b374a643976eb55cdc19c60b6dd51ebe9bcc607a", "bugs": { "url": "https://github.com/isaacs/minimatch/issues" }, "homepage": "https://github.com/isaacs/minimatch", "_id": "minimatch@1.0.0", "_shasum": "e0dd2120b49e1b724ce8d714c520822a9438576d", "_from": "minimatch@>=1.0.0 <2.0.0", "_npmVersion": "1.4.21", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "e0dd2120b49e1b724ce8d714c520822a9438576d", "tarball": "http://registry.npmjs.org/minimatch/-/minimatch-1.0.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-1.0.0.tgz", "readme": "ERROR: No README data found!" } ��������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/README.md��000644 �000766 �000024 �00000015005 12455173731 033007� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# minimatch A minimal matching utility. [![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch) This is the matching library used internally by npm. Eventually, it will replace the C binding in node-glob. It works by converting glob expressions into JavaScript `RegExp` objects. ## Usage ```javascript var minimatch = require("minimatch") minimatch("bar.foo", "*.foo") // true! minimatch("bar.foo", "*.bar") // false! minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! ``` ## Features Supports these glob features: * Brace Expansion * Extended glob matching * "Globstar" `**` matching See: * `man sh` * `man bash` * `man 3 fnmatch` * `man 5 gitignore` ## Minimatch Class Create a minimatch object by instanting the `minimatch.Minimatch` class. ```javascript var Minimatch = require("minimatch").Minimatch var mm = new Minimatch(pattern, options) ``` ### Properties * `pattern` The original pattern the minimatch object represents. * `options` The options supplied to the constructor. * `set` A 2-dimensional array of regexp or string expressions. Each row in the array corresponds to a brace-expanded pattern. Each item in the row corresponds to a single path-part. For example, the pattern `{a,b/c}/d` would expand to a set of patterns like: [ [ a, d ] , [ b, c, d ] ] If a portion of the pattern doesn't have any "magic" in it (that is, it's something like `"foo"` rather than `fo*o?`), then it will be left as a string rather than converted to a regular expression. * `regexp` Created by the `makeRe` method. A single regular expression expressing the entire pattern. This is useful in cases where you wish to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. * `negate` True if the pattern is negated. * `comment` True if the pattern is a comment. * `empty` True if the pattern is `""`. ### Methods * `makeRe` Generate the `regexp` member if necessary, and return it. Will return `false` if the pattern is invalid. * `match(fname)` Return true if the filename matches the pattern, or false otherwise. * `matchOne(fileArray, patternArray, partial)` Take a `/`-split filename, and match it against a single row in the `regExpSet`. This method is mainly for internal use, but is exposed so that it can be used by a glob-walker that needs to avoid excessive filesystem calls. All other methods are internal, and will be called as necessary. ## Functions The top-level exported function has a `cache` property, which is an LRU cache set to store 100 items. So, calling these methods repeatedly with the same pattern and options will use the same Minimatch object, saving the cost of parsing it multiple times. ### minimatch(path, pattern, options) Main export. Tests a path against the pattern using the options. ```javascript var isJS = minimatch(file, "*.js", { matchBase: true }) ``` ### minimatch.filter(pattern, options) Returns a function that tests its supplied argument, suitable for use with `Array.filter`. Example: ```javascript var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) ``` ### minimatch.match(list, pattern, options) Match against the list of files, in the style of fnmatch or glob. If nothing is matched, and options.nonull is set, then return a list containing the pattern itself. ```javascript var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) ``` ### minimatch.makeRe(pattern, options) Make a regular expression object from the pattern. ## Options All options are `false` by default. ### debug Dump a ton of stuff to stderr. ### nobrace Do not expand `{a,b}` and `{1..3}` brace sets. ### noglobstar Disable `**` matching against multiple folder names. ### dot Allow patterns to match filenames starting with a period, even if the pattern does not explicitly have a period in that spot. Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` is set. ### noext Disable "extglob" style patterns like `+(a|b)`. ### nocase Perform a case-insensitive match. ### nonull When a match is not found by `minimatch.match`, return a list containing the pattern itself if this option is set. When not set, an empty list is returned if there are no matches. ### matchBase If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. ### nocomment Suppress the behavior of treating `#` at the start of a pattern as a comment. ### nonegate Suppress the behavior of treating a leading `!` character as negation. ### flipNegate Returns from negate expressions the same as if they were not negated. (Ie, true on a hit, false on a miss.) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between minimatch and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.1, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. If an escaped pattern has no matches, and the `nonull` flag is set, then minimatch.match returns the pattern as-provided, rather than interpreting the character escapes. For example, `minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/�������������000755 �000766 �000024 �00000000000 12456115117 035566� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/bench.js�����000644 �000766 �000024 �00000015406 12455173731 037216� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// different ways to id objects // use a req/res pair, since it's crazy deep and cyclical // sparseFE10 and sigmund are usually pretty close, which is to be expected, // since they are essentially the same algorithm, except that sigmund handles // regular expression objects properly. var http = require('http') var util = require('util') var sigmund = require('./sigmund.js') var sreq, sres, creq, cres, test http.createServer(function (q, s) { sreq = q sres = s sres.end('ok') this.close(function () { setTimeout(function () { start() }, 200) }) }).listen(1337, function () { creq = http.get({ port: 1337 }) creq.on('response', function (s) { cres = s }) }) function start () { test = [sreq, sres, creq, cres] // test = sreq // sreq.sres = sres // sreq.creq = creq // sreq.cres = cres for (var i in exports.compare) { console.log(i) var hash = exports.compare[i]() console.log(hash) console.log(hash.length) console.log('') } require('bench').runMain() } function customWs (obj, md, d) { d = d || 0 var to = typeof obj if (to === 'undefined' || to === 'function' || to === null) return '' if (d > md || !obj || to !== 'object') return ('' + obj).replace(/[\n ]+/g, '') if (Array.isArray(obj)) { return obj.map(function (i, _, __) { return customWs(i, md, d + 1) }).reduce(function (a, b) { return a + b }, '') } var keys = Object.keys(obj) return keys.map(function (k, _, __) { return k + ':' + customWs(obj[k], md, d + 1) }).reduce(function (a, b) { return a + b }, '') } function custom (obj, md, d) { d = d || 0 var to = typeof obj if (to === 'undefined' || to === 'function' || to === null) return '' if (d > md || !obj || to !== 'object') return '' + obj if (Array.isArray(obj)) { return obj.map(function (i, _, __) { return custom(i, md, d + 1) }).reduce(function (a, b) { return a + b }, '') } var keys = Object.keys(obj) return keys.map(function (k, _, __) { return k + ':' + custom(obj[k], md, d + 1) }).reduce(function (a, b) { return a + b }, '') } function sparseFE2 (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' Object.keys(v).forEach(function (k, _, __) { // pseudo-private values. skip those. if (k.charAt(0) === '_') return var to = typeof v[k] if (to === 'function' || to === 'undefined') return soFar += k + ':' ch(v[k], depth + 1) }) soFar += '}' } ch(obj, 0) return soFar } function sparseFE (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' Object.keys(v).forEach(function (k, _, __) { // pseudo-private values. skip those. if (k.charAt(0) === '_') return var to = typeof v[k] if (to === 'function' || to === 'undefined') return soFar += k ch(v[k], depth + 1) }) } ch(obj, 0) return soFar } function sparse (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' for (var k in v) { // pseudo-private values. skip those. if (k.charAt(0) === '_') continue var to = typeof v[k] if (to === 'function' || to === 'undefined') continue soFar += k ch(v[k], depth + 1) } } ch(obj, 0) return soFar } function noCommas (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' for (var k in v) { // pseudo-private values. skip those. if (k.charAt(0) === '_') continue var to = typeof v[k] if (to === 'function' || to === 'undefined') continue soFar += k + ':' ch(v[k], depth + 1) } soFar += '}' } ch(obj, 0) return soFar } function flatten (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' for (var k in v) { // pseudo-private values. skip those. if (k.charAt(0) === '_') continue var to = typeof v[k] if (to === 'function' || to === 'undefined') continue soFar += k + ':' ch(v[k], depth + 1) soFar += ',' } soFar += '}' } ch(obj, 0) return soFar } exports.compare = { // 'custom 2': function () { // return custom(test, 2, 0) // }, // 'customWs 2': function () { // return customWs(test, 2, 0) // }, 'JSON.stringify (guarded)': function () { var seen = [] return JSON.stringify(test, function (k, v) { if (typeof v !== 'object' || !v) return v if (seen.indexOf(v) !== -1) return undefined seen.push(v) return v }) }, 'flatten 10': function () { return flatten(test, 10) }, // 'flattenFE 10': function () { // return flattenFE(test, 10) // }, 'noCommas 10': function () { return noCommas(test, 10) }, 'sparse 10': function () { return sparse(test, 10) }, 'sparseFE 10': function () { return sparseFE(test, 10) }, 'sparseFE2 10': function () { return sparseFE2(test, 10) }, sigmund: function() { return sigmund(test, 10) }, // 'util.inspect 1': function () { // return util.inspect(test, false, 1, false) // }, // 'util.inspect undefined': function () { // util.inspect(test) // }, // 'util.inspect 2': function () { // util.inspect(test, false, 2, false) // }, // 'util.inspect 3': function () { // util.inspect(test, false, 3, false) // }, // 'util.inspect 4': function () { // util.inspect(test, false, 4, false) // }, // 'util.inspect Infinity': function () { // util.inspect(test, false, Infinity, false) // } } /** results **/ ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/LICENSE������000644 �000766 �000024 �00000002436 12455173731 036605� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/package.json�000644 �000766 �000024 �00000006176 12455173731 040073� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "sigmund", "version": "1.0.0", "description": "Quick and dirty signatures for Objects.", "main": "sigmund.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.3.0" }, "scripts": { "test": "tap test/*.js", "bench": "node bench.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/sigmund" }, "keywords": [ "object", "signature", "key", "data", "psychoanalysis" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "BSD", "readme": "# sigmund\n\nQuick and dirty signatures for Objects.\n\nThis is like a much faster `deepEquals` comparison, which returns a\nstring key suitable for caches and the like.\n\n## Usage\n\n```javascript\nfunction doSomething (someObj) {\n var key = sigmund(someObj, maxDepth) // max depth defaults to 10\n var cached = cache.get(key)\n if (cached) return cached)\n\n var result = expensiveCalculation(someObj)\n cache.set(key, result)\n return result\n}\n```\n\nThe resulting key will be as unique and reproducible as calling\n`JSON.stringify` or `util.inspect` on the object, but is much faster.\nIn order to achieve this speed, some differences are glossed over.\nFor example, the object `{0:'foo'}` will be treated identically to the\narray `['foo']`.\n\nAlso, just as there is no way to summon the soul from the scribblings\nof a cocain-addled psychoanalyst, there is no way to revive the object\nfrom the signature string that sigmund gives you. In fact, it's\nbarely even readable.\n\nAs with `sys.inspect` and `JSON.stringify`, larger objects will\nproduce larger signature strings.\n\nBecause sigmund is a bit less strict than the more thorough\nalternatives, the strings will be shorter, and also there is a\nslightly higher chance for collisions. For example, these objects\nhave the same signature:\n\n var obj1 = {a:'b',c:/def/,g:['h','i',{j:'',k:'l'}]}\n var obj2 = {a:'b',c:'/def/',g:['h','i','{jkl']}\n\nLike a good Freudian, sigmund is most effective when you already have\nsome understanding of what you're looking for. It can help you help\nyourself, but you must be willing to do some work as well.\n\nCycles are handled, and cyclical objects are silently omitted (though\nthe key is included in the signature output.)\n\nThe second argument is the maximum depth, which defaults to 10,\nbecause that is the maximum object traversal depth covered by most\ninsurance carriers.\n", "_id": "sigmund@1.0.0", "dist": { "shasum": "66a2b3a749ae8b5fb89efd4fcc01dc94fbe02296", "tarball": "http://registry.npmjs.org/sigmund/-/sigmund-1.0.0.tgz" }, "_npmVersion": "1.1.48", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "_shasum": "66a2b3a749ae8b5fb89efd4fcc01dc94fbe02296", "_resolved": "https://registry.npmjs.org/sigmund/-/sigmund-1.0.0.tgz", "_from": "sigmund@>=1.0.0 <1.1.0", "bugs": { "url": "https://github.com/isaacs/sigmund/issues" }, "homepage": "https://github.com/isaacs/sigmund" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/README.md����000644 �000766 �000024 �00000003475 12455173731 037063� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# sigmund Quick and dirty signatures for Objects. This is like a much faster `deepEquals` comparison, which returns a string key suitable for caches and the like. ## Usage ```javascript function doSomething (someObj) { var key = sigmund(someObj, maxDepth) // max depth defaults to 10 var cached = cache.get(key) if (cached) return cached) var result = expensiveCalculation(someObj) cache.set(key, result) return result } ``` The resulting key will be as unique and reproducible as calling `JSON.stringify` or `util.inspect` on the object, but is much faster. In order to achieve this speed, some differences are glossed over. For example, the object `{0:'foo'}` will be treated identically to the array `['foo']`. Also, just as there is no way to summon the soul from the scribblings of a cocain-addled psychoanalyst, there is no way to revive the object from the signature string that sigmund gives you. In fact, it's barely even readable. As with `sys.inspect` and `JSON.stringify`, larger objects will produce larger signature strings. Because sigmund is a bit less strict than the more thorough alternatives, the strings will be shorter, and also there is a slightly higher chance for collisions. For example, these objects have the same signature: var obj1 = {a:'b',c:/def/,g:['h','i',{j:'',k:'l'}]} var obj2 = {a:'b',c:'/def/',g:['h','i','{jkl']} Like a good Freudian, sigmund is most effective when you already have some understanding of what you're looking for. It can help you help yourself, but you must be willing to do some work as well. Cycles are handled, and cyclical objects are silently omitted (though the key is included in the signature output.) The second argument is the maximum depth, which defaults to 10, because that is the maximum object traversal depth covered by most insurance carriers. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/sigmund.js���000644 �000766 �000024 �00000002172 12455173731 037601� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = sigmund function sigmund (subject, maxSessions) { maxSessions = maxSessions || 10; var notes = []; var analysis = ''; var RE = RegExp; function psychoAnalyze (subject, session) { if (session > maxSessions) return; if (typeof subject === 'function' || typeof subject === 'undefined') { return; } if (typeof subject !== 'object' || !subject || (subject instanceof RE)) { analysis += subject; return; } if (notes.indexOf(subject) !== -1 || session === maxSessions) return; notes.push(subject); analysis += '{'; Object.keys(subject).forEach(function (issue, _, __) { // pseudo-private values. skip those. if (issue.charAt(0) === '_') return; var to = typeof subject[issue]; if (to === 'function' || to === 'undefined') return; analysis += issue; psychoAnalyze(subject[issue], session + 1); }); } psychoAnalyze(subject, 0); return analysis; } // vim: set softtabstop=4 shiftwidth=4: ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/build.js����������������������000644 �000766 �000024 �00000017043 12456106751 027310� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = build /** * Module dependencies. */ var fs = require('graceful-fs') , rm = require('rimraf') , path = require('path') , glob = require('glob') , log = require('npmlog') , which = require('which') , mkdirp = require('mkdirp') , exec = require('child_process').exec , win = process.platform == 'win32' exports.usage = 'Invokes `' + (win ? 'msbuild' : 'make') + '` and builds the module' function build (gyp, argv, callback) { var makeCommand = gyp.opts.make || process.env.MAKE || (process.platform.indexOf('bsd') != -1 && process.platform.indexOf('kfreebsd') == -1 ? 'gmake' : 'make') , command = win ? 'msbuild' : makeCommand , buildDir = path.resolve('build') , configPath = path.resolve(buildDir, 'config.gypi') , jobs = gyp.opts.jobs || process.env.JOBS , buildType , config , arch , nodeDir , copyDevLib loadConfigGypi() /** * Load the "config.gypi" file that was generated during "configure". */ function loadConfigGypi () { fs.readFile(configPath, 'utf8', function (err, data) { if (err) { if (err.code == 'ENOENT') { callback(new Error('You must run `node-gyp configure` first!')) } else { callback(err) } return } config = JSON.parse(data.replace(/\#.+\n/, '')) // get the 'arch', 'buildType', and 'nodeDir' vars from the config buildType = config.target_defaults.default_configuration arch = config.variables.target_arch nodeDir = config.variables.nodedir copyDevLib = config.variables.copy_dev_lib == 'true' if ('debug' in gyp.opts) { buildType = gyp.opts.debug ? 'Debug' : 'Release' } if (!buildType) { buildType = 'Release' } log.verbose('build type', buildType) log.verbose('architecture', arch) log.verbose('node dev dir', nodeDir) if (win) { findSolutionFile() } else { doWhich() } }) } /** * On Windows, find the first build/*.sln file. */ function findSolutionFile () { glob('build/*.sln', function (err, files) { if (err) return callback(err) if (files.length === 0) { return callback(new Error('Could not find *.sln file. Did you run "configure"?')) } guessedSolution = files[0] log.verbose('found first Solution file', guessedSolution) doWhich() }) } /** * Uses node-which to locate the msbuild / make executable. */ function doWhich () { // First make sure we have the build command in the PATH which(command, function (err, execPath) { if (err) { if (win && /not found/.test(err.message)) { // On windows and no 'msbuild' found. Let's guess where it is findMsbuild() } else { // Some other error or 'make' not found on Unix, report that to the user callback(err) } return } log.verbose('`which` succeeded for `' + command + '`', execPath) copyNodeLib() }) } /** * Search for the location of "msbuild.exe" file on Windows. */ function findMsbuild () { log.verbose('could not find "msbuild.exe" in PATH - finding location in registry') var notfoundErr = new Error('Can\'t find "msbuild.exe". Do you have Microsoft Visual Studio C++ 2008+ installed?') var cmd = 'reg query "HKLM\\Software\\Microsoft\\MSBuild\\ToolsVersions" /s' if (process.arch !== 'ia32') cmd += ' /reg:32' exec(cmd, function (err, stdout, stderr) { var reVers = /ToolsVersions\\([^\\]+)$/i , rePath = /\r\n[ \t]+MSBuildToolsPath[ \t]+REG_SZ[ \t]+([^\r]+)/i , msbuilds = [] , r , msbuildPath if (err) { return callback(notfoundErr) } stdout.split('\r\n\r\n').forEach(function(l) { if (!l) return l = l.trim() if (r = reVers.exec(l.substring(0, l.indexOf('\r\n')))) { var ver = parseFloat(r[1], 10) if (ver >= 3.5) { if (r = rePath.exec(l)) { msbuilds.push({ version: ver, path: r[1] }) } } } }) msbuilds.sort(function (x, y) { return (x.version < y.version ? -1 : 1) }) ;(function verifyMsbuild () { if (!msbuilds.length) return callback(notfoundErr); msbuildPath = path.resolve(msbuilds.pop().path, 'msbuild.exe') fs.stat(msbuildPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { if (msbuilds.length) { return verifyMsbuild() } else { callback(notfoundErr) } } else { callback(err) } return } command = msbuildPath copyNodeLib() }) })() }) } /** * Copies the iojs.lib file for the current target architecture into the * current proper dev dir location. */ function copyNodeLib () { if (!win || !copyDevLib) return doBuild() var buildDir = path.resolve(nodeDir, buildType) , archNodeLibPath = path.resolve(nodeDir, arch, 'iojs.lib') , buildNodeLibPath = path.resolve(buildDir, 'iojs.lib') mkdirp(buildDir, function (err, isNew) { if (err) return callback(err) log.verbose('"' + buildType + '" dir needed to be created?', isNew) var rs = fs.createReadStream(archNodeLibPath) , ws = fs.createWriteStream(buildNodeLibPath) log.verbose('copying "iojs.lib" for ' + arch, buildNodeLibPath) rs.pipe(ws) rs.on('error', callback) ws.on('error', callback) rs.on('end', doBuild) }) } /** * Actually spawn the process and compile the module. */ function doBuild () { // Enable Verbose build var verbose = log.levels[log.level] <= log.levels.verbose if (!win && verbose) { argv.push('V=1') } if (win && !verbose) { argv.push('/clp:Verbosity=minimal') } if (win) { // Turn off the Microsoft logo on Windows argv.push('/nologo') } // Specify the build type, Release by default if (win) { var p = arch === 'x64' ? 'x64' : 'Win32' argv.push('/p:Configuration=' + buildType + ';Platform=' + p) if (jobs) { if (!isNaN(parseInt(jobs, 10))) { argv.push('/m:' + parseInt(jobs, 10)) } else if (jobs.toUpperCase() === 'MAX') { argv.push('/m:' + require('os').cpus().length) } } } else { argv.push('BUILDTYPE=' + buildType) // Invoke the Makefile in the 'build' dir. argv.push('-C') argv.push('build') if (jobs) { if (!isNaN(parseInt(jobs, 10))) { argv.push('--jobs') argv.push(parseInt(jobs, 10)) } else if (jobs.toUpperCase() === 'MAX') { argv.push('--jobs') argv.push(require('os').cpus().length) } } } if (win) { // did the user specify their own .sln file? var hasSln = argv.some(function (arg) { return path.extname(arg) == '.sln' }) if (!hasSln) { argv.unshift(gyp.opts.solution || guessedSolution) } } var proc = gyp.spawn(command, argv) proc.on('exit', onExit) } /** * Invoked after the make/msbuild command exits. */ function onExit (code, signal) { if (code !== 0) { return callback(new Error('`' + command + '` failed with exit code: ' + code)) } if (signal) { return callback(new Error('`' + command + '` got signal: ' + signal)) } callback() } } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/clean.js����������������������000644 �000766 �000024 �00000000572 12455173731 027273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = clean exports.usage = 'Removes any generated build files and the "out" dir' /** * Module dependencies. */ var rm = require('rimraf') var log = require('npmlog') function clean (gyp, argv, callback) { // Remove the 'build' dir var buildDir = 'build' log.verbose('clean', 'removing "%s" directory', buildDir) rm(buildDir, callback) } ��������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/configure.js������������������000644 �000766 �000024 �00000025404 12455173731 030173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = exports = configure /** * Module dependencies. */ var fs = require('graceful-fs') , path = require('path') , glob = require('glob') , log = require('npmlog') , osenv = require('osenv') , which = require('which') , semver = require('semver') , mkdirp = require('mkdirp') , cp = require('child_process') , extend = require('util')._extend , spawn = cp.spawn , execFile = cp.execFile , win = process.platform == 'win32' exports.usage = 'Generates ' + (win ? 'MSVC project files' : 'a Makefile') + ' for the current module' function configure (gyp, argv, callback) { var python = gyp.opts.python || process.env.PYTHON || 'python' , buildDir = path.resolve('build') , configNames = [ 'config.gypi', 'common.gypi' ] , configs = [] , nodeDir checkPython() // Check if Python is in the $PATH function checkPython () { log.verbose('check python', 'checking for Python executable "%s" in the PATH', python) which(python, function (err, execPath) { if (err) { log.verbose('`which` failed', python, err) if (win) { guessPython() } else { failNoPython() } } else { log.verbose('`which` succeeded', python, execPath) checkPythonVersion() } }) } // Called on Windows when "python" isn't available in the current $PATH. // We're gonna check if "%SystemDrive%\python27\python.exe" exists. function guessPython () { log.verbose('could not find "' + python + '". guessing location') var rootDir = process.env.SystemDrive || 'C:\\' if (rootDir[rootDir.length - 1] !== '\\') { rootDir += '\\' } var pythonPath = path.resolve(rootDir, 'Python27', 'python.exe') log.verbose('ensuring that file exists:', pythonPath) fs.stat(pythonPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { failNoPython() } else { callback(err) } return } python = pythonPath checkPythonVersion() }) } function checkPythonVersion () { var env = extend({}, process.env); env.TERM = 'dumb'; execFile(python, ['-c', 'import platform; print(platform.python_version());'], { env: env }, function (err, stdout) { if (err) { return callback(err) } log.verbose('check python version', '`%s -c "import platform; print(platform.python_version());"` returned: %j', python, stdout) var version = stdout.trim() if (~version.indexOf('+')) { log.silly('stripping "+" sign(s) from version') version = version.replace(/\+/g, '') } if (~version.indexOf('rc')) { log.silly('stripping "rc" identifier from version') version = version.replace(/rc(.*)$/ig, '') } var range = semver.Range('>=2.5.0 <3.0.0') if (range.test(version)) { getNodeDir() } else { failPythonVersion(version) } }) } function failNoPython () { callback(new Error('Can\'t find Python executable "' + python + '", you can set the PYTHON env variable.')) } function failPythonVersion (badVersion) { callback(new Error('Python executable "' + python + '" is v' + badVersion + ', which is not supported by gyp.\n' + 'You can pass the --python switch to point to Python >= v2.5.0 & < 3.0.0.')) } function getNodeDir () { // 'python' should be set by now process.env.PYTHON = python if (gyp.opts.nodedir) { // --nodedir was specified. use that for the dev files nodeDir = gyp.opts.nodedir.replace(/^~/, osenv.home()) log.verbose('get node dir', 'compiling against specified --nodedir dev files: %s', nodeDir) createBuildDir() } else { // if no --nodedir specified, ensure node dependencies are installed var version var versionStr if (gyp.opts.target) { // if --target was given, then determine a target version to compile for versionStr = gyp.opts.target log.verbose('get node dir', 'compiling against --target node version: %s', versionStr) } else { // if no --target was specified then use the current host node version versionStr = process.version log.verbose('get node dir', 'no --target version specified, falling back to host node version: %s', versionStr) } // make sure we have a valid version try { version = semver.parse(versionStr) } catch (e) { return callback(e) } if (!version) { return callback(new Error('Invalid version number: ' + versionStr)) } // ensure that the target node version's dev files are installed gyp.opts.ensure = true gyp.commands.install([ versionStr ], function (err, version) { if (err) return callback(err) log.verbose('get node dir', 'target node version installed:', version) nodeDir = path.resolve(gyp.devDir, version) createBuildDir() }) } } function createBuildDir () { log.verbose('build dir', 'attempting to create "build" dir: %s', buildDir) mkdirp(buildDir, function (err, isNew) { if (err) return callback(err) log.verbose('build dir', '"build" dir needed to be created?', isNew) createConfigFile() }) } function createConfigFile (err) { if (err) return callback(err) var configFilename = 'config.gypi' var configPath = path.resolve(buildDir, configFilename) log.verbose('build/' + configFilename, 'creating config file') var config = process.config || {} , defaults = config.target_defaults , variables = config.variables // default "config.variables" if (!variables) variables = config.variables = {} // default "config.defaults" if (!defaults) defaults = config.target_defaults = {} // don't inherit the "defaults" from node's `process.config` object. // doing so could cause problems in cases where the `node` executable was // compiled on a different machine (with different lib/include paths) than // the machine where the addon is being built to defaults.cflags = [] defaults.defines = [] defaults.include_dirs = [] defaults.libraries = [] // set the default_configuration prop if ('debug' in gyp.opts) { defaults.default_configuration = gyp.opts.debug ? 'Debug' : 'Release' } if (!defaults.default_configuration) { defaults.default_configuration = 'Release' } // set the target_arch variable variables.target_arch = gyp.opts.arch || process.arch || 'ia32' // set the node development directory variables.nodedir = nodeDir // don't copy dev libraries with nodedir option variables.copy_dev_lib = !gyp.opts.nodedir // disable -T "thin" static archives by default variables.standalone_static_library = gyp.opts.thin ? 0 : 1 // loop through the rest of the opts and add the unknown ones as variables. // this allows for module-specific configure flags like: // // $ node-gyp configure --shared-libxml2 Object.keys(gyp.opts).forEach(function (opt) { if (opt === 'argv') return if (opt in gyp.configDefs) return variables[opt.replace(/-/g, '_')] = gyp.opts[opt] }) // ensures that any boolean values from `process.config` get stringified function boolsToString (k, v) { if (typeof v === 'boolean') return String(v) return v } log.silly('build/' + configFilename, config) // now write out the config.gypi file to the build/ dir var prefix = '# Do not edit. File was generated by node-gyp\'s "configure" step' , json = JSON.stringify(config, boolsToString, 2) log.verbose('build/' + configFilename, 'writing out config file: %s', configPath) configs.push(configPath) fs.writeFile(configPath, [prefix, json, ''].join('\n'), findConfigs) } function findConfigs (err) { if (err) return callback(err) var name = configNames.shift() if (!name) return runGyp() var fullPath = path.resolve(name) log.verbose(name, 'checking for gypi file: %s', fullPath) fs.stat(fullPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { findConfigs() // check next gypi filename } else { callback(err) } } else { log.verbose(name, 'found gypi file') configs.push(fullPath) findConfigs() } }) } function runGyp (err) { if (err) return callback(err) if (!~argv.indexOf('-f') && !~argv.indexOf('--format')) { if (win) { log.verbose('gyp', 'gyp format was not specified; forcing "msvs"') // force the 'make' target for non-Windows argv.push('-f', 'msvs') } else { log.verbose('gyp', 'gyp format was not specified; forcing "make"') // force the 'make' target for non-Windows argv.push('-f', 'make') } } function hasMsvsVersion () { return argv.some(function (arg) { return arg.indexOf('msvs_version') === 0 }) } if (win && !hasMsvsVersion()) { if ('msvs_version' in gyp.opts) { argv.push('-G', 'msvs_version=' + gyp.opts.msvs_version) } else { argv.push('-G', 'msvs_version=auto') } } // include all the ".gypi" files that were found configs.forEach(function (config) { argv.push('-I', config) }) // this logic ported from the old `gyp_addon` python file var gyp_script = path.resolve(__dirname, '..', 'gyp', 'gyp_main.py') var addon_gypi = path.resolve(__dirname, '..', 'addon.gypi') var common_gypi = path.resolve(nodeDir, 'common.gypi') var output_dir = 'build' if (win) { // Windows expects an absolute path output_dir = buildDir } argv.push('-I', addon_gypi) argv.push('-I', common_gypi) argv.push('-Dlibrary=shared_library') argv.push('-Dvisibility=default') argv.push('-Dnode_root_dir=' + nodeDir) argv.push('-Dmodule_root_dir=' + process.cwd()) argv.push('--depth=.') argv.push('--no-parallel') // tell gyp to write the Makefile/Solution files into output_dir argv.push('--generator-output', output_dir) // tell make to write its output into the same dir argv.push('-Goutput_dir=.') // enforce use of the "binding.gyp" file argv.unshift('binding.gyp') // execute `gyp` from the current target nodedir argv.unshift(gyp_script) // make sure python uses files that came with this particular node package process.env.PYTHONPATH = path.resolve(__dirname, '..', 'gyp', 'pylib') var cp = gyp.spawn(python, argv) cp.on('exit', onCpExit) } /** * Called when the `gyp` child process exits. */ function onCpExit (code, signal) { if (code !== 0) { callback(new Error('`gyp` failed with exit code: ' + code)) } else { // we're done callback() } } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/install.js��������������������000644 �000766 �000024 �00000034353 12456106751 027662� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = install exports.usage = 'Install node development files for the specified node version.' /** * Module dependencies. */ var fs = require('graceful-fs') , osenv = require('osenv') , tar = require('tar') , rm = require('rimraf') , path = require('path') , crypto = require('crypto') , zlib = require('zlib') , log = require('npmlog') , semver = require('semver') , fstream = require('fstream') , request = require('request') , minimatch = require('minimatch') , mkdir = require('mkdirp') , win = process.platform == 'win32' function install (gyp, argv, callback) { // ensure no double-callbacks happen function cb (err) { if (cb.done) return cb.done = true if (err) { log.warn('install', 'got an error, rolling back install') // roll-back the install if anything went wrong gyp.commands.remove([ version ], function (err2) { callback(err) }) } else { callback(null, version) } } var distUrl = gyp.opts['dist-url'] || gyp.opts.disturl || 'https://iojs.org/dist' // Determine which node dev files version we are installing var versionStr = argv[0] || gyp.opts.target || process.version log.verbose('install', 'input version string %j', versionStr) // parse the version to normalize and ensure it's valid var version = semver.parse(versionStr) if (!version) { return callback(new Error('Invalid version number: ' + versionStr)) } if (semver.lt(versionStr, '0.8.0')) { return callback(new Error('Minimum target version is `0.8.0` or greater. Got: ' + versionStr)) } // 0.x.y-pre versions are not published yet and cannot be installed. Bail. if (version.prerelease[0] === 'pre') { log.verbose('detected "pre" node version', versionStr) if (gyp.opts.nodedir) { log.verbose('--nodedir flag was passed; skipping install', gyp.opts.nodedir) callback() } else { callback(new Error('"pre" versions of node cannot be installed, use the --nodedir flag instead')) } return } // flatten version into String version = version.version log.verbose('install', 'installing version: %s', version) // distributions starting with 0.10.0 contain sha256 checksums var checksumAlgo = semver.gte(version, '0.10.0') ? 'sha256' : 'sha1' // the directory where the dev files will be installed var devDir = path.resolve(gyp.devDir, version) // If '--ensure' was passed, then don't *always* install the version; // check if it is already installed, and only install when needed if (gyp.opts.ensure) { log.verbose('install', '--ensure was passed, so won\'t reinstall if already installed') fs.stat(devDir, function (err, stat) { if (err) { if (err.code == 'ENOENT') { log.verbose('install', 'version not already installed, continuing with install', version) go() } else if (err.code == 'EACCES') { eaccesFallback() } else { cb(err) } return } log.verbose('install', 'version is already installed, need to check "installVersion"') var installVersionFile = path.resolve(devDir, 'installVersion') fs.readFile(installVersionFile, 'ascii', function (err, ver) { if (err && err.code != 'ENOENT') { return cb(err) } var installVersion = parseInt(ver, 10) || 0 log.verbose('got "installVersion"', installVersion) log.verbose('needs "installVersion"', gyp.package.installVersion) if (installVersion < gyp.package.installVersion) { log.verbose('install', 'version is no good; reinstalling') go() } else { log.verbose('install', 'version is good') cb() } }) }) } else { go() } function download (url) { log.http('GET', url) var req = null var requestOpts = { uri: url , headers: { 'User-Agent': 'node-gyp v' + gyp.version + ' (node ' + process.version + ')' } } // basic support for a proxy server var proxyUrl = gyp.opts.proxy || process.env.http_proxy || process.env.HTTP_PROXY || process.env.npm_config_proxy if (proxyUrl) { if (/^https?:\/\//i.test(proxyUrl)) { log.verbose('download', 'using proxy url: "%s"', proxyUrl) requestOpts.proxy = proxyUrl } else { log.warn('download', 'ignoring invalid "proxy" config setting: "%s"', proxyUrl) } } try { // The "request" constructor can throw sometimes apparently :( // See: https://github.com/TooTallNate/node-gyp/issues/114 req = request(requestOpts) } catch (e) { cb(e) } if (req) { req.on('response', function (res) { log.http(res.statusCode, url) }) } return req } function getContentSha(res, callback) { var shasum = crypto.createHash(checksumAlgo) res.on('data', function (chunk) { shasum.update(chunk) }).on('end', function () { callback(null, shasum.digest('hex')) }) } function go () { log.verbose('ensuring nodedir is created', devDir) // first create the dir for the node dev files mkdir(devDir, function (err, created) { if (err) { if (err.code == 'EACCES') { eaccesFallback() } else { cb(err) } return } if (created) { log.verbose('created nodedir', created) } // now download the node tarball var tarPath = gyp.opts['tarball'] var tarballUrl = tarPath ? tarPath : distUrl + '/v' + version + '/iojs-v' + version + '.tar.gz' , badDownload = false , extractCount = 0 , gunzip = zlib.createGunzip() , extracter = tar.Extract({ path: devDir, strip: 1, filter: isValid }) var contentShasums = {} var expectShasums = {} // checks if a file to be extracted from the tarball is valid. // only .h header files and the gyp files get extracted function isValid () { var name = this.path.substring(devDir.length + 1) var isValid = valid(name) if (name === '' && this.type === 'Directory') { // the first directory entry is ok return true } if (isValid) { log.verbose('extracted file from tarball', name) extractCount++ } else { // invalid log.silly('ignoring from tarball', name) } return isValid } gunzip.on('error', cb) extracter.on('error', cb) extracter.on('end', afterTarball) // download the tarball, gunzip and extract! if (tarPath) { var input = fs.createReadStream(tarballUrl) input.pipe(gunzip).pipe(extracter) return } var req = download(tarballUrl) if (!req) return // something went wrong downloading the tarball? req.on('error', function (err) { badDownload = true cb(err) }) req.on('close', function () { if (extractCount === 0) { cb(new Error('Connection closed while downloading tarball file')) } }) req.on('response', function (res) { if (res.statusCode !== 200) { badDownload = true cb(new Error(res.statusCode + ' status code downloading tarball')) return } // content checksum getContentSha(res, function (_, checksum) { var filename = path.basename(tarballUrl).trim() contentShasums[filename] = checksum log.verbose('content checksum', filename, checksum) }) // start unzipping and untaring req.pipe(gunzip).pipe(extracter) }) // invoked after the tarball has finished being extracted function afterTarball () { if (badDownload) return if (extractCount === 0) { return cb(new Error('There was a fatal problem while downloading/extracting the tarball')) } log.verbose('tarball', 'done parsing tarball') var async = 0 if (win) { // need to download iojs.lib async++ downloadNodeLib(deref) } // write the "installVersion" file async++ var installVersionPath = path.resolve(devDir, 'installVersion') fs.writeFile(installVersionPath, gyp.package.installVersion + '\n', deref) // download SHASUMS.txt async++ downloadShasums(deref) if (async === 0) { // no async tasks required cb() } function deref (err) { if (err) return cb(err) async-- if (!async) { log.verbose('download contents checksum', JSON.stringify(contentShasums)) // check content shasums for (var k in contentShasums) { log.verbose('validating download checksum for ' + k, '(%s == %s)', contentShasums[k], expectShasums[k]) // TODO(piscisaureus) re-enable checksum verification when the correct files are in place. if (false || contentShasums[k] !== expectShasums[k]) { cb(new Error(k + ' local checksum ' + contentShasums[k] + ' not match remote ' + expectShasums[k])) return } } cb() } } } function downloadShasums(done) { var shasumsFile = (checksumAlgo === 'sha256') ? 'SHASUMS256.txt' : 'SHASUMS.txt' log.verbose('check download content checksum, need to download `' + shasumsFile + '`...') var shasumsPath = path.resolve(devDir, shasumsFile) , shasumsUrl = distUrl + '/v' + version + '/' + shasumsFile log.verbose('checksum url', shasumsUrl) var req = download(shasumsUrl) if (!req) return req.on('error', done) req.on('response', function (res) { if (res.statusCode !== 200) { done(new Error(res.statusCode + ' status code downloading checksum')) return } var chunks = [] res.on('data', function (chunk) { chunks.push(chunk) }) res.on('end', function () { var lines = Buffer.concat(chunks).toString().trim().split('\n') lines.forEach(function (line) { var items = line.trim().split(/\s+/) if (items.length !== 2) return // 0035d18e2dcf9aad669b1c7c07319e17abfe3762 ./node-v0.11.4.tar.gz var name = items[1].replace(/^\.\//, '') expectShasums[name] = items[0] }) log.verbose('checksum data', JSON.stringify(expectShasums)) done() }) }) } function downloadNodeLib (done) { log.verbose('on Windows; need to download `iojs.lib`...') var dir32 = path.resolve(devDir, 'ia32') , dir64 = path.resolve(devDir, 'x64') , nodeLibPath32 = path.resolve(dir32, 'iojs.lib') , nodeLibPath64 = path.resolve(dir64, 'iojs.lib') , nodeLibUrl32 = distUrl + '/v' + version + '/win-x86/iojs.lib' , nodeLibUrl64 = distUrl + '/v' + version + '/win-x64/iojs.lib' log.verbose('32-bit iojs.lib dir', dir32) log.verbose('64-bit iojs.lib dir', dir64) log.verbose('`iojs.lib` 32-bit url', nodeLibUrl32) log.verbose('`iojs.lib` 64-bit url', nodeLibUrl64) var async = 2 mkdir(dir32, function (err) { if (err) return done(err) log.verbose('streaming 32-bit iojs.lib to:', nodeLibPath32) var req = download(nodeLibUrl32) if (!req) return req.on('error', done) req.on('response', function (res) { if (res.statusCode !== 200) { done(new Error(res.statusCode + ' status code downloading 32-bit iojs.lib')) return } getContentSha(res, function (_, checksum) { contentShasums['win-x86/iojs.lib'] = checksum log.verbose('content checksum', 'win-x86/iojs.lib', checksum) }) var ws = fs.createWriteStream(nodeLibPath32) ws.on('error', cb) req.pipe(ws) }) req.on('end', function () { --async || done() }) }) mkdir(dir64, function (err) { if (err) return done(err) log.verbose('streaming 64-bit iojs.lib to:', nodeLibPath64) var req = download(nodeLibUrl64) if (!req) return req.on('error', done) req.on('response', function (res) { if (res.statusCode !== 200) { done(new Error(res.statusCode + ' status code downloading 64-bit iojs.lib')) return } getContentSha(res, function (_, checksum) { contentShasums['win-x64/iojs.lib'] = checksum log.verbose('content checksum', 'win-x64/iojs.lib', checksum) }) var ws = fs.createWriteStream(nodeLibPath64) ws.on('error', cb) req.pipe(ws) }) req.on('end', function () { --async || done() }) }) } // downloadNodeLib() }) // mkdir() } // go() /** * Checks if a given filename is "valid" for this installation. */ function valid (file) { // header files return minimatch(file, '*.h', { matchBase: true }) || minimatch(file, '*.gypi', { matchBase: true }) } /** * The EACCES fallback is a workaround for npm's `sudo` behavior, where * it drops the permissions before invoking any child processes (like * node-gyp). So what happens is the "nobody" user doesn't have * permission to create the dev dir. As a fallback, make the tmpdir() be * the dev dir for this installation. This is not ideal, but at least * the compilation will succeed... */ function eaccesFallback () { var tmpdir = osenv.tmpdir() gyp.devDir = path.resolve(tmpdir, '.node-gyp') log.warn('EACCES', 'user "%s" does not have permission to access the dev dir "%s"', osenv.user(), devDir) log.warn('EACCES', 'attempting to reinstall using temporary dev dir "%s"', gyp.devDir) if (process.cwd() == tmpdir) { log.verbose('tmpdir == cwd', 'automatically will remove dev files after to save disk space') gyp.todo.push({ name: 'remove', args: argv }) } gyp.commands.install(argv, cb) } } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/list.js�����������������������000644 �000766 �000024 �00000001316 12455173731 027161� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = list exports.usage = 'Prints a listing of the currently installed node development files' /** * Module dependencies. */ var fs = require('graceful-fs') , path = require('path') , log = require('npmlog') function list (gyp, args, callback) { var devDir = gyp.devDir log.verbose('list', 'using node-gyp dir:', devDir) // readdir() the node-gyp dir fs.readdir(devDir, onreaddir) function onreaddir (err, versions) { if (err && err.code != 'ENOENT') { return callback(err) } if (Array.isArray(versions)) { versions = versions.filter(function (v) { return v != 'current' }) } else { versions = [] } callback(null, versions) } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/node-gyp.js�������������������000644 �000766 �000024 �00000011743 12455173731 027735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ /** * Module exports. */ module.exports = exports = gyp /** * Module dependencies. */ var fs = require('graceful-fs') , path = require('path') , nopt = require('nopt') , log = require('npmlog') , child_process = require('child_process') , EE = require('events').EventEmitter , inherits = require('util').inherits , commands = [ // Module build commands 'build' , 'clean' , 'configure' , 'rebuild' // Development Header File management commands , 'install' , 'list' , 'remove' ] , aliases = { 'ls': 'list' , 'rm': 'remove' } // differentiate node-gyp's logs from npm's log.heading = 'gyp' /** * The `gyp` function. */ function gyp () { return new Gyp() } function Gyp () { var self = this // set the dir where node-gyp dev files get installed // TODO: make this *more* configurable? // see: https://github.com/TooTallNate/node-gyp/issues/21 var homeDir = process.env.HOME || process.env.USERPROFILE if (!homeDir) { throw new Error( "node-gyp requires that the user's home directory is specified " + "in either of the environmental variables HOME or USERPROFILE" ); } this.devDir = path.resolve(homeDir, '.node-gyp') this.commands = {} commands.forEach(function (command) { self.commands[command] = function (argv, callback) { log.verbose('command', command, argv) return require('./' + command)(self, argv, callback) } }) } inherits(Gyp, EE) exports.Gyp = Gyp var proto = Gyp.prototype /** * Export the contents of the package.json. */ proto.package = require('../package') /** * nopt configuration definitions */ proto.configDefs = { help: Boolean // everywhere , arch: String // 'configure' , debug: Boolean // 'build' , directory: String // bin , make: String // 'build' , msvs_version: String // 'configure' , ensure: Boolean // 'install' , solution: String // 'build' (windows only) , proxy: String // 'install' , nodedir: String // 'configure' , loglevel: String // everywhere , python: String // 'configure' , 'dist-url': String // 'install' , 'tarball': String // 'install' , jobs: String // 'build' , thin: String // 'configure' } /** * nopt shorthands */ proto.shorthands = { release: '--no-debug' , C: '--directory' , debug: '--debug' , j: '--jobs' , silly: '--loglevel=silly' , verbose: '--loglevel=verbose' } /** * expose the command aliases for the bin file to use. */ proto.aliases = aliases /** * Parses the given argv array and sets the 'opts', * 'argv' and 'command' properties. */ proto.parseArgv = function parseOpts (argv) { this.opts = nopt(this.configDefs, this.shorthands, argv) this.argv = this.opts.argv.remain.slice() var commands = this.todo = [] // create a copy of the argv array with aliases mapped argv = this.argv.map(function (arg) { // is this an alias? if (arg in this.aliases) { arg = this.aliases[arg] } return arg }, this) // process the mapped args into "command" objects ("name" and "args" props) argv.slice().forEach(function (arg) { if (arg in this.commands) { var args = argv.splice(0, argv.indexOf(arg)) argv.shift() if (commands.length > 0) { commands[commands.length - 1].args = args } commands.push({ name: arg, args: [] }) } }, this) if (commands.length > 0) { commands[commands.length - 1].args = argv.splice(0) } // support for inheriting config env variables from npm var npm_config_prefix = 'npm_config_' Object.keys(process.env).forEach(function (name) { if (name.indexOf(npm_config_prefix) !== 0) return var val = process.env[name] if (name === npm_config_prefix + 'loglevel') { log.level = val } else { // add the user-defined options to the config name = name.substring(npm_config_prefix.length) this.opts[name] = val } }, this) if (this.opts.loglevel) { log.level = this.opts.loglevel } log.resume() } /** * Spawns a child process and emits a 'spawn' event. */ proto.spawn = function spawn (command, args, opts) { if (!opts) opts = {} if (!opts.silent && !opts.customFds) { opts.customFds = [ 0, 1, 2 ] } var cp = child_process.spawn(command, args, opts) log.info('spawn', command) log.info('spawn args', args) return cp } /** * Returns the usage instructions for node-gyp. */ proto.usage = function usage () { var str = [ '' , ' Usage: node-gyp <command> [options]' , '' , ' where <command> is one of:' , commands.map(function (c) { return ' - ' + c + ' - ' + require('./' + c).usage }).join('\n') , '' , 'node-gyp@' + this.version + ' ' + path.resolve(__dirname, '..') , 'node@' + process.versions.node ].join('\n') return str } /** * Version number getter. */ Object.defineProperty(proto, 'version', { get: function () { return this.package.version } , enumerable: true }) �����������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/rebuild.js��������������������000644 �000766 �000024 �00000000517 12455173731 027636� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = rebuild exports.usage = 'Runs "clean", "configure" and "build" all at once' var log = require('npmlog') function rebuild (gyp, argv, callback) { gyp.todo.push( { name: 'clean', args: [] } , { name: 'configure', args: [] } , { name: 'build', args: [] } ) process.nextTick(callback) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/lib/remove.js���������������������000644 �000766 �000024 �00000002570 12455173731 027506� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = remove exports.usage = 'Removes the node development files for the specified version' /** * Module dependencies. */ var fs = require('fs') , rm = require('rimraf') , path = require('path') , log = require('npmlog') , semver = require('semver') function remove (gyp, argv, callback) { var devDir = gyp.devDir log.verbose('remove', 'using node-gyp dir:', devDir) // get the user-specified version to remove var v = argv[0] || gyp.opts.target log.verbose('remove', 'removing target version:', v) if (!v) { return callback(new Error('You must specify a version number to remove. Ex: "' + process.version + '"')) } // parse the version to normalize and make sure it's valid var version = semver.parse(v) if (!version) { return callback(new Error('Invalid version number: ' + v)) } // flatten the version Array into a String version = version.version var versionPath = path.resolve(gyp.devDir, version) log.verbose('remove', 'removing development files for version:', version) // first check if its even installed fs.stat(versionPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { callback(null, 'version was already uninstalled: ' + version) } else { callback(err) } return } // Go ahead and delete the dir rm(versionPath, callback) }) } ����������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/.npmignore��������������������000644 �000766 �000024 �00000000006 12455173731 027673� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*.pyc ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/AUTHORS�����������������������000644 �000766 �000024 �00000000434 12455173731 026751� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Names should be added to this file like so: # Name or Organization <email address> Google Inc. Bloomberg Finance L.P. Yandex LLC Steven Knight <knight@baldmt.com> Ryan Norton <rnorton10@gmail.com> David J. Sankel <david@sankelsoftware.com> Eric N. Vander Weele <ericvw@gmail.com> ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/buildbot/���������������������000755 �000766 �000024 �00000000000 12456115117 027477� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/codereview.settings�����������000644 �000766 �000024 �00000000546 12455173731 031623� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# This file is used by gcl to get repository specific information. CODE_REVIEW_SERVER: codereview.chromium.org CC_LIST: gyp-developer@googlegroups.com VIEW_VC: http://code.google.com/p/gyp/source/detail?r= TRY_ON_UPLOAD: True TRYSERVER_PROJECT: gyp TRYSERVER_PATCHLEVEL: 0 TRYSERVER_ROOT: trunk TRYSERVER_SVN_URL: svn://svn.chromium.org/chrome-try/try-nacl ����������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/data/�������������������������000755 �000766 �000024 �00000000000 12456115117 026604� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/DEPS��������������������������000644 �000766 �000024 �00000001066 12455173731 026361� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# DEPS file for gclient use in buildbot execution of gyp tests. # # (You don't need to use gclient for normal GYP development work.) vars = { "chrome_trunk": "http://src.chromium.org/svn/trunk", "googlecode_url": "http://%s.googlecode.com/svn", } deps = { } deps_os = { "win": { "third_party/cygwin": Var("chrome_trunk") + "/deps/third_party/cygwin@66844", "third_party/python_26": Var("chrome_trunk") + "/tools/third_party/python_26@89111", "src/third_party/pefile": (Var("googlecode_url") % "pefile") + "/trunk@63", }, } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp���������������������������000755 �000766 �000024 �00000000362 12455173731 026426� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash # Copyright 2013 The Chromium Authors. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. set -e base=$(dirname "$0") exec python "${base}/gyp_main.py" "$@" ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp.bat�����������������������000755 �000766 �000024 �00000000311 12455173731 027165� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@rem Copyright (c) 2009 Google Inc. All rights reserved. @rem Use of this source code is governed by a BSD-style license that can be @rem found in the LICENSE file. @python "%~dp0gyp_main.py" %* �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp_dummy.c�������������������000644 �000766 �000024 �00000000276 12455173731 030063� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/* Copyright (c) 2009 Google Inc. All rights reserved. * Use of this source code is governed by a BSD-style license that can be * found in the LICENSE file. */ int main() { return 0; } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp_main.py�������������������000755 �000766 �000024 �00000000715 12455173731 030063� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2009 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import sys # TODO(mark): sys.path manipulation is some temporary testing stuff. try: import gyp except ImportError, e: import os.path sys.path.append(os.path.join(os.path.dirname(sys.argv[0]), 'pylib')) import gyp if __name__ == '__main__': sys.exit(gyp.script_main()) ���������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/gyptest.py��������������������000755 �000766 �000024 �00000017464 12455173731 027770� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. __doc__ = """ gyptest.py -- test runner for GYP tests. """ import os import optparse import subprocess import sys class CommandRunner: """ Executor class for commands, including "commands" implemented by Python functions. """ verbose = True active = True def __init__(self, dictionary={}): self.subst_dictionary(dictionary) def subst_dictionary(self, dictionary): self._subst_dictionary = dictionary def subst(self, string, dictionary=None): """ Substitutes (via the format operator) the values in the specified dictionary into the specified command. The command can be an (action, string) tuple. In all cases, we perform substitution on strings and don't worry if something isn't a string. (It's probably a Python function to be executed.) """ if dictionary is None: dictionary = self._subst_dictionary if dictionary: try: string = string % dictionary except TypeError: pass return string def display(self, command, stdout=None, stderr=None): if not self.verbose: return if type(command) == type(()): func = command[0] args = command[1:] s = '%s(%s)' % (func.__name__, ', '.join(map(repr, args))) if type(command) == type([]): # TODO: quote arguments containing spaces # TODO: handle meta characters? s = ' '.join(command) else: s = self.subst(command) if not s.endswith('\n'): s += '\n' sys.stdout.write(s) sys.stdout.flush() def execute(self, command, stdout=None, stderr=None): """ Executes a single command. """ if not self.active: return 0 if type(command) == type(''): command = self.subst(command) cmdargs = shlex.split(command) if cmdargs[0] == 'cd': command = (os.chdir,) + tuple(cmdargs[1:]) if type(command) == type(()): func = command[0] args = command[1:] return func(*args) else: if stdout is sys.stdout: # Same as passing sys.stdout, except python2.4 doesn't fail on it. subout = None else: # Open pipe for anything else so Popen works on python2.4. subout = subprocess.PIPE if stderr is sys.stderr: # Same as passing sys.stderr, except python2.4 doesn't fail on it. suberr = None elif stderr is None: # Merge with stdout if stderr isn't specified. suberr = subprocess.STDOUT else: # Open pipe for anything else so Popen works on python2.4. suberr = subprocess.PIPE p = subprocess.Popen(command, shell=(sys.platform == 'win32'), stdout=subout, stderr=suberr) p.wait() if stdout is None: self.stdout = p.stdout.read() elif stdout is not sys.stdout: stdout.write(p.stdout.read()) if stderr not in (None, sys.stderr): stderr.write(p.stderr.read()) return p.returncode def run(self, command, display=None, stdout=None, stderr=None): """ Runs a single command, displaying it first. """ if display is None: display = command self.display(display) return self.execute(command, stdout, stderr) class Unbuffered: def __init__(self, fp): self.fp = fp def write(self, arg): self.fp.write(arg) self.fp.flush() def __getattr__(self, attr): return getattr(self.fp, attr) sys.stdout = Unbuffered(sys.stdout) sys.stderr = Unbuffered(sys.stderr) def is_test_name(f): return f.startswith('gyptest') and f.endswith('.py') def find_all_gyptest_files(directory): result = [] for root, dirs, files in os.walk(directory): if '.svn' in dirs: dirs.remove('.svn') result.extend([ os.path.join(root, f) for f in files if is_test_name(f) ]) result.sort() return result def main(argv=None): if argv is None: argv = sys.argv usage = "gyptest.py [-ahlnq] [-f formats] [test ...]" parser = optparse.OptionParser(usage=usage) parser.add_option("-a", "--all", action="store_true", help="run all tests") parser.add_option("-C", "--chdir", action="store", default=None, help="chdir to the specified directory") parser.add_option("-f", "--format", action="store", default='', help="run tests with the specified formats") parser.add_option("-G", '--gyp_option', action="append", default=[], help="Add -G options to the gyp command line") parser.add_option("-l", "--list", action="store_true", help="list available tests and exit") parser.add_option("-n", "--no-exec", action="store_true", help="no execute, just print the command line") parser.add_option("--passed", action="store_true", help="report passed tests") parser.add_option("--path", action="append", default=[], help="additional $PATH directory") parser.add_option("-q", "--quiet", action="store_true", help="quiet, don't print test command lines") opts, args = parser.parse_args(argv[1:]) if opts.chdir: os.chdir(opts.chdir) if opts.path: extra_path = [os.path.abspath(p) for p in opts.path] extra_path = os.pathsep.join(extra_path) os.environ['PATH'] = extra_path + os.pathsep + os.environ['PATH'] if not args: if not opts.all: sys.stderr.write('Specify -a to get all tests.\n') return 1 args = ['test'] tests = [] for arg in args: if os.path.isdir(arg): tests.extend(find_all_gyptest_files(os.path.normpath(arg))) else: if not is_test_name(os.path.basename(arg)): print >>sys.stderr, arg, 'is not a valid gyp test name.' sys.exit(1) tests.append(arg) if opts.list: for test in tests: print test sys.exit(0) CommandRunner.verbose = not opts.quiet CommandRunner.active = not opts.no_exec cr = CommandRunner() os.environ['PYTHONPATH'] = os.path.abspath('test/lib') if not opts.quiet: sys.stdout.write('PYTHONPATH=%s\n' % os.environ['PYTHONPATH']) passed = [] failed = [] no_result = [] if opts.format: format_list = opts.format.split(',') else: # TODO: not duplicate this mapping from pylib/gyp/__init__.py format_list = { 'aix5': ['make'], 'freebsd7': ['make'], 'freebsd8': ['make'], 'openbsd5': ['make'], 'cygwin': ['msvs'], 'win32': ['msvs', 'ninja'], 'linux2': ['make', 'ninja'], 'linux3': ['make', 'ninja'], 'darwin': ['make', 'ninja', 'xcode'], }[sys.platform] for format in format_list: os.environ['TESTGYP_FORMAT'] = format if not opts.quiet: sys.stdout.write('TESTGYP_FORMAT=%s\n' % format) gyp_options = [] for option in opts.gyp_option: gyp_options += ['-G', option] if gyp_options and not opts.quiet: sys.stdout.write('Extra Gyp options: %s\n' % gyp_options) for test in tests: status = cr.run([sys.executable, test] + gyp_options, stdout=sys.stdout, stderr=sys.stderr) if status == 2: no_result.append(test) elif status: failed.append(test) else: passed.append(test) if not opts.quiet: def report(description, tests): if tests: if len(tests) == 1: sys.stdout.write("\n%s the following test:\n" % description) else: fmt = "\n%s the following %d tests:\n" sys.stdout.write(fmt % (description, len(tests))) sys.stdout.write("\t" + "\n\t".join(tests) + "\n") if opts.passed: report("Passed", passed) report("Failed", failed) report("No result from", no_result) if failed: return 1 else: return 0 if __name__ == "__main__": sys.exit(main()) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/LICENSE�����������������������000644 �000766 �000024 �00000002703 12455173731 026707� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2009 Google Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Google Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. �������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/OWNERS������������������������000644 �000766 �000024 �00000000002 12455173731 026630� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������* ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/PRESUBMIT.py������������������000644 �000766 �000024 �00000006455 12455173731 027636� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Top-level presubmit script for GYP. See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts for more details about the presubmit API built into gcl. """ PYLINT_BLACKLIST = [ # TODO: fix me. # From SCons, not done in google style. 'test/lib/TestCmd.py', 'test/lib/TestCommon.py', 'test/lib/TestGyp.py', # Needs style fix. 'pylib/gyp/generator/xcode.py', ] PYLINT_DISABLED_WARNINGS = [ # TODO: fix me. # Many tests include modules they don't use. 'W0611', # Include order doesn't properly include local files? 'F0401', # Some use of built-in names. 'W0622', # Some unused variables. 'W0612', # Operator not preceded/followed by space. 'C0323', 'C0322', # Unnecessary semicolon. 'W0301', # Unused argument. 'W0613', # String has no effect (docstring in wrong place). 'W0105', # Comma not followed by space. 'C0324', # Access to a protected member. 'W0212', # Bad indent. 'W0311', # Line too long. 'C0301', # Undefined variable. 'E0602', # Not exception type specified. 'W0702', # No member of that name. 'E1101', # Dangerous default {}. 'W0102', # Others, too many to sort. 'W0201', 'W0232', 'E1103', 'W0621', 'W0108', 'W0223', 'W0231', 'R0201', 'E0101', 'C0321', # ************* Module copy # W0104:427,12:_test.odict.__setitem__: Statement seems to have no effect 'W0104', ] def CheckChangeOnUpload(input_api, output_api): report = [] report.extend(input_api.canned_checks.PanProjectChecks( input_api, output_api)) return report def CheckChangeOnCommit(input_api, output_api): report = [] # Accept any year number from 2009 to the current year. current_year = int(input_api.time.strftime('%Y')) allowed_years = (str(s) for s in reversed(xrange(2009, current_year + 1))) years_re = '(' + '|'.join(allowed_years) + ')' # The (c) is deprecated, but tolerate it until it's removed from all files. license = ( r'.*? Copyright (\(c\) )?%(year)s Google Inc\. All rights reserved\.\n' r'.*? Use of this source code is governed by a BSD-style license that ' r'can be\n' r'.*? found in the LICENSE file\.\n' ) % { 'year': years_re, } report.extend(input_api.canned_checks.PanProjectChecks( input_api, output_api, license_header=license)) report.extend(input_api.canned_checks.CheckTreeIsOpen( input_api, output_api, 'http://gyp-status.appspot.com/status', 'http://gyp-status.appspot.com/current')) import os import sys old_sys_path = sys.path try: sys.path = ['pylib', 'test/lib'] + sys.path blacklist = PYLINT_BLACKLIST if sys.platform == 'win32': blacklist = [os.path.normpath(x).replace('\\', '\\\\') for x in PYLINT_BLACKLIST] report.extend(input_api.canned_checks.RunPylint( input_api, output_api, black_list=blacklist, disabled_warnings=PYLINT_DISABLED_WARNINGS)) finally: sys.path = old_sys_path return report def GetPreferredTrySlaves(): return ['gyp-win32', 'gyp-win64', 'gyp-linux', 'gyp-mac', 'gyp-android'] �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/������������������������000755 �000766 �000024 �00000000000 12456115117 027012� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylintrc����������������������000644 �000766 �000024 �00000023425 12455173731 027475� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������[MASTER] # Specify a configuration file. #rcfile= # Python code to execute, usually for sys.path manipulation such as # pygtk.require(). #init-hook= # Profiled execution. profile=no # Add files or directories to the blacklist. They should be base names, not # paths. ignore=CVS # Pickle collected data for later comparisons. persistent=yes # List of plugins (as comma separated values of python modules names) to load, # usually to register additional checkers. load-plugins= [MESSAGES CONTROL] # Enable the message, report, category or checker with the given id(s). You can # either give multiple identifier separated by comma (,) or put this option # multiple time. #enable= # Disable the message, report, category or checker with the given id(s). You # can either give multiple identifier separated by comma (,) or put this option # multiple time (only on the command line, not in the configuration file where # it should appear only once). # C0103: Invalid name "NN" (should match [a-z_][a-z0-9_]{2,30}$) # C0111: Missing docstring # C0302: Too many lines in module (NN) # R0902: Too many instance attributes (N/7) # R0903: Too few public methods (N/2) # R0904: Too many public methods (NN/20) # R0912: Too many branches (NN/12) # R0913: Too many arguments (N/5) # R0914: Too many local variables (NN/15) # R0915: Too many statements (NN/50) # W0141: Used builtin function 'map' # W0142: Used * or ** magic # W0232: Class has no __init__ method # W0511: TODO # W0603: Using the global statement # # These should be enabled eventually: # C0112: Empty docstring # C0301: Line too long (NN/80) # C0321: More than one statement on single line # C0322: Operator not preceded by a space # C0323: Operator not followed by a space # C0324: Comma not followed by a space # E0101: Explicit return in __init__ # E0102: function already defined line NN # E1002: Use of super on an old style class # E1101: Instance of 'XX' has no 'YY' member # E1103: Instance of 'XX' has no 'XX' member (but some types could not be inferred) # E0602: Undefined variable 'XX' # F0401: Unable to import 'XX' # R0201: Method could be a function # R0801: Similar lines in N files # W0102: Dangerous default value {} as argument # W0104: Statement seems to have no effect # W0105: String statement has no effect # W0108: Lambda may not be necessary # W0201: Attribute 'XX' defined outside __init__ # W0212: Access to a protected member XX of a client class # W0221: Arguments number differs from overridden method # W0223: Method 'XX' is abstract in class 'YY' but is not overridden # W0231: __init__ method from base class 'XX' is not called # W0301: Unnecessary semicolon # W0311: Bad indentation. Found NN spaces, expected NN # W0401: Wildcard import XX # W0402: Uses of a deprecated module 'string' # W0403: Relative import 'XX', should be 'YY.XX' # W0404: Reimport 'XX' (imported line NN) # W0601: Global variable 'XX' undefined at the module level # W0602: Using global for 'XX' but no assignment is done # W0611: Unused import pprint # W0612: Unused variable 'XX' # W0613: Unused argument 'XX' # W0614: Unused import XX from wildcard import # W0621: Redefining name 'XX' from outer scope (line NN) # W0622: Redefining built-in 'NN' # W0631: Using possibly undefined loop variable 'XX' # W0701: Raising a string exception # W0702: No exception type(s) specified disable=C0103,C0111,C0302,R0902,R0903,R0904,R0912,R0913,R0914,R0915,W0141,W0142,W0232,W0511,W0603,C0112,C0301,C0321,C0322,C0323,C0324,E0101,E0102,E1002,E1101,E1103,E0602,F0401,R0201,R0801,W0102,W0104,W0105,W0108,W0201,W0212,W0221,W0223,W0231,W0301,W0311,W0401,W0402,W0403,W0404,W0601,W0602,W0611,W0612,W0613,W0614,W0621,W0622,W0631,W0701,W0702 [REPORTS] # Set the output format. Available formats are text, parseable, colorized, msvs # (visual studio) and html output-format=text # Include message's id in output include-ids=yes # Put messages in a separate file for each module / package specified on the # command line instead of printing them on stdout. Reports (if any) will be # written in a file name "pylint_global.[txt|html]". files-output=no # Tells whether to display a full report or only the messages reports=no # Python expression which should return a note less than 10 (10 is the highest # note). You have access to the variables errors warning, statement which # respectively contain the number of errors / warnings messages and the total # number of statements analyzed. This is used by the global evaluation report # (RP0004). evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10) # Add a comment according to your evaluation note. This is used by the global # evaluation report (RP0004). comment=no [VARIABLES] # Tells whether we should check for unused import in __init__ files. init-import=no # A regular expression matching the beginning of the name of dummy variables # (i.e. not used). dummy-variables-rgx=_|dummy # List of additional names supposed to be defined in builtins. Remember that # you should avoid to define new builtins when possible. additional-builtins= [TYPECHECK] # Tells whether missing members accessed in mixin class should be ignored. A # mixin class is detected if its name ends with "mixin" (case insensitive). ignore-mixin-members=yes # List of classes names for which member attributes should not be checked # (useful for classes with attributes dynamically set). ignored-classes=SQLObject # When zope mode is activated, add a predefined set of Zope acquired attributes # to generated-members. zope=no # List of members which are set dynamically and missed by pylint inference # system, and so shouldn't trigger E0201 when accessed. Python regular # expressions are accepted. generated-members=REQUEST,acl_users,aq_parent [MISCELLANEOUS] # List of note tags to take in consideration, separated by a comma. notes=FIXME,XXX,TODO [SIMILARITIES] # Minimum lines number of a similarity. min-similarity-lines=4 # Ignore comments when computing similarities. ignore-comments=yes # Ignore docstrings when computing similarities. ignore-docstrings=yes [FORMAT] # Maximum number of characters on a single line. max-line-length=80 # Maximum number of lines in a module max-module-lines=1000 # String used as indentation unit. This is usually " " (4 spaces) or "\t" (1 # tab). indent-string=' ' [BASIC] # Required attributes for module, separated by a comma required-attributes= # List of builtins function names that should not be used, separated by a comma bad-functions=map,filter,apply,input # Regular expression which should only match correct module names module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$ # Regular expression which should only match correct module level names const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$ # Regular expression which should only match correct class names class-rgx=[A-Z_][a-zA-Z0-9]+$ # Regular expression which should only match correct function names function-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression which should only match correct method names method-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression which should only match correct instance attribute names attr-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression which should only match correct argument names argument-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression which should only match correct variable names variable-rgx=[a-z_][a-z0-9_]{2,30}$ # Regular expression which should only match correct list comprehension / # generator expression variable names inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$ # Good variable names which should always be accepted, separated by a comma good-names=i,j,k,ex,Run,_ # Bad variable names which should always be refused, separated by a comma bad-names=foo,bar,baz,toto,tutu,tata # Regular expression which should only match functions or classes name which do # not require a docstring no-docstring-rgx=__.*__ [DESIGN] # Maximum number of arguments for function / method max-args=5 # Argument names that match this expression will be ignored. Default to name # with leading underscore ignored-argument-names=_.* # Maximum number of locals for function / method body max-locals=15 # Maximum number of return / yield for function / method body max-returns=6 # Maximum number of branch for function / method body max-branchs=12 # Maximum number of statements in function / method body max-statements=50 # Maximum number of parents for a class (see R0901). max-parents=7 # Maximum number of attributes for a class (see R0902). max-attributes=7 # Minimum number of public methods for a class (see R0903). min-public-methods=2 # Maximum number of public methods for a class (see R0904). max-public-methods=20 [CLASSES] # List of interface methods to ignore, separated by a comma. This is used for # instance to not check methods defines in Zope's Interface base class. ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by # List of method names used to declare (i.e. assign) instance attributes. defining-attr-methods=__init__,__new__,setUp # List of valid names for the first argument in a class method. valid-classmethod-first-arg=cls [IMPORTS] # Deprecated modules which should not be used, separated by a comma deprecated-modules=regsub,string,TERMIOS,Bastion,rexec # Create a graph of every (i.e. internal and external) dependencies in the # given file (report RP0402 must not be disabled) import-graph= # Create a graph of external dependencies in the given file (report RP0402 must # not be disabled) ext-import-graph= # Create a graph of internal dependencies in the given file (report RP0402 must # not be disabled) int-import-graph= [EXCEPTIONS] # Exceptions that will emit a warning when being caught. Defaults to # "Exception" overgeneral-exceptions=Exception �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/����������������������000755 �000766 �000024 �00000000000 12456115117 027337� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/setup.py����������������������000755 �000766 �000024 �00000001030 12455173731 027407� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2009 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. from setuptools import setup setup( name='gyp', version='0.1', description='Generate Your Projects', author='Chromium Authors', author_email='chromium-dev@googlegroups.com', url='http://code.google.com/p/gyp', package_dir = {'': 'pylib'}, packages=['gyp', 'gyp.generator'], entry_points = {'console_scripts': ['gyp=gyp:script_main'] } ) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/������������������������000755 �000766 �000024 �00000000000 12456115117 027033� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/������������������000755 �000766 �000024 �00000000000 12456115117 030123� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/graphviz.py�������������000755 �000766 �000024 �00000005476 12455173731 031263� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Using the JSON dumped by the dump-dependency-json generator, generate input suitable for graphviz to render a dependency graph of targets.""" import collections import json import sys def ParseTarget(target): target, _, suffix = target.partition('#') filename, _, target = target.partition(':') return filename, target, suffix def LoadEdges(filename, targets): """Load the edges map from the dump file, and filter it to only show targets in |targets| and their depedendents.""" file = open('dump.json') edges = json.load(file) file.close() # Copy out only the edges we're interested in from the full edge list. target_edges = {} to_visit = targets[:] while to_visit: src = to_visit.pop() if src in target_edges: continue target_edges[src] = edges[src] to_visit.extend(edges[src]) return target_edges def WriteGraph(edges): """Print a graphviz graph to stdout. |edges| is a map of target to a list of other targets it depends on.""" # Bucket targets by file. files = collections.defaultdict(list) for src, dst in edges.items(): build_file, target_name, toolset = ParseTarget(src) files[build_file].append(src) print 'digraph D {' print ' fontsize=8' # Used by subgraphs. print ' node [fontsize=8]' # Output nodes by file. We must first write out each node within # its file grouping before writing out any edges that may refer # to those nodes. for filename, targets in files.items(): if len(targets) == 1: # If there's only one node for this file, simplify # the display by making it a box without an internal node. target = targets[0] build_file, target_name, toolset = ParseTarget(target) print ' "%s" [shape=box, label="%s\\n%s"]' % (target, filename, target_name) else: # Group multiple nodes together in a subgraph. print ' subgraph "cluster_%s" {' % filename print ' label = "%s"' % filename for target in targets: build_file, target_name, toolset = ParseTarget(target) print ' "%s" [label="%s"]' % (target, target_name) print ' }' # Now that we've placed all the nodes within subgraphs, output all # the edges between nodes. for src, dsts in edges.items(): for dst in dsts: print ' "%s" -> "%s"' % (src, dst) print '}' def main(): if len(sys.argv) < 2: print >>sys.stderr, __doc__ print >>sys.stderr print >>sys.stderr, 'usage: %s target1 target2...' % (sys.argv[0]) return 1 edges = LoadEdges('dump.json', sys.argv[1:]) WriteGraph(edges) return 0 if __name__ == '__main__': sys.exit(main()) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_gyp.py�����������000755 �000766 �000024 �00000011224 12455173731 031623� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Pretty-prints the contents of a GYP file.""" import sys import re # Regex to remove comments when we're counting braces. COMMENT_RE = re.compile(r'\s*#.*') # Regex to remove quoted strings when we're counting braces. # It takes into account quoted quotes, and makes sure that the quotes match. # NOTE: It does not handle quotes that span more than one line, or # cases where an escaped quote is preceeded by an escaped backslash. QUOTE_RE_STR = r'(?P<q>[\'"])(.*?)(?<![^\\][\\])(?P=q)' QUOTE_RE = re.compile(QUOTE_RE_STR) def comment_replace(matchobj): return matchobj.group(1) + matchobj.group(2) + '#' * len(matchobj.group(3)) def mask_comments(input): """Mask the quoted strings so we skip braces inside quoted strings.""" search_re = re.compile(r'(.*?)(#)(.*)') return [search_re.sub(comment_replace, line) for line in input] def quote_replace(matchobj): return "%s%s%s%s" % (matchobj.group(1), matchobj.group(2), 'x'*len(matchobj.group(3)), matchobj.group(2)) def mask_quotes(input): """Mask the quoted strings so we skip braces inside quoted strings.""" search_re = re.compile(r'(.*?)' + QUOTE_RE_STR) return [search_re.sub(quote_replace, line) for line in input] def do_split(input, masked_input, search_re): output = [] mask_output = [] for (line, masked_line) in zip(input, masked_input): m = search_re.match(masked_line) while m: split = len(m.group(1)) line = line[:split] + r'\n' + line[split:] masked_line = masked_line[:split] + r'\n' + masked_line[split:] m = search_re.match(masked_line) output.extend(line.split(r'\n')) mask_output.extend(masked_line.split(r'\n')) return (output, mask_output) def split_double_braces(input): """Masks out the quotes and comments, and then splits appropriate lines (lines that matche the double_*_brace re's above) before indenting them below. These are used to split lines which have multiple braces on them, so that the indentation looks prettier when all laid out (e.g. closing braces make a nice diagonal line). """ double_open_brace_re = re.compile(r'(.*?[\[\{\(,])(\s*)([\[\{\(])') double_close_brace_re = re.compile(r'(.*?[\]\}\)],?)(\s*)([\]\}\)])') masked_input = mask_quotes(input) masked_input = mask_comments(masked_input) (output, mask_output) = do_split(input, masked_input, double_open_brace_re) (output, mask_output) = do_split(output, mask_output, double_close_brace_re) return output def count_braces(line): """keeps track of the number of braces on a given line and returns the result. It starts at zero and subtracts for closed braces, and adds for open braces. """ open_braces = ['[', '(', '{'] close_braces = [']', ')', '}'] closing_prefix_re = re.compile(r'(.*?[^\s\]\}\)]+.*?)([\]\}\)],?)\s*$') cnt = 0 stripline = COMMENT_RE.sub(r'', line) stripline = QUOTE_RE.sub(r"''", stripline) for char in stripline: for brace in open_braces: if char == brace: cnt += 1 for brace in close_braces: if char == brace: cnt -= 1 after = False if cnt > 0: after = True # This catches the special case of a closing brace having something # other than just whitespace ahead of it -- we don't want to # unindent that until after this line is printed so it stays with # the previous indentation level. if cnt < 0 and closing_prefix_re.match(stripline): after = True return (cnt, after) def prettyprint_input(lines): """Does the main work of indenting the input based on the brace counts.""" indent = 0 basic_offset = 2 last_line = "" for line in lines: if COMMENT_RE.match(line): print line else: line = line.strip('\r\n\t ') # Otherwise doesn't strip \r on Unix. if len(line) > 0: (brace_diff, after) = count_braces(line) if brace_diff != 0: if after: print " " * (basic_offset * indent) + line indent += brace_diff else: indent += brace_diff print " " * (basic_offset * indent) + line else: print " " * (basic_offset * indent) + line else: print "" last_line = line def main(): if len(sys.argv) > 1: data = open(sys.argv[1]).read().splitlines() else: data = sys.stdin.read().splitlines() # Split up the double braces. lines = split_double_braces(data) # Indent and print the output. prettyprint_input(lines) return 0 if __name__ == '__main__': sys.exit(main()) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_sln.py�����������000755 �000766 �000024 �00000011744 12455173731 031627� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Prints the information in a sln file in a diffable way. It first outputs each projects in alphabetical order with their dependencies. Then it outputs a possible build order. """ __author__ = 'nsylvain (Nicolas Sylvain)' import os import re import sys import pretty_vcproj def BuildProject(project, built, projects, deps): # if all dependencies are done, we can build it, otherwise we try to build the # dependency. # This is not infinite-recursion proof. for dep in deps[project]: if dep not in built: BuildProject(dep, built, projects, deps) print project built.append(project) def ParseSolution(solution_file): # All projects, their clsid and paths. projects = dict() # A list of dependencies associated with a project. dependencies = dict() # Regular expressions that matches the SLN format. # The first line of a project definition. begin_project = re.compile(('^Project\("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942' '}"\) = "(.*)", "(.*)", "(.*)"$')) # The last line of a project definition. end_project = re.compile('^EndProject$') # The first line of a dependency list. begin_dep = re.compile('ProjectSection\(ProjectDependencies\) = postProject$') # The last line of a dependency list. end_dep = re.compile('EndProjectSection$') # A line describing a dependency. dep_line = re.compile(' *({.*}) = ({.*})$') in_deps = False solution = open(solution_file) for line in solution: results = begin_project.search(line) if results: # Hack to remove icu because the diff is too different. if results.group(1).find('icu') != -1: continue # We remove "_gyp" from the names because it helps to diff them. current_project = results.group(1).replace('_gyp', '') projects[current_project] = [results.group(2).replace('_gyp', ''), results.group(3), results.group(2)] dependencies[current_project] = [] continue results = end_project.search(line) if results: current_project = None continue results = begin_dep.search(line) if results: in_deps = True continue results = end_dep.search(line) if results: in_deps = False continue results = dep_line.search(line) if results and in_deps and current_project: dependencies[current_project].append(results.group(1)) continue # Change all dependencies clsid to name instead. for project in dependencies: # For each dependencies in this project new_dep_array = [] for dep in dependencies[project]: # Look for the project name matching this cldis for project_info in projects: if projects[project_info][1] == dep: new_dep_array.append(project_info) dependencies[project] = sorted(new_dep_array) return (projects, dependencies) def PrintDependencies(projects, deps): print "---------------------------------------" print "Dependencies for all projects" print "---------------------------------------" print "-- --" for (project, dep_list) in sorted(deps.items()): print "Project : %s" % project print "Path : %s" % projects[project][0] if dep_list: for dep in dep_list: print " - %s" % dep print "" print "-- --" def PrintBuildOrder(projects, deps): print "---------------------------------------" print "Build order " print "---------------------------------------" print "-- --" built = [] for (project, _) in sorted(deps.items()): if project not in built: BuildProject(project, built, projects, deps) print "-- --" def PrintVCProj(projects): for project in projects: print "-------------------------------------" print "-------------------------------------" print project print project print project print "-------------------------------------" print "-------------------------------------" project_path = os.path.abspath(os.path.join(os.path.dirname(sys.argv[1]), projects[project][2])) pretty = pretty_vcproj argv = [ '', project_path, '$(SolutionDir)=%s\\' % os.path.dirname(sys.argv[1]), ] argv.extend(sys.argv[3:]) pretty.main(argv) def main(): # check if we have exactly 1 parameter. if len(sys.argv) < 2: print 'Usage: %s "c:\\path\\to\\project.sln"' % sys.argv[0] return 1 (projects, deps) = ParseSolution(sys.argv[1]) PrintDependencies(projects, deps) PrintBuildOrder(projects, deps) if '--recursive' in sys.argv: PrintVCProj(projects) return 0 if __name__ == '__main__': sys.exit(main()) ����������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/pretty_vcproj.py��������000755 �000766 �000024 �00000022562 12455173731 032336� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Make the format of a vcproj really pretty. This script normalize and sort an xml. It also fetches all the properties inside linked vsprops and include them explicitly in the vcproj. It outputs the resulting xml to stdout. """ __author__ = 'nsylvain (Nicolas Sylvain)' import os import sys from xml.dom.minidom import parse from xml.dom.minidom import Node REPLACEMENTS = dict() ARGUMENTS = None class CmpTuple(object): """Compare function between 2 tuple.""" def __call__(self, x, y): return cmp(x[0], y[0]) class CmpNode(object): """Compare function between 2 xml nodes.""" def __call__(self, x, y): def get_string(node): node_string = "node" node_string += node.nodeName if node.nodeValue: node_string += node.nodeValue if node.attributes: # We first sort by name, if present. node_string += node.getAttribute("Name") all_nodes = [] for (name, value) in node.attributes.items(): all_nodes.append((name, value)) all_nodes.sort(CmpTuple()) for (name, value) in all_nodes: node_string += name node_string += value return node_string return cmp(get_string(x), get_string(y)) def PrettyPrintNode(node, indent=0): if node.nodeType == Node.TEXT_NODE: if node.data.strip(): print '%s%s' % (' '*indent, node.data.strip()) return if node.childNodes: node.normalize() # Get the number of attributes attr_count = 0 if node.attributes: attr_count = node.attributes.length # Print the main tag if attr_count == 0: print '%s<%s>' % (' '*indent, node.nodeName) else: print '%s<%s' % (' '*indent, node.nodeName) all_attributes = [] for (name, value) in node.attributes.items(): all_attributes.append((name, value)) all_attributes.sort(CmpTuple()) for (name, value) in all_attributes: print '%s %s="%s"' % (' '*indent, name, value) print '%s>' % (' '*indent) if node.nodeValue: print '%s %s' % (' '*indent, node.nodeValue) for sub_node in node.childNodes: PrettyPrintNode(sub_node, indent=indent+2) print '%s</%s>' % (' '*indent, node.nodeName) def FlattenFilter(node): """Returns a list of all the node and sub nodes.""" node_list = [] if (node.attributes and node.getAttribute('Name') == '_excluded_files'): # We don't add the "_excluded_files" filter. return [] for current in node.childNodes: if current.nodeName == 'Filter': node_list.extend(FlattenFilter(current)) else: node_list.append(current) return node_list def FixFilenames(filenames, current_directory): new_list = [] for filename in filenames: if filename: for key in REPLACEMENTS: filename = filename.replace(key, REPLACEMENTS[key]) os.chdir(current_directory) filename = filename.strip('"\' ') if filename.startswith('$'): new_list.append(filename) else: new_list.append(os.path.abspath(filename)) return new_list def AbsoluteNode(node): """Makes all the properties we know about in this node absolute.""" if node.attributes: for (name, value) in node.attributes.items(): if name in ['InheritedPropertySheets', 'RelativePath', 'AdditionalIncludeDirectories', 'IntermediateDirectory', 'OutputDirectory', 'AdditionalLibraryDirectories']: # We want to fix up these paths path_list = value.split(';') new_list = FixFilenames(path_list, os.path.dirname(ARGUMENTS[1])) node.setAttribute(name, ';'.join(new_list)) if not value: node.removeAttribute(name) def CleanupVcproj(node): """For each sub node, we call recursively this function.""" for sub_node in node.childNodes: AbsoluteNode(sub_node) CleanupVcproj(sub_node) # Normalize the node, and remove all extranous whitespaces. for sub_node in node.childNodes: if sub_node.nodeType == Node.TEXT_NODE: sub_node.data = sub_node.data.replace("\r", "") sub_node.data = sub_node.data.replace("\n", "") sub_node.data = sub_node.data.rstrip() # Fix all the semicolon separated attributes to be sorted, and we also # remove the dups. if node.attributes: for (name, value) in node.attributes.items(): sorted_list = sorted(value.split(';')) unique_list = [] for i in sorted_list: if not unique_list.count(i): unique_list.append(i) node.setAttribute(name, ';'.join(unique_list)) if not value: node.removeAttribute(name) if node.childNodes: node.normalize() # For each node, take a copy, and remove it from the list. node_array = [] while node.childNodes and node.childNodes[0]: # Take a copy of the node and remove it from the list. current = node.childNodes[0] node.removeChild(current) # If the child is a filter, we want to append all its children # to this same list. if current.nodeName == 'Filter': node_array.extend(FlattenFilter(current)) else: node_array.append(current) # Sort the list. node_array.sort(CmpNode()) # Insert the nodes in the correct order. for new_node in node_array: # But don't append empty tool node. if new_node.nodeName == 'Tool': if new_node.attributes and new_node.attributes.length == 1: # This one was empty. continue if new_node.nodeName == 'UserMacro': continue node.appendChild(new_node) def GetConfiguationNodes(vcproj): #TODO(nsylvain): Find a better way to navigate the xml. nodes = [] for node in vcproj.childNodes: if node.nodeName == "Configurations": for sub_node in node.childNodes: if sub_node.nodeName == "Configuration": nodes.append(sub_node) return nodes def GetChildrenVsprops(filename): dom = parse(filename) if dom.documentElement.attributes: vsprops = dom.documentElement.getAttribute('InheritedPropertySheets') return FixFilenames(vsprops.split(';'), os.path.dirname(filename)) return [] def SeekToNode(node1, child2): # A text node does not have properties. if child2.nodeType == Node.TEXT_NODE: return None # Get the name of the current node. current_name = child2.getAttribute("Name") if not current_name: # There is no name. We don't know how to merge. return None # Look through all the nodes to find a match. for sub_node in node1.childNodes: if sub_node.nodeName == child2.nodeName: name = sub_node.getAttribute("Name") if name == current_name: return sub_node # No match. We give up. return None def MergeAttributes(node1, node2): # No attributes to merge? if not node2.attributes: return for (name, value2) in node2.attributes.items(): # Don't merge the 'Name' attribute. if name == 'Name': continue value1 = node1.getAttribute(name) if value1: # The attribute exist in the main node. If it's equal, we leave it # untouched, otherwise we concatenate it. if value1 != value2: node1.setAttribute(name, ';'.join([value1, value2])) else: # The attribute does nto exist in the main node. We append this one. node1.setAttribute(name, value2) # If the attribute was a property sheet attributes, we remove it, since # they are useless. if name == 'InheritedPropertySheets': node1.removeAttribute(name) def MergeProperties(node1, node2): MergeAttributes(node1, node2) for child2 in node2.childNodes: child1 = SeekToNode(node1, child2) if child1: MergeProperties(child1, child2) else: node1.appendChild(child2.cloneNode(True)) def main(argv): """Main function of this vcproj prettifier.""" global ARGUMENTS ARGUMENTS = argv # check if we have exactly 1 parameter. if len(argv) < 2: print ('Usage: %s "c:\\path\\to\\vcproj.vcproj" [key1=value1] ' '[key2=value2]' % argv[0]) return 1 # Parse the keys for i in range(2, len(argv)): (key, value) = argv[i].split('=') REPLACEMENTS[key] = value # Open the vcproj and parse the xml. dom = parse(argv[1]) # First thing we need to do is find the Configuration Node and merge them # with the vsprops they include. for configuration_node in GetConfiguationNodes(dom.documentElement): # Get the property sheets associated with this configuration. vsprops = configuration_node.getAttribute('InheritedPropertySheets') # Fix the filenames to be absolute. vsprops_list = FixFilenames(vsprops.strip().split(';'), os.path.dirname(argv[1])) # Extend the list of vsprops with all vsprops contained in the current # vsprops. for current_vsprops in vsprops_list: vsprops_list.extend(GetChildrenVsprops(current_vsprops)) # Now that we have all the vsprops, we need to merge them. for current_vsprops in vsprops_list: MergeProperties(configuration_node, parse(current_vsprops).documentElement) # Now that everything is merged, we need to cleanup the xml. CleanupVcproj(dom.documentElement) # Finally, we use the prett xml function to print the vcproj back to the # user. #print dom.toprettyxml(newl="\n") PrettyPrintNode(dom.documentElement) return 0 if __name__ == '__main__': sys.exit(main(sys.argv)) ����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/README������������������000644 �000766 �000024 �00000001505 12455173731 027721� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������pretty_vcproj: Usage: pretty_vcproj.py "c:\path\to\vcproj.vcproj" [key1=value1] [key2=value2] They key/value pair are used to resolve vsprops name. For example, if I want to diff the base.vcproj project: pretty_vcproj.py z:\dev\src-chrome\src\base\build\base.vcproj "$(SolutionDir)=z:\dev\src-chrome\src\chrome\\" "$(CHROMIUM_BUILD)=" "$(CHROME_BUILD_TYPE)=" > orignal.txt pretty_vcproj.py z:\dev\src-chrome\src\base\base_gyp.vcproj "$(SolutionDir)=z:\dev\src-chrome\src\chrome\\" "$(CHROMIUM_BUILD)=" "$(CHROME_BUILD_TYPE)=" > gyp.txt And you can use your favorite diff tool to see the changes. Note: In the case of base.vcproj, the original vcproj is one level up the generated one. I suggest you do a search and replace for '"..\' and replace it with '"' in original.txt before you perform the diff.�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/������������������000755 �000766 �000024 �00000000000 12456115117 030075� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/README������������000644 �000766 �000024 �00000000441 12455173731 030761� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Specifications contains syntax formatters for Xcode 3. These do not appear to be supported yet on Xcode 4. To use these with Xcode 3 please install both the gyp.pbfilespec and gyp.xclangspec files in ~/Library/Application Support/Developer/Shared/Xcode/Specifications/ and restart Xcode.�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/���000755 �000766 �000024 �00000000000 12456115117 033040� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.pbfilespec������������000644 �000766 �000024 �00000001275 12455173731 035630� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* gyp.pbfilespec GYP source file spec for Xcode 3 There is not much documentation available regarding the format of .pbfilespec files. As a starting point, see for instance the outdated documentation at: http://maxao.free.fr/xcode-plugin-interface/specifications.html and the files in: /Developer/Library/PrivateFrameworks/XcodeEdit.framework/Versions/A/Resources/ Place this file in directory: ~/Library/Application Support/Developer/Shared/Xcode/Specifications/ */ ( { Identifier = sourcecode.gyp; BasedOn = sourcecode; Name = "GYP Files"; Extensions = ("gyp", "gypi"); MIMETypes = ("text/gyp"); Language = "xcode.lang.gyp"; IsTextFile = YES; IsSourceFile = YES; } ) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.xclangspec������������000644 �000766 �000024 �00000011740 12455173731 035641� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������/* Copyright (c) 2011 Google Inc. All rights reserved. Use of this source code is governed by a BSD-style license that can be found in the LICENSE file. gyp.xclangspec GYP language specification for Xcode 3 There is not much documentation available regarding the format of .xclangspec files. As a starting point, see for instance the outdated documentation at: http://maxao.free.fr/xcode-plugin-interface/specifications.html and the files in: /Developer/Library/PrivateFrameworks/XcodeEdit.framework/Versions/A/Resources/ Place this file in directory: ~/Library/Application Support/Developer/Shared/Xcode/Specifications/ */ ( { Identifier = "xcode.lang.gyp.keyword"; Syntax = { Words = ( "and", "or", "<!", "<", ); Type = "xcode.syntax.keyword"; }; }, { Identifier = "xcode.lang.gyp.target.declarator"; Syntax = { Words = ( "'target_name'", ); Type = "xcode.syntax.identifier.type"; }; }, { Identifier = "xcode.lang.gyp.string.singlequote"; Syntax = { IncludeRules = ( "xcode.lang.string", "xcode.lang.gyp.keyword", "xcode.lang.number", ); Start = "'"; End = "'"; }; }, { Identifier = "xcode.lang.gyp.comma"; Syntax = { Words = ( ",", ); }; }, { Identifier = "xcode.lang.gyp"; Description = "GYP Coloring"; BasedOn = "xcode.lang.simpleColoring"; IncludeInMenu = YES; Name = "GYP"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer.toplevel"; IncludeRules = ( "xcode.lang.gyp.dictionary", ); Type = "xcode.syntax.plain"; }; }, // The following rule returns tokens to the other rules { Identifier = "xcode.lang.gyp.lexer"; Syntax = { IncludeRules = ( "xcode.lang.gyp.comment", "xcode.lang.string", 'xcode.lang.gyp.targetname.declarator', "xcode.lang.gyp.string.singlequote", "xcode.lang.number", "xcode.lang.gyp.comma", ); }; }, { Identifier = "xcode.lang.gyp.lexer.toplevel"; Syntax = { IncludeRules = ( "xcode.lang.gyp.comment", ); }; }, { Identifier = "xcode.lang.gyp.assignment"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Rules = ( "xcode.lang.gyp.assignment.lhs", ":", "xcode.lang.gyp.assignment.rhs", ); }; }, { Identifier = "xcode.lang.gyp.target.declaration"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Rules = ( "xcode.lang.gyp.target.declarator", ":", "xcode.lang.gyp.target.name", ); }; }, { Identifier = "xcode.lang.gyp.target.name"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Rules = ( "xcode.lang.gyp.string.singlequote", ); Type = "xcode.syntax.definition.function"; }; }, { Identifier = "xcode.lang.gyp.assignment.lhs"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Rules = ( "xcode.lang.gyp.string.singlequote", ); Type = "xcode.syntax.identifier.type"; }; }, { Identifier = "xcode.lang.gyp.assignment.rhs"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Rules = ( "xcode.lang.gyp.string.singlequote?", "xcode.lang.gyp.array?", "xcode.lang.gyp.dictionary?", "xcode.lang.number?", ); }; }, { Identifier = "xcode.lang.gyp.dictionary"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Start = "{"; End = "}"; Foldable = YES; Recursive = YES; IncludeRules = ( "xcode.lang.gyp.target.declaration", "xcode.lang.gyp.assignment", ); }; }, { Identifier = "xcode.lang.gyp.array"; Syntax = { Tokenizer = "xcode.lang.gyp.lexer"; Start = "["; End = "]"; Foldable = YES; Recursive = YES; IncludeRules = ( "xcode.lang.gyp.array", "xcode.lang.gyp.dictionary", "xcode.lang.gyp.string.singlequote", ); }; }, { Identifier = "xcode.lang.gyp.todo.mark"; Syntax = { StartChars = "T"; Match = ( "^\(TODO\(.*\):[ \t]+.*\)$", // include "TODO: " in the markers list ); // This is the order of captures. All of the match strings above need the same order. CaptureTypes = ( "xcode.syntax.mark" ); Type = "xcode.syntax.comment"; }; }, { Identifier = "xcode.lang.gyp.comment"; BasedOn = "xcode.lang.comment"; // for text macros Syntax = { Start = "#"; End = "\n"; IncludeRules = ( "xcode.lang.url", "xcode.lang.url.mail", "xcode.lang.comment.mark", "xcode.lang.gyp.todo.mark", ); Type = "xcode.syntax.comment"; }; }, ) ��������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/gyp-tests.el������000644 �000766 �000024 �00000004203 12455173731 032410� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������;;; gyp-tests.el - unit tests for gyp-mode. ;; Copyright (c) 2012 Google Inc. All rights reserved. ;; Use of this source code is governed by a BSD-style license that can be ;; found in the LICENSE file. ;; The recommended way to run these tests is to run them from the command-line, ;; with the run-unit-tests.sh script. (require 'cl) (require 'ert) (require 'gyp) (defconst samples (directory-files "testdata" t ".gyp$") "List of golden samples to check") (defun fontify (filename) (with-temp-buffer (insert-file-contents-literally filename) (gyp-mode) (font-lock-fontify-buffer) (buffer-string))) (defun read-golden-sample (filename) (with-temp-buffer (insert-file-contents-literally (concat filename ".fontified")) (read (current-buffer)))) (defun equivalent-face (face) "For the purposes of face comparison, we're not interested in the differences between certain faces. For example, the difference between font-lock-comment-delimiter and font-lock-comment-face." (case face ((font-lock-comment-delimiter-face) font-lock-comment-face) (t face))) (defun text-face-properties (s) "Extract the text properties from s" (let ((result (list t))) (dotimes (i (length s)) (setq result (cons (equivalent-face (get-text-property i 'face s)) result))) (nreverse result))) (ert-deftest test-golden-samples () "Check that fontification produces the same results as the golden samples" (dolist (sample samples) (let ((golden (read-golden-sample sample)) (fontified (fontify sample))) (should (equal golden fontified)) (should (equal (text-face-properties golden) (text-face-properties fontified)))))) (defun create-golden-sample (filename) "Create a golden sample by fontifying filename and writing out the printable representation of the fontified buffer (with text properties) to the FILENAME.fontified" (with-temp-file (concat filename ".fontified") (print (fontify filename) (current-buffer)))) (defun create-golden-samples () "Recreate the golden samples" (dolist (sample samples) (create-golden-sample sample))) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/gyp.el������������000644 �000766 �000024 �00000025323 12455173731 031256� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������;;; gyp.el - font-lock-mode support for gyp files. ;; Copyright (c) 2012 Google Inc. All rights reserved. ;; Use of this source code is governed by a BSD-style license that can be ;; found in the LICENSE file. ;; Put this somewhere in your load-path and ;; (require 'gyp) (require 'python) (require 'cl) (when (string-match "python-mode.el" (symbol-file 'python-mode 'defun)) (error (concat "python-mode must be loaded from python.el (bundled with " "recent emacsen), not from the older and less maintained " "python-mode.el"))) (defadvice python-calculate-indentation (after ami-outdent-closing-parens activate) "De-indent closing parens, braces, and brackets in gyp-mode." (if (and (eq major-mode 'gyp-mode) (string-match "^ *[])}][],)}]* *$" (buffer-substring-no-properties (line-beginning-position) (line-end-position)))) (setq ad-return-value (- ad-return-value 2)))) (define-derived-mode gyp-mode python-mode "Gyp" "Major mode for editing .gyp files. See http://code.google.com/p/gyp/" ;; gyp-parse-history is a stack of (POSITION . PARSE-STATE) tuples, ;; with greater positions at the top of the stack. PARSE-STATE ;; is a list of section symbols (see gyp-section-name and gyp-parse-to) ;; with most nested section symbol at the front of the list. (set (make-local-variable 'gyp-parse-history) '((1 . (list)))) (gyp-add-font-lock-keywords)) (defun gyp-set-indentation () "Hook function to configure python indentation to suit gyp mode." (setq python-continuation-offset 2 python-indent 2 python-guess-indent nil)) (add-hook 'gyp-mode-hook 'gyp-set-indentation) (add-to-list 'auto-mode-alist '("\\.gyp\\'" . gyp-mode)) (add-to-list 'auto-mode-alist '("\\.gypi\\'" . gyp-mode)) (add-to-list 'auto-mode-alist '("/\\.gclient\\'" . gyp-mode)) ;;; Font-lock support (defconst gyp-dependencies-regexp (regexp-opt (list "dependencies" "export_dependent_settings")) "Regular expression to introduce 'dependencies' section") (defconst gyp-sources-regexp (regexp-opt (list "action" "files" "include_dirs" "includes" "inputs" "libraries" "outputs" "sources")) "Regular expression to introduce 'sources' sections") (defconst gyp-conditions-regexp (regexp-opt (list "conditions" "target_conditions")) "Regular expression to introduce conditions sections") (defconst gyp-variables-regexp "^variables" "Regular expression to introduce variables sections") (defconst gyp-defines-regexp "^defines" "Regular expression to introduce 'defines' sections") (defconst gyp-targets-regexp "^targets" "Regular expression to introduce 'targets' sections") (defun gyp-section-name (section) "Map the sections we are interested in from SECTION to symbol. SECTION is a string from the buffer that introduces a section. The result is a symbol representing the kind of section. This allows us to treat (for the purposes of font-lock) several different section names as the same kind of section. For example, a 'sources section can be introduced by the 'sources', 'inputs', 'outputs' keyword. 'other is the default section kind when a more specific match is not made." (cond ((string-match-p gyp-dependencies-regexp section) 'dependencies) ((string-match-p gyp-sources-regexp section) 'sources) ((string-match-p gyp-variables-regexp section) 'variables) ((string-match-p gyp-conditions-regexp section) 'conditions) ((string-match-p gyp-targets-regexp section) 'targets) ((string-match-p gyp-defines-regexp section) 'defines) (t 'other))) (defun gyp-invalidate-parse-states-after (target-point) "Erase any parse information after target-point." (while (> (caar gyp-parse-history) target-point) (setq gyp-parse-history (cdr gyp-parse-history)))) (defun gyp-parse-point () "The point of the last parse state added by gyp-parse-to." (caar gyp-parse-history)) (defun gyp-parse-sections () "A list of section symbols holding at the last parse state point." (cdar gyp-parse-history)) (defun gyp-inside-dictionary-p () "Predicate returning true if the parser is inside a dictionary." (not (eq (cadar gyp-parse-history) 'list))) (defun gyp-add-parse-history (point sections) "Add parse state SECTIONS to the parse history at POINT so that parsing can be resumed instantly." (while (>= (caar gyp-parse-history) point) (setq gyp-parse-history (cdr gyp-parse-history))) (setq gyp-parse-history (cons (cons point sections) gyp-parse-history))) (defun gyp-parse-to (target-point) "Parses from (point) to TARGET-POINT adding the parse state information to gyp-parse-state-history. Parsing stops if TARGET-POINT is reached or if a string literal has been parsed. Returns nil if no further parsing can be done, otherwise returns the position of the start of a parsed string, leaving the point at the end of the string." (let ((parsing t) string-start) (while parsing (setq string-start nil) ;; Parse up to a character that starts a sexp, or if the nesting ;; level decreases. (let ((state (parse-partial-sexp (gyp-parse-point) target-point -1 t)) (sections (gyp-parse-sections))) (if (= (nth 0 state) -1) (setq sections (cdr sections)) ; pop out a level (cond ((looking-at-p "['\"]") ; a string (setq string-start (point)) (goto-char (scan-sexps (point) 1)) (if (gyp-inside-dictionary-p) ;; Look for sections inside a dictionary (let ((section (gyp-section-name (buffer-substring-no-properties (+ 1 string-start) (- (point) 1))))) (setq sections (cons section (cdr sections))))) ;; Stop after the string so it can be fontified. (setq target-point (point))) ((looking-at-p "{") ;; Inside a dictionary. Increase nesting. (forward-char 1) (setq sections (cons 'unknown sections))) ((looking-at-p "\\[") ;; Inside a list. Increase nesting (forward-char 1) (setq sections (cons 'list sections))) ((not (eobp)) ;; other (forward-char 1)))) (gyp-add-parse-history (point) sections) (setq parsing (< (point) target-point)))) string-start)) (defun gyp-section-at-point () "Transform the last parse state, which is a list of nested sections and return the section symbol that should be used to determine font-lock information for the string. Can return nil indicating the string should not have any attached section." (let ((sections (gyp-parse-sections))) (cond ((eq (car sections) 'conditions) ;; conditions can occur in a variables section, but we still want to ;; highlight it as a keyword. nil) ((and (eq (car sections) 'list) (eq (cadr sections) 'list)) ;; conditions and sources can have items in [[ ]] (caddr sections)) (t (cadr sections))))) (defun gyp-section-match (limit) "Parse from (point) to LIMIT returning by means of match data what was matched. The group of the match indicates what style font-lock should apply. See also `gyp-add-font-lock-keywords'." (gyp-invalidate-parse-states-after (point)) (let ((group nil) (string-start t)) (while (and (< (point) limit) (not group) string-start) (setq string-start (gyp-parse-to limit)) (if string-start (setq group (case (gyp-section-at-point) ('dependencies 1) ('variables 2) ('conditions 2) ('sources 3) ('defines 4) (nil nil))))) (if group (progn ;; Set the match data to indicate to the font-lock mechanism the ;; highlighting to be performed. (set-match-data (append (list string-start (point)) (make-list (* (1- group) 2) nil) (list (1+ string-start) (1- (point))))) t)))) ;;; Please see http://code.google.com/p/gyp/wiki/GypLanguageSpecification for ;;; canonical list of keywords. (defun gyp-add-font-lock-keywords () "Add gyp-mode keywords to font-lock mechanism." ;; TODO(jknotten): Move all the keyword highlighting into gyp-section-match ;; so that we can do the font-locking in a single font-lock pass. (font-lock-add-keywords nil (list ;; Top-level keywords (list (concat "['\"]\\(" (regexp-opt (list "action" "action_name" "actions" "cflags" "conditions" "configurations" "copies" "defines" "dependencies" "destination" "direct_dependent_settings" "export_dependent_settings" "extension" "files" "include_dirs" "includes" "inputs" "libraries" "link_settings" "mac_bundle" "message" "msvs_external_rule" "outputs" "product_name" "process_outputs_as_sources" "rules" "rule_name" "sources" "suppress_wildcard" "target_conditions" "target_defaults" "target_defines" "target_name" "toolsets" "targets" "type" "variables" "xcode_settings")) "[!/+=]?\\)") 1 'font-lock-keyword-face t) ;; Type of target (list (concat "['\"]\\(" (regexp-opt (list "loadable_module" "static_library" "shared_library" "executable" "none")) "\\)") 1 'font-lock-type-face t) (list "\\(?:target\\|action\\)_name['\"]\\s-*:\\s-*['\"]\\([^ '\"]*\\)" 1 'font-lock-function-name-face t) (list 'gyp-section-match (list 1 'font-lock-function-name-face t t) ; dependencies (list 2 'font-lock-variable-name-face t t) ; variables, conditions (list 3 'font-lock-constant-face t t) ; sources (list 4 'font-lock-preprocessor-face t t)) ; preprocessor ;; Variable expansion (list "<@?(\\([^\n )]+\\))" 1 'font-lock-variable-name-face t) ;; Command expansion (list "<!@?(\\([^\n )]+\\))" 1 'font-lock-variable-name-face t) ))) (provide 'gyp) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/README������������000644 �000766 �000024 �00000000632 12455173731 031011� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������How to install gyp-mode for emacs: Add the following to your ~/.emacs (replace ... with the path to your gyp checkout). (setq load-path (cons ".../tools/emacs" load-path)) (require 'gyp) Restart emacs (or eval-region the added lines) and you should be all set. Please note that ert is required for running the tests, which is included in Emacs 24, or available separately from https://github.com/ohler/ert ������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/run-unit-tests.sh�000755 �000766 �000024 �00000000462 12455173731 033412� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. emacs --no-site-file --no-init-file --batch \ --load ert.el --load gyp.el --load gyp-tests.el \ -f ert-run-tests-batch-and-exit ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/���������000755 �000766 �000024 �00000000000 12456115117 031734� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/media.gyp000644 �000766 �000024 �00000110453 12455173731 033545� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 The Chromium Authors. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. { 'variables': { 'chromium_code': 1, # Override to dynamically link the PulseAudio library. 'use_pulseaudio%': 0, # Override to dynamically link the cras (ChromeOS audio) library. 'use_cras%': 0, }, 'targets': [ { 'target_name': 'media', 'type': '<(component)', 'dependencies': [ 'yuv_convert', '../base/base.gyp:base', '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations', '../build/temp_gyp/googleurl.gyp:googleurl', '../crypto/crypto.gyp:crypto', '../third_party/openmax/openmax.gyp:il', '../ui/ui.gyp:ui', ], 'defines': [ 'MEDIA_IMPLEMENTATION', ], 'include_dirs': [ '..', ], 'sources': [ 'audio/android/audio_manager_android.cc', 'audio/android/audio_manager_android.h', 'audio/android/audio_track_output_android.cc', 'audio/android/audio_track_output_android.h', 'audio/android/opensles_input.cc', 'audio/android/opensles_input.h', 'audio/android/opensles_output.cc', 'audio/android/opensles_output.h', 'audio/async_socket_io_handler.h', 'audio/async_socket_io_handler_posix.cc', 'audio/async_socket_io_handler_win.cc', 'audio/audio_buffers_state.cc', 'audio/audio_buffers_state.h', 'audio/audio_io.h', 'audio/audio_input_controller.cc', 'audio/audio_input_controller.h', 'audio/audio_input_stream_impl.cc', 'audio/audio_input_stream_impl.h', 'audio/audio_device_name.cc', 'audio/audio_device_name.h', 'audio/audio_manager.cc', 'audio/audio_manager.h', 'audio/audio_manager_base.cc', 'audio/audio_manager_base.h', 'audio/audio_output_controller.cc', 'audio/audio_output_controller.h', 'audio/audio_output_dispatcher.cc', 'audio/audio_output_dispatcher.h', 'audio/audio_output_dispatcher_impl.cc', 'audio/audio_output_dispatcher_impl.h', 'audio/audio_output_mixer.cc', 'audio/audio_output_mixer.h', 'audio/audio_output_proxy.cc', 'audio/audio_output_proxy.h', 'audio/audio_parameters.cc', 'audio/audio_parameters.h', 'audio/audio_util.cc', 'audio/audio_util.h', 'audio/cross_process_notification.cc', 'audio/cross_process_notification.h', 'audio/cross_process_notification_win.cc', 'audio/cross_process_notification_posix.cc', 'audio/fake_audio_input_stream.cc', 'audio/fake_audio_input_stream.h', 'audio/fake_audio_output_stream.cc', 'audio/fake_audio_output_stream.h', 'audio/linux/audio_manager_linux.cc', 'audio/linux/audio_manager_linux.h', 'audio/linux/alsa_input.cc', 'audio/linux/alsa_input.h', 'audio/linux/alsa_output.cc', 'audio/linux/alsa_output.h', 'audio/linux/alsa_util.cc', 'audio/linux/alsa_util.h', 'audio/linux/alsa_wrapper.cc', 'audio/linux/alsa_wrapper.h', 'audio/linux/cras_output.cc', 'audio/linux/cras_output.h', 'audio/openbsd/audio_manager_openbsd.cc', 'audio/openbsd/audio_manager_openbsd.h', 'audio/mac/audio_input_mac.cc', 'audio/mac/audio_input_mac.h', 'audio/mac/audio_low_latency_input_mac.cc', 'audio/mac/audio_low_latency_input_mac.h', 'audio/mac/audio_low_latency_output_mac.cc', 'audio/mac/audio_low_latency_output_mac.h', 'audio/mac/audio_manager_mac.cc', 'audio/mac/audio_manager_mac.h', 'audio/mac/audio_output_mac.cc', 'audio/mac/audio_output_mac.h', 'audio/null_audio_sink.cc', 'audio/null_audio_sink.h', 'audio/pulse/pulse_output.cc', 'audio/pulse/pulse_output.h', 'audio/sample_rates.cc', 'audio/sample_rates.h', 'audio/simple_sources.cc', 'audio/simple_sources.h', 'audio/win/audio_low_latency_input_win.cc', 'audio/win/audio_low_latency_input_win.h', 'audio/win/audio_low_latency_output_win.cc', 'audio/win/audio_low_latency_output_win.h', 'audio/win/audio_manager_win.cc', 'audio/win/audio_manager_win.h', 'audio/win/avrt_wrapper_win.cc', 'audio/win/avrt_wrapper_win.h', 'audio/win/device_enumeration_win.cc', 'audio/win/device_enumeration_win.h', 'audio/win/wavein_input_win.cc', 'audio/win/wavein_input_win.h', 'audio/win/waveout_output_win.cc', 'audio/win/waveout_output_win.h', 'base/android/media_jni_registrar.cc', 'base/android/media_jni_registrar.h', 'base/audio_decoder.cc', 'base/audio_decoder.h', 'base/audio_decoder_config.cc', 'base/audio_decoder_config.h', 'base/audio_renderer.h', 'base/audio_renderer_mixer.cc', 'base/audio_renderer_mixer.h', 'base/audio_renderer_mixer_input.cc', 'base/audio_renderer_mixer_input.h', 'base/bitstream_buffer.h', 'base/buffers.cc', 'base/buffers.h', 'base/byte_queue.cc', 'base/byte_queue.h', 'base/channel_layout.cc', 'base/channel_layout.h', 'base/clock.cc', 'base/clock.h', 'base/composite_filter.cc', 'base/composite_filter.h', 'base/data_buffer.cc', 'base/data_buffer.h', 'base/data_source.cc', 'base/data_source.h', 'base/decoder_buffer.cc', 'base/decoder_buffer.h', 'base/decrypt_config.cc', 'base/decrypt_config.h', 'base/decryptor.h', 'base/decryptor_client.h', 'base/demuxer.cc', 'base/demuxer.h', 'base/demuxer_stream.cc', 'base/demuxer_stream.h', 'base/djb2.cc', 'base/djb2.h', 'base/filter_collection.cc', 'base/filter_collection.h', 'base/filter_host.h', 'base/filters.cc', 'base/filters.h', 'base/h264_bitstream_converter.cc', 'base/h264_bitstream_converter.h', 'base/media.h', 'base/media_android.cc', 'base/media_export.h', 'base/media_log.cc', 'base/media_log.h', 'base/media_log_event.h', 'base/media_posix.cc', 'base/media_switches.cc', 'base/media_switches.h', 'base/media_win.cc', 'base/message_loop_factory.cc', 'base/message_loop_factory.h', 'base/pipeline.cc', 'base/pipeline.h', 'base/pipeline_status.cc', 'base/pipeline_status.h', 'base/ranges.cc', 'base/ranges.h', 'base/seekable_buffer.cc', 'base/seekable_buffer.h', 'base/state_matrix.cc', 'base/state_matrix.h', 'base/stream_parser.cc', 'base/stream_parser.h', 'base/stream_parser_buffer.cc', 'base/stream_parser_buffer.h', 'base/video_decoder.cc', 'base/video_decoder.h', 'base/video_decoder_config.cc', 'base/video_decoder_config.h', 'base/video_frame.cc', 'base/video_frame.h', 'base/video_renderer.h', 'base/video_util.cc', 'base/video_util.h', 'crypto/aes_decryptor.cc', 'crypto/aes_decryptor.h', 'ffmpeg/ffmpeg_common.cc', 'ffmpeg/ffmpeg_common.h', 'ffmpeg/file_protocol.cc', 'ffmpeg/file_protocol.h', 'filters/audio_file_reader.cc', 'filters/audio_file_reader.h', 'filters/audio_renderer_algorithm.cc', 'filters/audio_renderer_algorithm.h', 'filters/audio_renderer_impl.cc', 'filters/audio_renderer_impl.h', 'filters/bitstream_converter.cc', 'filters/bitstream_converter.h', 'filters/chunk_demuxer.cc', 'filters/chunk_demuxer.h', 'filters/chunk_demuxer_client.h', 'filters/dummy_demuxer.cc', 'filters/dummy_demuxer.h', 'filters/ffmpeg_audio_decoder.cc', 'filters/ffmpeg_audio_decoder.h', 'filters/ffmpeg_demuxer.cc', 'filters/ffmpeg_demuxer.h', 'filters/ffmpeg_h264_bitstream_converter.cc', 'filters/ffmpeg_h264_bitstream_converter.h', 'filters/ffmpeg_glue.cc', 'filters/ffmpeg_glue.h', 'filters/ffmpeg_video_decoder.cc', 'filters/ffmpeg_video_decoder.h', 'filters/file_data_source.cc', 'filters/file_data_source.h', 'filters/gpu_video_decoder.cc', 'filters/gpu_video_decoder.h', 'filters/in_memory_url_protocol.cc', 'filters/in_memory_url_protocol.h', 'filters/source_buffer_stream.cc', 'filters/source_buffer_stream.h', 'filters/video_frame_generator.cc', 'filters/video_frame_generator.h', 'filters/video_renderer_base.cc', 'filters/video_renderer_base.h', 'video/capture/fake_video_capture_device.cc', 'video/capture/fake_video_capture_device.h', 'video/capture/linux/video_capture_device_linux.cc', 'video/capture/linux/video_capture_device_linux.h', 'video/capture/mac/video_capture_device_mac.h', 'video/capture/mac/video_capture_device_mac.mm', 'video/capture/mac/video_capture_device_qtkit_mac.h', 'video/capture/mac/video_capture_device_qtkit_mac.mm', 'video/capture/video_capture.h', 'video/capture/video_capture_device.h', 'video/capture/video_capture_device_dummy.cc', 'video/capture/video_capture_device_dummy.h', 'video/capture/video_capture_proxy.cc', 'video/capture/video_capture_proxy.h', 'video/capture/video_capture_types.h', 'video/capture/win/filter_base_win.cc', 'video/capture/win/filter_base_win.h', 'video/capture/win/pin_base_win.cc', 'video/capture/win/pin_base_win.h', 'video/capture/win/sink_filter_observer_win.h', 'video/capture/win/sink_filter_win.cc', 'video/capture/win/sink_filter_win.h', 'video/capture/win/sink_input_pin_win.cc', 'video/capture/win/sink_input_pin_win.h', 'video/capture/win/video_capture_device_win.cc', 'video/capture/win/video_capture_device_win.h', 'video/picture.cc', 'video/picture.h', 'video/video_decode_accelerator.cc', 'video/video_decode_accelerator.h', 'webm/webm_constants.h', 'webm/webm_cluster_parser.cc', 'webm/webm_cluster_parser.h', 'webm/webm_content_encodings.cc', 'webm/webm_content_encodings.h', 'webm/webm_content_encodings_client.cc', 'webm/webm_content_encodings_client.h', 'webm/webm_info_parser.cc', 'webm/webm_info_parser.h', 'webm/webm_parser.cc', 'webm/webm_parser.h', 'webm/webm_stream_parser.cc', 'webm/webm_stream_parser.h', 'webm/webm_tracks_parser.cc', 'webm/webm_tracks_parser.h', ], 'direct_dependent_settings': { 'include_dirs': [ '..', ], }, 'conditions': [ # Android doesn't use ffmpeg, so make the dependency conditional # and exclude the sources which depend on ffmpeg. ['OS != "android"', { 'dependencies': [ '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], }], ['OS == "android"', { 'sources!': [ 'base/media_posix.cc', 'ffmpeg/ffmpeg_common.cc', 'ffmpeg/ffmpeg_common.h', 'ffmpeg/file_protocol.cc', 'ffmpeg/file_protocol.h', 'filters/audio_file_reader.cc', 'filters/audio_file_reader.h', 'filters/bitstream_converter.cc', 'filters/bitstream_converter.h', 'filters/chunk_demuxer.cc', 'filters/chunk_demuxer.h', 'filters/chunk_demuxer_client.h', 'filters/ffmpeg_audio_decoder.cc', 'filters/ffmpeg_audio_decoder.h', 'filters/ffmpeg_demuxer.cc', 'filters/ffmpeg_demuxer.h', 'filters/ffmpeg_h264_bitstream_converter.cc', 'filters/ffmpeg_h264_bitstream_converter.h', 'filters/ffmpeg_glue.cc', 'filters/ffmpeg_glue.h', 'filters/ffmpeg_video_decoder.cc', 'filters/ffmpeg_video_decoder.h', 'filters/gpu_video_decoder.cc', 'filters/gpu_video_decoder.h', 'webm/webm_cluster_parser.cc', 'webm/webm_cluster_parser.h', 'webm/webm_stream_parser.cc', 'webm/webm_stream_parser.h', ], }], # The below 'android' condition were added temporarily and should be # removed in downstream, because there is no Java environment setup in # upstream yet. ['OS == "android"', { 'sources!':[ 'audio/android/audio_track_output_android.cc', ], 'sources':[ 'audio/android/audio_track_output_stub_android.cc', ], 'link_settings': { 'libraries': [ '-lOpenSLES', ], }, }], ['OS=="linux" or OS=="freebsd" or OS=="solaris"', { 'link_settings': { 'libraries': [ '-lasound', ], }, }], ['OS=="openbsd"', { 'sources/': [ ['exclude', '/alsa_' ], ['exclude', '/audio_manager_linux' ] ], 'link_settings': { 'libraries': [ ], }, }], ['OS!="openbsd"', { 'sources!': [ 'audio/openbsd/audio_manager_openbsd.cc', 'audio/openbsd/audio_manager_openbsd.h', ], }], ['OS=="linux"', { 'variables': { 'conditions': [ ['sysroot!=""', { 'pkg-config': '../build/linux/pkg-config-wrapper "<(sysroot)" "<(target_arch)"', }, { 'pkg-config': 'pkg-config' }], ], }, 'conditions': [ ['use_cras == 1', { 'cflags': [ '<!@(<(pkg-config) --cflags libcras)', ], 'link_settings': { 'libraries': [ '<!@(<(pkg-config) --libs libcras)', ], }, 'defines': [ 'USE_CRAS', ], }, { # else: use_cras == 0 'sources!': [ 'audio/linux/cras_output.cc', 'audio/linux/cras_output.h', ], }], ], }], ['os_posix == 1', { 'conditions': [ ['use_pulseaudio == 1', { 'cflags': [ '<!@(pkg-config --cflags libpulse)', ], 'link_settings': { 'libraries': [ '<!@(pkg-config --libs-only-l libpulse)', ], }, 'defines': [ 'USE_PULSEAUDIO', ], }, { # else: use_pulseaudio == 0 'sources!': [ 'audio/pulse/pulse_output.cc', 'audio/pulse/pulse_output.h', ], }], ], }], ['os_posix == 1 and OS != "android"', { # Video capture isn't supported in Android yet. 'sources!': [ 'video/capture/video_capture_device_dummy.cc', 'video/capture/video_capture_device_dummy.h', ], }], ['OS=="mac"', { 'link_settings': { 'libraries': [ '$(SDKROOT)/System/Library/Frameworks/AudioUnit.framework', '$(SDKROOT)/System/Library/Frameworks/AudioToolbox.framework', '$(SDKROOT)/System/Library/Frameworks/CoreAudio.framework', '$(SDKROOT)/System/Library/Frameworks/CoreVideo.framework', '$(SDKROOT)/System/Library/Frameworks/QTKit.framework', ], }, }], ['OS=="win"', { 'sources!': [ 'audio/pulse/pulse_output.cc', 'audio/pulse/pulse_output.h', 'video/capture/video_capture_device_dummy.cc', 'video/capture/video_capture_device_dummy.h', ], }], ['proprietary_codecs==1 or branding=="Chrome"', { 'sources': [ 'mp4/avc.cc', 'mp4/avc.h', 'mp4/box_definitions.cc', 'mp4/box_definitions.h', 'mp4/box_reader.cc', 'mp4/box_reader.h', 'mp4/cenc.cc', 'mp4/cenc.h', 'mp4/mp4_stream_parser.cc', 'mp4/mp4_stream_parser.h', 'mp4/offset_byte_queue.cc', 'mp4/offset_byte_queue.h', 'mp4/track_run_iterator.cc', 'mp4/track_run_iterator.h', ], }], ], }, { 'target_name': 'yuv_convert', 'type': 'static_library', 'include_dirs': [ '..', ], 'conditions': [ ['order_profiling != 0', { 'target_conditions' : [ ['_toolset=="target"', { 'cflags!': [ '-finstrument-functions' ], }], ], }], [ 'target_arch == "ia32" or target_arch == "x64"', { 'dependencies': [ 'yuv_convert_simd_x86', ], }], [ 'target_arch == "arm"', { 'dependencies': [ 'yuv_convert_simd_arm', ], }], ], 'sources': [ 'base/yuv_convert.cc', 'base/yuv_convert.h', ], }, { 'target_name': 'yuv_convert_simd_x86', 'type': 'static_library', 'include_dirs': [ '..', ], 'sources': [ 'base/simd/convert_rgb_to_yuv_c.cc', 'base/simd/convert_rgb_to_yuv_sse2.cc', 'base/simd/convert_rgb_to_yuv_ssse3.asm', 'base/simd/convert_rgb_to_yuv_ssse3.cc', 'base/simd/convert_rgb_to_yuv_ssse3.inc', 'base/simd/convert_yuv_to_rgb_c.cc', 'base/simd/convert_yuv_to_rgb_x86.cc', 'base/simd/convert_yuv_to_rgb_mmx.asm', 'base/simd/convert_yuv_to_rgb_mmx.inc', 'base/simd/convert_yuv_to_rgb_sse.asm', 'base/simd/filter_yuv.h', 'base/simd/filter_yuv_c.cc', 'base/simd/filter_yuv_mmx.cc', 'base/simd/filter_yuv_sse2.cc', 'base/simd/linear_scale_yuv_to_rgb_mmx.asm', 'base/simd/linear_scale_yuv_to_rgb_mmx.inc', 'base/simd/linear_scale_yuv_to_rgb_sse.asm', 'base/simd/scale_yuv_to_rgb_mmx.asm', 'base/simd/scale_yuv_to_rgb_mmx.inc', 'base/simd/scale_yuv_to_rgb_sse.asm', 'base/simd/yuv_to_rgb_table.cc', 'base/simd/yuv_to_rgb_table.h', ], 'conditions': [ ['order_profiling != 0', { 'target_conditions' : [ ['_toolset=="target"', { 'cflags!': [ '-finstrument-functions' ], }], ], }], [ 'target_arch == "x64"', { # Source files optimized for X64 systems. 'sources': [ 'base/simd/linear_scale_yuv_to_rgb_mmx_x64.asm', 'base/simd/scale_yuv_to_rgb_sse2_x64.asm', ], }], [ 'os_posix == 1 and OS != "mac" and OS != "android"', { 'cflags': [ '-msse2', ], }], [ 'OS == "mac"', { 'configurations': { 'Debug': { 'xcode_settings': { # gcc on the mac builds horribly unoptimized sse code in debug # mode. Since this is rarely going to be debugged, run with full # optimizations in Debug as well as Release. 'GCC_OPTIMIZATION_LEVEL': '3', # -O3 }, }, }, }], [ 'OS=="win"', { 'variables': { 'yasm_flags': [ '-DWIN32', '-DMSVC', '-DCHROMIUM', '-Isimd', ], }, }], [ 'OS=="mac"', { 'variables': { 'yasm_flags': [ '-DPREFIX', '-DMACHO', '-DCHROMIUM', '-Isimd', ], }, }], [ 'os_posix==1 and OS!="mac"', { 'variables': { 'conditions': [ [ 'target_arch=="ia32"', { 'yasm_flags': [ '-DX86_32', '-DELF', '-DCHROMIUM', '-Isimd', ], }, { 'yasm_flags': [ '-DARCH_X86_64', '-DELF', '-DPIC', '-DCHROMIUM', '-Isimd', ], }], ], }, }], ], 'variables': { 'yasm_output_path': '<(SHARED_INTERMEDIATE_DIR)/media', }, 'msvs_2010_disable_uldi_when_referenced': 1, 'includes': [ '../third_party/yasm/yasm_compile.gypi', ], }, { 'target_name': 'yuv_convert_simd_arm', 'type': 'static_library', 'include_dirs': [ '..', ], 'sources': [ 'base/simd/convert_rgb_to_yuv_c.cc', 'base/simd/convert_rgb_to_yuv.h', 'base/simd/convert_yuv_to_rgb_c.cc', 'base/simd/convert_yuv_to_rgb.h', 'base/simd/filter_yuv.h', 'base/simd/filter_yuv_c.cc', 'base/simd/yuv_to_rgb_table.cc', 'base/simd/yuv_to_rgb_table.h', ], }, { 'target_name': 'media_unittests', 'type': 'executable', 'dependencies': [ 'media', 'media_test_support', 'yuv_convert', '../base/base.gyp:base', '../base/base.gyp:base_i18n', '../base/base.gyp:test_support_base', '../testing/gmock.gyp:gmock', '../testing/gtest.gyp:gtest', '../ui/ui.gyp:ui', ], 'sources': [ 'audio/async_socket_io_handler_unittest.cc', 'audio/audio_input_controller_unittest.cc', 'audio/audio_input_device_unittest.cc', 'audio/audio_input_unittest.cc', 'audio/audio_input_volume_unittest.cc', 'audio/audio_low_latency_input_output_unittest.cc', 'audio/audio_output_controller_unittest.cc', 'audio/audio_output_proxy_unittest.cc', 'audio/audio_parameters_unittest.cc', 'audio/audio_util_unittest.cc', 'audio/cross_process_notification_unittest.cc', 'audio/linux/alsa_output_unittest.cc', 'audio/mac/audio_low_latency_input_mac_unittest.cc', 'audio/mac/audio_output_mac_unittest.cc', 'audio/simple_sources_unittest.cc', 'audio/win/audio_low_latency_input_win_unittest.cc', 'audio/win/audio_low_latency_output_win_unittest.cc', 'audio/win/audio_output_win_unittest.cc', 'base/audio_renderer_mixer_unittest.cc', 'base/audio_renderer_mixer_input_unittest.cc', 'base/buffers_unittest.cc', 'base/clock_unittest.cc', 'base/composite_filter_unittest.cc', 'base/data_buffer_unittest.cc', 'base/decoder_buffer_unittest.cc', 'base/djb2_unittest.cc', 'base/fake_audio_render_callback.cc', 'base/fake_audio_render_callback.h', 'base/filter_collection_unittest.cc', 'base/h264_bitstream_converter_unittest.cc', 'base/pipeline_unittest.cc', 'base/ranges_unittest.cc', 'base/run_all_unittests.cc', 'base/seekable_buffer_unittest.cc', 'base/state_matrix_unittest.cc', 'base/test_data_util.cc', 'base/test_data_util.h', 'base/video_frame_unittest.cc', 'base/video_util_unittest.cc', 'base/yuv_convert_unittest.cc', 'crypto/aes_decryptor_unittest.cc', 'ffmpeg/ffmpeg_common_unittest.cc', 'filters/audio_renderer_algorithm_unittest.cc', 'filters/audio_renderer_impl_unittest.cc', 'filters/bitstream_converter_unittest.cc', 'filters/chunk_demuxer_unittest.cc', 'filters/ffmpeg_audio_decoder_unittest.cc', 'filters/ffmpeg_decoder_unittest.h', 'filters/ffmpeg_demuxer_unittest.cc', 'filters/ffmpeg_glue_unittest.cc', 'filters/ffmpeg_h264_bitstream_converter_unittest.cc', 'filters/ffmpeg_video_decoder_unittest.cc', 'filters/file_data_source_unittest.cc', 'filters/pipeline_integration_test.cc', 'filters/pipeline_integration_test_base.cc', 'filters/source_buffer_stream_unittest.cc', 'filters/video_renderer_base_unittest.cc', 'video/capture/video_capture_device_unittest.cc', 'webm/cluster_builder.cc', 'webm/cluster_builder.h', 'webm/webm_cluster_parser_unittest.cc', 'webm/webm_content_encodings_client_unittest.cc', 'webm/webm_parser_unittest.cc', ], 'conditions': [ ['os_posix==1 and OS!="mac"', { 'conditions': [ ['linux_use_tcmalloc==1', { 'dependencies': [ '../base/allocator/allocator.gyp:allocator', ], }], ], }], ['OS != "android"', { 'dependencies': [ '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], }], ['OS == "android"', { 'sources!': [ 'audio/audio_input_volume_unittest.cc', 'base/test_data_util.cc', 'base/test_data_util.h', 'ffmpeg/ffmpeg_common_unittest.cc', 'filters/ffmpeg_audio_decoder_unittest.cc', 'filters/bitstream_converter_unittest.cc', 'filters/chunk_demuxer_unittest.cc', 'filters/ffmpeg_demuxer_unittest.cc', 'filters/ffmpeg_glue_unittest.cc', 'filters/ffmpeg_h264_bitstream_converter_unittest.cc', 'filters/ffmpeg_video_decoder_unittest.cc', 'filters/pipeline_integration_test.cc', 'filters/pipeline_integration_test_base.cc', 'mp4/mp4_stream_parser_unittest.cc', 'webm/webm_cluster_parser_unittest.cc', ], }], ['OS == "linux"', { 'conditions': [ ['use_cras == 1', { 'sources': [ 'audio/linux/cras_output_unittest.cc', ], 'defines': [ 'USE_CRAS', ], }], ], }], [ 'target_arch=="ia32" or target_arch=="x64"', { 'sources': [ 'base/simd/convert_rgb_to_yuv_unittest.cc', ], }], ['proprietary_codecs==1 or branding=="Chrome"', { 'sources': [ 'mp4/avc_unittest.cc', 'mp4/box_reader_unittest.cc', 'mp4/mp4_stream_parser_unittest.cc', 'mp4/offset_byte_queue_unittest.cc', ], }], ], }, { 'target_name': 'media_test_support', 'type': 'static_library', 'dependencies': [ 'media', '../base/base.gyp:base', '../testing/gmock.gyp:gmock', '../testing/gtest.gyp:gtest', ], 'sources': [ 'audio/test_audio_input_controller_factory.cc', 'audio/test_audio_input_controller_factory.h', 'base/mock_callback.cc', 'base/mock_callback.h', 'base/mock_data_source_host.cc', 'base/mock_data_source_host.h', 'base/mock_demuxer_host.cc', 'base/mock_demuxer_host.h', 'base/mock_filter_host.cc', 'base/mock_filter_host.h', 'base/mock_filters.cc', 'base/mock_filters.h', ], }, { 'target_name': 'scaler_bench', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../skia/skia.gyp:skia', ], 'sources': [ 'tools/scaler_bench/scaler_bench.cc', ], }, { 'target_name': 'qt_faststart', 'type': 'executable', 'sources': [ 'tools/qt_faststart/qt_faststart.c' ], }, { 'target_name': 'seek_tester', 'type': 'executable', 'dependencies': [ 'media', '../base/base.gyp:base', ], 'sources': [ 'tools/seek_tester/seek_tester.cc', ], }, ], 'conditions': [ ['OS=="win"', { 'targets': [ { 'target_name': 'player_wtl', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations', '../ui/ui.gyp:ui', ], 'include_dirs': [ '<(DEPTH)/third_party/wtl/include', ], 'sources': [ 'tools/player_wtl/list.h', 'tools/player_wtl/mainfrm.h', 'tools/player_wtl/movie.cc', 'tools/player_wtl/movie.h', 'tools/player_wtl/player_wtl.cc', 'tools/player_wtl/player_wtl.rc', 'tools/player_wtl/props.h', 'tools/player_wtl/seek.h', 'tools/player_wtl/resource.h', 'tools/player_wtl/view.h', ], 'msvs_settings': { 'VCLinkerTool': { 'SubSystem': '2', # Set /SUBSYSTEM:WINDOWS }, }, 'defines': [ '_CRT_SECURE_NO_WARNINGS=1', ], }, ], }], ['OS == "win" or toolkit_uses_gtk == 1', { 'targets': [ { 'target_name': 'shader_bench', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../ui/gl/gl.gyp:gl', ], 'sources': [ 'tools/shader_bench/shader_bench.cc', 'tools/shader_bench/cpu_color_painter.cc', 'tools/shader_bench/cpu_color_painter.h', 'tools/shader_bench/gpu_color_painter.cc', 'tools/shader_bench/gpu_color_painter.h', 'tools/shader_bench/gpu_painter.cc', 'tools/shader_bench/gpu_painter.h', 'tools/shader_bench/painter.cc', 'tools/shader_bench/painter.h', 'tools/shader_bench/window.cc', 'tools/shader_bench/window.h', ], 'conditions': [ ['toolkit_uses_gtk == 1', { 'dependencies': [ '../build/linux/system.gyp:gtk', ], 'sources': [ 'tools/shader_bench/window_linux.cc', ], }], ['OS=="win"', { 'dependencies': [ '../third_party/angle/src/build_angle.gyp:libEGL', '../third_party/angle/src/build_angle.gyp:libGLESv2', ], 'sources': [ 'tools/shader_bench/window_win.cc', ], }], ], }, ], }], ['OS == "linux" and target_arch != "arm"', { 'targets': [ { 'target_name': 'tile_render_bench', 'type': 'executable', 'dependencies': [ '../base/base.gyp:base', '../ui/gl/gl.gyp:gl', ], 'libraries': [ '-lGL', '-ldl', ], 'sources': [ 'tools/tile_render_bench/tile_render_bench.cc', ], }, ], }], ['os_posix == 1 and OS != "mac" and OS != "android"', { 'targets': [ { 'target_name': 'player_x11', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../ui/gl/gl.gyp:gl', ], 'link_settings': { 'libraries': [ '-ldl', '-lX11', '-lXrender', '-lXext', ], }, 'sources': [ 'tools/player_x11/data_source_logger.cc', 'tools/player_x11/data_source_logger.h', 'tools/player_x11/gl_video_renderer.cc', 'tools/player_x11/gl_video_renderer.h', 'tools/player_x11/player_x11.cc', 'tools/player_x11/x11_video_renderer.cc', 'tools/player_x11/x11_video_renderer.h', ], }, ], }], ['OS == "android"', { 'targets': [ { 'target_name': 'player_android', 'type': 'static_library', 'sources': [ 'base/android/media_player_bridge.cc', 'base/android/media_player_bridge.h', ], 'dependencies': [ '../base/base.gyp:base', ], 'include_dirs': [ '<(SHARED_INTERMEDIATE_DIR)/media', ], 'actions': [ { 'action_name': 'generate-jni-headers', 'inputs': [ '../base/android/jni_generator/jni_generator.py', 'base/android/java/src/org/chromium/media/MediaPlayerListener.java', ], 'outputs': [ '<(SHARED_INTERMEDIATE_DIR)/media/jni/media_player_listener_jni.h', ], 'action': [ 'python', '<(DEPTH)/base/android/jni_generator/jni_generator.py', '-o', '<@(_inputs)', '<@(_outputs)', ], }, ], }, { 'target_name': 'media_java', 'type': 'none', 'dependencies': [ '../base/base.gyp:base_java' ], 'variables': { 'package_name': 'media', 'java_in_dir': 'base/android/java', }, 'includes': [ '../build/java.gypi' ], }, ], }, { # OS != "android"' # Android does not use ffmpeg, so disable the targets which require it. 'targets': [ { 'target_name': 'ffmpeg_unittests', 'type': 'executable', 'dependencies': [ 'media', 'media_test_support', '../base/base.gyp:base', '../base/base.gyp:base_i18n', '../base/base.gyp:test_support_base', '../base/base.gyp:test_support_perf', '../testing/gtest.gyp:gtest', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'ffmpeg/ffmpeg_unittest.cc', ], 'conditions': [ ['toolkit_uses_gtk == 1', { 'dependencies': [ # Needed for the following #include chain: # base/run_all_unittests.cc # ../base/test_suite.h # gtk/gtk.h '../build/linux/system.gyp:gtk', ], 'conditions': [ ['linux_use_tcmalloc==1', { 'dependencies': [ '../base/allocator/allocator.gyp:allocator', ], }], ], }], ], }, { 'target_name': 'ffmpeg_regression_tests', 'type': 'executable', 'dependencies': [ 'media', 'media_test_support', '../base/base.gyp:test_support_base', '../testing/gmock.gyp:gmock', '../testing/gtest.gyp:gtest', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'base/test_data_util.cc', 'base/run_all_unittests.cc', 'ffmpeg/ffmpeg_regression_tests.cc', 'filters/pipeline_integration_test_base.cc', ], 'conditions': [ ['os_posix==1 and OS!="mac"', { 'conditions': [ ['linux_use_tcmalloc==1', { 'dependencies': [ '../base/allocator/allocator.gyp:allocator', ], }], ], }], ], }, { 'target_name': 'ffmpeg_tests', 'type': 'executable', 'dependencies': [ 'media', '../base/base.gyp:base', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'test/ffmpeg_tests/ffmpeg_tests.cc', ], }, { 'target_name': 'media_bench', 'type': 'executable', 'dependencies': [ 'media', '../base/base.gyp:base', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'tools/media_bench/media_bench.cc', ], }, ], }] ], } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/tools/emacs/testdata/media.gyp.fontified�������������000644 �000766 �000024 �00000476046 12455173731 035451� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� #("# Copyright (c) 2012 The Chromium Authors. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. { 'variables': { 'chromium_code': 1, # Override to dynamically link the PulseAudio library. 'use_pulseaudio%': 0, # Override to dynamically link the cras (ChromeOS audio) library. 'use_cras%': 0, }, 'targets': [ { 'target_name': 'media', 'type': '<(component)', 'dependencies': [ 'yuv_convert', '../base/base.gyp:base', '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations', '../build/temp_gyp/googleurl.gyp:googleurl', '../crypto/crypto.gyp:crypto', '../third_party/openmax/openmax.gyp:il', '../ui/ui.gyp:ui', ], 'defines': [ 'MEDIA_IMPLEMENTATION', ], 'include_dirs': [ '..', ], 'sources': [ 'audio/android/audio_manager_android.cc', 'audio/android/audio_manager_android.h', 'audio/android/audio_track_output_android.cc', 'audio/android/audio_track_output_android.h', 'audio/android/opensles_input.cc', 'audio/android/opensles_input.h', 'audio/android/opensles_output.cc', 'audio/android/opensles_output.h', 'audio/async_socket_io_handler.h', 'audio/async_socket_io_handler_posix.cc', 'audio/async_socket_io_handler_win.cc', 'audio/audio_buffers_state.cc', 'audio/audio_buffers_state.h', 'audio/audio_io.h', 'audio/audio_input_controller.cc', 'audio/audio_input_controller.h', 'audio/audio_input_stream_impl.cc', 'audio/audio_input_stream_impl.h', 'audio/audio_device_name.cc', 'audio/audio_device_name.h', 'audio/audio_manager.cc', 'audio/audio_manager.h', 'audio/audio_manager_base.cc', 'audio/audio_manager_base.h', 'audio/audio_output_controller.cc', 'audio/audio_output_controller.h', 'audio/audio_output_dispatcher.cc', 'audio/audio_output_dispatcher.h', 'audio/audio_output_dispatcher_impl.cc', 'audio/audio_output_dispatcher_impl.h', 'audio/audio_output_mixer.cc', 'audio/audio_output_mixer.h', 'audio/audio_output_proxy.cc', 'audio/audio_output_proxy.h', 'audio/audio_parameters.cc', 'audio/audio_parameters.h', 'audio/audio_util.cc', 'audio/audio_util.h', 'audio/cross_process_notification.cc', 'audio/cross_process_notification.h', 'audio/cross_process_notification_win.cc', 'audio/cross_process_notification_posix.cc', 'audio/fake_audio_input_stream.cc', 'audio/fake_audio_input_stream.h', 'audio/fake_audio_output_stream.cc', 'audio/fake_audio_output_stream.h', 'audio/linux/audio_manager_linux.cc', 'audio/linux/audio_manager_linux.h', 'audio/linux/alsa_input.cc', 'audio/linux/alsa_input.h', 'audio/linux/alsa_output.cc', 'audio/linux/alsa_output.h', 'audio/linux/alsa_util.cc', 'audio/linux/alsa_util.h', 'audio/linux/alsa_wrapper.cc', 'audio/linux/alsa_wrapper.h', 'audio/linux/cras_output.cc', 'audio/linux/cras_output.h', 'audio/openbsd/audio_manager_openbsd.cc', 'audio/openbsd/audio_manager_openbsd.h', 'audio/mac/audio_input_mac.cc', 'audio/mac/audio_input_mac.h', 'audio/mac/audio_low_latency_input_mac.cc', 'audio/mac/audio_low_latency_input_mac.h', 'audio/mac/audio_low_latency_output_mac.cc', 'audio/mac/audio_low_latency_output_mac.h', 'audio/mac/audio_manager_mac.cc', 'audio/mac/audio_manager_mac.h', 'audio/mac/audio_output_mac.cc', 'audio/mac/audio_output_mac.h', 'audio/null_audio_sink.cc', 'audio/null_audio_sink.h', 'audio/pulse/pulse_output.cc', 'audio/pulse/pulse_output.h', 'audio/sample_rates.cc', 'audio/sample_rates.h', 'audio/simple_sources.cc', 'audio/simple_sources.h', 'audio/win/audio_low_latency_input_win.cc', 'audio/win/audio_low_latency_input_win.h', 'audio/win/audio_low_latency_output_win.cc', 'audio/win/audio_low_latency_output_win.h', 'audio/win/audio_manager_win.cc', 'audio/win/audio_manager_win.h', 'audio/win/avrt_wrapper_win.cc', 'audio/win/avrt_wrapper_win.h', 'audio/win/device_enumeration_win.cc', 'audio/win/device_enumeration_win.h', 'audio/win/wavein_input_win.cc', 'audio/win/wavein_input_win.h', 'audio/win/waveout_output_win.cc', 'audio/win/waveout_output_win.h', 'base/android/media_jni_registrar.cc', 'base/android/media_jni_registrar.h', 'base/audio_decoder.cc', 'base/audio_decoder.h', 'base/audio_decoder_config.cc', 'base/audio_decoder_config.h', 'base/audio_renderer.h', 'base/audio_renderer_mixer.cc', 'base/audio_renderer_mixer.h', 'base/audio_renderer_mixer_input.cc', 'base/audio_renderer_mixer_input.h', 'base/bitstream_buffer.h', 'base/buffers.cc', 'base/buffers.h', 'base/byte_queue.cc', 'base/byte_queue.h', 'base/channel_layout.cc', 'base/channel_layout.h', 'base/clock.cc', 'base/clock.h', 'base/composite_filter.cc', 'base/composite_filter.h', 'base/data_buffer.cc', 'base/data_buffer.h', 'base/data_source.cc', 'base/data_source.h', 'base/decoder_buffer.cc', 'base/decoder_buffer.h', 'base/decrypt_config.cc', 'base/decrypt_config.h', 'base/decryptor.h', 'base/decryptor_client.h', 'base/demuxer.cc', 'base/demuxer.h', 'base/demuxer_stream.cc', 'base/demuxer_stream.h', 'base/djb2.cc', 'base/djb2.h', 'base/filter_collection.cc', 'base/filter_collection.h', 'base/filter_host.h', 'base/filters.cc', 'base/filters.h', 'base/h264_bitstream_converter.cc', 'base/h264_bitstream_converter.h', 'base/media.h', 'base/media_android.cc', 'base/media_export.h', 'base/media_log.cc', 'base/media_log.h', 'base/media_log_event.h', 'base/media_posix.cc', 'base/media_switches.cc', 'base/media_switches.h', 'base/media_win.cc', 'base/message_loop_factory.cc', 'base/message_loop_factory.h', 'base/pipeline.cc', 'base/pipeline.h', 'base/pipeline_status.cc', 'base/pipeline_status.h', 'base/ranges.cc', 'base/ranges.h', 'base/seekable_buffer.cc', 'base/seekable_buffer.h', 'base/state_matrix.cc', 'base/state_matrix.h', 'base/stream_parser.cc', 'base/stream_parser.h', 'base/stream_parser_buffer.cc', 'base/stream_parser_buffer.h', 'base/video_decoder.cc', 'base/video_decoder.h', 'base/video_decoder_config.cc', 'base/video_decoder_config.h', 'base/video_frame.cc', 'base/video_frame.h', 'base/video_renderer.h', 'base/video_util.cc', 'base/video_util.h', 'crypto/aes_decryptor.cc', 'crypto/aes_decryptor.h', 'ffmpeg/ffmpeg_common.cc', 'ffmpeg/ffmpeg_common.h', 'ffmpeg/file_protocol.cc', 'ffmpeg/file_protocol.h', 'filters/audio_file_reader.cc', 'filters/audio_file_reader.h', 'filters/audio_renderer_algorithm.cc', 'filters/audio_renderer_algorithm.h', 'filters/audio_renderer_impl.cc', 'filters/audio_renderer_impl.h', 'filters/bitstream_converter.cc', 'filters/bitstream_converter.h', 'filters/chunk_demuxer.cc', 'filters/chunk_demuxer.h', 'filters/chunk_demuxer_client.h', 'filters/dummy_demuxer.cc', 'filters/dummy_demuxer.h', 'filters/ffmpeg_audio_decoder.cc', 'filters/ffmpeg_audio_decoder.h', 'filters/ffmpeg_demuxer.cc', 'filters/ffmpeg_demuxer.h', 'filters/ffmpeg_h264_bitstream_converter.cc', 'filters/ffmpeg_h264_bitstream_converter.h', 'filters/ffmpeg_glue.cc', 'filters/ffmpeg_glue.h', 'filters/ffmpeg_video_decoder.cc', 'filters/ffmpeg_video_decoder.h', 'filters/file_data_source.cc', 'filters/file_data_source.h', 'filters/gpu_video_decoder.cc', 'filters/gpu_video_decoder.h', 'filters/in_memory_url_protocol.cc', 'filters/in_memory_url_protocol.h', 'filters/source_buffer_stream.cc', 'filters/source_buffer_stream.h', 'filters/video_frame_generator.cc', 'filters/video_frame_generator.h', 'filters/video_renderer_base.cc', 'filters/video_renderer_base.h', 'video/capture/fake_video_capture_device.cc', 'video/capture/fake_video_capture_device.h', 'video/capture/linux/video_capture_device_linux.cc', 'video/capture/linux/video_capture_device_linux.h', 'video/capture/mac/video_capture_device_mac.h', 'video/capture/mac/video_capture_device_mac.mm', 'video/capture/mac/video_capture_device_qtkit_mac.h', 'video/capture/mac/video_capture_device_qtkit_mac.mm', 'video/capture/video_capture.h', 'video/capture/video_capture_device.h', 'video/capture/video_capture_device_dummy.cc', 'video/capture/video_capture_device_dummy.h', 'video/capture/video_capture_proxy.cc', 'video/capture/video_capture_proxy.h', 'video/capture/video_capture_types.h', 'video/capture/win/filter_base_win.cc', 'video/capture/win/filter_base_win.h', 'video/capture/win/pin_base_win.cc', 'video/capture/win/pin_base_win.h', 'video/capture/win/sink_filter_observer_win.h', 'video/capture/win/sink_filter_win.cc', 'video/capture/win/sink_filter_win.h', 'video/capture/win/sink_input_pin_win.cc', 'video/capture/win/sink_input_pin_win.h', 'video/capture/win/video_capture_device_win.cc', 'video/capture/win/video_capture_device_win.h', 'video/picture.cc', 'video/picture.h', 'video/video_decode_accelerator.cc', 'video/video_decode_accelerator.h', 'webm/webm_constants.h', 'webm/webm_cluster_parser.cc', 'webm/webm_cluster_parser.h', 'webm/webm_content_encodings.cc', 'webm/webm_content_encodings.h', 'webm/webm_content_encodings_client.cc', 'webm/webm_content_encodings_client.h', 'webm/webm_info_parser.cc', 'webm/webm_info_parser.h', 'webm/webm_parser.cc', 'webm/webm_parser.h', 'webm/webm_stream_parser.cc', 'webm/webm_stream_parser.h', 'webm/webm_tracks_parser.cc', 'webm/webm_tracks_parser.h', ], 'direct_dependent_settings': { 'include_dirs': [ '..', ], }, 'conditions': [ # Android doesn't use ffmpeg, so make the dependency conditional # and exclude the sources which depend on ffmpeg. ['OS != \"android\"', { 'dependencies': [ '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], }], ['OS == \"android\"', { 'sources!': [ 'base/media_posix.cc', 'ffmpeg/ffmpeg_common.cc', 'ffmpeg/ffmpeg_common.h', 'ffmpeg/file_protocol.cc', 'ffmpeg/file_protocol.h', 'filters/audio_file_reader.cc', 'filters/audio_file_reader.h', 'filters/bitstream_converter.cc', 'filters/bitstream_converter.h', 'filters/chunk_demuxer.cc', 'filters/chunk_demuxer.h', 'filters/chunk_demuxer_client.h', 'filters/ffmpeg_audio_decoder.cc', 'filters/ffmpeg_audio_decoder.h', 'filters/ffmpeg_demuxer.cc', 'filters/ffmpeg_demuxer.h', 'filters/ffmpeg_h264_bitstream_converter.cc', 'filters/ffmpeg_h264_bitstream_converter.h', 'filters/ffmpeg_glue.cc', 'filters/ffmpeg_glue.h', 'filters/ffmpeg_video_decoder.cc', 'filters/ffmpeg_video_decoder.h', 'filters/gpu_video_decoder.cc', 'filters/gpu_video_decoder.h', 'webm/webm_cluster_parser.cc', 'webm/webm_cluster_parser.h', 'webm/webm_stream_parser.cc', 'webm/webm_stream_parser.h', ], }], # The below 'android' condition were added temporarily and should be # removed in downstream, because there is no Java environment setup in # upstream yet. ['OS == \"android\"', { 'sources!':[ 'audio/android/audio_track_output_android.cc', ], 'sources':[ 'audio/android/audio_track_output_stub_android.cc', ], 'link_settings': { 'libraries': [ '-lOpenSLES', ], }, }], ['OS==\"linux\" or OS==\"freebsd\" or OS==\"solaris\"', { 'link_settings': { 'libraries': [ '-lasound', ], }, }], ['OS==\"openbsd\"', { 'sources/': [ ['exclude', '/alsa_' ], ['exclude', '/audio_manager_linux' ] ], 'link_settings': { 'libraries': [ ], }, }], ['OS!=\"openbsd\"', { 'sources!': [ 'audio/openbsd/audio_manager_openbsd.cc', 'audio/openbsd/audio_manager_openbsd.h', ], }], ['OS==\"linux\"', { 'variables': { 'conditions': [ ['sysroot!=\"\"', { 'pkg-config': '../build/linux/pkg-config-wrapper \"<(sysroot)\" \"<(target_arch)\"', }, { 'pkg-config': 'pkg-config' }], ], }, 'conditions': [ ['use_cras == 1', { 'cflags': [ '<!@(<(pkg-config) --cflags libcras)', ], 'link_settings': { 'libraries': [ '<!@(<(pkg-config) --libs libcras)', ], }, 'defines': [ 'USE_CRAS', ], }, { # else: use_cras == 0 'sources!': [ 'audio/linux/cras_output.cc', 'audio/linux/cras_output.h', ], }], ], }], ['os_posix == 1', { 'conditions': [ ['use_pulseaudio == 1', { 'cflags': [ '<!@(pkg-config --cflags libpulse)', ], 'link_settings': { 'libraries': [ '<!@(pkg-config --libs-only-l libpulse)', ], }, 'defines': [ 'USE_PULSEAUDIO', ], }, { # else: use_pulseaudio == 0 'sources!': [ 'audio/pulse/pulse_output.cc', 'audio/pulse/pulse_output.h', ], }], ], }], ['os_posix == 1 and OS != \"android\"', { # Video capture isn't supported in Android yet. 'sources!': [ 'video/capture/video_capture_device_dummy.cc', 'video/capture/video_capture_device_dummy.h', ], }], ['OS==\"mac\"', { 'link_settings': { 'libraries': [ '$(SDKROOT)/System/Library/Frameworks/AudioUnit.framework', '$(SDKROOT)/System/Library/Frameworks/AudioToolbox.framework', '$(SDKROOT)/System/Library/Frameworks/CoreAudio.framework', '$(SDKROOT)/System/Library/Frameworks/CoreVideo.framework', '$(SDKROOT)/System/Library/Frameworks/QTKit.framework', ], }, }], ['OS==\"win\"', { 'sources!': [ 'audio/pulse/pulse_output.cc', 'audio/pulse/pulse_output.h', 'video/capture/video_capture_device_dummy.cc', 'video/capture/video_capture_device_dummy.h', ], }], ['proprietary_codecs==1 or branding==\"Chrome\"', { 'sources': [ 'mp4/avc.cc', 'mp4/avc.h', 'mp4/box_definitions.cc', 'mp4/box_definitions.h', 'mp4/box_reader.cc', 'mp4/box_reader.h', 'mp4/cenc.cc', 'mp4/cenc.h', 'mp4/mp4_stream_parser.cc', 'mp4/mp4_stream_parser.h', 'mp4/offset_byte_queue.cc', 'mp4/offset_byte_queue.h', 'mp4/track_run_iterator.cc', 'mp4/track_run_iterator.h', ], }], ], }, { 'target_name': 'yuv_convert', 'type': 'static_library', 'include_dirs': [ '..', ], 'conditions': [ ['order_profiling != 0', { 'target_conditions' : [ ['_toolset==\"target\"', { 'cflags!': [ '-finstrument-functions' ], }], ], }], [ 'target_arch == \"ia32\" or target_arch == \"x64\"', { 'dependencies': [ 'yuv_convert_simd_x86', ], }], [ 'target_arch == \"arm\"', { 'dependencies': [ 'yuv_convert_simd_arm', ], }], ], 'sources': [ 'base/yuv_convert.cc', 'base/yuv_convert.h', ], }, { 'target_name': 'yuv_convert_simd_x86', 'type': 'static_library', 'include_dirs': [ '..', ], 'sources': [ 'base/simd/convert_rgb_to_yuv_c.cc', 'base/simd/convert_rgb_to_yuv_sse2.cc', 'base/simd/convert_rgb_to_yuv_ssse3.asm', 'base/simd/convert_rgb_to_yuv_ssse3.cc', 'base/simd/convert_rgb_to_yuv_ssse3.inc', 'base/simd/convert_yuv_to_rgb_c.cc', 'base/simd/convert_yuv_to_rgb_x86.cc', 'base/simd/convert_yuv_to_rgb_mmx.asm', 'base/simd/convert_yuv_to_rgb_mmx.inc', 'base/simd/convert_yuv_to_rgb_sse.asm', 'base/simd/filter_yuv.h', 'base/simd/filter_yuv_c.cc', 'base/simd/filter_yuv_mmx.cc', 'base/simd/filter_yuv_sse2.cc', 'base/simd/linear_scale_yuv_to_rgb_mmx.asm', 'base/simd/linear_scale_yuv_to_rgb_mmx.inc', 'base/simd/linear_scale_yuv_to_rgb_sse.asm', 'base/simd/scale_yuv_to_rgb_mmx.asm', 'base/simd/scale_yuv_to_rgb_mmx.inc', 'base/simd/scale_yuv_to_rgb_sse.asm', 'base/simd/yuv_to_rgb_table.cc', 'base/simd/yuv_to_rgb_table.h', ], 'conditions': [ ['order_profiling != 0', { 'target_conditions' : [ ['_toolset==\"target\"', { 'cflags!': [ '-finstrument-functions' ], }], ], }], [ 'target_arch == \"x64\"', { # Source files optimized for X64 systems. 'sources': [ 'base/simd/linear_scale_yuv_to_rgb_mmx_x64.asm', 'base/simd/scale_yuv_to_rgb_sse2_x64.asm', ], }], [ 'os_posix == 1 and OS != \"mac\" and OS != \"android\"', { 'cflags': [ '-msse2', ], }], [ 'OS == \"mac\"', { 'configurations': { 'Debug': { 'xcode_settings': { # gcc on the mac builds horribly unoptimized sse code in debug # mode. Since this is rarely going to be debugged, run with full # optimizations in Debug as well as Release. 'GCC_OPTIMIZATION_LEVEL': '3', # -O3 }, }, }, }], [ 'OS==\"win\"', { 'variables': { 'yasm_flags': [ '-DWIN32', '-DMSVC', '-DCHROMIUM', '-Isimd', ], }, }], [ 'OS==\"mac\"', { 'variables': { 'yasm_flags': [ '-DPREFIX', '-DMACHO', '-DCHROMIUM', '-Isimd', ], }, }], [ 'os_posix==1 and OS!=\"mac\"', { 'variables': { 'conditions': [ [ 'target_arch==\"ia32\"', { 'yasm_flags': [ '-DX86_32', '-DELF', '-DCHROMIUM', '-Isimd', ], }, { 'yasm_flags': [ '-DARCH_X86_64', '-DELF', '-DPIC', '-DCHROMIUM', '-Isimd', ], }], ], }, }], ], 'variables': { 'yasm_output_path': '<(SHARED_INTERMEDIATE_DIR)/media', }, 'msvs_2010_disable_uldi_when_referenced': 1, 'includes': [ '../third_party/yasm/yasm_compile.gypi', ], }, { 'target_name': 'yuv_convert_simd_arm', 'type': 'static_library', 'include_dirs': [ '..', ], 'sources': [ 'base/simd/convert_rgb_to_yuv_c.cc', 'base/simd/convert_rgb_to_yuv.h', 'base/simd/convert_yuv_to_rgb_c.cc', 'base/simd/convert_yuv_to_rgb.h', 'base/simd/filter_yuv.h', 'base/simd/filter_yuv_c.cc', 'base/simd/yuv_to_rgb_table.cc', 'base/simd/yuv_to_rgb_table.h', ], }, { 'target_name': 'media_unittests', 'type': 'executable', 'dependencies': [ 'media', 'media_test_support', 'yuv_convert', '../base/base.gyp:base', '../base/base.gyp:base_i18n', '../base/base.gyp:test_support_base', '../testing/gmock.gyp:gmock', '../testing/gtest.gyp:gtest', '../ui/ui.gyp:ui', ], 'sources': [ 'audio/async_socket_io_handler_unittest.cc', 'audio/audio_input_controller_unittest.cc', 'audio/audio_input_device_unittest.cc', 'audio/audio_input_unittest.cc', 'audio/audio_input_volume_unittest.cc', 'audio/audio_low_latency_input_output_unittest.cc', 'audio/audio_output_controller_unittest.cc', 'audio/audio_output_proxy_unittest.cc', 'audio/audio_parameters_unittest.cc', 'audio/audio_util_unittest.cc', 'audio/cross_process_notification_unittest.cc', 'audio/linux/alsa_output_unittest.cc', 'audio/mac/audio_low_latency_input_mac_unittest.cc', 'audio/mac/audio_output_mac_unittest.cc', 'audio/simple_sources_unittest.cc', 'audio/win/audio_low_latency_input_win_unittest.cc', 'audio/win/audio_low_latency_output_win_unittest.cc', 'audio/win/audio_output_win_unittest.cc', 'base/audio_renderer_mixer_unittest.cc', 'base/audio_renderer_mixer_input_unittest.cc', 'base/buffers_unittest.cc', 'base/clock_unittest.cc', 'base/composite_filter_unittest.cc', 'base/data_buffer_unittest.cc', 'base/decoder_buffer_unittest.cc', 'base/djb2_unittest.cc', 'base/fake_audio_render_callback.cc', 'base/fake_audio_render_callback.h', 'base/filter_collection_unittest.cc', 'base/h264_bitstream_converter_unittest.cc', 'base/pipeline_unittest.cc', 'base/ranges_unittest.cc', 'base/run_all_unittests.cc', 'base/seekable_buffer_unittest.cc', 'base/state_matrix_unittest.cc', 'base/test_data_util.cc', 'base/test_data_util.h', 'base/video_frame_unittest.cc', 'base/video_util_unittest.cc', 'base/yuv_convert_unittest.cc', 'crypto/aes_decryptor_unittest.cc', 'ffmpeg/ffmpeg_common_unittest.cc', 'filters/audio_renderer_algorithm_unittest.cc', 'filters/audio_renderer_impl_unittest.cc', 'filters/bitstream_converter_unittest.cc', 'filters/chunk_demuxer_unittest.cc', 'filters/ffmpeg_audio_decoder_unittest.cc', 'filters/ffmpeg_decoder_unittest.h', 'filters/ffmpeg_demuxer_unittest.cc', 'filters/ffmpeg_glue_unittest.cc', 'filters/ffmpeg_h264_bitstream_converter_unittest.cc', 'filters/ffmpeg_video_decoder_unittest.cc', 'filters/file_data_source_unittest.cc', 'filters/pipeline_integration_test.cc', 'filters/pipeline_integration_test_base.cc', 'filters/source_buffer_stream_unittest.cc', 'filters/video_renderer_base_unittest.cc', 'video/capture/video_capture_device_unittest.cc', 'webm/cluster_builder.cc', 'webm/cluster_builder.h', 'webm/webm_cluster_parser_unittest.cc', 'webm/webm_content_encodings_client_unittest.cc', 'webm/webm_parser_unittest.cc', ], 'conditions': [ ['os_posix==1 and OS!=\"mac\"', { 'conditions': [ ['linux_use_tcmalloc==1', { 'dependencies': [ '../base/allocator/allocator.gyp:allocator', ], }], ], }], ['OS != \"android\"', { 'dependencies': [ '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], }], ['OS == \"android\"', { 'sources!': [ 'audio/audio_input_volume_unittest.cc', 'base/test_data_util.cc', 'base/test_data_util.h', 'ffmpeg/ffmpeg_common_unittest.cc', 'filters/ffmpeg_audio_decoder_unittest.cc', 'filters/bitstream_converter_unittest.cc', 'filters/chunk_demuxer_unittest.cc', 'filters/ffmpeg_demuxer_unittest.cc', 'filters/ffmpeg_glue_unittest.cc', 'filters/ffmpeg_h264_bitstream_converter_unittest.cc', 'filters/ffmpeg_video_decoder_unittest.cc', 'filters/pipeline_integration_test.cc', 'filters/pipeline_integration_test_base.cc', 'mp4/mp4_stream_parser_unittest.cc', 'webm/webm_cluster_parser_unittest.cc', ], }], ['OS == \"linux\"', { 'conditions': [ ['use_cras == 1', { 'sources': [ 'audio/linux/cras_output_unittest.cc', ], 'defines': [ 'USE_CRAS', ], }], ], }], [ 'target_arch==\"ia32\" or target_arch==\"x64\"', { 'sources': [ 'base/simd/convert_rgb_to_yuv_unittest.cc', ], }], ['proprietary_codecs==1 or branding==\"Chrome\"', { 'sources': [ 'mp4/avc_unittest.cc', 'mp4/box_reader_unittest.cc', 'mp4/mp4_stream_parser_unittest.cc', 'mp4/offset_byte_queue_unittest.cc', ], }], ], }, { 'target_name': 'media_test_support', 'type': 'static_library', 'dependencies': [ 'media', '../base/base.gyp:base', '../testing/gmock.gyp:gmock', '../testing/gtest.gyp:gtest', ], 'sources': [ 'audio/test_audio_input_controller_factory.cc', 'audio/test_audio_input_controller_factory.h', 'base/mock_callback.cc', 'base/mock_callback.h', 'base/mock_data_source_host.cc', 'base/mock_data_source_host.h', 'base/mock_demuxer_host.cc', 'base/mock_demuxer_host.h', 'base/mock_filter_host.cc', 'base/mock_filter_host.h', 'base/mock_filters.cc', 'base/mock_filters.h', ], }, { 'target_name': 'scaler_bench', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../skia/skia.gyp:skia', ], 'sources': [ 'tools/scaler_bench/scaler_bench.cc', ], }, { 'target_name': 'qt_faststart', 'type': 'executable', 'sources': [ 'tools/qt_faststart/qt_faststart.c' ], }, { 'target_name': 'seek_tester', 'type': 'executable', 'dependencies': [ 'media', '../base/base.gyp:base', ], 'sources': [ 'tools/seek_tester/seek_tester.cc', ], }, ], 'conditions': [ ['OS==\"win\"', { 'targets': [ { 'target_name': 'player_wtl', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../base/third_party/dynamic_annotations/dynamic_annotations.gyp:dynamic_annotations', '../ui/ui.gyp:ui', ], 'include_dirs': [ '<(DEPTH)/third_party/wtl/include', ], 'sources': [ 'tools/player_wtl/list.h', 'tools/player_wtl/mainfrm.h', 'tools/player_wtl/movie.cc', 'tools/player_wtl/movie.h', 'tools/player_wtl/player_wtl.cc', 'tools/player_wtl/player_wtl.rc', 'tools/player_wtl/props.h', 'tools/player_wtl/seek.h', 'tools/player_wtl/resource.h', 'tools/player_wtl/view.h', ], 'msvs_settings': { 'VCLinkerTool': { 'SubSystem': '2', # Set /SUBSYSTEM:WINDOWS }, }, 'defines': [ '_CRT_SECURE_NO_WARNINGS=1', ], }, ], }], ['OS == \"win\" or toolkit_uses_gtk == 1', { 'targets': [ { 'target_name': 'shader_bench', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../ui/gl/gl.gyp:gl', ], 'sources': [ 'tools/shader_bench/shader_bench.cc', 'tools/shader_bench/cpu_color_painter.cc', 'tools/shader_bench/cpu_color_painter.h', 'tools/shader_bench/gpu_color_painter.cc', 'tools/shader_bench/gpu_color_painter.h', 'tools/shader_bench/gpu_painter.cc', 'tools/shader_bench/gpu_painter.h', 'tools/shader_bench/painter.cc', 'tools/shader_bench/painter.h', 'tools/shader_bench/window.cc', 'tools/shader_bench/window.h', ], 'conditions': [ ['toolkit_uses_gtk == 1', { 'dependencies': [ '../build/linux/system.gyp:gtk', ], 'sources': [ 'tools/shader_bench/window_linux.cc', ], }], ['OS==\"win\"', { 'dependencies': [ '../third_party/angle/src/build_angle.gyp:libEGL', '../third_party/angle/src/build_angle.gyp:libGLESv2', ], 'sources': [ 'tools/shader_bench/window_win.cc', ], }], ], }, ], }], ['OS == \"linux\" and target_arch != \"arm\"', { 'targets': [ { 'target_name': 'tile_render_bench', 'type': 'executable', 'dependencies': [ '../base/base.gyp:base', '../ui/gl/gl.gyp:gl', ], 'libraries': [ '-lGL', '-ldl', ], 'sources': [ 'tools/tile_render_bench/tile_render_bench.cc', ], }, ], }], ['os_posix == 1 and OS != \"mac\" and OS != \"android\"', { 'targets': [ { 'target_name': 'player_x11', 'type': 'executable', 'dependencies': [ 'media', 'yuv_convert', '../base/base.gyp:base', '../ui/gl/gl.gyp:gl', ], 'link_settings': { 'libraries': [ '-ldl', '-lX11', '-lXrender', '-lXext', ], }, 'sources': [ 'tools/player_x11/data_source_logger.cc', 'tools/player_x11/data_source_logger.h', 'tools/player_x11/gl_video_renderer.cc', 'tools/player_x11/gl_video_renderer.h', 'tools/player_x11/player_x11.cc', 'tools/player_x11/x11_video_renderer.cc', 'tools/player_x11/x11_video_renderer.h', ], }, ], }], ['OS == \"android\"', { 'targets': [ { 'target_name': 'player_android', 'type': 'static_library', 'sources': [ 'base/android/media_player_bridge.cc', 'base/android/media_player_bridge.h', ], 'dependencies': [ '../base/base.gyp:base', ], 'include_dirs': [ '<(SHARED_INTERMEDIATE_DIR)/media', ], 'actions': [ { 'action_name': 'generate-jni-headers', 'inputs': [ '../base/android/jni_generator/jni_generator.py', 'base/android/java/src/org/chromium/media/MediaPlayerListener.java', ], 'outputs': [ '<(SHARED_INTERMEDIATE_DIR)/media/jni/media_player_listener_jni.h', ], 'action': [ 'python', '<(DEPTH)/base/android/jni_generator/jni_generator.py', '-o', '<@(_inputs)', '<@(_outputs)', ], }, ], }, { 'target_name': 'media_java', 'type': 'none', 'dependencies': [ '../base/base.gyp:base_java' ], 'variables': { 'package_name': 'media', 'java_in_dir': 'base/android/java', }, 'includes': [ '../build/java.gypi' ], }, ], }, { # OS != \"android\"' # Android does not use ffmpeg, so disable the targets which require it. 'targets': [ { 'target_name': 'ffmpeg_unittests', 'type': 'executable', 'dependencies': [ 'media', 'media_test_support', '../base/base.gyp:base', '../base/base.gyp:base_i18n', '../base/base.gyp:test_support_base', '../base/base.gyp:test_support_perf', '../testing/gtest.gyp:gtest', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'ffmpeg/ffmpeg_unittest.cc', ], 'conditions': [ ['toolkit_uses_gtk == 1', { 'dependencies': [ # Needed for the following #include chain: # base/run_all_unittests.cc # ../base/test_suite.h # gtk/gtk.h '../build/linux/system.gyp:gtk', ], 'conditions': [ ['linux_use_tcmalloc==1', { 'dependencies': [ '../base/allocator/allocator.gyp:allocator', ], }], ], }], ], }, { 'target_name': 'ffmpeg_regression_tests', 'type': 'executable', 'dependencies': [ 'media', 'media_test_support', '../base/base.gyp:test_support_base', '../testing/gmock.gyp:gmock', '../testing/gtest.gyp:gtest', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'base/test_data_util.cc', 'base/run_all_unittests.cc', 'ffmpeg/ffmpeg_regression_tests.cc', 'filters/pipeline_integration_test_base.cc', ], 'conditions': [ ['os_posix==1 and OS!=\"mac\"', { 'conditions': [ ['linux_use_tcmalloc==1', { 'dependencies': [ '../base/allocator/allocator.gyp:allocator', ], }], ], }], ], }, { 'target_name': 'ffmpeg_tests', 'type': 'executable', 'dependencies': [ 'media', '../base/base.gyp:base', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'test/ffmpeg_tests/ffmpeg_tests.cc', ], }, { 'target_name': 'media_bench', 'type': 'executable', 'dependencies': [ 'media', '../base/base.gyp:base', '../third_party/ffmpeg/ffmpeg.gyp:ffmpeg', ], 'sources': [ 'tools/media_bench/media_bench.cc', ], }, ], }] ], } " 0 64 (face font-lock-comment-face) 64 137 (face font-lock-comment-face) 137 166 (face font-lock-comment-face) 166 171 nil 171 172 (face font-lock-string-face) 172 181 (face font-lock-keyword-face) 181 182 (face font-lock-string-face) 182 190 nil 190 191 (face font-lock-string-face) 191 204 (face font-lock-variable-name-face) 204 205 (face font-lock-string-face) 205 214 nil 214 269 (face font-lock-comment-face) 269 273 nil 273 274 (face font-lock-string-face) 274 289 (face font-lock-variable-name-face) 289 290 (face font-lock-string-face) 290 299 nil 299 365 (face font-lock-comment-face) 365 369 nil 369 370 (face font-lock-string-face) 370 379 (face font-lock-variable-name-face) 379 380 (face font-lock-string-face) 380 392 nil 392 393 (face font-lock-string-face) 393 400 (face font-lock-keyword-face) 400 401 (face font-lock-string-face) 401 417 nil 417 418 (face font-lock-string-face) 418 429 (face font-lock-keyword-face) 429 430 (face font-lock-string-face) 430 432 nil 432 433 (face font-lock-string-face) 433 438 (face font-lock-function-name-face) 438 439 (face font-lock-string-face) 439 447 nil 447 448 (face font-lock-string-face) 448 452 (face font-lock-keyword-face) 452 453 (face font-lock-string-face) 453 455 nil 455 458 (face font-lock-string-face) 458 467 (face font-lock-variable-name-face) 467 469 (face font-lock-string-face) 469 477 nil 477 478 (face font-lock-string-face) 478 490 (face font-lock-keyword-face) 490 491 (face font-lock-string-face) 491 503 nil 503 504 (face font-lock-string-face) 504 515 (face font-lock-function-name-face) 515 516 (face font-lock-string-face) 516 526 nil 526 527 (face font-lock-string-face) 527 548 (face font-lock-function-name-face) 548 549 (face font-lock-string-face) 549 559 nil 559 560 (face font-lock-string-face) 560 643 (face font-lock-function-name-face) 643 644 (face font-lock-string-face) 644 654 nil 654 655 (face font-lock-string-face) 655 696 (face font-lock-function-name-face) 696 697 (face font-lock-string-face) 697 707 nil 707 708 (face font-lock-string-face) 708 735 (face font-lock-function-name-face) 735 736 (face font-lock-string-face) 736 746 nil 746 747 (face font-lock-string-face) 747 784 (face font-lock-function-name-face) 784 785 (face font-lock-string-face) 785 795 nil 795 796 (face font-lock-string-face) 796 811 (face font-lock-function-name-face) 811 812 (face font-lock-string-face) 812 829 nil 829 830 (face font-lock-string-face) 830 837 (face font-lock-keyword-face) 837 838 (face font-lock-string-face) 838 850 nil 850 851 (face font-lock-string-face) 851 871 (face font-lock-preprocessor-face) 871 872 (face font-lock-string-face) 872 889 nil 889 890 (face font-lock-string-face) 890 902 (face font-lock-keyword-face) 902 903 (face font-lock-string-face) 903 915 nil 915 916 (face font-lock-string-face) 916 918 (face font-lock-constant-face) 918 919 (face font-lock-string-face) 919 936 nil 936 937 (face font-lock-string-face) 937 944 (face font-lock-keyword-face) 944 945 (face font-lock-string-face) 945 957 nil 957 958 (face font-lock-string-face) 958 996 (face font-lock-constant-face) 996 997 (face font-lock-string-face) 997 1007 nil 1007 1008 (face font-lock-string-face) 1008 1045 (face font-lock-constant-face) 1045 1046 (face font-lock-string-face) 1046 1056 nil 1056 1057 (face font-lock-string-face) 1057 1100 (face font-lock-constant-face) 1100 1101 (face font-lock-string-face) 1101 1111 nil 1111 1112 (face font-lock-string-face) 1112 1154 (face font-lock-constant-face) 1154 1155 (face font-lock-string-face) 1155 1165 nil 1165 1166 (face font-lock-string-face) 1166 1197 (face font-lock-constant-face) 1197 1198 (face font-lock-string-face) 1198 1208 nil 1208 1209 (face font-lock-string-face) 1209 1239 (face font-lock-constant-face) 1239 1240 (face font-lock-string-face) 1240 1250 nil 1250 1251 (face font-lock-string-face) 1251 1283 (face font-lock-constant-face) 1283 1284 (face font-lock-string-face) 1284 1294 nil 1294 1295 (face font-lock-string-face) 1295 1326 (face font-lock-constant-face) 1326 1327 (face font-lock-string-face) 1327 1337 nil 1337 1338 (face font-lock-string-face) 1338 1369 (face font-lock-constant-face) 1369 1370 (face font-lock-string-face) 1370 1380 nil 1380 1381 (face font-lock-string-face) 1381 1419 (face font-lock-constant-face) 1419 1420 (face font-lock-string-face) 1420 1430 nil 1430 1431 (face font-lock-string-face) 1431 1467 (face font-lock-constant-face) 1467 1468 (face font-lock-string-face) 1468 1478 nil 1478 1479 (face font-lock-string-face) 1479 1507 (face font-lock-constant-face) 1507 1508 (face font-lock-string-face) 1508 1518 nil 1518 1519 (face font-lock-string-face) 1519 1546 (face font-lock-constant-face) 1546 1547 (face font-lock-string-face) 1547 1557 nil 1557 1558 (face font-lock-string-face) 1558 1574 (face font-lock-constant-face) 1574 1575 (face font-lock-string-face) 1575 1585 nil 1585 1586 (face font-lock-string-face) 1586 1617 (face font-lock-constant-face) 1617 1618 (face font-lock-string-face) 1618 1628 nil 1628 1629 (face font-lock-string-face) 1629 1659 (face font-lock-constant-face) 1659 1660 (face font-lock-string-face) 1660 1670 nil 1670 1671 (face font-lock-string-face) 1671 1703 (face font-lock-constant-face) 1703 1704 (face font-lock-string-face) 1704 1714 nil 1714 1715 (face font-lock-string-face) 1715 1746 (face font-lock-constant-face) 1746 1747 (face font-lock-string-face) 1747 1757 nil 1757 1758 (face font-lock-string-face) 1758 1784 (face font-lock-constant-face) 1784 1785 (face font-lock-string-face) 1785 1795 nil 1795 1796 (face font-lock-string-face) 1796 1821 (face font-lock-constant-face) 1821 1822 (face font-lock-string-face) 1822 1832 nil 1832 1833 (face font-lock-string-face) 1833 1855 (face font-lock-constant-face) 1855 1856 (face font-lock-string-face) 1856 1866 nil 1866 1867 (face font-lock-string-face) 1867 1888 (face font-lock-constant-face) 1888 1889 (face font-lock-string-face) 1889 1899 nil 1899 1900 (face font-lock-string-face) 1900 1927 (face font-lock-constant-face) 1927 1928 (face font-lock-string-face) 1928 1938 nil 1938 1939 (face font-lock-string-face) 1939 1965 (face font-lock-constant-face) 1965 1966 (face font-lock-string-face) 1966 1976 nil 1976 1977 (face font-lock-string-face) 1977 2009 (face font-lock-constant-face) 2009 2010 (face font-lock-string-face) 2010 2020 nil 2020 2021 (face font-lock-string-face) 2021 2052 (face font-lock-constant-face) 2052 2053 (face font-lock-string-face) 2053 2063 nil 2063 2064 (face font-lock-string-face) 2064 2096 (face font-lock-constant-face) 2096 2097 (face font-lock-string-face) 2097 2107 nil 2107 2108 (face font-lock-string-face) 2108 2139 (face font-lock-constant-face) 2139 2140 (face font-lock-string-face) 2140 2150 nil 2150 2151 (face font-lock-string-face) 2151 2188 (face font-lock-constant-face) 2188 2189 (face font-lock-string-face) 2189 2199 nil 2199 2200 (face font-lock-string-face) 2200 2236 (face font-lock-constant-face) 2236 2237 (face font-lock-string-face) 2237 2247 nil 2247 2248 (face font-lock-string-face) 2248 2275 (face font-lock-constant-face) 2275 2276 (face font-lock-string-face) 2276 2286 nil 2286 2287 (face font-lock-string-face) 2287 2313 (face font-lock-constant-face) 2313 2314 (face font-lock-string-face) 2314 2324 nil 2324 2325 (face font-lock-string-face) 2325 2352 (face font-lock-constant-face) 2352 2353 (face font-lock-string-face) 2353 2363 nil 2363 2364 (face font-lock-string-face) 2364 2390 (face font-lock-constant-face) 2390 2391 (face font-lock-string-face) 2391 2401 nil 2401 2402 (face font-lock-string-face) 2402 2427 (face font-lock-constant-face) 2427 2428 (face font-lock-string-face) 2428 2438 nil 2438 2439 (face font-lock-string-face) 2439 2463 (face font-lock-constant-face) 2463 2464 (face font-lock-string-face) 2464 2474 nil 2474 2475 (face font-lock-string-face) 2475 2494 (face font-lock-constant-face) 2494 2495 (face font-lock-string-face) 2495 2505 nil 2505 2506 (face font-lock-string-face) 2506 2524 (face font-lock-constant-face) 2524 2525 (face font-lock-string-face) 2525 2535 nil 2535 2536 (face font-lock-string-face) 2536 2571 (face font-lock-constant-face) 2571 2572 (face font-lock-string-face) 2572 2582 nil 2582 2583 (face font-lock-string-face) 2583 2617 (face font-lock-constant-face) 2617 2618 (face font-lock-string-face) 2618 2628 nil 2628 2629 (face font-lock-string-face) 2629 2668 (face font-lock-constant-face) 2668 2669 (face font-lock-string-face) 2669 2679 nil 2679 2680 (face font-lock-string-face) 2680 2721 (face font-lock-constant-face) 2721 2722 (face font-lock-string-face) 2722 2732 nil 2732 2733 (face font-lock-string-face) 2733 2765 (face font-lock-constant-face) 2765 2766 (face font-lock-string-face) 2766 2776 nil 2776 2777 (face font-lock-string-face) 2777 2808 (face font-lock-constant-face) 2808 2809 (face font-lock-string-face) 2809 2819 nil 2819 2820 (face font-lock-string-face) 2820 2853 (face font-lock-constant-face) 2853 2854 (face font-lock-string-face) 2854 2864 nil 2864 2865 (face font-lock-string-face) 2865 2897 (face font-lock-constant-face) 2897 2898 (face font-lock-string-face) 2898 2908 nil 2908 2909 (face font-lock-string-face) 2909 2943 (face font-lock-constant-face) 2943 2944 (face font-lock-string-face) 2944 2954 nil 2954 2955 (face font-lock-string-face) 2955 2988 (face font-lock-constant-face) 2988 2989 (face font-lock-string-face) 2989 2999 nil 2999 3000 (face font-lock-string-face) 3000 3025 (face font-lock-constant-face) 3025 3026 (face font-lock-string-face) 3026 3036 nil 3036 3037 (face font-lock-string-face) 3037 3061 (face font-lock-constant-face) 3061 3062 (face font-lock-string-face) 3062 3072 nil 3072 3073 (face font-lock-string-face) 3073 3099 (face font-lock-constant-face) 3099 3100 (face font-lock-string-face) 3100 3110 nil 3110 3111 (face font-lock-string-face) 3111 3136 (face font-lock-constant-face) 3136 3137 (face font-lock-string-face) 3137 3147 nil 3147 3148 (face font-lock-string-face) 3148 3172 (face font-lock-constant-face) 3172 3173 (face font-lock-string-face) 3173 3183 nil 3183 3184 (face font-lock-string-face) 3184 3207 (face font-lock-constant-face) 3207 3208 (face font-lock-string-face) 3208 3218 nil 3218 3219 (face font-lock-string-face) 3219 3246 (face font-lock-constant-face) 3246 3247 (face font-lock-string-face) 3247 3257 nil 3257 3258 (face font-lock-string-face) 3258 3284 (face font-lock-constant-face) 3284 3285 (face font-lock-string-face) 3285 3295 nil 3295 3296 (face font-lock-string-face) 3296 3322 (face font-lock-constant-face) 3322 3323 (face font-lock-string-face) 3323 3333 nil 3333 3334 (face font-lock-string-face) 3334 3359 (face font-lock-constant-face) 3359 3360 (face font-lock-string-face) 3360 3370 nil 3370 3371 (face font-lock-string-face) 3371 3409 (face font-lock-constant-face) 3409 3410 (face font-lock-string-face) 3410 3420 nil 3420 3421 (face font-lock-string-face) 3421 3458 (face font-lock-constant-face) 3458 3459 (face font-lock-string-face) 3459 3469 nil 3469 3470 (face font-lock-string-face) 3470 3498 (face font-lock-constant-face) 3498 3499 (face font-lock-string-face) 3499 3509 nil 3509 3510 (face font-lock-string-face) 3510 3537 (face font-lock-constant-face) 3537 3538 (face font-lock-string-face) 3538 3548 nil 3548 3549 (face font-lock-string-face) 3549 3589 (face font-lock-constant-face) 3589 3590 (face font-lock-string-face) 3590 3600 nil 3600 3601 (face font-lock-string-face) 3601 3640 (face font-lock-constant-face) 3640 3641 (face font-lock-string-face) 3641 3651 nil 3651 3652 (face font-lock-string-face) 3652 3693 (face font-lock-constant-face) 3693 3694 (face font-lock-string-face) 3694 3704 nil 3704 3705 (face font-lock-string-face) 3705 3745 (face font-lock-constant-face) 3745 3746 (face font-lock-string-face) 3746 3756 nil 3756 3757 (face font-lock-string-face) 3757 3787 (face font-lock-constant-face) 3787 3788 (face font-lock-string-face) 3788 3798 nil 3798 3799 (face font-lock-string-face) 3799 3828 (face font-lock-constant-face) 3828 3829 (face font-lock-string-face) 3829 3839 nil 3839 3840 (face font-lock-string-face) 3840 3869 (face font-lock-constant-face) 3869 3870 (face font-lock-string-face) 3870 3880 nil 3880 3881 (face font-lock-string-face) 3881 3909 (face font-lock-constant-face) 3909 3910 (face font-lock-string-face) 3910 3920 nil 3920 3921 (face font-lock-string-face) 3921 3945 (face font-lock-constant-face) 3945 3946 (face font-lock-string-face) 3946 3956 nil 3956 3957 (face font-lock-string-face) 3957 3980 (face font-lock-constant-face) 3980 3981 (face font-lock-string-face) 3981 3991 nil 3991 3992 (face font-lock-string-face) 3992 4019 (face font-lock-constant-face) 4019 4020 (face font-lock-string-face) 4020 4030 nil 4030 4031 (face font-lock-string-face) 4031 4057 (face font-lock-constant-face) 4057 4058 (face font-lock-string-face) 4058 4068 nil 4068 4069 (face font-lock-string-face) 4069 4090 (face font-lock-constant-face) 4090 4091 (face font-lock-string-face) 4091 4101 nil 4101 4102 (face font-lock-string-face) 4102 4122 (face font-lock-constant-face) 4122 4123 (face font-lock-string-face) 4123 4133 nil 4133 4134 (face font-lock-string-face) 4134 4157 (face font-lock-constant-face) 4157 4158 (face font-lock-string-face) 4158 4168 nil 4168 4169 (face font-lock-string-face) 4169 4191 (face font-lock-constant-face) 4191 4192 (face font-lock-string-face) 4192 4202 nil 4202 4203 (face font-lock-string-face) 4203 4243 (face font-lock-constant-face) 4243 4244 (face font-lock-string-face) 4244 4254 nil 4254 4255 (face font-lock-string-face) 4255 4294 (face font-lock-constant-face) 4294 4295 (face font-lock-string-face) 4295 4305 nil 4305 4306 (face font-lock-string-face) 4306 4347 (face font-lock-constant-face) 4347 4348 (face font-lock-string-face) 4348 4358 nil 4358 4359 (face font-lock-string-face) 4359 4399 (face font-lock-constant-face) 4399 4400 (face font-lock-string-face) 4400 4410 nil 4410 4411 (face font-lock-string-face) 4411 4441 (face font-lock-constant-face) 4441 4442 (face font-lock-string-face) 4442 4452 nil 4452 4453 (face font-lock-string-face) 4453 4482 (face font-lock-constant-face) 4482 4483 (face font-lock-string-face) 4483 4493 nil 4493 4494 (face font-lock-string-face) 4494 4523 (face font-lock-constant-face) 4523 4524 (face font-lock-string-face) 4524 4534 nil 4534 4535 (face font-lock-string-face) 4535 4563 (face font-lock-constant-face) 4563 4564 (face font-lock-string-face) 4564 4574 nil 4574 4575 (face font-lock-string-face) 4575 4610 (face font-lock-constant-face) 4610 4611 (face font-lock-string-face) 4611 4621 nil 4621 4622 (face font-lock-string-face) 4622 4656 (face font-lock-constant-face) 4656 4657 (face font-lock-string-face) 4657 4667 nil 4667 4668 (face font-lock-string-face) 4668 4697 (face font-lock-constant-face) 4697 4698 (face font-lock-string-face) 4698 4708 nil 4708 4709 (face font-lock-string-face) 4709 4737 (face font-lock-constant-face) 4737 4738 (face font-lock-string-face) 4738 4748 nil 4748 4749 (face font-lock-string-face) 4749 4780 (face font-lock-constant-face) 4780 4781 (face font-lock-string-face) 4781 4791 nil 4791 4792 (face font-lock-string-face) 4792 4822 (face font-lock-constant-face) 4822 4823 (face font-lock-string-face) 4823 4833 nil 4833 4834 (face font-lock-string-face) 4834 4869 (face font-lock-constant-face) 4869 4870 (face font-lock-string-face) 4870 4880 nil 4880 4881 (face font-lock-string-face) 4881 4915 (face font-lock-constant-face) 4915 4916 (face font-lock-string-face) 4916 4926 nil 4926 4927 (face font-lock-string-face) 4927 4948 (face font-lock-constant-face) 4948 4949 (face font-lock-string-face) 4949 4959 nil 4959 4960 (face font-lock-string-face) 4960 4980 (face font-lock-constant-face) 4980 4981 (face font-lock-string-face) 4981 4991 nil 4991 4992 (face font-lock-string-face) 4992 5020 (face font-lock-constant-face) 5020 5021 (face font-lock-string-face) 5021 5031 nil 5031 5032 (face font-lock-string-face) 5032 5059 (face font-lock-constant-face) 5059 5060 (face font-lock-string-face) 5060 5070 nil 5070 5071 (face font-lock-string-face) 5071 5092 (face font-lock-constant-face) 5092 5093 (face font-lock-string-face) 5093 5103 nil 5103 5104 (face font-lock-string-face) 5104 5132 (face font-lock-constant-face) 5132 5133 (face font-lock-string-face) 5133 5143 nil 5143 5144 (face font-lock-string-face) 5144 5171 (face font-lock-constant-face) 5171 5172 (face font-lock-string-face) 5172 5182 nil 5182 5183 (face font-lock-string-face) 5183 5217 (face font-lock-constant-face) 5217 5218 (face font-lock-string-face) 5218 5228 nil 5228 5229 (face font-lock-string-face) 5229 5262 (face font-lock-constant-face) 5262 5263 (face font-lock-string-face) 5263 5273 nil 5273 5274 (face font-lock-string-face) 5274 5297 (face font-lock-constant-face) 5297 5298 (face font-lock-string-face) 5298 5308 nil 5308 5309 (face font-lock-string-face) 5309 5324 (face font-lock-constant-face) 5324 5325 (face font-lock-string-face) 5325 5335 nil 5335 5336 (face font-lock-string-face) 5336 5350 (face font-lock-constant-face) 5350 5351 (face font-lock-string-face) 5351 5361 nil 5361 5362 (face font-lock-string-face) 5362 5380 (face font-lock-constant-face) 5380 5381 (face font-lock-string-face) 5381 5391 nil 5391 5392 (face font-lock-string-face) 5392 5409 (face font-lock-constant-face) 5409 5410 (face font-lock-string-face) 5410 5420 nil 5420 5421 (face font-lock-string-face) 5421 5443 (face font-lock-constant-face) 5443 5444 (face font-lock-string-face) 5444 5454 nil 5454 5455 (face font-lock-string-face) 5455 5476 (face font-lock-constant-face) 5476 5477 (face font-lock-string-face) 5477 5487 nil 5487 5488 (face font-lock-string-face) 5488 5501 (face font-lock-constant-face) 5501 5502 (face font-lock-string-face) 5502 5512 nil 5512 5513 (face font-lock-string-face) 5513 5525 (face font-lock-constant-face) 5525 5526 (face font-lock-string-face) 5526 5536 nil 5536 5537 (face font-lock-string-face) 5537 5561 (face font-lock-constant-face) 5561 5562 (face font-lock-string-face) 5562 5572 nil 5572 5573 (face font-lock-string-face) 5573 5596 (face font-lock-constant-face) 5596 5597 (face font-lock-string-face) 5597 5607 nil 5607 5608 (face font-lock-string-face) 5608 5627 (face font-lock-constant-face) 5627 5628 (face font-lock-string-face) 5628 5638 nil 5638 5639 (face font-lock-string-face) 5639 5657 (face font-lock-constant-face) 5657 5658 (face font-lock-string-face) 5658 5668 nil 5668 5669 (face font-lock-string-face) 5669 5688 (face font-lock-constant-face) 5688 5689 (face font-lock-string-face) 5689 5699 nil 5699 5700 (face font-lock-string-face) 5700 5718 (face font-lock-constant-face) 5718 5719 (face font-lock-string-face) 5719 5729 nil 5729 5730 (face font-lock-string-face) 5730 5752 (face font-lock-constant-face) 5752 5753 (face font-lock-string-face) 5753 5763 nil 5763 5764 (face font-lock-string-face) 5764 5785 (face font-lock-constant-face) 5785 5786 (face font-lock-string-face) 5786 5796 nil 5796 5797 (face font-lock-string-face) 5797 5819 (face font-lock-constant-face) 5819 5820 (face font-lock-string-face) 5820 5830 nil 5830 5831 (face font-lock-string-face) 5831 5852 (face font-lock-constant-face) 5852 5853 (face font-lock-string-face) 5853 5863 nil 5863 5864 (face font-lock-string-face) 5864 5880 (face font-lock-constant-face) 5880 5881 (face font-lock-string-face) 5881 5891 nil 5891 5892 (face font-lock-string-face) 5892 5915 (face font-lock-constant-face) 5915 5916 (face font-lock-string-face) 5916 5926 nil 5926 5927 (face font-lock-string-face) 5927 5942 (face font-lock-constant-face) 5942 5943 (face font-lock-string-face) 5943 5953 nil 5953 5954 (face font-lock-string-face) 5954 5968 (face font-lock-constant-face) 5968 5969 (face font-lock-string-face) 5969 5979 nil 5979 5980 (face font-lock-string-face) 5980 6002 (face font-lock-constant-face) 6002 6003 (face font-lock-string-face) 6003 6013 nil 6013 6014 (face font-lock-string-face) 6014 6035 (face font-lock-constant-face) 6035 6036 (face font-lock-string-face) 6036 6046 nil 6046 6047 (face font-lock-string-face) 6047 6059 (face font-lock-constant-face) 6059 6060 (face font-lock-string-face) 6060 6070 nil 6070 6071 (face font-lock-string-face) 6071 6082 (face font-lock-constant-face) 6082 6083 (face font-lock-string-face) 6083 6093 nil 6093 6094 (face font-lock-string-face) 6094 6119 (face font-lock-constant-face) 6119 6120 (face font-lock-string-face) 6120 6130 nil 6130 6131 (face font-lock-string-face) 6131 6155 (face font-lock-constant-face) 6155 6156 (face font-lock-string-face) 6156 6166 nil 6166 6167 (face font-lock-string-face) 6167 6185 (face font-lock-constant-face) 6185 6186 (face font-lock-string-face) 6186 6196 nil 6196 6197 (face font-lock-string-face) 6197 6212 (face font-lock-constant-face) 6212 6213 (face font-lock-string-face) 6213 6223 nil 6223 6224 (face font-lock-string-face) 6224 6238 (face font-lock-constant-face) 6238 6239 (face font-lock-string-face) 6239 6249 nil 6249 6250 (face font-lock-string-face) 6250 6282 (face font-lock-constant-face) 6282 6283 (face font-lock-string-face) 6283 6293 nil 6293 6294 (face font-lock-string-face) 6294 6325 (face font-lock-constant-face) 6325 6326 (face font-lock-string-face) 6326 6336 nil 6336 6337 (face font-lock-string-face) 6337 6349 (face font-lock-constant-face) 6349 6350 (face font-lock-string-face) 6350 6360 nil 6360 6361 (face font-lock-string-face) 6361 6382 (face font-lock-constant-face) 6382 6383 (face font-lock-string-face) 6383 6393 nil 6393 6394 (face font-lock-string-face) 6394 6413 (face font-lock-constant-face) 6413 6414 (face font-lock-string-face) 6414 6424 nil 6424 6425 (face font-lock-string-face) 6425 6442 (face font-lock-constant-face) 6442 6443 (face font-lock-string-face) 6443 6453 nil 6453 6454 (face font-lock-string-face) 6454 6470 (face font-lock-constant-face) 6470 6471 (face font-lock-string-face) 6471 6481 nil 6481 6482 (face font-lock-string-face) 6482 6504 (face font-lock-constant-face) 6504 6505 (face font-lock-string-face) 6505 6515 nil 6515 6516 (face font-lock-string-face) 6516 6535 (face font-lock-constant-face) 6535 6536 (face font-lock-string-face) 6536 6546 nil 6546 6547 (face font-lock-string-face) 6547 6569 (face font-lock-constant-face) 6569 6570 (face font-lock-string-face) 6570 6580 nil 6580 6581 (face font-lock-string-face) 6581 6602 (face font-lock-constant-face) 6602 6603 (face font-lock-string-face) 6603 6613 nil 6613 6614 (face font-lock-string-face) 6614 6631 (face font-lock-constant-face) 6631 6632 (face font-lock-string-face) 6632 6642 nil 6642 6643 (face font-lock-string-face) 6643 6671 (face font-lock-constant-face) 6671 6672 (face font-lock-string-face) 6672 6682 nil 6682 6683 (face font-lock-string-face) 6683 6710 (face font-lock-constant-face) 6710 6711 (face font-lock-string-face) 6711 6721 nil 6721 6722 (face font-lock-string-face) 6722 6738 (face font-lock-constant-face) 6738 6739 (face font-lock-string-face) 6739 6749 nil 6749 6750 (face font-lock-string-face) 6750 6765 (face font-lock-constant-face) 6765 6766 (face font-lock-string-face) 6766 6776 nil 6776 6777 (face font-lock-string-face) 6777 6800 (face font-lock-constant-face) 6800 6801 (face font-lock-string-face) 6801 6811 nil 6811 6812 (face font-lock-string-face) 6812 6834 (face font-lock-constant-face) 6834 6835 (face font-lock-string-face) 6835 6845 nil 6845 6846 (face font-lock-string-face) 6846 6860 (face font-lock-constant-face) 6860 6861 (face font-lock-string-face) 6861 6871 nil 6871 6872 (face font-lock-string-face) 6872 6885 (face font-lock-constant-face) 6885 6886 (face font-lock-string-face) 6886 6896 nil 6896 6897 (face font-lock-string-face) 6897 6920 (face font-lock-constant-face) 6920 6921 (face font-lock-string-face) 6921 6931 nil 6931 6932 (face font-lock-string-face) 6932 6954 (face font-lock-constant-face) 6954 6955 (face font-lock-string-face) 6955 6965 nil 6965 6966 (face font-lock-string-face) 6966 6986 (face font-lock-constant-face) 6986 6987 (face font-lock-string-face) 6987 6997 nil 6997 6998 (face font-lock-string-face) 6998 7017 (face font-lock-constant-face) 7017 7018 (face font-lock-string-face) 7018 7028 nil 7028 7029 (face font-lock-string-face) 7029 7050 (face font-lock-constant-face) 7050 7051 (face font-lock-string-face) 7051 7061 nil 7061 7062 (face font-lock-string-face) 7062 7082 (face font-lock-constant-face) 7082 7083 (face font-lock-string-face) 7083 7093 nil 7093 7094 (face font-lock-string-face) 7094 7122 (face font-lock-constant-face) 7122 7123 (face font-lock-string-face) 7123 7133 nil 7133 7134 (face font-lock-string-face) 7134 7161 (face font-lock-constant-face) 7161 7162 (face font-lock-string-face) 7162 7172 nil 7172 7173 (face font-lock-string-face) 7173 7194 (face font-lock-constant-face) 7194 7195 (face font-lock-string-face) 7195 7205 nil 7205 7206 (face font-lock-string-face) 7206 7226 (face font-lock-constant-face) 7226 7227 (face font-lock-string-face) 7227 7237 nil 7237 7238 (face font-lock-string-face) 7238 7266 (face font-lock-constant-face) 7266 7267 (face font-lock-string-face) 7267 7277 nil 7277 7278 (face font-lock-string-face) 7278 7305 (face font-lock-constant-face) 7305 7306 (face font-lock-string-face) 7306 7316 nil 7316 7317 (face font-lock-string-face) 7317 7336 (face font-lock-constant-face) 7336 7337 (face font-lock-string-face) 7337 7347 nil 7347 7348 (face font-lock-string-face) 7348 7366 (face font-lock-constant-face) 7366 7367 (face font-lock-string-face) 7367 7377 nil 7377 7378 (face font-lock-string-face) 7378 7399 (face font-lock-constant-face) 7399 7400 (face font-lock-string-face) 7400 7410 nil 7410 7411 (face font-lock-string-face) 7411 7429 (face font-lock-constant-face) 7429 7430 (face font-lock-string-face) 7430 7440 nil 7440 7441 (face font-lock-string-face) 7441 7458 (face font-lock-constant-face) 7458 7459 (face font-lock-string-face) 7459 7469 nil 7469 7470 (face font-lock-string-face) 7470 7493 (face font-lock-constant-face) 7493 7494 (face font-lock-string-face) 7494 7504 nil 7504 7505 (face font-lock-string-face) 7505 7527 (face font-lock-constant-face) 7527 7528 (face font-lock-string-face) 7528 7538 nil 7538 7539 (face font-lock-string-face) 7539 7562 (face font-lock-constant-face) 7562 7563 (face font-lock-string-face) 7563 7573 nil 7573 7574 (face font-lock-string-face) 7574 7596 (face font-lock-constant-face) 7596 7597 (face font-lock-string-face) 7597 7607 nil 7607 7608 (face font-lock-string-face) 7608 7631 (face font-lock-constant-face) 7631 7632 (face font-lock-string-face) 7632 7642 nil 7642 7643 (face font-lock-string-face) 7643 7665 (face font-lock-constant-face) 7665 7666 (face font-lock-string-face) 7666 7676 nil 7676 7677 (face font-lock-string-face) 7677 7705 (face font-lock-constant-face) 7705 7706 (face font-lock-string-face) 7706 7716 nil 7716 7717 (face font-lock-string-face) 7717 7744 (face font-lock-constant-face) 7744 7745 (face font-lock-string-face) 7745 7755 nil 7755 7756 (face font-lock-string-face) 7756 7791 (face font-lock-constant-face) 7791 7792 (face font-lock-string-face) 7792 7802 nil 7802 7803 (face font-lock-string-face) 7803 7837 (face font-lock-constant-face) 7837 7838 (face font-lock-string-face) 7838 7848 nil 7848 7849 (face font-lock-string-face) 7849 7879 (face font-lock-constant-face) 7879 7880 (face font-lock-string-face) 7880 7890 nil 7890 7891 (face font-lock-string-face) 7891 7920 (face font-lock-constant-face) 7920 7921 (face font-lock-string-face) 7921 7931 nil 7931 7932 (face font-lock-string-face) 7932 7962 (face font-lock-constant-face) 7962 7963 (face font-lock-string-face) 7963 7973 nil 7973 7974 (face font-lock-string-face) 7974 8003 (face font-lock-constant-face) 8003 8004 (face font-lock-string-face) 8004 8014 nil 8014 8015 (face font-lock-string-face) 8015 8039 (face font-lock-constant-face) 8039 8040 (face font-lock-string-face) 8040 8050 nil 8050 8051 (face font-lock-string-face) 8051 8074 (face font-lock-constant-face) 8074 8075 (face font-lock-string-face) 8075 8085 nil 8085 8086 (face font-lock-string-face) 8086 8116 (face font-lock-constant-face) 8116 8117 (face font-lock-string-face) 8117 8127 nil 8127 8128 (face font-lock-string-face) 8128 8152 (face font-lock-constant-face) 8152 8153 (face font-lock-string-face) 8153 8163 nil 8163 8164 (face font-lock-string-face) 8164 8187 (face font-lock-constant-face) 8187 8188 (face font-lock-string-face) 8188 8198 nil 8198 8199 (face font-lock-string-face) 8199 8230 (face font-lock-constant-face) 8230 8231 (face font-lock-string-face) 8231 8241 nil 8241 8242 (face font-lock-string-face) 8242 8272 (face font-lock-constant-face) 8272 8273 (face font-lock-string-face) 8273 8283 nil 8283 8284 (face font-lock-string-face) 8284 8309 (face font-lock-constant-face) 8309 8310 (face font-lock-string-face) 8310 8320 nil 8320 8321 (face font-lock-string-face) 8321 8345 (face font-lock-constant-face) 8345 8346 (face font-lock-string-face) 8346 8356 nil 8356 8357 (face font-lock-string-face) 8357 8399 (face font-lock-constant-face) 8399 8400 (face font-lock-string-face) 8400 8410 nil 8410 8411 (face font-lock-string-face) 8411 8452 (face font-lock-constant-face) 8452 8453 (face font-lock-string-face) 8453 8463 nil 8463 8464 (face font-lock-string-face) 8464 8486 (face font-lock-constant-face) 8486 8487 (face font-lock-string-face) 8487 8497 nil 8497 8498 (face font-lock-string-face) 8498 8519 (face font-lock-constant-face) 8519 8520 (face font-lock-string-face) 8520 8530 nil 8530 8531 (face font-lock-string-face) 8531 8562 (face font-lock-constant-face) 8562 8563 (face font-lock-string-face) 8563 8573 nil 8573 8574 (face font-lock-string-face) 8574 8604 (face font-lock-constant-face) 8604 8605 (face font-lock-string-face) 8605 8615 nil 8615 8616 (face font-lock-string-face) 8616 8643 (face font-lock-constant-face) 8643 8644 (face font-lock-string-face) 8644 8654 nil 8654 8655 (face font-lock-string-face) 8655 8681 (face font-lock-constant-face) 8681 8682 (face font-lock-string-face) 8682 8692 nil 8692 8693 (face font-lock-string-face) 8693 8721 (face font-lock-constant-face) 8721 8722 (face font-lock-string-face) 8722 8732 nil 8732 8733 (face font-lock-string-face) 8733 8760 (face font-lock-constant-face) 8760 8761 (face font-lock-string-face) 8761 8771 nil 8771 8772 (face font-lock-string-face) 8772 8805 (face font-lock-constant-face) 8805 8806 (face font-lock-string-face) 8806 8816 nil 8816 8817 (face font-lock-string-face) 8817 8849 (face font-lock-constant-face) 8849 8850 (face font-lock-string-face) 8850 8860 nil 8860 8861 (face font-lock-string-face) 8861 8892 (face font-lock-constant-face) 8892 8893 (face font-lock-string-face) 8893 8903 nil 8903 8904 (face font-lock-string-face) 8904 8934 (face font-lock-constant-face) 8934 8935 (face font-lock-string-face) 8935 8945 nil 8945 8946 (face font-lock-string-face) 8946 8978 (face font-lock-constant-face) 8978 8979 (face font-lock-string-face) 8979 8989 nil 8989 8990 (face font-lock-string-face) 8990 9021 (face font-lock-constant-face) 9021 9022 (face font-lock-string-face) 9022 9032 nil 9032 9033 (face font-lock-string-face) 9033 9063 (face font-lock-constant-face) 9063 9064 (face font-lock-string-face) 9064 9074 nil 9074 9075 (face font-lock-string-face) 9075 9104 (face font-lock-constant-face) 9104 9105 (face font-lock-string-face) 9105 9115 nil 9115 9116 (face font-lock-string-face) 9116 9158 (face font-lock-constant-face) 9158 9159 (face font-lock-string-face) 9159 9169 nil 9169 9170 (face font-lock-string-face) 9170 9211 (face font-lock-constant-face) 9211 9212 (face font-lock-string-face) 9212 9222 nil 9222 9223 (face font-lock-string-face) 9223 9272 (face font-lock-constant-face) 9272 9273 (face font-lock-string-face) 9273 9283 nil 9283 9284 (face font-lock-string-face) 9284 9332 (face font-lock-constant-face) 9332 9333 (face font-lock-string-face) 9333 9343 nil 9343 9344 (face font-lock-string-face) 9344 9388 (face font-lock-constant-face) 9388 9389 (face font-lock-string-face) 9389 9399 nil 9399 9400 (face font-lock-string-face) 9400 9445 (face font-lock-constant-face) 9445 9446 (face font-lock-string-face) 9446 9456 nil 9456 9457 (face font-lock-string-face) 9457 9507 (face font-lock-constant-face) 9507 9508 (face font-lock-string-face) 9508 9518 nil 9518 9519 (face font-lock-string-face) 9519 9570 (face font-lock-constant-face) 9570 9571 (face font-lock-string-face) 9571 9581 nil 9581 9582 (face font-lock-string-face) 9582 9611 (face font-lock-constant-face) 9611 9612 (face font-lock-string-face) 9612 9622 nil 9622 9623 (face font-lock-string-face) 9623 9659 (face font-lock-constant-face) 9659 9660 (face font-lock-string-face) 9660 9670 nil 9670 9671 (face font-lock-string-face) 9671 9714 (face font-lock-constant-face) 9714 9715 (face font-lock-string-face) 9715 9725 nil 9725 9726 (face font-lock-string-face) 9726 9768 (face font-lock-constant-face) 9768 9769 (face font-lock-string-face) 9769 9779 nil 9779 9780 (face font-lock-string-face) 9780 9816 (face font-lock-constant-face) 9816 9817 (face font-lock-string-face) 9817 9827 nil 9827 9828 (face font-lock-string-face) 9828 9863 (face font-lock-constant-face) 9863 9864 (face font-lock-string-face) 9864 9874 nil 9874 9875 (face font-lock-string-face) 9875 9910 (face font-lock-constant-face) 9910 9911 (face font-lock-string-face) 9911 9921 nil 9921 9922 (face font-lock-string-face) 9922 9958 (face font-lock-constant-face) 9958 9959 (face font-lock-string-face) 9959 9969 nil 9969 9970 (face font-lock-string-face) 9970 10005 (face font-lock-constant-face) 10005 10006 (face font-lock-string-face) 10006 10016 nil 10016 10017 (face font-lock-string-face) 10017 10050 (face font-lock-constant-face) 10050 10051 (face font-lock-string-face) 10051 10061 nil 10061 10062 (face font-lock-string-face) 10062 10094 (face font-lock-constant-face) 10094 10095 (face font-lock-string-face) 10095 10105 nil 10105 10106 (face font-lock-string-face) 10106 10150 (face font-lock-constant-face) 10150 10151 (face font-lock-string-face) 10151 10161 nil 10161 10162 (face font-lock-string-face) 10162 10198 (face font-lock-constant-face) 10198 10199 (face font-lock-string-face) 10199 10209 nil 10209 10210 (face font-lock-string-face) 10210 10245 (face font-lock-constant-face) 10245 10246 (face font-lock-string-face) 10246 10256 nil 10256 10257 (face font-lock-string-face) 10257 10296 (face font-lock-constant-face) 10296 10297 (face font-lock-string-face) 10297 10307 nil 10307 10308 (face font-lock-string-face) 10308 10346 (face font-lock-constant-face) 10346 10347 (face font-lock-string-face) 10347 10357 nil 10357 10358 (face font-lock-string-face) 10358 10403 (face font-lock-constant-face) 10403 10404 (face font-lock-string-face) 10404 10414 nil 10414 10415 (face font-lock-string-face) 10415 10459 (face font-lock-constant-face) 10459 10460 (face font-lock-string-face) 10460 10470 nil 10470 10471 (face font-lock-string-face) 10471 10487 (face font-lock-constant-face) 10487 10488 (face font-lock-string-face) 10488 10498 nil 10498 10499 (face font-lock-string-face) 10499 10514 (face font-lock-constant-face) 10514 10515 (face font-lock-string-face) 10515 10525 nil 10525 10526 (face font-lock-string-face) 10526 10559 (face font-lock-constant-face) 10559 10560 (face font-lock-string-face) 10560 10570 nil 10570 10571 (face font-lock-string-face) 10571 10603 (face font-lock-constant-face) 10603 10604 (face font-lock-string-face) 10604 10614 nil 10614 10615 (face font-lock-string-face) 10615 10636 (face font-lock-constant-face) 10636 10637 (face font-lock-string-face) 10637 10647 nil 10647 10648 (face font-lock-string-face) 10648 10675 (face font-lock-constant-face) 10675 10676 (face font-lock-string-face) 10676 10686 nil 10686 10687 (face font-lock-string-face) 10687 10713 (face font-lock-constant-face) 10713 10714 (face font-lock-string-face) 10714 10724 nil 10724 10725 (face font-lock-string-face) 10725 10755 (face font-lock-constant-face) 10755 10756 (face font-lock-string-face) 10756 10766 nil 10766 10767 (face font-lock-string-face) 10767 10796 (face font-lock-constant-face) 10796 10797 (face font-lock-string-face) 10797 10807 nil 10807 10808 (face font-lock-string-face) 10808 10845 (face font-lock-constant-face) 10845 10846 (face font-lock-string-face) 10846 10856 nil 10856 10857 (face font-lock-string-face) 10857 10893 (face font-lock-constant-face) 10893 10894 (face font-lock-string-face) 10894 10904 nil 10904 10905 (face font-lock-string-face) 10905 10929 (face font-lock-constant-face) 10929 10930 (face font-lock-string-face) 10930 10940 nil 10940 10941 (face font-lock-string-face) 10941 10964 (face font-lock-constant-face) 10964 10965 (face font-lock-string-face) 10965 10975 nil 10975 10976 (face font-lock-string-face) 10976 10995 (face font-lock-constant-face) 10995 10996 (face font-lock-string-face) 10996 11006 nil 11006 11007 (face font-lock-string-face) 11007 11025 (face font-lock-constant-face) 11025 11026 (face font-lock-string-face) 11026 11036 nil 11036 11037 (face font-lock-string-face) 11037 11063 (face font-lock-constant-face) 11063 11064 (face font-lock-string-face) 11064 11074 nil 11074 11075 (face font-lock-string-face) 11075 11100 (face font-lock-constant-face) 11100 11101 (face font-lock-string-face) 11101 11111 nil 11111 11112 (face font-lock-string-face) 11112 11138 (face font-lock-constant-face) 11138 11139 (face font-lock-string-face) 11139 11149 nil 11149 11150 (face font-lock-string-face) 11150 11175 (face font-lock-constant-face) 11175 11176 (face font-lock-string-face) 11176 11193 nil 11193 11194 (face font-lock-string-face) 11194 11219 (face font-lock-keyword-face) 11219 11220 (face font-lock-string-face) 11220 11232 nil 11232 11233 (face font-lock-string-face) 11233 11245 (face font-lock-keyword-face) 11245 11246 (face font-lock-string-face) 11246 11260 nil 11260 11261 (face font-lock-string-face) 11261 11263 (face font-lock-constant-face) 11263 11264 (face font-lock-string-face) 11264 11292 nil 11292 11293 (face font-lock-string-face) 11293 11303 (face font-lock-keyword-face) 11303 11304 (face font-lock-string-face) 11304 11316 nil 11316 11381 (face font-lock-comment-face) 11381 11389 nil 11389 11439 (face font-lock-comment-face) 11439 11448 nil 11448 11449 (face font-lock-string-face) 11449 11464 (face font-lock-variable-name-face) 11464 11465 (face font-lock-string-face) 11465 11479 nil 11479 11480 (face font-lock-string-face) 11480 11492 (face font-lock-keyword-face) 11492 11493 (face font-lock-string-face) 11493 11509 nil 11509 11510 (face font-lock-string-face) 11510 11549 (face font-lock-function-name-face) 11549 11550 (face font-lock-string-face) 11550 11586 nil 11586 11587 (face font-lock-string-face) 11587 11602 (face font-lock-variable-name-face) 11602 11603 (face font-lock-string-face) 11603 11617 nil 11617 11618 (face font-lock-string-face) 11618 11626 (face font-lock-keyword-face) 11626 11627 (face font-lock-string-face) 11627 11643 nil 11643 11644 (face font-lock-string-face) 11644 11663 (face font-lock-constant-face) 11663 11664 (face font-lock-string-face) 11664 11678 nil 11678 11679 (face font-lock-string-face) 11679 11702 (face font-lock-constant-face) 11702 11703 (face font-lock-string-face) 11703 11717 nil 11717 11718 (face font-lock-string-face) 11718 11740 (face font-lock-constant-face) 11740 11741 (face font-lock-string-face) 11741 11755 nil 11755 11756 (face font-lock-string-face) 11756 11779 (face font-lock-constant-face) 11779 11780 (face font-lock-string-face) 11780 11794 nil 11794 11795 (face font-lock-string-face) 11795 11817 (face font-lock-constant-face) 11817 11818 (face font-lock-string-face) 11818 11832 nil 11832 11833 (face font-lock-string-face) 11833 11861 (face font-lock-constant-face) 11861 11862 (face font-lock-string-face) 11862 11876 nil 11876 11877 (face font-lock-string-face) 11877 11904 (face font-lock-constant-face) 11904 11905 (face font-lock-string-face) 11905 11919 nil 11919 11920 (face font-lock-string-face) 11920 11950 (face font-lock-constant-face) 11950 11951 (face font-lock-string-face) 11951 11965 nil 11965 11966 (face font-lock-string-face) 11966 11995 (face font-lock-constant-face) 11995 11996 (face font-lock-string-face) 11996 12010 nil 12010 12011 (face font-lock-string-face) 12011 12035 (face font-lock-constant-face) 12035 12036 (face font-lock-string-face) 12036 12050 nil 12050 12051 (face font-lock-string-face) 12051 12074 (face font-lock-constant-face) 12074 12075 (face font-lock-string-face) 12075 12089 nil 12089 12090 (face font-lock-string-face) 12090 12120 (face font-lock-constant-face) 12120 12121 (face font-lock-string-face) 12121 12135 nil 12135 12136 (face font-lock-string-face) 12136 12167 (face font-lock-constant-face) 12167 12168 (face font-lock-string-face) 12168 12182 nil 12182 12183 (face font-lock-string-face) 12183 12213 (face font-lock-constant-face) 12213 12214 (face font-lock-string-face) 12214 12228 nil 12228 12229 (face font-lock-string-face) 12229 12254 (face font-lock-constant-face) 12254 12255 (face font-lock-string-face) 12255 12269 nil 12269 12270 (face font-lock-string-face) 12270 12294 (face font-lock-constant-face) 12294 12295 (face font-lock-string-face) 12295 12309 nil 12309 12310 (face font-lock-string-face) 12310 12352 (face font-lock-constant-face) 12352 12353 (face font-lock-string-face) 12353 12367 nil 12367 12368 (face font-lock-string-face) 12368 12409 (face font-lock-constant-face) 12409 12410 (face font-lock-string-face) 12410 12424 nil 12424 12425 (face font-lock-string-face) 12425 12447 (face font-lock-constant-face) 12447 12448 (face font-lock-string-face) 12448 12462 nil 12462 12463 (face font-lock-string-face) 12463 12484 (face font-lock-constant-face) 12484 12485 (face font-lock-string-face) 12485 12499 nil 12499 12500 (face font-lock-string-face) 12500 12531 (face font-lock-constant-face) 12531 12532 (face font-lock-string-face) 12532 12546 nil 12546 12547 (face font-lock-string-face) 12547 12577 (face font-lock-constant-face) 12577 12578 (face font-lock-string-face) 12578 12592 nil 12592 12593 (face font-lock-string-face) 12593 12621 (face font-lock-constant-face) 12621 12622 (face font-lock-string-face) 12622 12636 nil 12636 12637 (face font-lock-string-face) 12637 12664 (face font-lock-constant-face) 12664 12665 (face font-lock-string-face) 12665 12679 nil 12679 12680 (face font-lock-string-face) 12680 12707 (face font-lock-constant-face) 12707 12708 (face font-lock-string-face) 12708 12722 nil 12722 12723 (face font-lock-string-face) 12723 12749 (face font-lock-constant-face) 12749 12750 (face font-lock-string-face) 12750 12764 nil 12764 12765 (face font-lock-string-face) 12765 12791 (face font-lock-constant-face) 12791 12792 (face font-lock-string-face) 12792 12806 nil 12806 12807 (face font-lock-string-face) 12807 12832 (face font-lock-constant-face) 12832 12833 (face font-lock-string-face) 12833 12868 nil 12868 12937 (face font-lock-comment-face) 12937 12945 nil 12945 13016 (face font-lock-comment-face) 13016 13024 nil 13024 13040 (face font-lock-comment-face) 13040 13049 nil 13049 13050 (face font-lock-string-face) 13050 13065 (face font-lock-variable-name-face) 13065 13066 (face font-lock-string-face) 13066 13080 nil 13080 13081 (face font-lock-string-face) 13081 13089 (face font-lock-keyword-face) 13089 13090 (face font-lock-string-face) 13090 13105 nil 13105 13106 (face font-lock-string-face) 13106 13149 (face font-lock-constant-face) 13149 13150 (face font-lock-string-face) 13150 13175 nil 13175 13176 (face font-lock-string-face) 13176 13183 (face font-lock-keyword-face) 13183 13184 (face font-lock-string-face) 13184 13199 nil 13199 13200 (face font-lock-string-face) 13200 13248 (face font-lock-constant-face) 13248 13249 (face font-lock-string-face) 13249 13274 nil 13274 13275 (face font-lock-string-face) 13275 13288 (face font-lock-keyword-face) 13288 13289 (face font-lock-string-face) 13289 13305 nil 13305 13306 (face font-lock-string-face) 13306 13315 (face font-lock-keyword-face) 13315 13316 (face font-lock-string-face) 13316 13334 nil 13334 13335 (face font-lock-string-face) 13335 13345 (face font-lock-constant-face) 13345 13346 (face font-lock-string-face) 13346 13397 nil 13397 13398 (face font-lock-string-face) 13398 13443 (face font-lock-variable-name-face) 13443 13444 (face font-lock-string-face) 13444 13458 nil 13458 13459 (face font-lock-string-face) 13459 13472 (face font-lock-keyword-face) 13472 13473 (face font-lock-string-face) 13473 13489 nil 13489 13490 (face font-lock-string-face) 13490 13499 (face font-lock-keyword-face) 13499 13500 (face font-lock-string-face) 13500 13518 nil 13518 13519 (face font-lock-string-face) 13519 13527 (face font-lock-constant-face) 13527 13528 (face font-lock-string-face) 13528 13579 nil 13579 13580 (face font-lock-string-face) 13580 13593 (face font-lock-variable-name-face) 13593 13594 (face font-lock-string-face) 13594 13608 nil 13608 13609 (face font-lock-string-face) 13609 13617 (face font-lock-keyword-face) 13617 13618 (face font-lock-string-face) 13618 13623 nil 13623 13624 (face font-lock-string-face) 13624 13631 (face font-lock-constant-face) 13631 13632 (face font-lock-string-face) 13632 13634 nil 13634 13635 (face font-lock-string-face) 13635 13641 (face font-lock-constant-face) 13641 13642 (face font-lock-string-face) 13642 13671 nil 13671 13672 (face font-lock-string-face) 13672 13679 (face font-lock-constant-face) 13679 13680 (face font-lock-string-face) 13680 13682 nil 13682 13683 (face font-lock-string-face) 13683 13703 (face font-lock-constant-face) 13703 13704 (face font-lock-string-face) 13704 13720 nil 13720 13721 (face font-lock-string-face) 13721 13734 (face font-lock-keyword-face) 13734 13735 (face font-lock-string-face) 13735 13751 nil 13751 13752 (face font-lock-string-face) 13752 13761 (face font-lock-keyword-face) 13761 13762 (face font-lock-string-face) 13762 13815 nil 13815 13816 (face font-lock-string-face) 13816 13829 (face font-lock-variable-name-face) 13829 13830 (face font-lock-string-face) 13830 13844 nil 13844 13845 (face font-lock-string-face) 13845 13853 (face font-lock-keyword-face) 13853 13854 (face font-lock-string-face) 13854 13870 nil 13870 13871 (face font-lock-string-face) 13871 13909 (face font-lock-constant-face) 13909 13910 (face font-lock-string-face) 13910 13924 nil 13924 13925 (face font-lock-string-face) 13925 13962 (face font-lock-constant-face) 13962 13963 (face font-lock-string-face) 13963 13999 nil 13999 14000 (face font-lock-string-face) 14000 14011 (face font-lock-variable-name-face) 14011 14012 (face font-lock-string-face) 14012 14026 nil 14026 14027 (face font-lock-string-face) 14027 14036 (face font-lock-keyword-face) 14036 14037 (face font-lock-string-face) 14037 14053 nil 14053 14054 (face font-lock-string-face) 14054 14064 (face font-lock-keyword-face) 14064 14065 (face font-lock-string-face) 14065 14084 nil 14084 14085 (face font-lock-string-face) 14085 14096 (face font-lock-variable-name-face) 14096 14097 (face font-lock-string-face) 14097 14117 nil 14117 14129 (face font-lock-string-face) 14129 14131 nil 14131 14169 (face font-lock-string-face) 14169 14176 (face font-lock-variable-name-face) 14176 14182 (face font-lock-string-face) 14182 14193 (face font-lock-variable-name-face) 14193 14196 (face font-lock-string-face) 14196 14233 nil 14233 14245 (face font-lock-string-face) 14245 14247 nil 14247 14259 (face font-lock-string-face) 14259 14316 nil 14316 14317 (face font-lock-string-face) 14317 14327 (face font-lock-keyword-face) 14327 14328 (face font-lock-string-face) 14328 14345 nil 14345 14346 (face font-lock-string-face) 14346 14359 (face font-lock-variable-name-face) 14359 14360 (face font-lock-string-face) 14360 14378 nil 14378 14379 (face font-lock-string-face) 14379 14385 (face font-lock-keyword-face) 14385 14386 (face font-lock-string-face) 14386 14406 nil 14406 14411 (face font-lock-string-face) 14411 14413 (face font-lock-variable-name-face) 14413 14423 (face font-lock-variable-name-face) 14423 14443 (face font-lock-string-face) 14443 14476 nil 14476 14477 (face font-lock-string-face) 14477 14490 (face font-lock-keyword-face) 14490 14491 (face font-lock-string-face) 14491 14511 nil 14511 14512 (face font-lock-string-face) 14512 14521 (face font-lock-keyword-face) 14521 14522 (face font-lock-string-face) 14522 14544 nil 14544 14545 (face font-lock-string-face) 14545 14549 (face font-lock-constant-face) 14549 14551 (face font-lock-variable-name-face) 14551 14561 (face font-lock-variable-name-face) 14561 14578 (face font-lock-constant-face) 14578 14579 (face font-lock-string-face) 14579 14631 nil 14631 14632 (face font-lock-string-face) 14632 14639 (face font-lock-keyword-face) 14639 14640 (face font-lock-string-face) 14640 14660 nil 14660 14661 (face font-lock-string-face) 14661 14669 (face font-lock-preprocessor-face) 14669 14670 (face font-lock-string-face) 14670 14707 nil 14707 14729 (face font-lock-comment-face) 14729 14743 nil 14743 14744 (face font-lock-string-face) 14744 14752 (face font-lock-keyword-face) 14752 14753 (face font-lock-string-face) 14753 14773 nil 14773 14774 (face font-lock-string-face) 14774 14800 (face font-lock-constant-face) 14800 14801 (face font-lock-string-face) 14801 14819 nil 14819 14820 (face font-lock-string-face) 14820 14845 (face font-lock-constant-face) 14845 14846 (face font-lock-string-face) 14846 14915 nil 14915 14916 (face font-lock-string-face) 14916 14929 (face font-lock-variable-name-face) 14929 14930 (face font-lock-string-face) 14930 14944 nil 14944 14945 (face font-lock-string-face) 14945 14955 (face font-lock-keyword-face) 14955 14956 (face font-lock-string-face) 14956 14973 nil 14973 14974 (face font-lock-string-face) 14974 14993 (face font-lock-variable-name-face) 14993 14994 (face font-lock-string-face) 14994 15012 nil 15012 15013 (face font-lock-string-face) 15013 15019 (face font-lock-keyword-face) 15019 15020 (face font-lock-string-face) 15020 15040 nil 15040 15075 (face font-lock-string-face) 15075 15108 nil 15108 15109 (face font-lock-string-face) 15109 15122 (face font-lock-keyword-face) 15122 15123 (face font-lock-string-face) 15123 15143 nil 15143 15144 (face font-lock-string-face) 15144 15153 (face font-lock-keyword-face) 15153 15154 (face font-lock-string-face) 15154 15176 nil 15176 15177 (face font-lock-string-face) 15177 15215 (face font-lock-constant-face) 15215 15216 (face font-lock-string-face) 15216 15268 nil 15268 15269 (face font-lock-string-face) 15269 15276 (face font-lock-keyword-face) 15276 15277 (face font-lock-string-face) 15277 15297 nil 15297 15298 (face font-lock-string-face) 15298 15312 (face font-lock-preprocessor-face) 15312 15313 (face font-lock-string-face) 15313 15350 nil 15350 15378 (face font-lock-comment-face) 15378 15392 nil 15392 15393 (face font-lock-string-face) 15393 15401 (face font-lock-keyword-face) 15401 15402 (face font-lock-string-face) 15402 15422 nil 15422 15423 (face font-lock-string-face) 15423 15450 (face font-lock-constant-face) 15450 15451 (face font-lock-string-face) 15451 15469 nil 15469 15470 (face font-lock-string-face) 15470 15496 (face font-lock-constant-face) 15496 15497 (face font-lock-string-face) 15497 15566 nil 15566 15567 (face font-lock-string-face) 15567 15600 (face font-lock-variable-name-face) 15600 15601 (face font-lock-string-face) 15601 15615 nil 15615 15663 (face font-lock-comment-face) 15663 15673 nil 15673 15674 (face font-lock-string-face) 15674 15682 (face font-lock-keyword-face) 15682 15683 (face font-lock-string-face) 15683 15699 nil 15699 15700 (face font-lock-string-face) 15700 15743 (face font-lock-constant-face) 15743 15744 (face font-lock-string-face) 15744 15758 nil 15758 15759 (face font-lock-string-face) 15759 15801 (face font-lock-constant-face) 15801 15802 (face font-lock-string-face) 15802 15838 nil 15838 15839 (face font-lock-string-face) 15839 15848 (face font-lock-variable-name-face) 15848 15849 (face font-lock-string-face) 15849 15863 nil 15863 15864 (face font-lock-string-face) 15864 15877 (face font-lock-keyword-face) 15877 15878 (face font-lock-string-face) 15878 15894 nil 15894 15895 (face font-lock-string-face) 15895 15904 (face font-lock-keyword-face) 15904 15905 (face font-lock-string-face) 15905 15923 nil 15923 15924 (face font-lock-string-face) 15924 15980 (face font-lock-constant-face) 15980 15981 (face font-lock-string-face) 15981 15997 nil 15997 15998 (face font-lock-string-face) 15998 16057 (face font-lock-constant-face) 16057 16058 (face font-lock-string-face) 16058 16074 nil 16074 16075 (face font-lock-string-face) 16075 16131 (face font-lock-constant-face) 16131 16132 (face font-lock-string-face) 16132 16148 nil 16148 16149 (face font-lock-string-face) 16149 16205 (face font-lock-constant-face) 16205 16206 (face font-lock-string-face) 16206 16222 nil 16222 16223 (face font-lock-string-face) 16223 16275 (face font-lock-constant-face) 16275 16276 (face font-lock-string-face) 16276 16327 nil 16327 16328 (face font-lock-string-face) 16328 16337 (face font-lock-variable-name-face) 16337 16338 (face font-lock-string-face) 16338 16352 nil 16352 16353 (face font-lock-string-face) 16353 16361 (face font-lock-keyword-face) 16361 16362 (face font-lock-string-face) 16362 16378 nil 16378 16379 (face font-lock-string-face) 16379 16406 (face font-lock-constant-face) 16406 16407 (face font-lock-string-face) 16407 16421 nil 16421 16422 (face font-lock-string-face) 16422 16448 (face font-lock-constant-face) 16448 16449 (face font-lock-string-face) 16449 16463 nil 16463 16464 (face font-lock-string-face) 16464 16507 (face font-lock-constant-face) 16507 16508 (face font-lock-string-face) 16508 16522 nil 16522 16523 (face font-lock-string-face) 16523 16565 (face font-lock-constant-face) 16565 16566 (face font-lock-string-face) 16566 16602 nil 16602 16603 (face font-lock-string-face) 16603 16646 (face font-lock-variable-name-face) 16646 16647 (face font-lock-string-face) 16647 16661 nil 16661 16662 (face font-lock-string-face) 16662 16669 (face font-lock-keyword-face) 16669 16670 (face font-lock-string-face) 16670 16686 nil 16686 16687 (face font-lock-string-face) 16687 16697 (face font-lock-constant-face) 16697 16698 (face font-lock-string-face) 16698 16712 nil 16712 16713 (face font-lock-string-face) 16713 16722 (face font-lock-constant-face) 16722 16723 (face font-lock-string-face) 16723 16737 nil 16737 16738 (face font-lock-string-face) 16738 16760 (face font-lock-constant-face) 16760 16761 (face font-lock-string-face) 16761 16775 nil 16775 16776 (face font-lock-string-face) 16776 16797 (face font-lock-constant-face) 16797 16798 (face font-lock-string-face) 16798 16812 nil 16812 16813 (face font-lock-string-face) 16813 16830 (face font-lock-constant-face) 16830 16831 (face font-lock-string-face) 16831 16845 nil 16845 16846 (face font-lock-string-face) 16846 16862 (face font-lock-constant-face) 16862 16863 (face font-lock-string-face) 16863 16877 nil 16877 16878 (face font-lock-string-face) 16878 16889 (face font-lock-constant-face) 16889 16890 (face font-lock-string-face) 16890 16904 nil 16904 16905 (face font-lock-string-face) 16905 16915 (face font-lock-constant-face) 16915 16916 (face font-lock-string-face) 16916 16930 nil 16930 16931 (face font-lock-string-face) 16931 16955 (face font-lock-constant-face) 16955 16956 (face font-lock-string-face) 16956 16970 nil 16970 16971 (face font-lock-string-face) 16971 16994 (face font-lock-constant-face) 16994 16995 (face font-lock-string-face) 16995 17009 nil 17009 17010 (face font-lock-string-face) 17010 17034 (face font-lock-constant-face) 17034 17035 (face font-lock-string-face) 17035 17049 nil 17049 17050 (face font-lock-string-face) 17050 17073 (face font-lock-constant-face) 17073 17074 (face font-lock-string-face) 17074 17088 nil 17088 17089 (face font-lock-string-face) 17089 17114 (face font-lock-constant-face) 17114 17115 (face font-lock-string-face) 17115 17129 nil 17129 17130 (face font-lock-string-face) 17130 17154 (face font-lock-constant-face) 17154 17155 (face font-lock-string-face) 17155 17210 nil 17210 17211 (face font-lock-string-face) 17211 17222 (face font-lock-keyword-face) 17222 17223 (face font-lock-string-face) 17223 17225 nil 17225 17226 (face font-lock-string-face) 17226 17237 (face font-lock-function-name-face) 17237 17238 (face font-lock-string-face) 17238 17246 nil 17246 17247 (face font-lock-string-face) 17247 17251 (face font-lock-keyword-face) 17251 17252 (face font-lock-string-face) 17252 17254 nil 17254 17255 (face font-lock-string-face) 17255 17269 (face font-lock-type-face) 17269 17270 (face font-lock-string-face) 17270 17278 nil 17278 17279 (face font-lock-string-face) 17279 17291 (face font-lock-keyword-face) 17291 17292 (face font-lock-string-face) 17292 17304 nil 17304 17305 (face font-lock-string-face) 17305 17307 (face font-lock-constant-face) 17307 17308 (face font-lock-string-face) 17308 17325 nil 17325 17326 (face font-lock-string-face) 17326 17336 (face font-lock-keyword-face) 17336 17337 (face font-lock-string-face) 17337 17350 nil 17350 17351 (face font-lock-string-face) 17351 17371 (face font-lock-variable-name-face) 17371 17372 (face font-lock-string-face) 17372 17386 nil 17386 17387 (face font-lock-string-face) 17387 17404 (face font-lock-keyword-face) 17404 17405 (face font-lock-string-face) 17405 17423 nil 17423 17424 (face font-lock-string-face) 17424 17442 (face font-lock-variable-name-face) 17442 17443 (face font-lock-string-face) 17443 17461 nil 17461 17462 (face font-lock-string-face) 17462 17469 (face font-lock-keyword-face) 17469 17470 (face font-lock-string-face) 17470 17474 nil 17474 17498 (face font-lock-string-face) 17498 17553 nil 17553 17554 (face font-lock-string-face) 17554 17599 (face font-lock-variable-name-face) 17599 17600 (face font-lock-string-face) 17600 17614 nil 17614 17615 (face font-lock-string-face) 17615 17627 (face font-lock-keyword-face) 17627 17628 (face font-lock-string-face) 17628 17644 nil 17644 17645 (face font-lock-string-face) 17645 17665 (face font-lock-function-name-face) 17665 17666 (face font-lock-string-face) 17666 17703 nil 17703 17704 (face font-lock-string-face) 17704 17724 (face font-lock-variable-name-face) 17724 17725 (face font-lock-string-face) 17725 17739 nil 17739 17740 (face font-lock-string-face) 17740 17752 (face font-lock-keyword-face) 17752 17753 (face font-lock-string-face) 17753 17769 nil 17769 17770 (face font-lock-string-face) 17770 17790 (face font-lock-function-name-face) 17790 17791 (face font-lock-string-face) 17791 17833 nil 17833 17834 (face font-lock-string-face) 17834 17841 (face font-lock-keyword-face) 17841 17842 (face font-lock-string-face) 17842 17854 nil 17854 17855 (face font-lock-string-face) 17855 17874 (face font-lock-constant-face) 17874 17875 (face font-lock-string-face) 17875 17885 nil 17885 17886 (face font-lock-string-face) 17886 17904 (face font-lock-constant-face) 17904 17905 (face font-lock-string-face) 17905 17935 nil 17935 17936 (face font-lock-string-face) 17936 17947 (face font-lock-keyword-face) 17947 17948 (face font-lock-string-face) 17948 17950 nil 17950 17951 (face font-lock-string-face) 17951 17971 (face font-lock-function-name-face) 17971 17972 (face font-lock-string-face) 17972 17980 nil 17980 17981 (face font-lock-string-face) 17981 17985 (face font-lock-keyword-face) 17985 17986 (face font-lock-string-face) 17986 17988 nil 17988 17989 (face font-lock-string-face) 17989 18003 (face font-lock-type-face) 18003 18004 (face font-lock-string-face) 18004 18012 nil 18012 18013 (face font-lock-string-face) 18013 18025 (face font-lock-keyword-face) 18025 18026 (face font-lock-string-face) 18026 18038 nil 18038 18039 (face font-lock-string-face) 18039 18041 (face font-lock-constant-face) 18041 18042 (face font-lock-string-face) 18042 18059 nil 18059 18060 (face font-lock-string-face) 18060 18067 (face font-lock-keyword-face) 18067 18068 (face font-lock-string-face) 18068 18080 nil 18080 18081 (face font-lock-string-face) 18081 18114 (face font-lock-constant-face) 18114 18115 (face font-lock-string-face) 18115 18125 nil 18125 18126 (face font-lock-string-face) 18126 18162 (face font-lock-constant-face) 18162 18163 (face font-lock-string-face) 18163 18173 nil 18173 18174 (face font-lock-string-face) 18174 18212 (face font-lock-constant-face) 18212 18213 (face font-lock-string-face) 18213 18223 nil 18223 18224 (face font-lock-string-face) 18224 18261 (face font-lock-constant-face) 18261 18262 (face font-lock-string-face) 18262 18272 nil 18272 18273 (face font-lock-string-face) 18273 18311 (face font-lock-constant-face) 18311 18312 (face font-lock-string-face) 18312 18322 nil 18322 18323 (face font-lock-string-face) 18323 18356 (face font-lock-constant-face) 18356 18357 (face font-lock-string-face) 18357 18367 nil 18367 18368 (face font-lock-string-face) 18368 18403 (face font-lock-constant-face) 18403 18404 (face font-lock-string-face) 18404 18414 nil 18414 18415 (face font-lock-string-face) 18415 18451 (face font-lock-constant-face) 18451 18452 (face font-lock-string-face) 18452 18462 nil 18462 18463 (face font-lock-string-face) 18463 18499 (face font-lock-constant-face) 18499 18500 (face font-lock-string-face) 18500 18510 nil 18510 18511 (face font-lock-string-face) 18511 18547 (face font-lock-constant-face) 18547 18548 (face font-lock-string-face) 18548 18558 nil 18558 18559 (face font-lock-string-face) 18559 18581 (face font-lock-constant-face) 18581 18582 (face font-lock-string-face) 18582 18592 nil 18592 18593 (face font-lock-string-face) 18593 18618 (face font-lock-constant-face) 18618 18619 (face font-lock-string-face) 18619 18629 nil 18629 18630 (face font-lock-string-face) 18630 18657 (face font-lock-constant-face) 18657 18658 (face font-lock-string-face) 18658 18668 nil 18668 18669 (face font-lock-string-face) 18669 18697 (face font-lock-constant-face) 18697 18698 (face font-lock-string-face) 18698 18708 nil 18708 18709 (face font-lock-string-face) 18709 18750 (face font-lock-constant-face) 18750 18751 (face font-lock-string-face) 18751 18761 nil 18761 18762 (face font-lock-string-face) 18762 18803 (face font-lock-constant-face) 18803 18804 (face font-lock-string-face) 18804 18814 nil 18814 18815 (face font-lock-string-face) 18815 18856 (face font-lock-constant-face) 18856 18857 (face font-lock-string-face) 18857 18867 nil 18867 18868 (face font-lock-string-face) 18868 18902 (face font-lock-constant-face) 18902 18903 (face font-lock-string-face) 18903 18913 nil 18913 18914 (face font-lock-string-face) 18914 18948 (face font-lock-constant-face) 18948 18949 (face font-lock-string-face) 18949 18959 nil 18959 18960 (face font-lock-string-face) 18960 18994 (face font-lock-constant-face) 18994 18995 (face font-lock-string-face) 18995 19005 nil 19005 19006 (face font-lock-string-face) 19006 19035 (face font-lock-constant-face) 19035 19036 (face font-lock-string-face) 19036 19046 nil 19046 19047 (face font-lock-string-face) 19047 19075 (face font-lock-constant-face) 19075 19076 (face font-lock-string-face) 19076 19093 nil 19093 19094 (face font-lock-string-face) 19094 19104 (face font-lock-keyword-face) 19104 19105 (face font-lock-string-face) 19105 19118 nil 19118 19119 (face font-lock-string-face) 19119 19139 (face font-lock-variable-name-face) 19139 19140 (face font-lock-string-face) 19140 19154 nil 19154 19155 (face font-lock-string-face) 19155 19172 (face font-lock-keyword-face) 19172 19173 (face font-lock-string-face) 19173 19191 nil 19191 19192 (face font-lock-string-face) 19192 19210 (face font-lock-variable-name-face) 19210 19211 (face font-lock-string-face) 19211 19229 nil 19229 19230 (face font-lock-string-face) 19230 19237 (face font-lock-keyword-face) 19237 19238 (face font-lock-string-face) 19238 19242 nil 19242 19266 (face font-lock-string-face) 19266 19321 nil 19321 19322 (face font-lock-string-face) 19322 19342 (face font-lock-variable-name-face) 19342 19343 (face font-lock-string-face) 19343 19357 nil 19357 19399 (face font-lock-comment-face) 19399 19409 nil 19409 19410 (face font-lock-string-face) 19410 19417 (face font-lock-keyword-face) 19417 19418 (face font-lock-string-face) 19418 19434 nil 19434 19435 (face font-lock-string-face) 19435 19480 (face font-lock-constant-face) 19480 19481 (face font-lock-string-face) 19481 19495 nil 19495 19496 (face font-lock-string-face) 19496 19535 (face font-lock-constant-face) 19535 19536 (face font-lock-string-face) 19536 19573 nil 19573 19574 (face font-lock-string-face) 19574 19623 (face font-lock-variable-name-face) 19623 19624 (face font-lock-string-face) 19624 19638 nil 19638 19639 (face font-lock-string-face) 19639 19645 (face font-lock-keyword-face) 19645 19646 (face font-lock-string-face) 19646 19662 nil 19662 19670 (face font-lock-string-face) 19670 19707 nil 19707 19708 (face font-lock-string-face) 19708 19719 (face font-lock-variable-name-face) 19719 19720 (face font-lock-string-face) 19720 19734 nil 19734 19735 (face font-lock-string-face) 19735 19749 (face font-lock-keyword-face) 19749 19750 (face font-lock-string-face) 19750 19766 nil 19766 19773 (face font-lock-string-face) 19773 19791 nil 19791 19792 (face font-lock-string-face) 19792 19806 (face font-lock-keyword-face) 19806 19807 (face font-lock-string-face) 19807 19827 nil 19827 19890 (face font-lock-comment-face) 19890 19906 nil 19906 19971 (face font-lock-comment-face) 19971 19987 nil 19987 20032 (face font-lock-comment-face) 20032 20048 nil 20048 20072 (face font-lock-string-face) 20072 20074 nil 20074 20077 (face font-lock-string-face) 20077 20080 nil 20080 20086 (face font-lock-comment-face) 20086 20155 nil 20155 20156 (face font-lock-string-face) 20156 20165 (face font-lock-variable-name-face) 20165 20166 (face font-lock-string-face) 20166 20180 nil 20180 20181 (face font-lock-string-face) 20181 20190 (face font-lock-keyword-face) 20190 20191 (face font-lock-string-face) 20191 20207 nil 20207 20208 (face font-lock-string-face) 20208 20218 (face font-lock-variable-name-face) 20218 20219 (face font-lock-string-face) 20219 20237 nil 20237 20246 (face font-lock-string-face) 20246 20262 nil 20262 20270 (face font-lock-string-face) 20270 20286 nil 20286 20298 (face font-lock-string-face) 20298 20314 nil 20314 20322 (face font-lock-string-face) 20322 20374 nil 20374 20375 (face font-lock-string-face) 20375 20384 (face font-lock-variable-name-face) 20384 20385 (face font-lock-string-face) 20385 20399 nil 20399 20400 (face font-lock-string-face) 20400 20409 (face font-lock-keyword-face) 20409 20410 (face font-lock-string-face) 20410 20426 nil 20426 20427 (face font-lock-string-face) 20427 20437 (face font-lock-variable-name-face) 20437 20438 (face font-lock-string-face) 20438 20456 nil 20456 20466 (face font-lock-string-face) 20466 20482 nil 20482 20491 (face font-lock-string-face) 20491 20507 nil 20507 20519 (face font-lock-string-face) 20519 20535 nil 20535 20543 (face font-lock-string-face) 20543 20595 nil 20595 20596 (face font-lock-string-face) 20596 20621 (face font-lock-variable-name-face) 20621 20622 (face font-lock-string-face) 20622 20636 nil 20636 20637 (face font-lock-string-face) 20637 20646 (face font-lock-keyword-face) 20646 20647 (face font-lock-string-face) 20647 20663 nil 20663 20664 (face font-lock-string-face) 20664 20674 (face font-lock-keyword-face) 20674 20675 (face font-lock-string-face) 20675 20695 nil 20695 20696 (face font-lock-string-face) 20696 20715 (face font-lock-variable-name-face) 20715 20716 (face font-lock-string-face) 20716 20736 nil 20736 20748 (face font-lock-string-face) 20748 20770 nil 20770 20780 (face font-lock-string-face) 20780 20800 nil 20800 20807 (face font-lock-string-face) 20807 20827 nil 20827 20839 (face font-lock-string-face) 20839 20859 nil 20859 20867 (face font-lock-string-face) 20867 20923 nil 20923 20935 (face font-lock-string-face) 20935 20957 nil 20957 20972 (face font-lock-string-face) 20972 20992 nil 20992 20999 (face font-lock-string-face) 20999 21019 nil 21019 21026 (face font-lock-string-face) 21026 21046 nil 21046 21058 (face font-lock-string-face) 21058 21078 nil 21078 21086 (face font-lock-string-face) 21086 21180 nil 21180 21181 (face font-lock-string-face) 21181 21190 (face font-lock-keyword-face) 21190 21191 (face font-lock-string-face) 21191 21203 nil 21203 21204 (face font-lock-string-face) 21204 21220 (face font-lock-variable-name-face) 21220 21221 (face font-lock-string-face) 21221 21223 nil 21223 21224 (face font-lock-string-face) 21224 21256 (face font-lock-variable-name-face) 21256 21257 (face font-lock-string-face) 21257 21274 nil 21274 21314 (face font-lock-string-face) 21314 21325 nil 21325 21326 (face font-lock-string-face) 21326 21334 (face font-lock-keyword-face) 21334 21335 (face font-lock-string-face) 21335 21347 nil 21347 21348 (face font-lock-string-face) 21348 21385 (face font-lock-constant-face) 21385 21386 (face font-lock-string-face) 21386 21416 nil 21416 21417 (face font-lock-string-face) 21417 21428 (face font-lock-keyword-face) 21428 21429 (face font-lock-string-face) 21429 21431 nil 21431 21432 (face font-lock-string-face) 21432 21452 (face font-lock-function-name-face) 21452 21453 (face font-lock-string-face) 21453 21461 nil 21461 21462 (face font-lock-string-face) 21462 21466 (face font-lock-keyword-face) 21466 21467 (face font-lock-string-face) 21467 21469 nil 21469 21470 (face font-lock-string-face) 21470 21484 (face font-lock-type-face) 21484 21485 (face font-lock-string-face) 21485 21493 nil 21493 21494 (face font-lock-string-face) 21494 21506 (face font-lock-keyword-face) 21506 21507 (face font-lock-string-face) 21507 21519 nil 21519 21520 (face font-lock-string-face) 21520 21522 (face font-lock-constant-face) 21522 21523 (face font-lock-string-face) 21523 21540 nil 21540 21541 (face font-lock-string-face) 21541 21548 (face font-lock-keyword-face) 21548 21549 (face font-lock-string-face) 21549 21561 nil 21561 21562 (face font-lock-string-face) 21562 21595 (face font-lock-constant-face) 21595 21596 (face font-lock-string-face) 21596 21606 nil 21606 21607 (face font-lock-string-face) 21607 21637 (face font-lock-constant-face) 21637 21638 (face font-lock-string-face) 21638 21648 nil 21648 21649 (face font-lock-string-face) 21649 21682 (face font-lock-constant-face) 21682 21683 (face font-lock-string-face) 21683 21693 nil 21693 21694 (face font-lock-string-face) 21694 21724 (face font-lock-constant-face) 21724 21725 (face font-lock-string-face) 21725 21735 nil 21735 21736 (face font-lock-string-face) 21736 21758 (face font-lock-constant-face) 21758 21759 (face font-lock-string-face) 21759 21769 nil 21769 21770 (face font-lock-string-face) 21770 21795 (face font-lock-constant-face) 21795 21796 (face font-lock-string-face) 21796 21806 nil 21806 21807 (face font-lock-string-face) 21807 21836 (face font-lock-constant-face) 21836 21837 (face font-lock-string-face) 21837 21847 nil 21847 21848 (face font-lock-string-face) 21848 21876 (face font-lock-constant-face) 21876 21877 (face font-lock-string-face) 21877 21907 nil 21907 21908 (face font-lock-string-face) 21908 21919 (face font-lock-keyword-face) 21919 21920 (face font-lock-string-face) 21920 21922 nil 21922 21923 (face font-lock-string-face) 21923 21938 (face font-lock-function-name-face) 21938 21939 (face font-lock-string-face) 21939 21947 nil 21947 21948 (face font-lock-string-face) 21948 21952 (face font-lock-keyword-face) 21952 21953 (face font-lock-string-face) 21953 21955 nil 21955 21956 (face font-lock-string-face) 21956 21966 (face font-lock-type-face) 21966 21967 (face font-lock-string-face) 21967 21975 nil 21975 21976 (face font-lock-string-face) 21976 21988 (face font-lock-keyword-face) 21988 21989 (face font-lock-string-face) 21989 22001 nil 22001 22002 (face font-lock-string-face) 22002 22007 (face font-lock-function-name-face) 22007 22008 (face font-lock-string-face) 22008 22018 nil 22018 22019 (face font-lock-string-face) 22019 22037 (face font-lock-function-name-face) 22037 22038 (face font-lock-string-face) 22038 22048 nil 22048 22049 (face font-lock-string-face) 22049 22060 (face font-lock-function-name-face) 22060 22061 (face font-lock-string-face) 22061 22071 nil 22071 22072 (face font-lock-string-face) 22072 22093 (face font-lock-function-name-face) 22093 22094 (face font-lock-string-face) 22094 22104 nil 22104 22105 (face font-lock-string-face) 22105 22131 (face font-lock-function-name-face) 22131 22132 (face font-lock-string-face) 22132 22142 nil 22142 22143 (face font-lock-string-face) 22143 22177 (face font-lock-function-name-face) 22177 22178 (face font-lock-string-face) 22178 22188 nil 22188 22189 (face font-lock-string-face) 22189 22215 (face font-lock-function-name-face) 22215 22216 (face font-lock-string-face) 22216 22226 nil 22226 22227 (face font-lock-string-face) 22227 22253 (face font-lock-function-name-face) 22253 22254 (face font-lock-string-face) 22254 22264 nil 22264 22265 (face font-lock-string-face) 22265 22280 (face font-lock-function-name-face) 22280 22281 (face font-lock-string-face) 22281 22298 nil 22298 22299 (face font-lock-string-face) 22299 22306 (face font-lock-keyword-face) 22306 22307 (face font-lock-string-face) 22307 22319 nil 22319 22320 (face font-lock-string-face) 22320 22361 (face font-lock-constant-face) 22361 22362 (face font-lock-string-face) 22362 22372 nil 22372 22373 (face font-lock-string-face) 22373 22413 (face font-lock-constant-face) 22413 22414 (face font-lock-string-face) 22414 22424 nil 22424 22425 (face font-lock-string-face) 22425 22461 (face font-lock-constant-face) 22461 22462 (face font-lock-string-face) 22462 22472 nil 22472 22473 (face font-lock-string-face) 22473 22502 (face font-lock-constant-face) 22502 22503 (face font-lock-string-face) 22503 22513 nil 22513 22514 (face font-lock-string-face) 22514 22550 (face font-lock-constant-face) 22550 22551 (face font-lock-string-face) 22551 22561 nil 22561 22562 (face font-lock-string-face) 22562 22610 (face font-lock-constant-face) 22610 22611 (face font-lock-string-face) 22611 22621 nil 22621 22622 (face font-lock-string-face) 22622 22663 (face font-lock-constant-face) 22663 22664 (face font-lock-string-face) 22664 22674 nil 22674 22675 (face font-lock-string-face) 22675 22711 (face font-lock-constant-face) 22711 22712 (face font-lock-string-face) 22712 22722 nil 22722 22723 (face font-lock-string-face) 22723 22757 (face font-lock-constant-face) 22757 22758 (face font-lock-string-face) 22758 22768 nil 22768 22769 (face font-lock-string-face) 22769 22797 (face font-lock-constant-face) 22797 22798 (face font-lock-string-face) 22798 22808 nil 22808 22809 (face font-lock-string-face) 22809 22853 (face font-lock-constant-face) 22853 22854 (face font-lock-string-face) 22854 22864 nil 22864 22865 (face font-lock-string-face) 22865 22900 (face font-lock-constant-face) 22900 22901 (face font-lock-string-face) 22901 22911 nil 22911 22912 (face font-lock-string-face) 22912 22961 (face font-lock-constant-face) 22961 22962 (face font-lock-string-face) 22962 22972 nil 22972 22973 (face font-lock-string-face) 22973 23011 (face font-lock-constant-face) 23011 23012 (face font-lock-string-face) 23012 23022 nil 23022 23023 (face font-lock-string-face) 23023 23055 (face font-lock-constant-face) 23055 23056 (face font-lock-string-face) 23056 23066 nil 23066 23067 (face font-lock-string-face) 23067 23116 (face font-lock-constant-face) 23116 23117 (face font-lock-string-face) 23117 23127 nil 23127 23128 (face font-lock-string-face) 23128 23178 (face font-lock-constant-face) 23178 23179 (face font-lock-string-face) 23179 23189 nil 23189 23190 (face font-lock-string-face) 23190 23228 (face font-lock-constant-face) 23228 23229 (face font-lock-string-face) 23229 23239 nil 23239 23240 (face font-lock-string-face) 23240 23277 (face font-lock-constant-face) 23277 23278 (face font-lock-string-face) 23278 23288 nil 23288 23289 (face font-lock-string-face) 23289 23332 (face font-lock-constant-face) 23332 23333 (face font-lock-string-face) 23333 23343 nil 23343 23344 (face font-lock-string-face) 23344 23368 (face font-lock-constant-face) 23368 23369 (face font-lock-string-face) 23369 23379 nil 23379 23380 (face font-lock-string-face) 23380 23402 (face font-lock-constant-face) 23402 23403 (face font-lock-string-face) 23403 23413 nil 23413 23414 (face font-lock-string-face) 23414 23447 (face font-lock-constant-face) 23447 23448 (face font-lock-string-face) 23448 23458 nil 23458 23459 (face font-lock-string-face) 23459 23487 (face font-lock-constant-face) 23487 23488 (face font-lock-string-face) 23488 23498 nil 23498 23499 (face font-lock-string-face) 23499 23530 (face font-lock-constant-face) 23530 23531 (face font-lock-string-face) 23531 23541 nil 23541 23542 (face font-lock-string-face) 23542 23563 (face font-lock-constant-face) 23563 23564 (face font-lock-string-face) 23564 23574 nil 23574 23575 (face font-lock-string-face) 23575 23609 (face font-lock-constant-face) 23609 23610 (face font-lock-string-face) 23610 23620 nil 23620 23621 (face font-lock-string-face) 23621 23654 (face font-lock-constant-face) 23654 23655 (face font-lock-string-face) 23655 23665 nil 23665 23666 (face font-lock-string-face) 23666 23700 (face font-lock-constant-face) 23700 23701 (face font-lock-string-face) 23701 23711 nil 23711 23712 (face font-lock-string-face) 23712 23753 (face font-lock-constant-face) 23753 23754 (face font-lock-string-face) 23754 23764 nil 23764 23765 (face font-lock-string-face) 23765 23790 (face font-lock-constant-face) 23790 23791 (face font-lock-string-face) 23791 23801 nil 23801 23802 (face font-lock-string-face) 23802 23825 (face font-lock-constant-face) 23825 23826 (face font-lock-string-face) 23826 23836 nil 23836 23837 (face font-lock-string-face) 23837 23862 (face font-lock-constant-face) 23862 23863 (face font-lock-string-face) 23863 23873 nil 23873 23874 (face font-lock-string-face) 23874 23906 (face font-lock-constant-face) 23906 23907 (face font-lock-string-face) 23907 23917 nil 23917 23918 (face font-lock-string-face) 23918 23947 (face font-lock-constant-face) 23947 23948 (face font-lock-string-face) 23948 23958 nil 23958 23959 (face font-lock-string-face) 23959 23981 (face font-lock-constant-face) 23981 23982 (face font-lock-string-face) 23982 23992 nil 23992 23993 (face font-lock-string-face) 23993 24014 (face font-lock-constant-face) 24014 24015 (face font-lock-string-face) 24015 24025 nil 24025 24026 (face font-lock-string-face) 24026 24054 (face font-lock-constant-face) 24054 24055 (face font-lock-string-face) 24055 24065 nil 24065 24066 (face font-lock-string-face) 24066 24093 (face font-lock-constant-face) 24093 24094 (face font-lock-string-face) 24094 24104 nil 24104 24105 (face font-lock-string-face) 24105 24133 (face font-lock-constant-face) 24133 24134 (face font-lock-string-face) 24134 24144 nil 24144 24145 (face font-lock-string-face) 24145 24177 (face font-lock-constant-face) 24177 24178 (face font-lock-string-face) 24178 24188 nil 24188 24189 (face font-lock-string-face) 24189 24221 (face font-lock-constant-face) 24221 24222 (face font-lock-string-face) 24222 24232 nil 24232 24233 (face font-lock-string-face) 24233 24277 (face font-lock-constant-face) 24277 24278 (face font-lock-string-face) 24278 24288 nil 24288 24289 (face font-lock-string-face) 24289 24328 (face font-lock-constant-face) 24328 24329 (face font-lock-string-face) 24329 24339 nil 24339 24340 (face font-lock-string-face) 24340 24379 (face font-lock-constant-face) 24379 24380 (face font-lock-string-face) 24380 24390 nil 24390 24391 (face font-lock-string-face) 24391 24424 (face font-lock-constant-face) 24424 24425 (face font-lock-string-face) 24425 24435 nil 24435 24436 (face font-lock-string-face) 24436 24476 (face font-lock-constant-face) 24476 24477 (face font-lock-string-face) 24477 24487 nil 24487 24488 (face font-lock-string-face) 24488 24521 (face font-lock-constant-face) 24521 24522 (face font-lock-string-face) 24522 24532 nil 24532 24533 (face font-lock-string-face) 24533 24567 (face font-lock-constant-face) 24567 24568 (face font-lock-string-face) 24568 24578 nil 24578 24579 (face font-lock-string-face) 24579 24610 (face font-lock-constant-face) 24610 24611 (face font-lock-string-face) 24611 24621 nil 24621 24622 (face font-lock-string-face) 24622 24673 (face font-lock-constant-face) 24673 24674 (face font-lock-string-face) 24674 24684 nil 24684 24685 (face font-lock-string-face) 24685 24725 (face font-lock-constant-face) 24725 24726 (face font-lock-string-face) 24726 24736 nil 24736 24737 (face font-lock-string-face) 24737 24773 (face font-lock-constant-face) 24773 24774 (face font-lock-string-face) 24774 24784 nil 24784 24785 (face font-lock-string-face) 24785 24821 (face font-lock-constant-face) 24821 24822 (face font-lock-string-face) 24822 24832 nil 24832 24833 (face font-lock-string-face) 24833 24874 (face font-lock-constant-face) 24874 24875 (face font-lock-string-face) 24875 24885 nil 24885 24886 (face font-lock-string-face) 24886 24926 (face font-lock-constant-face) 24926 24927 (face font-lock-string-face) 24927 24937 nil 24937 24938 (face font-lock-string-face) 24938 24977 (face font-lock-constant-face) 24977 24978 (face font-lock-string-face) 24978 24988 nil 24988 24989 (face font-lock-string-face) 24989 25035 (face font-lock-constant-face) 25035 25036 (face font-lock-string-face) 25036 25046 nil 25046 25047 (face font-lock-string-face) 25047 25070 (face font-lock-constant-face) 25070 25071 (face font-lock-string-face) 25071 25081 nil 25081 25082 (face font-lock-string-face) 25082 25104 (face font-lock-constant-face) 25104 25105 (face font-lock-string-face) 25105 25115 nil 25115 25116 (face font-lock-string-face) 25116 25152 (face font-lock-constant-face) 25152 25153 (face font-lock-string-face) 25153 25163 nil 25163 25164 (face font-lock-string-face) 25164 25210 (face font-lock-constant-face) 25210 25211 (face font-lock-string-face) 25211 25221 nil 25221 25222 (face font-lock-string-face) 25222 25250 (face font-lock-constant-face) 25250 25251 (face font-lock-string-face) 25251 25268 nil 25268 25269 (face font-lock-string-face) 25269 25279 (face font-lock-keyword-face) 25279 25280 (face font-lock-string-face) 25280 25293 nil 25293 25294 (face font-lock-string-face) 25294 25319 (face font-lock-variable-name-face) 25319 25320 (face font-lock-string-face) 25320 25334 nil 25334 25335 (face font-lock-string-face) 25335 25345 (face font-lock-keyword-face) 25345 25346 (face font-lock-string-face) 25346 25363 nil 25363 25364 (face font-lock-string-face) 25364 25385 (face font-lock-variable-name-face) 25385 25386 (face font-lock-string-face) 25386 25404 nil 25404 25405 (face font-lock-string-face) 25405 25417 (face font-lock-keyword-face) 25417 25418 (face font-lock-string-face) 25418 25438 nil 25438 25439 (face font-lock-string-face) 25439 25480 (face font-lock-function-name-face) 25480 25481 (face font-lock-string-face) 25481 25550 nil 25550 25551 (face font-lock-string-face) 25551 25566 (face font-lock-variable-name-face) 25566 25567 (face font-lock-string-face) 25567 25581 nil 25581 25582 (face font-lock-string-face) 25582 25594 (face font-lock-keyword-face) 25594 25595 (face font-lock-string-face) 25595 25611 nil 25611 25612 (face font-lock-string-face) 25612 25651 (face font-lock-function-name-face) 25651 25652 (face font-lock-string-face) 25652 25688 nil 25688 25689 (face font-lock-string-face) 25689 25704 (face font-lock-variable-name-face) 25704 25705 (face font-lock-string-face) 25705 25719 nil 25719 25720 (face font-lock-string-face) 25720 25728 (face font-lock-keyword-face) 25728 25729 (face font-lock-string-face) 25729 25745 nil 25745 25746 (face font-lock-string-face) 25746 25782 (face font-lock-constant-face) 25782 25783 (face font-lock-string-face) 25783 25797 nil 25797 25798 (face font-lock-string-face) 25798 25820 (face font-lock-constant-face) 25820 25821 (face font-lock-string-face) 25821 25835 nil 25835 25836 (face font-lock-string-face) 25836 25857 (face font-lock-constant-face) 25857 25858 (face font-lock-string-face) 25858 25872 nil 25872 25873 (face font-lock-string-face) 25873 25905 (face font-lock-constant-face) 25905 25906 (face font-lock-string-face) 25906 25920 nil 25920 25921 (face font-lock-string-face) 25921 25961 (face font-lock-constant-face) 25961 25962 (face font-lock-string-face) 25962 25976 nil 25976 25977 (face font-lock-string-face) 25977 26016 (face font-lock-constant-face) 26016 26017 (face font-lock-string-face) 26017 26031 nil 26031 26032 (face font-lock-string-face) 26032 26065 (face font-lock-constant-face) 26065 26066 (face font-lock-string-face) 26066 26080 nil 26080 26081 (face font-lock-string-face) 26081 26115 (face font-lock-constant-face) 26115 26116 (face font-lock-string-face) 26116 26130 nil 26130 26131 (face font-lock-string-face) 26131 26162 (face font-lock-constant-face) 26162 26163 (face font-lock-string-face) 26163 26177 nil 26177 26178 (face font-lock-string-face) 26178 26229 (face font-lock-constant-face) 26229 26230 (face font-lock-string-face) 26230 26244 nil 26244 26245 (face font-lock-string-face) 26245 26285 (face font-lock-constant-face) 26285 26286 (face font-lock-string-face) 26286 26300 nil 26300 26301 (face font-lock-string-face) 26301 26337 (face font-lock-constant-face) 26337 26338 (face font-lock-string-face) 26338 26352 nil 26352 26353 (face font-lock-string-face) 26353 26394 (face font-lock-constant-face) 26394 26395 (face font-lock-string-face) 26395 26409 nil 26409 26410 (face font-lock-string-face) 26410 26443 (face font-lock-constant-face) 26443 26444 (face font-lock-string-face) 26444 26458 nil 26458 26459 (face font-lock-string-face) 26459 26495 (face font-lock-constant-face) 26495 26496 (face font-lock-string-face) 26496 26532 nil 26532 26533 (face font-lock-string-face) 26533 26546 (face font-lock-variable-name-face) 26546 26547 (face font-lock-string-face) 26547 26561 nil 26561 26562 (face font-lock-string-face) 26562 26572 (face font-lock-keyword-face) 26572 26573 (face font-lock-string-face) 26573 26590 nil 26590 26591 (face font-lock-string-face) 26591 26604 (face font-lock-variable-name-face) 26604 26605 (face font-lock-string-face) 26605 26623 nil 26623 26624 (face font-lock-string-face) 26624 26631 (face font-lock-keyword-face) 26631 26632 (face font-lock-string-face) 26632 26652 nil 26652 26653 (face font-lock-string-face) 26653 26688 (face font-lock-constant-face) 26688 26689 (face font-lock-string-face) 26689 26722 nil 26722 26723 (face font-lock-string-face) 26723 26730 (face font-lock-keyword-face) 26730 26731 (face font-lock-string-face) 26731 26751 nil 26751 26752 (face font-lock-string-face) 26752 26760 (face font-lock-preprocessor-face) 26760 26761 (face font-lock-string-face) 26761 26831 nil 26831 26832 (face font-lock-string-face) 26832 26873 (face font-lock-variable-name-face) 26873 26874 (face font-lock-string-face) 26874 26888 nil 26888 26889 (face font-lock-string-face) 26889 26896 (face font-lock-keyword-face) 26896 26897 (face font-lock-string-face) 26897 26913 nil 26913 26914 (face font-lock-string-face) 26914 26954 (face font-lock-constant-face) 26954 26955 (face font-lock-string-face) 26955 26991 nil 26991 26992 (face font-lock-string-face) 26992 27035 (face font-lock-variable-name-face) 27035 27036 (face font-lock-string-face) 27036 27050 nil 27050 27051 (face font-lock-string-face) 27051 27058 (face font-lock-keyword-face) 27058 27059 (face font-lock-string-face) 27059 27075 nil 27075 27076 (face font-lock-string-face) 27076 27095 (face font-lock-constant-face) 27095 27096 (face font-lock-string-face) 27096 27110 nil 27110 27111 (face font-lock-string-face) 27111 27137 (face font-lock-constant-face) 27137 27138 (face font-lock-string-face) 27138 27152 nil 27152 27153 (face font-lock-string-face) 27153 27186 (face font-lock-constant-face) 27186 27187 (face font-lock-string-face) 27187 27201 nil 27201 27202 (face font-lock-string-face) 27202 27235 (face font-lock-constant-face) 27235 27236 (face font-lock-string-face) 27236 27291 nil 27291 27292 (face font-lock-string-face) 27292 27303 (face font-lock-keyword-face) 27303 27304 (face font-lock-string-face) 27304 27306 nil 27306 27307 (face font-lock-string-face) 27307 27325 (face font-lock-function-name-face) 27325 27326 (face font-lock-string-face) 27326 27334 nil 27334 27335 (face font-lock-string-face) 27335 27339 (face font-lock-keyword-face) 27339 27340 (face font-lock-string-face) 27340 27342 nil 27342 27343 (face font-lock-string-face) 27343 27357 (face font-lock-type-face) 27357 27358 (face font-lock-string-face) 27358 27366 nil 27366 27367 (face font-lock-string-face) 27367 27379 (face font-lock-keyword-face) 27379 27380 (face font-lock-string-face) 27380 27392 nil 27392 27393 (face font-lock-string-face) 27393 27398 (face font-lock-function-name-face) 27398 27399 (face font-lock-string-face) 27399 27409 nil 27409 27410 (face font-lock-string-face) 27410 27431 (face font-lock-function-name-face) 27431 27432 (face font-lock-string-face) 27432 27442 nil 27442 27443 (face font-lock-string-face) 27443 27469 (face font-lock-function-name-face) 27469 27470 (face font-lock-string-face) 27470 27480 nil 27480 27481 (face font-lock-string-face) 27481 27507 (face font-lock-function-name-face) 27507 27508 (face font-lock-string-face) 27508 27525 nil 27525 27526 (face font-lock-string-face) 27526 27533 (face font-lock-keyword-face) 27533 27534 (face font-lock-string-face) 27534 27546 nil 27546 27547 (face font-lock-string-face) 27547 27591 (face font-lock-constant-face) 27591 27592 (face font-lock-string-face) 27592 27602 nil 27602 27603 (face font-lock-string-face) 27603 27646 (face font-lock-constant-face) 27646 27647 (face font-lock-string-face) 27647 27657 nil 27657 27658 (face font-lock-string-face) 27658 27679 (face font-lock-constant-face) 27679 27680 (face font-lock-string-face) 27680 27690 nil 27690 27691 (face font-lock-string-face) 27691 27711 (face font-lock-constant-face) 27711 27712 (face font-lock-string-face) 27712 27722 nil 27722 27723 (face font-lock-string-face) 27723 27752 (face font-lock-constant-face) 27752 27753 (face font-lock-string-face) 27753 27763 nil 27763 27764 (face font-lock-string-face) 27764 27792 (face font-lock-constant-face) 27792 27793 (face font-lock-string-face) 27793 27803 nil 27803 27804 (face font-lock-string-face) 27804 27829 (face font-lock-constant-face) 27829 27830 (face font-lock-string-face) 27830 27840 nil 27840 27841 (face font-lock-string-face) 27841 27865 (face font-lock-constant-face) 27865 27866 (face font-lock-string-face) 27866 27876 nil 27876 27877 (face font-lock-string-face) 27877 27901 (face font-lock-constant-face) 27901 27902 (face font-lock-string-face) 27902 27912 nil 27912 27913 (face font-lock-string-face) 27913 27936 (face font-lock-constant-face) 27936 27937 (face font-lock-string-face) 27937 27947 nil 27947 27948 (face font-lock-string-face) 27948 27968 (face font-lock-constant-face) 27968 27969 (face font-lock-string-face) 27969 27979 nil 27979 27980 (face font-lock-string-face) 27980 27999 (face font-lock-constant-face) 27999 28000 (face font-lock-string-face) 28000 28030 nil 28030 28031 (face font-lock-string-face) 28031 28042 (face font-lock-keyword-face) 28042 28043 (face font-lock-string-face) 28043 28045 nil 28045 28046 (face font-lock-string-face) 28046 28058 (face font-lock-function-name-face) 28058 28059 (face font-lock-string-face) 28059 28067 nil 28067 28068 (face font-lock-string-face) 28068 28072 (face font-lock-keyword-face) 28072 28073 (face font-lock-string-face) 28073 28075 nil 28075 28076 (face font-lock-string-face) 28076 28086 (face font-lock-type-face) 28086 28087 (face font-lock-string-face) 28087 28095 nil 28095 28096 (face font-lock-string-face) 28096 28108 (face font-lock-keyword-face) 28108 28109 (face font-lock-string-face) 28109 28121 nil 28121 28122 (face font-lock-string-face) 28122 28127 (face font-lock-function-name-face) 28127 28128 (face font-lock-string-face) 28128 28138 nil 28138 28139 (face font-lock-string-face) 28139 28150 (face font-lock-function-name-face) 28150 28151 (face font-lock-string-face) 28151 28161 nil 28161 28162 (face font-lock-string-face) 28162 28183 (face font-lock-function-name-face) 28183 28184 (face font-lock-string-face) 28184 28194 nil 28194 28195 (face font-lock-string-face) 28195 28216 (face font-lock-function-name-face) 28216 28217 (face font-lock-string-face) 28217 28234 nil 28234 28235 (face font-lock-string-face) 28235 28242 (face font-lock-keyword-face) 28242 28243 (face font-lock-string-face) 28243 28255 nil 28255 28256 (face font-lock-string-face) 28256 28290 (face font-lock-constant-face) 28290 28291 (face font-lock-string-face) 28291 28321 nil 28321 28322 (face font-lock-string-face) 28322 28333 (face font-lock-keyword-face) 28333 28334 (face font-lock-string-face) 28334 28336 nil 28336 28337 (face font-lock-string-face) 28337 28349 (face font-lock-function-name-face) 28349 28350 (face font-lock-string-face) 28350 28358 nil 28358 28359 (face font-lock-string-face) 28359 28363 (face font-lock-keyword-face) 28363 28364 (face font-lock-string-face) 28364 28366 nil 28366 28367 (face font-lock-string-face) 28367 28377 (face font-lock-type-face) 28377 28378 (face font-lock-string-face) 28378 28386 nil 28386 28387 (face font-lock-string-face) 28387 28394 (face font-lock-keyword-face) 28394 28395 (face font-lock-string-face) 28395 28407 nil 28407 28408 (face font-lock-string-face) 28408 28441 (face font-lock-constant-face) 28441 28442 (face font-lock-string-face) 28442 28471 nil 28471 28472 (face font-lock-string-face) 28472 28483 (face font-lock-keyword-face) 28483 28484 (face font-lock-string-face) 28484 28486 nil 28486 28487 (face font-lock-string-face) 28487 28498 (face font-lock-function-name-face) 28498 28499 (face font-lock-string-face) 28499 28507 nil 28507 28508 (face font-lock-string-face) 28508 28512 (face font-lock-keyword-face) 28512 28513 (face font-lock-string-face) 28513 28515 nil 28515 28516 (face font-lock-string-face) 28516 28526 (face font-lock-type-face) 28526 28527 (face font-lock-string-face) 28527 28535 nil 28535 28536 (face font-lock-string-face) 28536 28548 (face font-lock-keyword-face) 28548 28549 (face font-lock-string-face) 28549 28561 nil 28561 28562 (face font-lock-string-face) 28562 28567 (face font-lock-function-name-face) 28567 28568 (face font-lock-string-face) 28568 28578 nil 28578 28579 (face font-lock-string-face) 28579 28600 (face font-lock-function-name-face) 28600 28601 (face font-lock-string-face) 28601 28618 nil 28618 28619 (face font-lock-string-face) 28619 28626 (face font-lock-keyword-face) 28626 28627 (face font-lock-string-face) 28627 28639 nil 28639 28640 (face font-lock-string-face) 28640 28672 (face font-lock-constant-face) 28672 28673 (face font-lock-string-face) 28673 28698 nil 28698 28699 (face font-lock-string-face) 28699 28709 (face font-lock-keyword-face) 28709 28710 (face font-lock-string-face) 28710 28719 nil 28719 28720 (face font-lock-string-face) 28720 28729 (face font-lock-variable-name-face) 28729 28730 (face font-lock-string-face) 28730 28740 nil 28740 28741 (face font-lock-string-face) 28741 28748 (face font-lock-keyword-face) 28748 28749 (face font-lock-string-face) 28749 28773 nil 28773 28774 (face font-lock-string-face) 28774 28785 (face font-lock-keyword-face) 28785 28786 (face font-lock-string-face) 28786 28788 nil 28788 28789 (face font-lock-string-face) 28789 28799 (face font-lock-function-name-face) 28799 28800 (face font-lock-string-face) 28800 28812 nil 28812 28813 (face font-lock-string-face) 28813 28817 (face font-lock-keyword-face) 28817 28818 (face font-lock-string-face) 28818 28820 nil 28820 28821 (face font-lock-string-face) 28821 28831 (face font-lock-type-face) 28831 28832 (face font-lock-string-face) 28832 28844 nil 28844 28845 (face font-lock-string-face) 28845 28857 (face font-lock-keyword-face) 28857 28858 (face font-lock-string-face) 28858 28874 nil 28874 28875 (face font-lock-string-face) 28875 28880 (face font-lock-function-name-face) 28880 28881 (face font-lock-string-face) 28881 28895 nil 28895 28896 (face font-lock-string-face) 28896 28907 (face font-lock-function-name-face) 28907 28908 (face font-lock-string-face) 28908 28922 nil 28922 28923 (face font-lock-string-face) 28923 28944 (face font-lock-function-name-face) 28944 28945 (face font-lock-string-face) 28945 28959 nil 28959 28960 (face font-lock-string-face) 28960 29043 (face font-lock-function-name-face) 29043 29044 (face font-lock-string-face) 29044 29058 nil 29058 29059 (face font-lock-string-face) 29059 29074 (face font-lock-function-name-face) 29074 29075 (face font-lock-string-face) 29075 29100 nil 29100 29101 (face font-lock-string-face) 29101 29113 (face font-lock-keyword-face) 29113 29114 (face font-lock-string-face) 29114 29130 nil 29130 29131 (face font-lock-string-face) 29131 29133 (face font-lock-constant-face) 29133 29138 (face font-lock-variable-name-face) 29138 29163 (face font-lock-constant-face) 29163 29164 (face font-lock-string-face) 29164 29189 nil 29189 29190 (face font-lock-string-face) 29190 29197 (face font-lock-keyword-face) 29197 29198 (face font-lock-string-face) 29198 29214 nil 29214 29215 (face font-lock-string-face) 29215 29238 (face font-lock-constant-face) 29238 29239 (face font-lock-string-face) 29239 29253 nil 29253 29254 (face font-lock-string-face) 29254 29280 (face font-lock-constant-face) 29280 29281 (face font-lock-string-face) 29281 29295 nil 29295 29296 (face font-lock-string-face) 29296 29321 (face font-lock-constant-face) 29321 29322 (face font-lock-string-face) 29322 29336 nil 29336 29337 (face font-lock-string-face) 29337 29361 (face font-lock-constant-face) 29361 29362 (face font-lock-string-face) 29362 29376 nil 29376 29377 (face font-lock-string-face) 29377 29407 (face font-lock-constant-face) 29407 29408 (face font-lock-string-face) 29408 29422 nil 29422 29423 (face font-lock-string-face) 29423 29453 (face font-lock-constant-face) 29453 29454 (face font-lock-string-face) 29454 29468 nil 29468 29469 (face font-lock-string-face) 29469 29493 (face font-lock-constant-face) 29493 29494 (face font-lock-string-face) 29494 29508 nil 29508 29509 (face font-lock-string-face) 29509 29532 (face font-lock-constant-face) 29532 29533 (face font-lock-string-face) 29533 29547 nil 29547 29548 (face font-lock-string-face) 29548 29575 (face font-lock-constant-face) 29575 29576 (face font-lock-string-face) 29576 29590 nil 29590 29591 (face font-lock-string-face) 29591 29614 (face font-lock-constant-face) 29614 29615 (face font-lock-string-face) 29615 29640 nil 29640 29655 (face font-lock-string-face) 29655 29671 nil 29671 29685 (face font-lock-string-face) 29685 29703 nil 29703 29714 (face font-lock-string-face) 29714 29716 nil 29716 29719 (face font-lock-string-face) 29719 29729 nil 29729 29754 (face font-lock-comment-face) 29754 29792 nil 29792 29793 (face font-lock-string-face) 29793 29800 (face font-lock-keyword-face) 29800 29801 (face font-lock-string-face) 29801 29817 nil 29817 29818 (face font-lock-string-face) 29818 29843 (face font-lock-preprocessor-face) 29843 29844 (face font-lock-string-face) 29844 29892 nil 29892 29893 (face font-lock-string-face) 29893 29929 (face font-lock-variable-name-face) 29929 29930 (face font-lock-string-face) 29930 29940 nil 29940 29941 (face font-lock-string-face) 29941 29948 (face font-lock-keyword-face) 29948 29949 (face font-lock-string-face) 29949 29973 nil 29973 29974 (face font-lock-string-face) 29974 29985 (face font-lock-keyword-face) 29985 29986 (face font-lock-string-face) 29986 29988 nil 29988 29989 (face font-lock-string-face) 29989 30001 (face font-lock-function-name-face) 30001 30002 (face font-lock-string-face) 30002 30014 nil 30014 30015 (face font-lock-string-face) 30015 30019 (face font-lock-keyword-face) 30019 30020 (face font-lock-string-face) 30020 30022 nil 30022 30023 (face font-lock-string-face) 30023 30033 (face font-lock-type-face) 30033 30034 (face font-lock-string-face) 30034 30046 nil 30046 30047 (face font-lock-string-face) 30047 30059 (face font-lock-keyword-face) 30059 30060 (face font-lock-string-face) 30060 30076 nil 30076 30077 (face font-lock-string-face) 30077 30082 (face font-lock-function-name-face) 30082 30083 (face font-lock-string-face) 30083 30097 nil 30097 30098 (face font-lock-string-face) 30098 30109 (face font-lock-function-name-face) 30109 30110 (face font-lock-string-face) 30110 30124 nil 30124 30125 (face font-lock-string-face) 30125 30146 (face font-lock-function-name-face) 30146 30147 (face font-lock-string-face) 30147 30161 nil 30161 30162 (face font-lock-string-face) 30162 30180 (face font-lock-function-name-face) 30180 30181 (face font-lock-string-face) 30181 30206 nil 30206 30207 (face font-lock-string-face) 30207 30214 (face font-lock-keyword-face) 30214 30215 (face font-lock-string-face) 30215 30231 nil 30231 30232 (face font-lock-string-face) 30232 30266 (face font-lock-constant-face) 30266 30267 (face font-lock-string-face) 30267 30281 nil 30281 30282 (face font-lock-string-face) 30282 30321 (face font-lock-constant-face) 30321 30322 (face font-lock-string-face) 30322 30336 nil 30336 30337 (face font-lock-string-face) 30337 30375 (face font-lock-constant-face) 30375 30376 (face font-lock-string-face) 30376 30390 nil 30390 30391 (face font-lock-string-face) 30391 30430 (face font-lock-constant-face) 30430 30431 (face font-lock-string-face) 30431 30445 nil 30445 30446 (face font-lock-string-face) 30446 30484 (face font-lock-constant-face) 30484 30485 (face font-lock-string-face) 30485 30499 nil 30499 30500 (face font-lock-string-face) 30500 30533 (face font-lock-constant-face) 30533 30534 (face font-lock-string-face) 30534 30548 nil 30548 30549 (face font-lock-string-face) 30549 30581 (face font-lock-constant-face) 30581 30582 (face font-lock-string-face) 30582 30596 nil 30596 30597 (face font-lock-string-face) 30597 30626 (face font-lock-constant-face) 30626 30627 (face font-lock-string-face) 30627 30641 nil 30641 30642 (face font-lock-string-face) 30642 30670 (face font-lock-constant-face) 30670 30671 (face font-lock-string-face) 30671 30685 nil 30685 30686 (face font-lock-string-face) 30686 30714 (face font-lock-constant-face) 30714 30715 (face font-lock-string-face) 30715 30729 nil 30729 30730 (face font-lock-string-face) 30730 30757 (face font-lock-constant-face) 30757 30758 (face font-lock-string-face) 30758 30783 nil 30783 30784 (face font-lock-string-face) 30784 30794 (face font-lock-keyword-face) 30794 30795 (face font-lock-string-face) 30795 30812 nil 30812 30813 (face font-lock-string-face) 30813 30834 (face font-lock-variable-name-face) 30834 30835 (face font-lock-string-face) 30835 30853 nil 30853 30854 (face font-lock-string-face) 30854 30866 (face font-lock-keyword-face) 30866 30867 (face font-lock-string-face) 30867 30887 nil 30887 30888 (face font-lock-string-face) 30888 30917 (face font-lock-function-name-face) 30917 30918 (face font-lock-string-face) 30918 30951 nil 30951 30952 (face font-lock-string-face) 30952 30959 (face font-lock-keyword-face) 30959 30960 (face font-lock-string-face) 30960 30980 nil 30980 30981 (face font-lock-string-face) 30981 31015 (face font-lock-constant-face) 31015 31016 (face font-lock-string-face) 31016 31064 nil 31064 31065 (face font-lock-string-face) 31065 31074 (face font-lock-variable-name-face) 31074 31075 (face font-lock-string-face) 31075 31093 nil 31093 31094 (face font-lock-string-face) 31094 31106 (face font-lock-keyword-face) 31106 31107 (face font-lock-string-face) 31107 31127 nil 31127 31128 (face font-lock-string-face) 31128 31175 (face font-lock-function-name-face) 31175 31176 (face font-lock-string-face) 31176 31194 nil 31194 31195 (face font-lock-string-face) 31195 31245 (face font-lock-function-name-face) 31245 31246 (face font-lock-string-face) 31246 31279 nil 31279 31280 (face font-lock-string-face) 31280 31287 (face font-lock-keyword-face) 31287 31288 (face font-lock-string-face) 31288 31308 nil 31308 31309 (face font-lock-string-face) 31309 31341 (face font-lock-constant-face) 31341 31342 (face font-lock-string-face) 31342 31423 nil 31423 31424 (face font-lock-string-face) 31424 31462 (face font-lock-variable-name-face) 31462 31463 (face font-lock-string-face) 31463 31473 nil 31473 31474 (face font-lock-string-face) 31474 31481 (face font-lock-keyword-face) 31481 31482 (face font-lock-string-face) 31482 31506 nil 31506 31507 (face font-lock-string-face) 31507 31518 (face font-lock-keyword-face) 31518 31519 (face font-lock-string-face) 31519 31521 nil 31521 31522 (face font-lock-string-face) 31522 31539 (face font-lock-function-name-face) 31539 31540 (face font-lock-string-face) 31540 31552 nil 31552 31553 (face font-lock-string-face) 31553 31557 (face font-lock-keyword-face) 31557 31558 (face font-lock-string-face) 31558 31560 nil 31560 31561 (face font-lock-string-face) 31561 31571 (face font-lock-type-face) 31571 31572 (face font-lock-string-face) 31572 31584 nil 31584 31585 (face font-lock-string-face) 31585 31597 (face font-lock-keyword-face) 31597 31598 (face font-lock-string-face) 31598 31614 nil 31614 31615 (face font-lock-string-face) 31615 31636 (face font-lock-function-name-face) 31636 31637 (face font-lock-string-face) 31637 31651 nil 31651 31652 (face font-lock-string-face) 31652 31670 (face font-lock-function-name-face) 31670 31671 (face font-lock-string-face) 31671 31696 nil 31696 31697 (face font-lock-string-face) 31697 31706 (face font-lock-keyword-face) 31706 31707 (face font-lock-string-face) 31707 31723 nil 31723 31724 (face font-lock-string-face) 31724 31728 (face font-lock-constant-face) 31728 31729 (face font-lock-string-face) 31729 31743 nil 31743 31744 (face font-lock-string-face) 31744 31748 (face font-lock-constant-face) 31748 31749 (face font-lock-string-face) 31749 31774 nil 31774 31775 (face font-lock-string-face) 31775 31782 (face font-lock-keyword-face) 31782 31783 (face font-lock-string-face) 31783 31799 nil 31799 31800 (face font-lock-string-face) 31800 31844 (face font-lock-constant-face) 31844 31845 (face font-lock-string-face) 31845 31893 nil 31893 31894 (face font-lock-string-face) 31894 31943 (face font-lock-variable-name-face) 31943 31944 (face font-lock-string-face) 31944 31954 nil 31954 31955 (face font-lock-string-face) 31955 31962 (face font-lock-keyword-face) 31962 31963 (face font-lock-string-face) 31963 31987 nil 31987 31988 (face font-lock-string-face) 31988 31999 (face font-lock-keyword-face) 31999 32000 (face font-lock-string-face) 32000 32002 nil 32002 32003 (face font-lock-string-face) 32003 32013 (face font-lock-function-name-face) 32013 32014 (face font-lock-string-face) 32014 32026 nil 32026 32027 (face font-lock-string-face) 32027 32031 (face font-lock-keyword-face) 32031 32032 (face font-lock-string-face) 32032 32034 nil 32034 32035 (face font-lock-string-face) 32035 32045 (face font-lock-type-face) 32045 32046 (face font-lock-string-face) 32046 32058 nil 32058 32059 (face font-lock-string-face) 32059 32071 (face font-lock-keyword-face) 32071 32072 (face font-lock-string-face) 32072 32088 nil 32088 32089 (face font-lock-string-face) 32089 32094 (face font-lock-function-name-face) 32094 32095 (face font-lock-string-face) 32095 32109 nil 32109 32110 (face font-lock-string-face) 32110 32121 (face font-lock-function-name-face) 32121 32122 (face font-lock-string-face) 32122 32136 nil 32136 32137 (face font-lock-string-face) 32137 32158 (face font-lock-function-name-face) 32158 32159 (face font-lock-string-face) 32159 32173 nil 32173 32174 (face font-lock-string-face) 32174 32192 (face font-lock-function-name-face) 32192 32193 (face font-lock-string-face) 32193 32218 nil 32218 32219 (face font-lock-string-face) 32219 32232 (face font-lock-keyword-face) 32232 32233 (face font-lock-string-face) 32233 32249 nil 32249 32250 (face font-lock-string-face) 32250 32259 (face font-lock-keyword-face) 32259 32260 (face font-lock-string-face) 32260 32278 nil 32278 32279 (face font-lock-string-face) 32279 32283 (face font-lock-constant-face) 32283 32284 (face font-lock-string-face) 32284 32300 nil 32300 32301 (face font-lock-string-face) 32301 32306 (face font-lock-constant-face) 32306 32307 (face font-lock-string-face) 32307 32323 nil 32323 32324 (face font-lock-string-face) 32324 32333 (face font-lock-constant-face) 32333 32334 (face font-lock-string-face) 32334 32350 nil 32350 32351 (face font-lock-string-face) 32351 32357 (face font-lock-constant-face) 32357 32358 (face font-lock-string-face) 32358 32398 nil 32398 32399 (face font-lock-string-face) 32399 32406 (face font-lock-keyword-face) 32406 32407 (face font-lock-string-face) 32407 32423 nil 32423 32424 (face font-lock-string-face) 32424 32462 (face font-lock-constant-face) 32462 32463 (face font-lock-string-face) 32463 32477 nil 32477 32478 (face font-lock-string-face) 32478 32515 (face font-lock-constant-face) 32515 32516 (face font-lock-string-face) 32516 32530 nil 32530 32531 (face font-lock-string-face) 32531 32568 (face font-lock-constant-face) 32568 32569 (face font-lock-string-face) 32569 32583 nil 32583 32584 (face font-lock-string-face) 32584 32620 (face font-lock-constant-face) 32620 32621 (face font-lock-string-face) 32621 32635 nil 32635 32636 (face font-lock-string-face) 32636 32666 (face font-lock-constant-face) 32666 32667 (face font-lock-string-face) 32667 32681 nil 32681 32682 (face font-lock-string-face) 32682 32720 (face font-lock-constant-face) 32720 32721 (face font-lock-string-face) 32721 32735 nil 32735 32736 (face font-lock-string-face) 32736 32773 (face font-lock-constant-face) 32773 32774 (face font-lock-string-face) 32774 32822 nil 32822 32823 (face font-lock-string-face) 32823 32838 (face font-lock-variable-name-face) 32838 32839 (face font-lock-string-face) 32839 32849 nil 32849 32850 (face font-lock-string-face) 32850 32857 (face font-lock-keyword-face) 32857 32858 (face font-lock-string-face) 32858 32882 nil 32882 32883 (face font-lock-string-face) 32883 32894 (face font-lock-keyword-face) 32894 32895 (face font-lock-string-face) 32895 32897 nil 32897 32898 (face font-lock-string-face) 32898 32912 (face font-lock-function-name-face) 32912 32913 (face font-lock-string-face) 32913 32925 nil 32925 32926 (face font-lock-string-face) 32926 32930 (face font-lock-keyword-face) 32930 32931 (face font-lock-string-face) 32931 32933 nil 32933 32934 (face font-lock-string-face) 32934 32948 (face font-lock-type-face) 32948 32949 (face font-lock-string-face) 32949 32961 nil 32961 32962 (face font-lock-string-face) 32962 32969 (face font-lock-keyword-face) 32969 32970 (face font-lock-string-face) 32970 32986 nil 32986 32987 (face font-lock-string-face) 32987 33022 (face font-lock-constant-face) 33022 33023 (face font-lock-string-face) 33023 33037 nil 33037 33038 (face font-lock-string-face) 33038 33072 (face font-lock-constant-face) 33072 33073 (face font-lock-string-face) 33073 33098 nil 33098 33099 (face font-lock-string-face) 33099 33111 (face font-lock-keyword-face) 33111 33112 (face font-lock-string-face) 33112 33128 nil 33128 33129 (face font-lock-string-face) 33129 33150 (face font-lock-function-name-face) 33150 33151 (face font-lock-string-face) 33151 33176 nil 33176 33177 (face font-lock-string-face) 33177 33189 (face font-lock-keyword-face) 33189 33190 (face font-lock-string-face) 33190 33206 nil 33206 33207 (face font-lock-string-face) 33207 33209 (face font-lock-constant-face) 33209 33232 (face font-lock-variable-name-face) 33232 33239 (face font-lock-constant-face) 33239 33240 (face font-lock-string-face) 33240 33265 nil 33265 33266 (face font-lock-string-face) 33266 33273 (face font-lock-keyword-face) 33273 33274 (face font-lock-string-face) 33274 33306 nil 33306 33307 (face font-lock-string-face) 33307 33318 (face font-lock-keyword-face) 33318 33319 (face font-lock-string-face) 33319 33321 nil 33321 33322 (face font-lock-string-face) 33322 33342 (face font-lock-function-name-face) 33342 33343 (face font-lock-string-face) 33343 33359 nil 33359 33360 (face font-lock-string-face) 33360 33366 (face font-lock-keyword-face) 33366 33367 (face font-lock-string-face) 33367 33387 nil 33387 33388 (face font-lock-string-face) 33388 33434 (face font-lock-constant-face) 33434 33435 (face font-lock-string-face) 33435 33453 nil 33453 33454 (face font-lock-string-face) 33454 33519 (face font-lock-constant-face) 33519 33520 (face font-lock-string-face) 33520 33553 nil 33553 33554 (face font-lock-string-face) 33554 33561 (face font-lock-keyword-face) 33561 33562 (face font-lock-string-face) 33562 33582 nil 33582 33583 (face font-lock-string-face) 33583 33585 (face font-lock-constant-face) 33585 33608 (face font-lock-variable-name-face) 33608 33647 (face font-lock-constant-face) 33647 33648 (face font-lock-string-face) 33648 33681 nil 33681 33682 (face font-lock-string-face) 33682 33688 (face font-lock-keyword-face) 33688 33689 (face font-lock-string-face) 33689 33709 nil 33709 33710 (face font-lock-string-face) 33710 33716 (face font-lock-constant-face) 33716 33717 (face font-lock-string-face) 33717 33735 nil 33735 33736 (face font-lock-string-face) 33736 33738 (face font-lock-constant-face) 33738 33743 (face font-lock-variable-name-face) 33743 33788 (face font-lock-constant-face) 33788 33789 (face font-lock-string-face) 33789 33807 nil 33807 33808 (face font-lock-string-face) 33808 33810 (face font-lock-constant-face) 33810 33811 (face font-lock-string-face) 33811 33829 nil 33829 33830 (face font-lock-string-face) 33830 33833 (face font-lock-constant-face) 33833 33840 (face font-lock-variable-name-face) 33840 33841 (face font-lock-constant-face) 33841 33842 (face font-lock-string-face) 33842 33860 nil 33860 33861 (face font-lock-string-face) 33861 33864 (face font-lock-constant-face) 33864 33872 (face font-lock-variable-name-face) 33872 33873 (face font-lock-constant-face) 33873 33874 (face font-lock-string-face) 33874 33952 nil 33952 33953 (face font-lock-string-face) 33953 33964 (face font-lock-keyword-face) 33964 33965 (face font-lock-string-face) 33965 33967 nil 33967 33968 (face font-lock-string-face) 33968 33978 (face font-lock-function-name-face) 33978 33979 (face font-lock-string-face) 33979 33991 nil 33991 33992 (face font-lock-string-face) 33992 33996 (face font-lock-keyword-face) 33996 33997 (face font-lock-string-face) 33997 33999 nil 33999 34000 (face font-lock-string-face) 34000 34004 (face font-lock-type-face) 34004 34005 (face font-lock-string-face) 34005 34017 nil 34017 34018 (face font-lock-string-face) 34018 34030 (face font-lock-keyword-face) 34030 34031 (face font-lock-string-face) 34031 34035 nil 34035 34036 (face font-lock-string-face) 34036 34062 (face font-lock-function-name-face) 34062 34063 (face font-lock-string-face) 34063 34077 nil 34077 34078 (face font-lock-string-face) 34078 34087 (face font-lock-keyword-face) 34087 34088 (face font-lock-string-face) 34088 34104 nil 34104 34105 (face font-lock-string-face) 34105 34117 (face font-lock-variable-name-face) 34117 34118 (face font-lock-string-face) 34118 34120 nil 34120 34121 (face font-lock-string-face) 34121 34126 (face font-lock-variable-name-face) 34126 34127 (face font-lock-string-face) 34127 34141 nil 34141 34142 (face font-lock-string-face) 34142 34153 (face font-lock-variable-name-face) 34153 34154 (face font-lock-string-face) 34154 34156 nil 34156 34157 (face font-lock-string-face) 34157 34174 (face font-lock-variable-name-face) 34174 34175 (face font-lock-string-face) 34175 34200 nil 34200 34201 (face font-lock-string-face) 34201 34209 (face font-lock-keyword-face) 34209 34210 (face font-lock-string-face) 34210 34214 nil 34214 34215 (face font-lock-string-face) 34215 34233 (face font-lock-constant-face) 34233 34234 (face font-lock-string-face) 34234 34268 nil 34268 34287 (face font-lock-comment-face) 34287 34293 nil 34293 34365 (face font-lock-comment-face) 34365 34371 nil 34371 34372 (face font-lock-string-face) 34372 34379 (face font-lock-keyword-face) 34379 34380 (face font-lock-string-face) 34380 34404 nil 34404 34405 (face font-lock-string-face) 34405 34416 (face font-lock-keyword-face) 34416 34417 (face font-lock-string-face) 34417 34419 nil 34419 34420 (face font-lock-string-face) 34420 34436 (face font-lock-function-name-face) 34436 34437 (face font-lock-string-face) 34437 34449 nil 34449 34450 (face font-lock-string-face) 34450 34454 (face font-lock-keyword-face) 34454 34455 (face font-lock-string-face) 34455 34457 nil 34457 34458 (face font-lock-string-face) 34458 34468 (face font-lock-type-face) 34468 34469 (face font-lock-string-face) 34469 34481 nil 34481 34482 (face font-lock-string-face) 34482 34494 (face font-lock-keyword-face) 34494 34495 (face font-lock-string-face) 34495 34511 nil 34511 34512 (face font-lock-string-face) 34512 34517 (face font-lock-function-name-face) 34517 34518 (face font-lock-string-face) 34518 34532 nil 34532 34533 (face font-lock-string-face) 34533 34551 (face font-lock-function-name-face) 34551 34552 (face font-lock-string-face) 34552 34566 nil 34566 34567 (face font-lock-string-face) 34567 34588 (face font-lock-function-name-face) 34588 34589 (face font-lock-string-face) 34589 34603 nil 34603 34604 (face font-lock-string-face) 34604 34630 (face font-lock-function-name-face) 34630 34631 (face font-lock-string-face) 34631 34645 nil 34645 34646 (face font-lock-string-face) 34646 34680 (face font-lock-function-name-face) 34680 34681 (face font-lock-string-face) 34681 34695 nil 34695 34696 (face font-lock-string-face) 34696 34730 (face font-lock-function-name-face) 34730 34731 (face font-lock-string-face) 34731 34745 nil 34745 34746 (face font-lock-string-face) 34746 34772 (face font-lock-function-name-face) 34772 34773 (face font-lock-string-face) 34773 34787 nil 34787 34788 (face font-lock-string-face) 34788 34827 (face font-lock-function-name-face) 34827 34828 (face font-lock-string-face) 34828 34853 nil 34853 34854 (face font-lock-string-face) 34854 34861 (face font-lock-keyword-face) 34861 34862 (face font-lock-string-face) 34862 34878 nil 34878 34879 (face font-lock-string-face) 34879 34904 (face font-lock-constant-face) 34904 34905 (face font-lock-string-face) 34905 34930 nil 34930 34931 (face font-lock-string-face) 34931 34941 (face font-lock-keyword-face) 34941 34942 (face font-lock-string-face) 34942 34959 nil 34959 34960 (face font-lock-string-face) 34960 34981 (face font-lock-variable-name-face) 34981 34982 (face font-lock-string-face) 34982 35000 nil 35000 35001 (face font-lock-string-face) 35001 35013 (face font-lock-keyword-face) 35013 35014 (face font-lock-string-face) 35014 35034 nil 35034 35077 (face font-lock-comment-face) 35077 35093 nil 35093 35123 (face font-lock-comment-face) 35123 35139 nil 35139 35164 (face font-lock-comment-face) 35164 35180 nil 35180 35194 (face font-lock-comment-face) 35194 35210 nil 35210 35211 (face font-lock-string-face) 35211 35240 (face font-lock-function-name-face) 35240 35241 (face font-lock-string-face) 35241 35274 nil 35274 35275 (face font-lock-string-face) 35275 35285 (face font-lock-keyword-face) 35285 35286 (face font-lock-string-face) 35286 35307 nil 35307 35308 (face font-lock-string-face) 35308 35329 (face font-lock-variable-name-face) 35329 35330 (face font-lock-string-face) 35330 35352 nil 35352 35353 (face font-lock-string-face) 35353 35365 (face font-lock-keyword-face) 35365 35366 (face font-lock-string-face) 35366 35390 nil 35390 35391 (face font-lock-string-face) 35391 35432 (face font-lock-function-name-face) 35432 35433 (face font-lock-string-face) 35433 35553 nil 35553 35554 (face font-lock-string-face) 35554 35565 (face font-lock-keyword-face) 35565 35566 (face font-lock-string-face) 35566 35568 nil 35568 35569 (face font-lock-string-face) 35569 35592 (face font-lock-function-name-face) 35592 35593 (face font-lock-string-face) 35593 35605 nil 35605 35606 (face font-lock-string-face) 35606 35610 (face font-lock-keyword-face) 35610 35611 (face font-lock-string-face) 35611 35613 nil 35613 35614 (face font-lock-string-face) 35614 35624 (face font-lock-type-face) 35624 35625 (face font-lock-string-face) 35625 35637 nil 35637 35638 (face font-lock-string-face) 35638 35650 (face font-lock-keyword-face) 35650 35651 (face font-lock-string-face) 35651 35667 nil 35667 35668 (face font-lock-string-face) 35668 35673 (face font-lock-function-name-face) 35673 35674 (face font-lock-string-face) 35674 35688 nil 35688 35689 (face font-lock-string-face) 35689 35707 (face font-lock-function-name-face) 35707 35708 (face font-lock-string-face) 35708 35722 nil 35722 35723 (face font-lock-string-face) 35723 35757 (face font-lock-function-name-face) 35757 35758 (face font-lock-string-face) 35758 35772 nil 35772 35773 (face font-lock-string-face) 35773 35799 (face font-lock-function-name-face) 35799 35800 (face font-lock-string-face) 35800 35814 nil 35814 35815 (face font-lock-string-face) 35815 35841 (face font-lock-function-name-face) 35841 35842 (face font-lock-string-face) 35842 35856 nil 35856 35857 (face font-lock-string-face) 35857 35896 (face font-lock-function-name-face) 35896 35897 (face font-lock-string-face) 35897 35922 nil 35922 35923 (face font-lock-string-face) 35923 35930 (face font-lock-keyword-face) 35930 35931 (face font-lock-string-face) 35931 35947 nil 35947 35948 (face font-lock-string-face) 35948 35970 (face font-lock-constant-face) 35970 35971 (face font-lock-string-face) 35971 35985 nil 35985 35986 (face font-lock-string-face) 35986 36011 (face font-lock-constant-face) 36011 36012 (face font-lock-string-face) 36012 36026 nil 36026 36027 (face font-lock-string-face) 36027 36060 (face font-lock-constant-face) 36060 36061 (face font-lock-string-face) 36061 36075 nil 36075 36076 (face font-lock-string-face) 36076 36117 (face font-lock-constant-face) 36117 36118 (face font-lock-string-face) 36118 36143 nil 36143 36144 (face font-lock-string-face) 36144 36154 (face font-lock-keyword-face) 36154 36155 (face font-lock-string-face) 36155 36172 nil 36172 36173 (face font-lock-string-face) 36173 36198 (face font-lock-variable-name-face) 36198 36199 (face font-lock-string-face) 36199 36217 nil 36217 36218 (face font-lock-string-face) 36218 36228 (face font-lock-keyword-face) 36228 36229 (face font-lock-string-face) 36229 36250 nil 36250 36251 (face font-lock-string-face) 36251 36272 (face font-lock-variable-name-face) 36272 36273 (face font-lock-string-face) 36273 36295 nil 36295 36296 (face font-lock-string-face) 36296 36308 (face font-lock-keyword-face) 36308 36309 (face font-lock-string-face) 36309 36333 nil 36333 36334 (face font-lock-string-face) 36334 36375 (face font-lock-function-name-face) 36375 36376 (face font-lock-string-face) 36376 36496 nil 36496 36497 (face font-lock-string-face) 36497 36508 (face font-lock-keyword-face) 36508 36509 (face font-lock-string-face) 36509 36511 nil 36511 36512 (face font-lock-string-face) 36512 36524 (face font-lock-function-name-face) 36524 36525 (face font-lock-string-face) 36525 36537 nil 36537 36538 (face font-lock-string-face) 36538 36542 (face font-lock-keyword-face) 36542 36543 (face font-lock-string-face) 36543 36545 nil 36545 36546 (face font-lock-string-face) 36546 36556 (face font-lock-type-face) 36556 36557 (face font-lock-string-face) 36557 36569 nil 36569 36570 (face font-lock-string-face) 36570 36582 (face font-lock-keyword-face) 36582 36583 (face font-lock-string-face) 36583 36599 nil 36599 36600 (face font-lock-string-face) 36600 36605 (face font-lock-function-name-face) 36605 36606 (face font-lock-string-face) 36606 36620 nil 36620 36621 (face font-lock-string-face) 36621 36642 (face font-lock-function-name-face) 36642 36643 (face font-lock-string-face) 36643 36657 nil 36657 36658 (face font-lock-string-face) 36658 36697 (face font-lock-function-name-face) 36697 36698 (face font-lock-string-face) 36698 36723 nil 36723 36724 (face font-lock-string-face) 36724 36731 (face font-lock-keyword-face) 36731 36732 (face font-lock-string-face) 36732 36748 nil 36748 36749 (face font-lock-string-face) 36749 36782 (face font-lock-constant-face) 36782 36783 (face font-lock-string-face) 36783 36829 nil 36829 36830 (face font-lock-string-face) 36830 36841 (face font-lock-keyword-face) 36841 36842 (face font-lock-string-face) 36842 36844 nil 36844 36845 (face font-lock-string-face) 36845 36856 (face font-lock-function-name-face) 36856 36857 (face font-lock-string-face) 36857 36869 nil 36869 36870 (face font-lock-string-face) 36870 36874 (face font-lock-keyword-face) 36874 36875 (face font-lock-string-face) 36875 36877 nil 36877 36878 (face font-lock-string-face) 36878 36888 (face font-lock-type-face) 36888 36889 (face font-lock-string-face) 36889 36901 nil 36901 36902 (face font-lock-string-face) 36902 36914 (face font-lock-keyword-face) 36914 36915 (face font-lock-string-face) 36915 36931 nil 36931 36932 (face font-lock-string-face) 36932 36937 (face font-lock-function-name-face) 36937 36938 (face font-lock-string-face) 36938 36952 nil 36952 36953 (face font-lock-string-face) 36953 36974 (face font-lock-function-name-face) 36974 36975 (face font-lock-string-face) 36975 36989 nil 36989 36990 (face font-lock-string-face) 36990 37029 (face font-lock-function-name-face) 37029 37030 (face font-lock-string-face) 37030 37055 nil 37055 37056 (face font-lock-string-face) 37056 37063 (face font-lock-keyword-face) 37063 37064 (face font-lock-string-face) 37064 37080 nil 37080 37081 (face font-lock-string-face) 37081 37113 (face font-lock-constant-face) 37113 37114 (face font-lock-string-face) 37114 37163 nil) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/samples���������������000755 �000766 �000024 �00000004500 12455173731 030735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/python # Copyright (c) 2009 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import os.path import shutil import sys gyps = [ 'app/app.gyp', 'base/base.gyp', 'build/temp_gyp/googleurl.gyp', 'build/all.gyp', 'build/common.gypi', 'build/external_code.gypi', 'chrome/test/security_tests/security_tests.gyp', 'chrome/third_party/hunspell/hunspell.gyp', 'chrome/chrome.gyp', 'media/media.gyp', 'net/net.gyp', 'printing/printing.gyp', 'sdch/sdch.gyp', 'skia/skia.gyp', 'testing/gmock.gyp', 'testing/gtest.gyp', 'third_party/bzip2/bzip2.gyp', 'third_party/icu38/icu38.gyp', 'third_party/libevent/libevent.gyp', 'third_party/libjpeg/libjpeg.gyp', 'third_party/libpng/libpng.gyp', 'third_party/libxml/libxml.gyp', 'third_party/libxslt/libxslt.gyp', 'third_party/lzma_sdk/lzma_sdk.gyp', 'third_party/modp_b64/modp_b64.gyp', 'third_party/npapi/npapi.gyp', 'third_party/sqlite/sqlite.gyp', 'third_party/zlib/zlib.gyp', 'v8/tools/gyp/v8.gyp', 'webkit/activex_shim/activex_shim.gyp', 'webkit/activex_shim_dll/activex_shim_dll.gyp', 'webkit/build/action_csspropertynames.py', 'webkit/build/action_cssvaluekeywords.py', 'webkit/build/action_jsconfig.py', 'webkit/build/action_makenames.py', 'webkit/build/action_maketokenizer.py', 'webkit/build/action_useragentstylesheets.py', 'webkit/build/rule_binding.py', 'webkit/build/rule_bison.py', 'webkit/build/rule_gperf.py', 'webkit/tools/test_shell/test_shell.gyp', 'webkit/webkit.gyp', ] def Main(argv): if len(argv) != 3 or argv[1] not in ['push', 'pull']: print 'Usage: %s push/pull PATH_TO_CHROME' % argv[0] return 1 path_to_chrome = argv[2] for g in gyps: chrome_file = os.path.join(path_to_chrome, g) local_file = os.path.join(os.path.dirname(argv[0]), os.path.split(g)[1]) if argv[1] == 'push': print 'Copying %s to %s' % (local_file, chrome_file) shutil.copyfile(local_file, chrome_file) elif argv[1] == 'pull': print 'Copying %s to %s' % (chrome_file, local_file) shutil.copyfile(chrome_file, local_file) else: assert False return 0 if __name__ == '__main__': sys.exit(Main(sys.argv)) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/samples/samples.bat�����������000644 �000766 �000024 �00000000304 12455173731 031475� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@rem Copyright (c) 2009 Google Inc. All rights reserved. @rem Use of this source code is governed by a BSD-style license that can be @rem found in the LICENSE file. @python %~dp0/samples %* ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/��������������������000755 �000766 �000024 �00000000000 12456115117 027611� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/__init__.py���������000755 �000766 �000024 �00000051741 12455173731 031742� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import copy import gyp.input import optparse import os.path import re import shlex import sys import traceback from gyp.common import GypError # Default debug modes for GYP debug = {} # List of "official" debug modes, but you can use anything you like. DEBUG_GENERAL = 'general' DEBUG_VARIABLES = 'variables' DEBUG_INCLUDES = 'includes' def DebugOutput(mode, message, *args): if 'all' in gyp.debug or mode in gyp.debug: ctx = ('unknown', 0, 'unknown') try: f = traceback.extract_stack(limit=2) if f: ctx = f[0][:3] except: pass if args: message %= args print '%s:%s:%d:%s %s' % (mode.upper(), os.path.basename(ctx[0]), ctx[1], ctx[2], message) def FindBuildFiles(): extension = '.gyp' files = os.listdir(os.getcwd()) build_files = [] for file in files: if file.endswith(extension): build_files.append(file) return build_files def Load(build_files, format, default_variables={}, includes=[], depth='.', params=None, check=False, circular_check=True): """ Loads one or more specified build files. default_variables and includes will be copied before use. Returns the generator for the specified format and the data returned by loading the specified build files. """ if params is None: params = {} flavor = None if '-' in format: format, params['flavor'] = format.split('-', 1) default_variables = copy.copy(default_variables) # Default variables provided by this program and its modules should be # named WITH_CAPITAL_LETTERS to provide a distinct "best practice" namespace, # avoiding collisions with user and automatic variables. default_variables['GENERATOR'] = format # Format can be a custom python file, or by default the name of a module # within gyp.generator. if format.endswith('.py'): generator_name = os.path.splitext(format)[0] path, generator_name = os.path.split(generator_name) # Make sure the path to the custom generator is in sys.path # Don't worry about removing it once we are done. Keeping the path # to each generator that is used in sys.path is likely harmless and # arguably a good idea. path = os.path.abspath(path) if path not in sys.path: sys.path.insert(0, path) else: generator_name = 'gyp.generator.' + format # These parameters are passed in order (as opposed to by key) # because ActivePython cannot handle key parameters to __import__. generator = __import__(generator_name, globals(), locals(), generator_name) for (key, val) in generator.generator_default_variables.items(): default_variables.setdefault(key, val) # Give the generator the opportunity to set additional variables based on # the params it will receive in the output phase. if getattr(generator, 'CalculateVariables', None): generator.CalculateVariables(default_variables, params) # Give the generator the opportunity to set generator_input_info based on # the params it will receive in the output phase. if getattr(generator, 'CalculateGeneratorInputInfo', None): generator.CalculateGeneratorInputInfo(params) # Fetch the generator specific info that gets fed to input, we use getattr # so we can default things and the generators only have to provide what # they need. generator_input_info = { 'non_configuration_keys': getattr(generator, 'generator_additional_non_configuration_keys', []), 'path_sections': getattr(generator, 'generator_additional_path_sections', []), 'extra_sources_for_rules': getattr(generator, 'generator_extra_sources_for_rules', []), 'generator_supports_multiple_toolsets': getattr(generator, 'generator_supports_multiple_toolsets', False), 'generator_wants_static_library_dependencies_adjusted': getattr(generator, 'generator_wants_static_library_dependencies_adjusted', True), 'generator_wants_sorted_dependencies': getattr(generator, 'generator_wants_sorted_dependencies', False), 'generator_filelist_paths': getattr(generator, 'generator_filelist_paths', None), } # Process the input specific to this generator. result = gyp.input.Load(build_files, default_variables, includes[:], depth, generator_input_info, check, circular_check, params['parallel'], params['root_targets']) return [generator] + result def NameValueListToDict(name_value_list): """ Takes an array of strings of the form 'NAME=VALUE' and creates a dictionary of the pairs. If a string is simply NAME, then the value in the dictionary is set to True. If VALUE can be converted to an integer, it is. """ result = { } for item in name_value_list: tokens = item.split('=', 1) if len(tokens) == 2: # If we can make it an int, use that, otherwise, use the string. try: token_value = int(tokens[1]) except ValueError: token_value = tokens[1] # Set the variable to the supplied value. result[tokens[0]] = token_value else: # No value supplied, treat it as a boolean and set it. result[tokens[0]] = True return result def ShlexEnv(env_name): flags = os.environ.get(env_name, []) if flags: flags = shlex.split(flags) return flags def FormatOpt(opt, value): if opt.startswith('--'): return '%s=%s' % (opt, value) return opt + value def RegenerateAppendFlag(flag, values, predicate, env_name, options): """Regenerate a list of command line flags, for an option of action='append'. The |env_name|, if given, is checked in the environment and used to generate an initial list of options, then the options that were specified on the command line (given in |values|) are appended. This matches the handling of environment variables and command line flags where command line flags override the environment, while not requiring the environment to be set when the flags are used again. """ flags = [] if options.use_environment and env_name: for flag_value in ShlexEnv(env_name): value = FormatOpt(flag, predicate(flag_value)) if value in flags: flags.remove(value) flags.append(value) if values: for flag_value in values: flags.append(FormatOpt(flag, predicate(flag_value))) return flags def RegenerateFlags(options): """Given a parsed options object, and taking the environment variables into account, returns a list of flags that should regenerate an equivalent options object (even in the absence of the environment variables.) Any path options will be normalized relative to depth. The format flag is not included, as it is assumed the calling generator will set that as appropriate. """ def FixPath(path): path = gyp.common.FixIfRelativePath(path, options.depth) if not path: return os.path.curdir return path def Noop(value): return value # We always want to ignore the environment when regenerating, to avoid # duplicate or changed flags in the environment at the time of regeneration. flags = ['--ignore-environment'] for name, metadata in options._regeneration_metadata.iteritems(): opt = metadata['opt'] value = getattr(options, name) value_predicate = metadata['type'] == 'path' and FixPath or Noop action = metadata['action'] env_name = metadata['env_name'] if action == 'append': flags.extend(RegenerateAppendFlag(opt, value, value_predicate, env_name, options)) elif action in ('store', None): # None is a synonym for 'store'. if value: flags.append(FormatOpt(opt, value_predicate(value))) elif options.use_environment and env_name and os.environ.get(env_name): flags.append(FormatOpt(opt, value_predicate(os.environ.get(env_name)))) elif action in ('store_true', 'store_false'): if ((action == 'store_true' and value) or (action == 'store_false' and not value)): flags.append(opt) elif options.use_environment and env_name: print >>sys.stderr, ('Warning: environment regeneration unimplemented ' 'for %s flag %r env_name %r' % (action, opt, env_name)) else: print >>sys.stderr, ('Warning: regeneration unimplemented for action %r ' 'flag %r' % (action, opt)) return flags class RegeneratableOptionParser(optparse.OptionParser): def __init__(self): self.__regeneratable_options = {} optparse.OptionParser.__init__(self) def add_option(self, *args, **kw): """Add an option to the parser. This accepts the same arguments as OptionParser.add_option, plus the following: regenerate: can be set to False to prevent this option from being included in regeneration. env_name: name of environment variable that additional values for this option come from. type: adds type='path', to tell the regenerator that the values of this option need to be made relative to options.depth """ env_name = kw.pop('env_name', None) if 'dest' in kw and kw.pop('regenerate', True): dest = kw['dest'] # The path type is needed for regenerating, for optparse we can just treat # it as a string. type = kw.get('type') if type == 'path': kw['type'] = 'string' self.__regeneratable_options[dest] = { 'action': kw.get('action'), 'type': type, 'env_name': env_name, 'opt': args[0], } optparse.OptionParser.add_option(self, *args, **kw) def parse_args(self, *args): values, args = optparse.OptionParser.parse_args(self, *args) values._regeneration_metadata = self.__regeneratable_options return values, args def gyp_main(args): my_name = os.path.basename(sys.argv[0]) parser = RegeneratableOptionParser() usage = 'usage: %s [options ...] [build_file ...]' parser.set_usage(usage.replace('%s', '%prog')) parser.add_option('--build', dest='configs', action='append', help='configuration for build after project generation') parser.add_option('--check', dest='check', action='store_true', help='check format of gyp files') parser.add_option('--config-dir', dest='config_dir', action='store', env_name='GYP_CONFIG_DIR', default=None, help='The location for configuration files like ' 'include.gypi.') parser.add_option('-d', '--debug', dest='debug', metavar='DEBUGMODE', action='append', default=[], help='turn on a debugging ' 'mode for debugging GYP. Supported modes are "variables", ' '"includes" and "general" or "all" for all of them.') parser.add_option('-D', dest='defines', action='append', metavar='VAR=VAL', env_name='GYP_DEFINES', help='sets variable VAR to value VAL') parser.add_option('--depth', dest='depth', metavar='PATH', type='path', help='set DEPTH gyp variable to a relative path to PATH') parser.add_option('-f', '--format', dest='formats', action='append', env_name='GYP_GENERATORS', regenerate=False, help='output formats to generate') parser.add_option('-G', dest='generator_flags', action='append', default=[], metavar='FLAG=VAL', env_name='GYP_GENERATOR_FLAGS', help='sets generator flag FLAG to VAL') parser.add_option('--generator-output', dest='generator_output', action='store', default=None, metavar='DIR', type='path', env_name='GYP_GENERATOR_OUTPUT', help='puts generated build files under DIR') parser.add_option('--ignore-environment', dest='use_environment', action='store_false', default=True, regenerate=False, help='do not read options from environment variables') parser.add_option('-I', '--include', dest='includes', action='append', metavar='INCLUDE', type='path', help='files to include in all loaded .gyp files') # --no-circular-check disables the check for circular relationships between # .gyp files. These relationships should not exist, but they've only been # observed to be harmful with the Xcode generator. Chromium's .gyp files # currently have some circular relationships on non-Mac platforms, so this # option allows the strict behavior to be used on Macs and the lenient # behavior to be used elsewhere. # TODO(mark): Remove this option when http://crbug.com/35878 is fixed. parser.add_option('--no-circular-check', dest='circular_check', action='store_false', default=True, regenerate=False, help="don't check for circular relationships between files") parser.add_option('--no-parallel', action='store_true', default=False, help='Disable multiprocessing') parser.add_option('-S', '--suffix', dest='suffix', default='', help='suffix to add to generated files') parser.add_option('--toplevel-dir', dest='toplevel_dir', action='store', default=None, metavar='DIR', type='path', help='directory to use as the root of the source tree') parser.add_option('-R', '--root-target', dest='root_targets', action='append', metavar='TARGET', help='include only TARGET and its deep dependencies') options, build_files_arg = parser.parse_args(args) build_files = build_files_arg # Set up the configuration directory (defaults to ~/.gyp) if not options.config_dir: home = None home_dot_gyp = None if options.use_environment: home_dot_gyp = os.environ.get('GYP_CONFIG_DIR', None) if home_dot_gyp: home_dot_gyp = os.path.expanduser(home_dot_gyp) if not home_dot_gyp: home_vars = ['HOME'] if sys.platform in ('cygwin', 'win32'): home_vars.append('USERPROFILE') for home_var in home_vars: home = os.getenv(home_var) if home != None: home_dot_gyp = os.path.join(home, '.gyp') if not os.path.exists(home_dot_gyp): home_dot_gyp = None else: break else: home_dot_gyp = os.path.expanduser(options.config_dir) if home_dot_gyp and not os.path.exists(home_dot_gyp): home_dot_gyp = None if not options.formats: # If no format was given on the command line, then check the env variable. generate_formats = [] if options.use_environment: generate_formats = os.environ.get('GYP_GENERATORS', []) if generate_formats: generate_formats = re.split('[\s,]', generate_formats) if generate_formats: options.formats = generate_formats else: # Nothing in the variable, default based on platform. if sys.platform == 'darwin': options.formats = ['xcode'] elif sys.platform in ('win32', 'cygwin'): options.formats = ['msvs'] else: options.formats = ['make'] if not options.generator_output and options.use_environment: g_o = os.environ.get('GYP_GENERATOR_OUTPUT') if g_o: options.generator_output = g_o options.parallel = not options.no_parallel for mode in options.debug: gyp.debug[mode] = 1 # Do an extra check to avoid work when we're not debugging. if DEBUG_GENERAL in gyp.debug: DebugOutput(DEBUG_GENERAL, 'running with these options:') for option, value in sorted(options.__dict__.items()): if option[0] == '_': continue if isinstance(value, basestring): DebugOutput(DEBUG_GENERAL, " %s: '%s'", option, value) else: DebugOutput(DEBUG_GENERAL, " %s: %s", option, value) if not build_files: build_files = FindBuildFiles() if not build_files: raise GypError((usage + '\n\n%s: error: no build_file') % (my_name, my_name)) # TODO(mark): Chromium-specific hack! # For Chromium, the gyp "depth" variable should always be a relative path # to Chromium's top-level "src" directory. If no depth variable was set # on the command line, try to find a "src" directory by looking at the # absolute path to each build file's directory. The first "src" component # found will be treated as though it were the path used for --depth. if not options.depth: for build_file in build_files: build_file_dir = os.path.abspath(os.path.dirname(build_file)) build_file_dir_components = build_file_dir.split(os.path.sep) components_len = len(build_file_dir_components) for index in xrange(components_len - 1, -1, -1): if build_file_dir_components[index] == 'src': options.depth = os.path.sep.join(build_file_dir_components) break del build_file_dir_components[index] # If the inner loop found something, break without advancing to another # build file. if options.depth: break if not options.depth: raise GypError('Could not automatically locate src directory. This is' 'a temporary Chromium feature that will be removed. Use' '--depth as a workaround.') # If toplevel-dir is not set, we assume that depth is the root of our source # tree. if not options.toplevel_dir: options.toplevel_dir = options.depth # -D on the command line sets variable defaults - D isn't just for define, # it's for default. Perhaps there should be a way to force (-F?) a # variable's value so that it can't be overridden by anything else. cmdline_default_variables = {} defines = [] if options.use_environment: defines += ShlexEnv('GYP_DEFINES') if options.defines: defines += options.defines cmdline_default_variables = NameValueListToDict(defines) if DEBUG_GENERAL in gyp.debug: DebugOutput(DEBUG_GENERAL, "cmdline_default_variables: %s", cmdline_default_variables) # Set up includes. includes = [] # If ~/.gyp/include.gypi exists, it'll be forcibly included into every # .gyp file that's loaded, before anything else is included. if home_dot_gyp != None: default_include = os.path.join(home_dot_gyp, 'include.gypi') if os.path.exists(default_include): print 'Using overrides found in ' + default_include includes.append(default_include) # Command-line --include files come after the default include. if options.includes: includes.extend(options.includes) # Generator flags should be prefixed with the target generator since they # are global across all generator runs. gen_flags = [] if options.use_environment: gen_flags += ShlexEnv('GYP_GENERATOR_FLAGS') if options.generator_flags: gen_flags += options.generator_flags generator_flags = NameValueListToDict(gen_flags) if DEBUG_GENERAL in gyp.debug.keys(): DebugOutput(DEBUG_GENERAL, "generator_flags: %s", generator_flags) # Generate all requested formats (use a set in case we got one format request # twice) for format in set(options.formats): params = {'options': options, 'build_files': build_files, 'generator_flags': generator_flags, 'cwd': os.getcwd(), 'build_files_arg': build_files_arg, 'gyp_binary': sys.argv[0], 'home_dot_gyp': home_dot_gyp, 'parallel': options.parallel, 'root_targets': options.root_targets} # Start with the default variables from the command line. [generator, flat_list, targets, data] = Load(build_files, format, cmdline_default_variables, includes, options.depth, params, options.check, options.circular_check) # TODO(mark): Pass |data| for now because the generator needs a list of # build files that came in. In the future, maybe it should just accept # a list, and not the whole data dict. # NOTE: flat_list is the flattened dependency graph specifying the order # that targets may be built. Build systems that operate serially or that # need to have dependencies defined before dependents reference them should # generate targets in the order specified in flat_list. generator.GenerateOutput(flat_list, targets, data, params) if options.configs: valid_configs = targets[flat_list[0]]['configurations'].keys() for conf in options.configs: if conf not in valid_configs: raise GypError('Invalid config specified via --build: %s' % conf) generator.PerformBuild(data, options.configs, params) # Done return 0 def main(args): try: return gyp_main(args) except GypError, e: sys.stderr.write("gyp: %s\n" % e) return 1 # NOTE: setuptools generated console_scripts calls function with no arguments def script_main(): return main(sys.argv[1:]) if __name__ == '__main__': sys.exit(script_main()) �������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/common.py�����������000644 �000766 �000024 �00000041776 12455173731 031477� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. from __future__ import with_statement import errno import filecmp import os.path import re import tempfile import sys # A minimal memoizing decorator. It'll blow up if the args aren't immutable, # among other "problems". class memoize(object): def __init__(self, func): self.func = func self.cache = {} def __call__(self, *args): try: return self.cache[args] except KeyError: result = self.func(*args) self.cache[args] = result return result class GypError(Exception): """Error class representing an error, which is to be presented to the user. The main entry point will catch and display this. """ pass def ExceptionAppend(e, msg): """Append a message to the given exception's message.""" if not e.args: e.args = (msg,) elif len(e.args) == 1: e.args = (str(e.args[0]) + ' ' + msg,) else: e.args = (str(e.args[0]) + ' ' + msg,) + e.args[1:] def FindQualifiedTargets(target, qualified_list): """ Given a list of qualified targets, return the qualified targets for the specified |target|. """ return [t for t in qualified_list if ParseQualifiedTarget(t)[1] == target] def ParseQualifiedTarget(target): # Splits a qualified target into a build file, target name and toolset. # NOTE: rsplit is used to disambiguate the Windows drive letter separator. target_split = target.rsplit(':', 1) if len(target_split) == 2: [build_file, target] = target_split else: build_file = None target_split = target.rsplit('#', 1) if len(target_split) == 2: [target, toolset] = target_split else: toolset = None return [build_file, target, toolset] def ResolveTarget(build_file, target, toolset): # This function resolves a target into a canonical form: # - a fully defined build file, either absolute or relative to the current # directory # - a target name # - a toolset # # build_file is the file relative to which 'target' is defined. # target is the qualified target. # toolset is the default toolset for that target. [parsed_build_file, target, parsed_toolset] = ParseQualifiedTarget(target) if parsed_build_file: if build_file: # If a relative path, parsed_build_file is relative to the directory # containing build_file. If build_file is not in the current directory, # parsed_build_file is not a usable path as-is. Resolve it by # interpreting it as relative to build_file. If parsed_build_file is # absolute, it is usable as a path regardless of the current directory, # and os.path.join will return it as-is. build_file = os.path.normpath(os.path.join(os.path.dirname(build_file), parsed_build_file)) # Further (to handle cases like ../cwd), make it relative to cwd) if not os.path.isabs(build_file): build_file = RelativePath(build_file, '.') else: build_file = parsed_build_file if parsed_toolset: toolset = parsed_toolset return [build_file, target, toolset] def BuildFile(fully_qualified_target): # Extracts the build file from the fully qualified target. return ParseQualifiedTarget(fully_qualified_target)[0] def GetEnvironFallback(var_list, default): """Look up a key in the environment, with fallback to secondary keys and finally falling back to a default value.""" for var in var_list: if var in os.environ: return os.environ[var] return default def QualifiedTarget(build_file, target, toolset): # "Qualified" means the file that a target was defined in and the target # name, separated by a colon, suffixed by a # and the toolset name: # /path/to/file.gyp:target_name#toolset fully_qualified = build_file + ':' + target if toolset: fully_qualified = fully_qualified + '#' + toolset return fully_qualified @memoize def RelativePath(path, relative_to): # Assuming both |path| and |relative_to| are relative to the current # directory, returns a relative path that identifies path relative to # relative_to. # Convert to normalized (and therefore absolute paths). path = os.path.realpath(path) relative_to = os.path.realpath(relative_to) # On Windows, we can't create a relative path to a different drive, so just # use the absolute path. if sys.platform == 'win32': if (os.path.splitdrive(path)[0].lower() != os.path.splitdrive(relative_to)[0].lower()): return path # Split the paths into components. path_split = path.split(os.path.sep) relative_to_split = relative_to.split(os.path.sep) # Determine how much of the prefix the two paths share. prefix_len = len(os.path.commonprefix([path_split, relative_to_split])) # Put enough ".." components to back up out of relative_to to the common # prefix, and then append the part of path_split after the common prefix. relative_split = [os.path.pardir] * (len(relative_to_split) - prefix_len) + \ path_split[prefix_len:] if len(relative_split) == 0: # The paths were the same. return '' # Turn it back into a string and we're done. return os.path.join(*relative_split) @memoize def InvertRelativePath(path, toplevel_dir=None): """Given a path like foo/bar that is relative to toplevel_dir, return the inverse relative path back to the toplevel_dir. E.g. os.path.normpath(os.path.join(path, InvertRelativePath(path))) should always produce the empty string, unless the path contains symlinks. """ if not path: return path toplevel_dir = '.' if toplevel_dir is None else toplevel_dir return RelativePath(toplevel_dir, os.path.join(toplevel_dir, path)) def FixIfRelativePath(path, relative_to): # Like RelativePath but returns |path| unchanged if it is absolute. if os.path.isabs(path): return path return RelativePath(path, relative_to) def UnrelativePath(path, relative_to): # Assuming that |relative_to| is relative to the current directory, and |path| # is a path relative to the dirname of |relative_to|, returns a path that # identifies |path| relative to the current directory. rel_dir = os.path.dirname(relative_to) return os.path.normpath(os.path.join(rel_dir, path)) # re objects used by EncodePOSIXShellArgument. See IEEE 1003.1 XCU.2.2 at # http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_02 # and the documentation for various shells. # _quote is a pattern that should match any argument that needs to be quoted # with double-quotes by EncodePOSIXShellArgument. It matches the following # characters appearing anywhere in an argument: # \t, \n, space parameter separators # # comments # $ expansions (quoted to always expand within one argument) # % called out by IEEE 1003.1 XCU.2.2 # & job control # ' quoting # (, ) subshell execution # *, ?, [ pathname expansion # ; command delimiter # <, >, | redirection # = assignment # {, } brace expansion (bash) # ~ tilde expansion # It also matches the empty string, because "" (or '') is the only way to # represent an empty string literal argument to a POSIX shell. # # This does not match the characters in _escape, because those need to be # backslash-escaped regardless of whether they appear in a double-quoted # string. _quote = re.compile('[\t\n #$%&\'()*;<=>?[{|}~]|^$') # _escape is a pattern that should match any character that needs to be # escaped with a backslash, whether or not the argument matched the _quote # pattern. _escape is used with re.sub to backslash anything in _escape's # first match group, hence the (parentheses) in the regular expression. # # _escape matches the following characters appearing anywhere in an argument: # " to prevent POSIX shells from interpreting this character for quoting # \ to prevent POSIX shells from interpreting this character for escaping # ` to prevent POSIX shells from interpreting this character for command # substitution # Missing from this list is $, because the desired behavior of # EncodePOSIXShellArgument is to permit parameter (variable) expansion. # # Also missing from this list is !, which bash will interpret as the history # expansion character when history is enabled. bash does not enable history # by default in non-interactive shells, so this is not thought to be a problem. # ! was omitted from this list because bash interprets "\!" as a literal string # including the backslash character (avoiding history expansion but retaining # the backslash), which would not be correct for argument encoding. Handling # this case properly would also be problematic because bash allows the history # character to be changed with the histchars shell variable. Fortunately, # as history is not enabled in non-interactive shells and # EncodePOSIXShellArgument is only expected to encode for non-interactive # shells, there is no room for error here by ignoring !. _escape = re.compile(r'(["\\`])') def EncodePOSIXShellArgument(argument): """Encodes |argument| suitably for consumption by POSIX shells. argument may be quoted and escaped as necessary to ensure that POSIX shells treat the returned value as a literal representing the argument passed to this function. Parameter (variable) expansions beginning with $ are allowed to remain intact without escaping the $, to allow the argument to contain references to variables to be expanded by the shell. """ if not isinstance(argument, str): argument = str(argument) if _quote.search(argument): quote = '"' else: quote = '' encoded = quote + re.sub(_escape, r'\\\1', argument) + quote return encoded def EncodePOSIXShellList(list): """Encodes |list| suitably for consumption by POSIX shells. Returns EncodePOSIXShellArgument for each item in list, and joins them together using the space character as an argument separator. """ encoded_arguments = [] for argument in list: encoded_arguments.append(EncodePOSIXShellArgument(argument)) return ' '.join(encoded_arguments) def DeepDependencyTargets(target_dicts, roots): """Returns the recursive list of target dependencies.""" dependencies = set() pending = set(roots) while pending: # Pluck out one. r = pending.pop() # Skip if visited already. if r in dependencies: continue # Add it. dependencies.add(r) # Add its children. spec = target_dicts[r] pending.update(set(spec.get('dependencies', []))) pending.update(set(spec.get('dependencies_original', []))) return list(dependencies - set(roots)) def BuildFileTargets(target_list, build_file): """From a target_list, returns the subset from the specified build_file. """ return [p for p in target_list if BuildFile(p) == build_file] def AllTargets(target_list, target_dicts, build_file): """Returns all targets (direct and dependencies) for the specified build_file. """ bftargets = BuildFileTargets(target_list, build_file) deptargets = DeepDependencyTargets(target_dicts, bftargets) return bftargets + deptargets def WriteOnDiff(filename): """Write to a file only if the new contents differ. Arguments: filename: name of the file to potentially write to. Returns: A file like object which will write to temporary file and only overwrite the target if it differs (on close). """ class Writer: """Wrapper around file which only covers the target if it differs.""" def __init__(self): # Pick temporary file. tmp_fd, self.tmp_path = tempfile.mkstemp( suffix='.tmp', prefix=os.path.split(filename)[1] + '.gyp.', dir=os.path.split(filename)[0]) try: self.tmp_file = os.fdopen(tmp_fd, 'wb') except Exception: # Don't leave turds behind. os.unlink(self.tmp_path) raise def __getattr__(self, attrname): # Delegate everything else to self.tmp_file return getattr(self.tmp_file, attrname) def close(self): try: # Close tmp file. self.tmp_file.close() # Determine if different. same = False try: same = filecmp.cmp(self.tmp_path, filename, False) except OSError, e: if e.errno != errno.ENOENT: raise if same: # The new file is identical to the old one, just get rid of the new # one. os.unlink(self.tmp_path) else: # The new file is different from the old one, or there is no old one. # Rename the new file to the permanent name. # # tempfile.mkstemp uses an overly restrictive mode, resulting in a # file that can only be read by the owner, regardless of the umask. # There's no reason to not respect the umask here, which means that # an extra hoop is required to fetch it and reset the new file's mode. # # No way to get the umask without setting a new one? Set a safe one # and then set it back to the old value. umask = os.umask(077) os.umask(umask) os.chmod(self.tmp_path, 0666 & ~umask) if sys.platform == 'win32' and os.path.exists(filename): # NOTE: on windows (but not cygwin) rename will not replace an # existing file, so it must be preceded with a remove. Sadly there # is no way to make the switch atomic. os.remove(filename) os.rename(self.tmp_path, filename) except Exception: # Don't leave turds behind. os.unlink(self.tmp_path) raise return Writer() def EnsureDirExists(path): """Make sure the directory for |path| exists.""" try: os.makedirs(os.path.dirname(path)) except OSError: pass def GetFlavor(params): """Returns |params.flavor| if it's set, the system's default flavor else.""" flavors = { 'cygwin': 'win', 'win32': 'win', 'darwin': 'mac', } if 'flavor' in params: return params['flavor'] if sys.platform in flavors: return flavors[sys.platform] if sys.platform.startswith('sunos'): return 'solaris' if sys.platform.startswith('freebsd'): return 'freebsd' if sys.platform.startswith('openbsd'): return 'openbsd' if sys.platform.startswith('aix'): return 'aix' return 'linux' def CopyTool(flavor, out_path): """Finds (flock|mac|win)_tool.gyp in the gyp directory and copies it to |out_path|.""" # aix and solaris just need flock emulation. mac and win use more complicated # support scripts. prefix = { 'aix': 'flock', 'solaris': 'flock', 'mac': 'mac', 'win': 'win' }.get(flavor, None) if not prefix: return # Slurp input file. source_path = os.path.join( os.path.dirname(os.path.abspath(__file__)), '%s_tool.py' % prefix) with open(source_path) as source_file: source = source_file.readlines() # Add header and write it out. tool_path = os.path.join(out_path, 'gyp-%s-tool' % prefix) with open(tool_path, 'w') as tool_file: tool_file.write( ''.join([source[0], '# Generated by gyp. Do not edit.\n'] + source[1:])) # Make file executable. os.chmod(tool_path, 0755) # From Alex Martelli, # http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52560 # ASPN: Python Cookbook: Remove duplicates from a sequence # First comment, dated 2001/10/13. # (Also in the printed Python Cookbook.) def uniquer(seq, idfun=None): if idfun is None: idfun = lambda x: x seen = {} result = [] for item in seq: marker = idfun(item) if marker in seen: continue seen[marker] = 1 result.append(item) return result class CycleError(Exception): """An exception raised when an unexpected cycle is detected.""" def __init__(self, nodes): self.nodes = nodes def __str__(self): return 'CycleError: cycle involving: ' + str(self.nodes) def TopologicallySorted(graph, get_edges): """Topologically sort based on a user provided edge definition. Args: graph: A list of node names. get_edges: A function mapping from node name to a hashable collection of node names which this node has outgoing edges to. Returns: A list containing all of the node in graph in topological order. It is assumed that calling get_edges once for each node and caching is cheaper than repeatedly calling get_edges. Raises: CycleError in the event of a cycle. Example: graph = {'a': '$(b) $(c)', 'b': 'hi', 'c': '$(b)'} def GetEdges(node): return re.findall(r'\$\(([^))]\)', graph[node]) print TopologicallySorted(graph.keys(), GetEdges) ==> ['a', 'c', b'] """ get_edges = memoize(get_edges) visited = set() visiting = set() ordered_nodes = [] def Visit(node): if node in visiting: raise CycleError(visiting) if node in visited: return visited.add(node) visiting.add(node) for neighbor in get_edges(node): Visit(neighbor) visiting.remove(node) ordered_nodes.insert(0, node) for node in sorted(graph): Visit(node) return ordered_nodes ��iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/common_test.py������000755 �000766 �000024 �00000003662 12455173731 032531� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Unit tests for the common.py file.""" import gyp.common import unittest import sys class TestTopologicallySorted(unittest.TestCase): def test_Valid(self): """Test that sorting works on a valid graph with one possible order.""" graph = { 'a': ['b', 'c'], 'b': [], 'c': ['d'], 'd': ['b'], } def GetEdge(node): return tuple(graph[node]) self.assertEqual( gyp.common.TopologicallySorted(graph.keys(), GetEdge), ['a', 'c', 'd', 'b']) def test_Cycle(self): """Test that an exception is thrown on a cyclic graph.""" graph = { 'a': ['b'], 'b': ['c'], 'c': ['d'], 'd': ['a'], } def GetEdge(node): return tuple(graph[node]) self.assertRaises( gyp.common.CycleError, gyp.common.TopologicallySorted, graph.keys(), GetEdge) class TestGetFlavor(unittest.TestCase): """Test that gyp.common.GetFlavor works as intended""" original_platform = '' def setUp(self): self.original_platform = sys.platform def tearDown(self): sys.platform = self.original_platform def assertFlavor(self, expected, argument, param): sys.platform = argument self.assertEqual(expected, gyp.common.GetFlavor(param)) def test_platform_default(self): self.assertFlavor('freebsd', 'freebsd9' , {}) self.assertFlavor('freebsd', 'freebsd10', {}) self.assertFlavor('openbsd', 'openbsd5' , {}) self.assertFlavor('solaris', 'sunos5' , {}); self.assertFlavor('solaris', 'sunos' , {}); self.assertFlavor('linux' , 'linux2' , {}); self.assertFlavor('linux' , 'linux3' , {}); def test_param(self): self.assertFlavor('foobar', 'linux2' , {'flavor': 'foobar'}) if __name__ == '__main__': unittest.main() ������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/easy_xml.py���������000644 �000766 �000024 �00000011303 12455173731 032007� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import re import os def XmlToString(content, encoding='utf-8', pretty=False): """ Writes the XML content to disk, touching the file only if it has changed. Visual Studio files have a lot of pre-defined structures. This function makes it easy to represent these structures as Python data structures, instead of having to create a lot of function calls. Each XML element of the content is represented as a list composed of: 1. The name of the element, a string, 2. The attributes of the element, a dictionary (optional), and 3+. The content of the element, if any. Strings are simple text nodes and lists are child elements. Example 1: <test/> becomes ['test'] Example 2: <myelement a='value1' b='value2'> <childtype>This is</childtype> <childtype>it!</childtype> </myelement> becomes ['myelement', {'a':'value1', 'b':'value2'}, ['childtype', 'This is'], ['childtype', 'it!'], ] Args: content: The structured content to be converted. encoding: The encoding to report on the first XML line. pretty: True if we want pretty printing with indents and new lines. Returns: The XML content as a string. """ # We create a huge list of all the elements of the file. xml_parts = ['<?xml version="1.0" encoding="%s"?>' % encoding] if pretty: xml_parts.append('\n') _ConstructContentList(xml_parts, content, pretty) # Convert it to a string return ''.join(xml_parts) def _ConstructContentList(xml_parts, specification, pretty, level=0): """ Appends the XML parts corresponding to the specification. Args: xml_parts: A list of XML parts to be appended to. specification: The specification of the element. See EasyXml docs. pretty: True if we want pretty printing with indents and new lines. level: Indentation level. """ # The first item in a specification is the name of the element. if pretty: indentation = ' ' * level new_line = '\n' else: indentation = '' new_line = '' name = specification[0] if not isinstance(name, str): raise Exception('The first item of an EasyXml specification should be ' 'a string. Specification was ' + str(specification)) xml_parts.append(indentation + '<' + name) # Optionally in second position is a dictionary of the attributes. rest = specification[1:] if rest and isinstance(rest[0], dict): for at, val in sorted(rest[0].iteritems()): xml_parts.append(' %s="%s"' % (at, _XmlEscape(val, attr=True))) rest = rest[1:] if rest: xml_parts.append('>') all_strings = reduce(lambda x, y: x and isinstance(y, str), rest, True) multi_line = not all_strings if multi_line and new_line: xml_parts.append(new_line) for child_spec in rest: # If it's a string, append a text node. # Otherwise recurse over that child definition if isinstance(child_spec, str): xml_parts.append(_XmlEscape(child_spec)) else: _ConstructContentList(xml_parts, child_spec, pretty, level + 1) if multi_line and indentation: xml_parts.append(indentation) xml_parts.append('</%s>%s' % (name, new_line)) else: xml_parts.append('/>%s' % new_line) def WriteXmlIfChanged(content, path, encoding='utf-8', pretty=False, win32=False): """ Writes the XML content to disk, touching the file only if it has changed. Args: content: The structured content to be written. path: Location of the file. encoding: The encoding to report on the first line of the XML file. pretty: True if we want pretty printing with indents and new lines. """ xml_string = XmlToString(content, encoding, pretty) if win32 and os.linesep != '\r\n': xml_string = xml_string.replace('\n', '\r\n') # Get the old content try: f = open(path, 'r') existing = f.read() f.close() except: existing = None # It has changed, write it if existing != xml_string: f = open(path, 'w') f.write(xml_string) f.close() _xml_escape_map = { '"': '"', "'": ''', '<': '<', '>': '>', '&': '&', '\n': ' ', '\r': ' ', } _xml_escape_re = re.compile( "(%s)" % "|".join(map(re.escape, _xml_escape_map.keys()))) def _XmlEscape(value, attr=False): """ Escape a string for inclusion in XML.""" def replace(match): m = match.string[match.start() : match.end()] # don't replace single quotes in attrs if attr and m == "'": return m return _xml_escape_map[m] return _xml_escape_re.sub(replace, value) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/easy_xml_test.py����000755 �000766 �000024 �00000006306 12455173731 033060� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the easy_xml.py file. """ import gyp.easy_xml as easy_xml import unittest import StringIO class TestSequenceFunctions(unittest.TestCase): def setUp(self): self.stderr = StringIO.StringIO() def test_EasyXml_simple(self): self.assertEqual( easy_xml.XmlToString(['test']), '<?xml version="1.0" encoding="utf-8"?><test/>') self.assertEqual( easy_xml.XmlToString(['test'], encoding='Windows-1252'), '<?xml version="1.0" encoding="Windows-1252"?><test/>') def test_EasyXml_simple_with_attributes(self): self.assertEqual( easy_xml.XmlToString(['test2', {'a': 'value1', 'b': 'value2'}]), '<?xml version="1.0" encoding="utf-8"?><test2 a="value1" b="value2"/>') def test_EasyXml_escaping(self): original = '<test>\'"\r&\nfoo' converted = '<test>\'" & foo' converted_apos = converted.replace("'", ''') self.assertEqual( easy_xml.XmlToString(['test3', {'a': original}, original]), '<?xml version="1.0" encoding="utf-8"?><test3 a="%s">%s</test3>' % (converted, converted_apos)) def test_EasyXml_pretty(self): self.assertEqual( easy_xml.XmlToString( ['test3', ['GrandParent', ['Parent1', ['Child'] ], ['Parent2'] ] ], pretty=True), '<?xml version="1.0" encoding="utf-8"?>\n' '<test3>\n' ' <GrandParent>\n' ' <Parent1>\n' ' <Child/>\n' ' </Parent1>\n' ' <Parent2/>\n' ' </GrandParent>\n' '</test3>\n') def test_EasyXml_complex(self): # We want to create: target = ( '<?xml version="1.0" encoding="utf-8"?>' '<Project>' '<PropertyGroup Label="Globals">' '<ProjectGuid>{D2250C20-3A94-4FB9-AF73-11BC5B73884B}</ProjectGuid>' '<Keyword>Win32Proj</Keyword>' '<RootNamespace>automated_ui_tests</RootNamespace>' '</PropertyGroup>' '<Import Project="$(VCTargetsPath)\\Microsoft.Cpp.props"/>' '<PropertyGroup ' 'Condition="\'$(Configuration)|$(Platform)\'==' '\'Debug|Win32\'" Label="Configuration">' '<ConfigurationType>Application</ConfigurationType>' '<CharacterSet>Unicode</CharacterSet>' '</PropertyGroup>' '</Project>') xml = easy_xml.XmlToString( ['Project', ['PropertyGroup', {'Label': 'Globals'}, ['ProjectGuid', '{D2250C20-3A94-4FB9-AF73-11BC5B73884B}'], ['Keyword', 'Win32Proj'], ['RootNamespace', 'automated_ui_tests'] ], ['Import', {'Project': '$(VCTargetsPath)\\Microsoft.Cpp.props'}], ['PropertyGroup', {'Condition': "'$(Configuration)|$(Platform)'=='Debug|Win32'", 'Label': 'Configuration'}, ['ConfigurationType', 'Application'], ['CharacterSet', 'Unicode'] ] ]) self.assertEqual(xml, target) if __name__ == '__main__': unittest.main() ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/flock_tool.py�������000755 �000766 �000024 �00000002775 12455173731 032341� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """These functions are executed via gyp-flock-tool when using the Makefile generator. Used on systems that don't have a built-in flock.""" import fcntl import os import struct import subprocess import sys def main(args): executor = FlockTool() executor.Dispatch(args) class FlockTool(object): """This class emulates the 'flock' command.""" def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like copy-info-plist to CopyInfoPlist""" return name_string.title().replace('-', '') def ExecFlock(self, lockfile, *cmd_list): """Emulates the most basic behavior of Linux's flock(1).""" # Rely on exception handling to report errors. # Note that the stock python on SunOS has a bug # where fcntl.flock(fd, LOCK_EX) always fails # with EBADF, that's why we use this F_SETLK # hack instead. fd = os.open(lockfile, os.O_WRONLY|os.O_NOCTTY|os.O_CREAT, 0666) op = struct.pack('hhllhhl', fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0) fcntl.fcntl(fd, fcntl.F_SETLK, op) return subprocess.call(cmd_list) if __name__ == '__main__': sys.exit(main(sys.argv[1:])) ���iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/����������000755 �000766 �000024 �00000000000 12456115117 031577� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/input.py������������000644 �000766 �000024 �00000334273 12455173731 031343� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. from compiler.ast import Const from compiler.ast import Dict from compiler.ast import Discard from compiler.ast import List from compiler.ast import Module from compiler.ast import Node from compiler.ast import Stmt import compiler import copy import gyp.common import multiprocessing import optparse import os.path import re import shlex import signal import subprocess import sys import threading import time import traceback from gyp.common import GypError # A list of types that are treated as linkable. linkable_types = ['executable', 'shared_library', 'loadable_module'] # A list of sections that contain links to other targets. dependency_sections = ['dependencies', 'export_dependent_settings'] # base_path_sections is a list of sections defined by GYP that contain # pathnames. The generators can provide more keys, the two lists are merged # into path_sections, but you should call IsPathSection instead of using either # list directly. base_path_sections = [ 'destination', 'files', 'include_dirs', 'inputs', 'libraries', 'outputs', 'sources', ] path_sections = [] is_path_section_charset = set('=+?!') is_path_section_match_re = re.compile('_(dir|file|path)s?$') def IsPathSection(section): # If section ends in one of these characters, it's applied to a section # without the trailing characters. '/' is notably absent from this list, # because there's no way for a regular expression to be treated as a path. while section[-1:] in is_path_section_charset: section = section[:-1] return section in path_sections or is_path_section_match_re.search(section) # base_non_configuration_keys is a list of key names that belong in the target # itself and should not be propagated into its configurations. It is merged # with a list that can come from the generator to # create non_configuration_keys. base_non_configuration_keys = [ # Sections that must exist inside targets and not configurations. 'actions', 'configurations', 'copies', 'default_configuration', 'dependencies', 'dependencies_original', 'libraries', 'postbuilds', 'product_dir', 'product_extension', 'product_name', 'product_prefix', 'rules', 'run_as', 'sources', 'standalone_static_library', 'suppress_wildcard', 'target_name', 'toolset', 'toolsets', 'type', # Sections that can be found inside targets or configurations, but that # should not be propagated from targets into their configurations. 'variables', ] non_configuration_keys = [] # Keys that do not belong inside a configuration dictionary. invalid_configuration_keys = [ 'actions', 'all_dependent_settings', 'configurations', 'dependencies', 'direct_dependent_settings', 'libraries', 'link_settings', 'sources', 'standalone_static_library', 'target_name', 'type', ] # Controls whether or not the generator supports multiple toolsets. multiple_toolsets = False # Paths for converting filelist paths to output paths: { # toplevel, # qualified_output_dir, # } generator_filelist_paths = None def GetIncludedBuildFiles(build_file_path, aux_data, included=None): """Return a list of all build files included into build_file_path. The returned list will contain build_file_path as well as all other files that it included, either directly or indirectly. Note that the list may contain files that were included into a conditional section that evaluated to false and was not merged into build_file_path's dict. aux_data is a dict containing a key for each build file or included build file. Those keys provide access to dicts whose "included" keys contain lists of all other files included by the build file. included should be left at its default None value by external callers. It is used for recursion. The returned list will not contain any duplicate entries. Each build file in the list will be relative to the current directory. """ if included == None: included = [] if build_file_path in included: return included included.append(build_file_path) for included_build_file in aux_data[build_file_path].get('included', []): GetIncludedBuildFiles(included_build_file, aux_data, included) return included def CheckedEval(file_contents): """Return the eval of a gyp file. The gyp file is restricted to dictionaries and lists only, and repeated keys are not allowed. Note that this is slower than eval() is. """ ast = compiler.parse(file_contents) assert isinstance(ast, Module) c1 = ast.getChildren() assert c1[0] is None assert isinstance(c1[1], Stmt) c2 = c1[1].getChildren() assert isinstance(c2[0], Discard) c3 = c2[0].getChildren() assert len(c3) == 1 return CheckNode(c3[0], []) def CheckNode(node, keypath): if isinstance(node, Dict): c = node.getChildren() dict = {} for n in range(0, len(c), 2): assert isinstance(c[n], Const) key = c[n].getChildren()[0] if key in dict: raise GypError("Key '" + key + "' repeated at level " + repr(len(keypath) + 1) + " with key path '" + '.'.join(keypath) + "'") kp = list(keypath) # Make a copy of the list for descending this node. kp.append(key) dict[key] = CheckNode(c[n + 1], kp) return dict elif isinstance(node, List): c = node.getChildren() children = [] for index, child in enumerate(c): kp = list(keypath) # Copy list. kp.append(repr(index)) children.append(CheckNode(child, kp)) return children elif isinstance(node, Const): return node.getChildren()[0] else: raise TypeError, "Unknown AST node at key path '" + '.'.join(keypath) + \ "': " + repr(node) def LoadOneBuildFile(build_file_path, data, aux_data, variables, includes, is_target, check): if build_file_path in data: return data[build_file_path] if os.path.exists(build_file_path): build_file_contents = open(build_file_path).read() else: raise GypError("%s not found (cwd: %s)" % (build_file_path, os.getcwd())) build_file_data = None try: if check: build_file_data = CheckedEval(build_file_contents) else: build_file_data = eval(build_file_contents, {'__builtins__': None}, None) except SyntaxError, e: e.filename = build_file_path raise except Exception, e: gyp.common.ExceptionAppend(e, 'while reading ' + build_file_path) raise if not isinstance(build_file_data, dict): raise GypError("%s does not evaluate to a dictionary." % build_file_path) data[build_file_path] = build_file_data aux_data[build_file_path] = {} # Scan for includes and merge them in. if ('skip_includes' not in build_file_data or not build_file_data['skip_includes']): try: if is_target: LoadBuildFileIncludesIntoDict(build_file_data, build_file_path, data, aux_data, variables, includes, check) else: LoadBuildFileIncludesIntoDict(build_file_data, build_file_path, data, aux_data, variables, None, check) except Exception, e: gyp.common.ExceptionAppend(e, 'while reading includes of ' + build_file_path) raise return build_file_data def LoadBuildFileIncludesIntoDict(subdict, subdict_path, data, aux_data, variables, includes, check): includes_list = [] if includes != None: includes_list.extend(includes) if 'includes' in subdict: for include in subdict['includes']: # "include" is specified relative to subdict_path, so compute the real # path to include by appending the provided "include" to the directory # in which subdict_path resides. relative_include = \ os.path.normpath(os.path.join(os.path.dirname(subdict_path), include)) includes_list.append(relative_include) # Unhook the includes list, it's no longer needed. del subdict['includes'] # Merge in the included files. for include in includes_list: if not 'included' in aux_data[subdict_path]: aux_data[subdict_path]['included'] = [] aux_data[subdict_path]['included'].append(include) gyp.DebugOutput(gyp.DEBUG_INCLUDES, "Loading Included File: '%s'", include) MergeDicts(subdict, LoadOneBuildFile(include, data, aux_data, variables, None, False, check), subdict_path, include) # Recurse into subdictionaries. for k, v in subdict.iteritems(): if v.__class__ == dict: LoadBuildFileIncludesIntoDict(v, subdict_path, data, aux_data, variables, None, check) elif v.__class__ == list: LoadBuildFileIncludesIntoList(v, subdict_path, data, aux_data, variables, check) # This recurses into lists so that it can look for dicts. def LoadBuildFileIncludesIntoList(sublist, sublist_path, data, aux_data, variables, check): for item in sublist: if item.__class__ == dict: LoadBuildFileIncludesIntoDict(item, sublist_path, data, aux_data, variables, None, check) elif item.__class__ == list: LoadBuildFileIncludesIntoList(item, sublist_path, data, aux_data, variables, check) # Processes toolsets in all the targets. This recurses into condition entries # since they can contain toolsets as well. def ProcessToolsetsInDict(data): if 'targets' in data: target_list = data['targets'] new_target_list = [] for target in target_list: # If this target already has an explicit 'toolset', and no 'toolsets' # list, don't modify it further. if 'toolset' in target and 'toolsets' not in target: new_target_list.append(target) continue if multiple_toolsets: toolsets = target.get('toolsets', ['target']) else: toolsets = ['target'] # Make sure this 'toolsets' definition is only processed once. if 'toolsets' in target: del target['toolsets'] if len(toolsets) > 0: # Optimization: only do copies if more than one toolset is specified. for build in toolsets[1:]: new_target = copy.deepcopy(target) new_target['toolset'] = build new_target_list.append(new_target) target['toolset'] = toolsets[0] new_target_list.append(target) data['targets'] = new_target_list if 'conditions' in data: for condition in data['conditions']: if isinstance(condition, list): for condition_dict in condition[1:]: ProcessToolsetsInDict(condition_dict) # TODO(mark): I don't love this name. It just means that it's going to load # a build file that contains targets and is expected to provide a targets dict # that contains the targets... def LoadTargetBuildFile(build_file_path, data, aux_data, variables, includes, depth, check, load_dependencies): # If depth is set, predefine the DEPTH variable to be a relative path from # this build file's directory to the directory identified by depth. if depth: # TODO(dglazkov) The backslash/forward-slash replacement at the end is a # temporary measure. This should really be addressed by keeping all paths # in POSIX until actual project generation. d = gyp.common.RelativePath(depth, os.path.dirname(build_file_path)) if d == '': variables['DEPTH'] = '.' else: variables['DEPTH'] = d.replace('\\', '/') if build_file_path in data['target_build_files']: # Already loaded. return False data['target_build_files'].add(build_file_path) gyp.DebugOutput(gyp.DEBUG_INCLUDES, "Loading Target Build File '%s'", build_file_path) build_file_data = LoadOneBuildFile(build_file_path, data, aux_data, variables, includes, True, check) # Store DEPTH for later use in generators. build_file_data['_DEPTH'] = depth # Set up the included_files key indicating which .gyp files contributed to # this target dict. if 'included_files' in build_file_data: raise GypError(build_file_path + ' must not contain included_files key') included = GetIncludedBuildFiles(build_file_path, aux_data) build_file_data['included_files'] = [] for included_file in included: # included_file is relative to the current directory, but it needs to # be made relative to build_file_path's directory. included_relative = \ gyp.common.RelativePath(included_file, os.path.dirname(build_file_path)) build_file_data['included_files'].append(included_relative) # Do a first round of toolsets expansion so that conditions can be defined # per toolset. ProcessToolsetsInDict(build_file_data) # Apply "pre"/"early" variable expansions and condition evaluations. ProcessVariablesAndConditionsInDict( build_file_data, PHASE_EARLY, variables, build_file_path) # Since some toolsets might have been defined conditionally, perform # a second round of toolsets expansion now. ProcessToolsetsInDict(build_file_data) # Look at each project's target_defaults dict, and merge settings into # targets. if 'target_defaults' in build_file_data: if 'targets' not in build_file_data: raise GypError("Unable to find targets in build file %s" % build_file_path) index = 0 while index < len(build_file_data['targets']): # This procedure needs to give the impression that target_defaults is # used as defaults, and the individual targets inherit from that. # The individual targets need to be merged into the defaults. Make # a deep copy of the defaults for each target, merge the target dict # as found in the input file into that copy, and then hook up the # copy with the target-specific data merged into it as the replacement # target dict. old_target_dict = build_file_data['targets'][index] new_target_dict = copy.deepcopy(build_file_data['target_defaults']) MergeDicts(new_target_dict, old_target_dict, build_file_path, build_file_path) build_file_data['targets'][index] = new_target_dict index += 1 # No longer needed. del build_file_data['target_defaults'] # Look for dependencies. This means that dependency resolution occurs # after "pre" conditionals and variable expansion, but before "post" - # in other words, you can't put a "dependencies" section inside a "post" # conditional within a target. dependencies = [] if 'targets' in build_file_data: for target_dict in build_file_data['targets']: if 'dependencies' not in target_dict: continue for dependency in target_dict['dependencies']: dependencies.append( gyp.common.ResolveTarget(build_file_path, dependency, None)[0]) if load_dependencies: for dependency in dependencies: try: LoadTargetBuildFile(dependency, data, aux_data, variables, includes, depth, check, load_dependencies) except Exception, e: gyp.common.ExceptionAppend( e, 'while loading dependencies of %s' % build_file_path) raise else: return (build_file_path, dependencies) def CallLoadTargetBuildFile(global_flags, build_file_path, data, aux_data, variables, includes, depth, check, generator_input_info): """Wrapper around LoadTargetBuildFile for parallel processing. This wrapper is used when LoadTargetBuildFile is executed in a worker process. """ try: signal.signal(signal.SIGINT, signal.SIG_IGN) # Apply globals so that the worker process behaves the same. for key, value in global_flags.iteritems(): globals()[key] = value # Save the keys so we can return data that changed. data_keys = set(data) aux_data_keys = set(aux_data) SetGeneratorGlobals(generator_input_info) result = LoadTargetBuildFile(build_file_path, data, aux_data, variables, includes, depth, check, False) if not result: return result (build_file_path, dependencies) = result data_out = {} for key in data: if key == 'target_build_files': continue if key not in data_keys: data_out[key] = data[key] aux_data_out = {} for key in aux_data: if key not in aux_data_keys: aux_data_out[key] = aux_data[key] # This gets serialized and sent back to the main process via a pipe. # It's handled in LoadTargetBuildFileCallback. return (build_file_path, data_out, aux_data_out, dependencies) except GypError, e: sys.stderr.write("gyp: %s\n" % e) return None except Exception, e: print >>sys.stderr, 'Exception:', e print >>sys.stderr, traceback.format_exc() return None class ParallelProcessingError(Exception): pass class ParallelState(object): """Class to keep track of state when processing input files in parallel. If build files are loaded in parallel, use this to keep track of state during farming out and processing parallel jobs. It's stored in a global so that the callback function can have access to it. """ def __init__(self): # The multiprocessing pool. self.pool = None # The condition variable used to protect this object and notify # the main loop when there might be more data to process. self.condition = None # The "data" dict that was passed to LoadTargetBuildFileParallel self.data = None # The "aux_data" dict that was passed to LoadTargetBuildFileParallel self.aux_data = None # The number of parallel calls outstanding; decremented when a response # was received. self.pending = 0 # The set of all build files that have been scheduled, so we don't # schedule the same one twice. self.scheduled = set() # A list of dependency build file paths that haven't been scheduled yet. self.dependencies = [] # Flag to indicate if there was an error in a child process. self.error = False def LoadTargetBuildFileCallback(self, result): """Handle the results of running LoadTargetBuildFile in another process. """ self.condition.acquire() if not result: self.error = True self.condition.notify() self.condition.release() return (build_file_path0, data0, aux_data0, dependencies0) = result self.data['target_build_files'].add(build_file_path0) for key in data0: self.data[key] = data0[key] for key in aux_data0: self.aux_data[key] = aux_data0[key] for new_dependency in dependencies0: if new_dependency not in self.scheduled: self.scheduled.add(new_dependency) self.dependencies.append(new_dependency) self.pending -= 1 self.condition.notify() self.condition.release() def LoadTargetBuildFilesParallel(build_files, data, aux_data, variables, includes, depth, check, generator_input_info): parallel_state = ParallelState() parallel_state.condition = threading.Condition() # Make copies of the build_files argument that we can modify while working. parallel_state.dependencies = list(build_files) parallel_state.scheduled = set(build_files) parallel_state.pending = 0 parallel_state.data = data parallel_state.aux_data = aux_data try: parallel_state.condition.acquire() while parallel_state.dependencies or parallel_state.pending: if parallel_state.error: break if not parallel_state.dependencies: parallel_state.condition.wait() continue dependency = parallel_state.dependencies.pop() parallel_state.pending += 1 data_in = {} data_in['target_build_files'] = data['target_build_files'] aux_data_in = {} global_flags = { 'path_sections': globals()['path_sections'], 'non_configuration_keys': globals()['non_configuration_keys'], 'multiple_toolsets': globals()['multiple_toolsets']} if not parallel_state.pool: parallel_state.pool = multiprocessing.Pool(8) parallel_state.pool.apply_async( CallLoadTargetBuildFile, args = (global_flags, dependency, data_in, aux_data_in, variables, includes, depth, check, generator_input_info), callback = parallel_state.LoadTargetBuildFileCallback) except KeyboardInterrupt, e: parallel_state.pool.terminate() raise e parallel_state.condition.release() parallel_state.pool.close() parallel_state.pool.join() parallel_state.pool = None if parallel_state.error: sys.exit(1) # Look for the bracket that matches the first bracket seen in a # string, and return the start and end as a tuple. For example, if # the input is something like "<(foo <(bar)) blah", then it would # return (1, 13), indicating the entire string except for the leading # "<" and trailing " blah". LBRACKETS= set('{[(') BRACKETS = {'}': '{', ']': '[', ')': '('} def FindEnclosingBracketGroup(input_str): stack = [] start = -1 for index, char in enumerate(input_str): if char in LBRACKETS: stack.append(char) if start == -1: start = index elif char in BRACKETS: if not stack: return (-1, -1) if stack.pop() != BRACKETS[char]: return (-1, -1) if not stack: return (start, index + 1) return (-1, -1) canonical_int_re = re.compile('(0|-?[1-9][0-9]*)$') def IsStrCanonicalInt(string): """Returns True if |string| is in its canonical integer form. The canonical form is such that str(int(string)) == string. """ return isinstance(string, str) and canonical_int_re.match(string) # This matches things like "<(asdf)", "<!(cmd)", "<!@(cmd)", "<|(list)", # "<!interpreter(arguments)", "<([list])", and even "<([)" and "<(<())". # In the last case, the inner "<()" is captured in match['content']. early_variable_re = re.compile( '(?P<replace>(?P<type><(?:(?:!?@?)|\|)?)' '(?P<command_string>[-a-zA-Z0-9_.]+)?' '\((?P<is_array>\s*\[?)' '(?P<content>.*?)(\]?)\))') # This matches the same as early_variable_re, but with '>' instead of '<'. late_variable_re = re.compile( '(?P<replace>(?P<type>>(?:(?:!?@?)|\|)?)' '(?P<command_string>[-a-zA-Z0-9_.]+)?' '\((?P<is_array>\s*\[?)' '(?P<content>.*?)(\]?)\))') # This matches the same as early_variable_re, but with '^' instead of '<'. latelate_variable_re = re.compile( '(?P<replace>(?P<type>[\^](?:(?:!?@?)|\|)?)' '(?P<command_string>[-a-zA-Z0-9_.]+)?' '\((?P<is_array>\s*\[?)' '(?P<content>.*?)(\]?)\))') # Global cache of results from running commands so they don't have to be run # more then once. cached_command_results = {} def FixupPlatformCommand(cmd): if sys.platform == 'win32': if type(cmd) == list: cmd = [re.sub('^cat ', 'type ', cmd[0])] + cmd[1:] else: cmd = re.sub('^cat ', 'type ', cmd) return cmd PHASE_EARLY = 0 PHASE_LATE = 1 PHASE_LATELATE = 2 def ExpandVariables(input, phase, variables, build_file): # Look for the pattern that gets expanded into variables if phase == PHASE_EARLY: variable_re = early_variable_re expansion_symbol = '<' elif phase == PHASE_LATE: variable_re = late_variable_re expansion_symbol = '>' elif phase == PHASE_LATELATE: variable_re = latelate_variable_re expansion_symbol = '^' else: assert False input_str = str(input) if IsStrCanonicalInt(input_str): return int(input_str) # Do a quick scan to determine if an expensive regex search is warranted. if expansion_symbol not in input_str: return input_str # Get the entire list of matches as a list of MatchObject instances. # (using findall here would return strings instead of MatchObjects). matches = list(variable_re.finditer(input_str)) if not matches: return input_str output = input_str # Reverse the list of matches so that replacements are done right-to-left. # That ensures that earlier replacements won't mess up the string in a # way that causes later calls to find the earlier substituted text instead # of what's intended for replacement. matches.reverse() for match_group in matches: match = match_group.groupdict() gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Matches: %r", match) # match['replace'] is the substring to look for, match['type'] # is the character code for the replacement type (< > <! >! <| >| <@ # >@ <!@ >!@), match['is_array'] contains a '[' for command # arrays, and match['content'] is the name of the variable (< >) # or command to run (<! >!). match['command_string'] is an optional # command string. Currently, only 'pymod_do_main' is supported. # run_command is true if a ! variant is used. run_command = '!' in match['type'] command_string = match['command_string'] # file_list is true if a | variant is used. file_list = '|' in match['type'] # Capture these now so we can adjust them later. replace_start = match_group.start('replace') replace_end = match_group.end('replace') # Find the ending paren, and re-evaluate the contained string. (c_start, c_end) = FindEnclosingBracketGroup(input_str[replace_start:]) # Adjust the replacement range to match the entire command # found by FindEnclosingBracketGroup (since the variable_re # probably doesn't match the entire command if it contained # nested variables). replace_end = replace_start + c_end # Find the "real" replacement, matching the appropriate closing # paren, and adjust the replacement start and end. replacement = input_str[replace_start:replace_end] # Figure out what the contents of the variable parens are. contents_start = replace_start + c_start + 1 contents_end = replace_end - 1 contents = input_str[contents_start:contents_end] # Do filter substitution now for <|(). # Admittedly, this is different than the evaluation order in other # contexts. However, since filtration has no chance to run on <|(), # this seems like the only obvious way to give them access to filters. if file_list: processed_variables = copy.deepcopy(variables) ProcessListFiltersInDict(contents, processed_variables) # Recurse to expand variables in the contents contents = ExpandVariables(contents, phase, processed_variables, build_file) else: # Recurse to expand variables in the contents contents = ExpandVariables(contents, phase, variables, build_file) # Strip off leading/trailing whitespace so that variable matches are # simpler below (and because they are rarely needed). contents = contents.strip() # expand_to_list is true if an @ variant is used. In that case, # the expansion should result in a list. Note that the caller # is to be expecting a list in return, and not all callers do # because not all are working in list context. Also, for list # expansions, there can be no other text besides the variable # expansion in the input string. expand_to_list = '@' in match['type'] and input_str == replacement if run_command or file_list: # Find the build file's directory, so commands can be run or file lists # generated relative to it. build_file_dir = os.path.dirname(build_file) if build_file_dir == '' and not file_list: # If build_file is just a leaf filename indicating a file in the # current directory, build_file_dir might be an empty string. Set # it to None to signal to subprocess.Popen that it should run the # command in the current directory. build_file_dir = None # Support <|(listfile.txt ...) which generates a file # containing items from a gyp list, generated at gyp time. # This works around actions/rules which have more inputs than will # fit on the command line. if file_list: if type(contents) == list: contents_list = contents else: contents_list = contents.split(' ') replacement = contents_list[0] if os.path.isabs(replacement): raise GypError('| cannot handle absolute paths, got "%s"' % replacement) if not generator_filelist_paths: path = os.path.join(build_file_dir, replacement) else: if os.path.isabs(build_file_dir): toplevel = generator_filelist_paths['toplevel'] rel_build_file_dir = gyp.common.RelativePath(build_file_dir, toplevel) else: rel_build_file_dir = build_file_dir qualified_out_dir = generator_filelist_paths['qualified_out_dir'] path = os.path.join(qualified_out_dir, rel_build_file_dir, replacement) gyp.common.EnsureDirExists(path) replacement = gyp.common.RelativePath(path, build_file_dir) f = gyp.common.WriteOnDiff(path) for i in contents_list[1:]: f.write('%s\n' % i) f.close() elif run_command: use_shell = True if match['is_array']: contents = eval(contents) use_shell = False # Check for a cached value to avoid executing commands, or generating # file lists more than once. # TODO(http://code.google.com/p/gyp/issues/detail?id=112): It is # possible that the command being invoked depends on the current # directory. For that case the syntax needs to be extended so that the # directory is also used in cache_key (it becomes a tuple). # TODO(http://code.google.com/p/gyp/issues/detail?id=111): In theory, # someone could author a set of GYP files where each time the command # is invoked it produces different output by design. When the need # arises, the syntax should be extended to support no caching off a # command's output so it is run every time. cache_key = str(contents) cached_value = cached_command_results.get(cache_key, None) if cached_value is None: gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Executing command '%s' in directory '%s'", contents, build_file_dir) replacement = '' if command_string == 'pymod_do_main': # <!pymod_do_main(modulename param eters) loads |modulename| as a # python module and then calls that module's DoMain() function, # passing ["param", "eters"] as a single list argument. For modules # that don't load quickly, this can be faster than # <!(python modulename param eters). Do this in |build_file_dir|. oldwd = os.getcwd() # Python doesn't like os.open('.'): no fchdir. if build_file_dir: # build_file_dir may be None (see above). os.chdir(build_file_dir) try: parsed_contents = shlex.split(contents) try: py_module = __import__(parsed_contents[0]) except ImportError as e: raise GypError("Error importing pymod_do_main" "module (%s): %s" % (parsed_contents[0], e)) replacement = str(py_module.DoMain(parsed_contents[1:])).rstrip() finally: os.chdir(oldwd) assert replacement != None elif command_string: raise GypError("Unknown command string '%s' in '%s'." % (command_string, contents)) else: # Fix up command with platform specific workarounds. contents = FixupPlatformCommand(contents) p = subprocess.Popen(contents, shell=use_shell, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE, cwd=build_file_dir) p_stdout, p_stderr = p.communicate('') if p.wait() != 0 or p_stderr: sys.stderr.write(p_stderr) # Simulate check_call behavior, since check_call only exists # in python 2.5 and later. raise GypError("Call to '%s' returned exit status %d." % (contents, p.returncode)) replacement = p_stdout.rstrip() cached_command_results[cache_key] = replacement else: gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Had cache value for command '%s' in directory '%s'", contents,build_file_dir) replacement = cached_value else: if not contents in variables: if contents[-1] in ['!', '/']: # In order to allow cross-compiles (nacl) to happen more naturally, # we will allow references to >(sources/) etc. to resolve to # and empty list if undefined. This allows actions to: # 'action!': [ # '>@(_sources!)', # ], # 'action/': [ # '>@(_sources/)', # ], replacement = [] else: raise GypError('Undefined variable ' + contents + ' in ' + build_file) else: replacement = variables[contents] if isinstance(replacement, list): for item in replacement: if (not contents[-1] == '/' and not isinstance(item, str) and not isinstance(item, int)): raise GypError('Variable ' + contents + ' must expand to a string or list of strings; ' + 'list contains a ' + item.__class__.__name__) # Run through the list and handle variable expansions in it. Since # the list is guaranteed not to contain dicts, this won't do anything # with conditions sections. ProcessVariablesAndConditionsInList(replacement, phase, variables, build_file) elif not isinstance(replacement, str) and \ not isinstance(replacement, int): raise GypError('Variable ' + contents + ' must expand to a string or list of strings; ' + 'found a ' + replacement.__class__.__name__) if expand_to_list: # Expanding in list context. It's guaranteed that there's only one # replacement to do in |input_str| and that it's this replacement. See # above. if isinstance(replacement, list): # If it's already a list, make a copy. output = replacement[:] else: # Split it the same way sh would split arguments. output = shlex.split(str(replacement)) else: # Expanding in string context. encoded_replacement = '' if isinstance(replacement, list): # When expanding a list into string context, turn the list items # into a string in a way that will work with a subprocess call. # # TODO(mark): This isn't completely correct. This should # call a generator-provided function that observes the # proper list-to-argument quoting rules on a specific # platform instead of just calling the POSIX encoding # routine. encoded_replacement = gyp.common.EncodePOSIXShellList(replacement) else: encoded_replacement = replacement output = output[:replace_start] + str(encoded_replacement) + \ output[replace_end:] # Prepare for the next match iteration. input_str = output # Look for more matches now that we've replaced some, to deal with # expanding local variables (variables defined in the same # variables block as this one). gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Found output %r, recursing.", output) if isinstance(output, list): if output and isinstance(output[0], list): # Leave output alone if it's a list of lists. # We don't want such lists to be stringified. pass else: new_output = [] for item in output: new_output.append( ExpandVariables(item, phase, variables, build_file)) output = new_output else: output = ExpandVariables(output, phase, variables, build_file) # Convert all strings that are canonically-represented integers into integers. if isinstance(output, list): for index in xrange(0, len(output)): if IsStrCanonicalInt(output[index]): output[index] = int(output[index]) elif IsStrCanonicalInt(output): output = int(output) return output def ProcessConditionsInDict(the_dict, phase, variables, build_file): # Process a 'conditions' or 'target_conditions' section in the_dict, # depending on phase. # early -> conditions # late -> target_conditions # latelate -> no conditions # # Each item in a conditions list consists of cond_expr, a string expression # evaluated as the condition, and true_dict, a dict that will be merged into # the_dict if cond_expr evaluates to true. Optionally, a third item, # false_dict, may be present. false_dict is merged into the_dict if # cond_expr evaluates to false. # # Any dict merged into the_dict will be recursively processed for nested # conditionals and other expansions, also according to phase, immediately # prior to being merged. if phase == PHASE_EARLY: conditions_key = 'conditions' elif phase == PHASE_LATE: conditions_key = 'target_conditions' elif phase == PHASE_LATELATE: return else: assert False if not conditions_key in the_dict: return conditions_list = the_dict[conditions_key] # Unhook the conditions list, it's no longer needed. del the_dict[conditions_key] for condition in conditions_list: if not isinstance(condition, list): raise GypError(conditions_key + ' must be a list') if len(condition) != 2 and len(condition) != 3: # It's possible that condition[0] won't work in which case this # attempt will raise its own IndexError. That's probably fine. raise GypError(conditions_key + ' ' + condition[0] + ' must be length 2 or 3, not ' + str(len(condition))) [cond_expr, true_dict] = condition[0:2] false_dict = None if len(condition) == 3: false_dict = condition[2] # Do expansions on the condition itself. Since the conditon can naturally # contain variable references without needing to resort to GYP expansion # syntax, this is of dubious value for variables, but someone might want to # use a command expansion directly inside a condition. cond_expr_expanded = ExpandVariables(cond_expr, phase, variables, build_file) if not isinstance(cond_expr_expanded, str) and \ not isinstance(cond_expr_expanded, int): raise ValueError, \ 'Variable expansion in this context permits str and int ' + \ 'only, found ' + expanded.__class__.__name__ try: ast_code = compile(cond_expr_expanded, '<string>', 'eval') if eval(ast_code, {'__builtins__': None}, variables): merge_dict = true_dict else: merge_dict = false_dict except SyntaxError, e: syntax_error = SyntaxError('%s while evaluating condition \'%s\' in %s ' 'at character %d.' % (str(e.args[0]), e.text, build_file, e.offset), e.filename, e.lineno, e.offset, e.text) raise syntax_error except NameError, e: gyp.common.ExceptionAppend(e, 'while evaluating condition \'%s\' in %s' % (cond_expr_expanded, build_file)) raise GypError(e) if merge_dict != None: # Expand variables and nested conditinals in the merge_dict before # merging it. ProcessVariablesAndConditionsInDict(merge_dict, phase, variables, build_file) MergeDicts(the_dict, merge_dict, build_file, build_file) def LoadAutomaticVariablesFromDict(variables, the_dict): # Any keys with plain string values in the_dict become automatic variables. # The variable name is the key name with a "_" character prepended. for key, value in the_dict.iteritems(): if isinstance(value, str) or isinstance(value, int) or \ isinstance(value, list): variables['_' + key] = value def LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key): # Any keys in the_dict's "variables" dict, if it has one, becomes a # variable. The variable name is the key name in the "variables" dict. # Variables that end with the % character are set only if they are unset in # the variables dict. the_dict_key is the name of the key that accesses # the_dict in the_dict's parent dict. If the_dict's parent is not a dict # (it could be a list or it could be parentless because it is a root dict), # the_dict_key will be None. for key, value in the_dict.get('variables', {}).iteritems(): if not isinstance(value, str) and not isinstance(value, int) and \ not isinstance(value, list): continue if key.endswith('%'): variable_name = key[:-1] if variable_name in variables: # If the variable is already set, don't set it. continue if the_dict_key is 'variables' and variable_name in the_dict: # If the variable is set without a % in the_dict, and the_dict is a # variables dict (making |variables| a varaibles sub-dict of a # variables dict), use the_dict's definition. value = the_dict[variable_name] else: variable_name = key variables[variable_name] = value def ProcessVariablesAndConditionsInDict(the_dict, phase, variables_in, build_file, the_dict_key=None): """Handle all variable and command expansion and conditional evaluation. This function is the public entry point for all variable expansions and conditional evaluations. The variables_in dictionary will not be modified by this function. """ # Make a copy of the variables_in dict that can be modified during the # loading of automatics and the loading of the variables dict. variables = variables_in.copy() LoadAutomaticVariablesFromDict(variables, the_dict) if 'variables' in the_dict: # Make sure all the local variables are added to the variables # list before we process them so that you can reference one # variable from another. They will be fully expanded by recursion # in ExpandVariables. for key, value in the_dict['variables'].iteritems(): variables[key] = value # Handle the associated variables dict first, so that any variable # references within can be resolved prior to using them as variables. # Pass a copy of the variables dict to avoid having it be tainted. # Otherwise, it would have extra automatics added for everything that # should just be an ordinary variable in this scope. ProcessVariablesAndConditionsInDict(the_dict['variables'], phase, variables, build_file, 'variables') LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key) for key, value in the_dict.iteritems(): # Skip "variables", which was already processed if present. if key != 'variables' and isinstance(value, str): expanded = ExpandVariables(value, phase, variables, build_file) if not isinstance(expanded, str) and not isinstance(expanded, int): raise ValueError, \ 'Variable expansion in this context permits str and int ' + \ 'only, found ' + expanded.__class__.__name__ + ' for ' + key the_dict[key] = expanded # Variable expansion may have resulted in changes to automatics. Reload. # TODO(mark): Optimization: only reload if no changes were made. variables = variables_in.copy() LoadAutomaticVariablesFromDict(variables, the_dict) LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key) # Process conditions in this dict. This is done after variable expansion # so that conditions may take advantage of expanded variables. For example, # if the_dict contains: # {'type': '<(library_type)', # 'conditions': [['_type=="static_library"', { ... }]]}, # _type, as used in the condition, will only be set to the value of # library_type if variable expansion is performed before condition # processing. However, condition processing should occur prior to recursion # so that variables (both automatic and "variables" dict type) may be # adjusted by conditions sections, merged into the_dict, and have the # intended impact on contained dicts. # # This arrangement means that a "conditions" section containing a "variables" # section will only have those variables effective in subdicts, not in # the_dict. The workaround is to put a "conditions" section within a # "variables" section. For example: # {'conditions': [['os=="mac"', {'variables': {'define': 'IS_MAC'}}]], # 'defines': ['<(define)'], # 'my_subdict': {'defines': ['<(define)']}}, # will not result in "IS_MAC" being appended to the "defines" list in the # current scope but would result in it being appended to the "defines" list # within "my_subdict". By comparison: # {'variables': {'conditions': [['os=="mac"', {'define': 'IS_MAC'}]]}, # 'defines': ['<(define)'], # 'my_subdict': {'defines': ['<(define)']}}, # will append "IS_MAC" to both "defines" lists. # Evaluate conditions sections, allowing variable expansions within them # as well as nested conditionals. This will process a 'conditions' or # 'target_conditions' section, perform appropriate merging and recursive # conditional and variable processing, and then remove the conditions section # from the_dict if it is present. ProcessConditionsInDict(the_dict, phase, variables, build_file) # Conditional processing may have resulted in changes to automatics or the # variables dict. Reload. variables = variables_in.copy() LoadAutomaticVariablesFromDict(variables, the_dict) LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key) # Recurse into child dicts, or process child lists which may result in # further recursion into descendant dicts. for key, value in the_dict.iteritems(): # Skip "variables" and string values, which were already processed if # present. if key == 'variables' or isinstance(value, str): continue if isinstance(value, dict): # Pass a copy of the variables dict so that subdicts can't influence # parents. ProcessVariablesAndConditionsInDict(value, phase, variables, build_file, key) elif isinstance(value, list): # The list itself can't influence the variables dict, and # ProcessVariablesAndConditionsInList will make copies of the variables # dict if it needs to pass it to something that can influence it. No # copy is necessary here. ProcessVariablesAndConditionsInList(value, phase, variables, build_file) elif not isinstance(value, int): raise TypeError, 'Unknown type ' + value.__class__.__name__ + \ ' for ' + key def ProcessVariablesAndConditionsInList(the_list, phase, variables, build_file): # Iterate using an index so that new values can be assigned into the_list. index = 0 while index < len(the_list): item = the_list[index] if isinstance(item, dict): # Make a copy of the variables dict so that it won't influence anything # outside of its own scope. ProcessVariablesAndConditionsInDict(item, phase, variables, build_file) elif isinstance(item, list): ProcessVariablesAndConditionsInList(item, phase, variables, build_file) elif isinstance(item, str): expanded = ExpandVariables(item, phase, variables, build_file) if isinstance(expanded, str) or isinstance(expanded, int): the_list[index] = expanded elif isinstance(expanded, list): the_list[index:index+1] = expanded index += len(expanded) # index now identifies the next item to examine. Continue right now # without falling into the index increment below. continue else: raise ValueError, \ 'Variable expansion in this context permits strings and ' + \ 'lists only, found ' + expanded.__class__.__name__ + ' at ' + \ index elif not isinstance(item, int): raise TypeError, 'Unknown type ' + item.__class__.__name__ + \ ' at index ' + index index = index + 1 def BuildTargetsDict(data): """Builds a dict mapping fully-qualified target names to their target dicts. |data| is a dict mapping loaded build files by pathname relative to the current directory. Values in |data| are build file contents. For each |data| value with a "targets" key, the value of the "targets" key is taken as a list containing target dicts. Each target's fully-qualified name is constructed from the pathname of the build file (|data| key) and its "target_name" property. These fully-qualified names are used as the keys in the returned dict. These keys provide access to the target dicts, the dicts in the "targets" lists. """ targets = {} for build_file in data['target_build_files']: for target in data[build_file].get('targets', []): target_name = gyp.common.QualifiedTarget(build_file, target['target_name'], target['toolset']) if target_name in targets: raise GypError('Duplicate target definitions for ' + target_name) targets[target_name] = target return targets def QualifyDependencies(targets): """Make dependency links fully-qualified relative to the current directory. |targets| is a dict mapping fully-qualified target names to their target dicts. For each target in this dict, keys known to contain dependency links are examined, and any dependencies referenced will be rewritten so that they are fully-qualified and relative to the current directory. All rewritten dependencies are suitable for use as keys to |targets| or a similar dict. """ all_dependency_sections = [dep + op for dep in dependency_sections for op in ('', '!', '/')] for target, target_dict in targets.iteritems(): target_build_file = gyp.common.BuildFile(target) toolset = target_dict['toolset'] for dependency_key in all_dependency_sections: dependencies = target_dict.get(dependency_key, []) for index in xrange(0, len(dependencies)): dep_file, dep_target, dep_toolset = gyp.common.ResolveTarget( target_build_file, dependencies[index], toolset) if not multiple_toolsets: # Ignore toolset specification in the dependency if it is specified. dep_toolset = toolset dependency = gyp.common.QualifiedTarget(dep_file, dep_target, dep_toolset) dependencies[index] = dependency # Make sure anything appearing in a list other than "dependencies" also # appears in the "dependencies" list. if dependency_key != 'dependencies' and \ dependency not in target_dict['dependencies']: raise GypError('Found ' + dependency + ' in ' + dependency_key + ' of ' + target + ', but not in dependencies') def ExpandWildcardDependencies(targets, data): """Expands dependencies specified as build_file:*. For each target in |targets|, examines sections containing links to other targets. If any such section contains a link of the form build_file:*, it is taken as a wildcard link, and is expanded to list each target in build_file. The |data| dict provides access to build file dicts. Any target that does not wish to be included by wildcard can provide an optional "suppress_wildcard" key in its target dict. When present and true, a wildcard dependency link will not include such targets. All dependency names, including the keys to |targets| and the values in each dependency list, must be qualified when this function is called. """ for target, target_dict in targets.iteritems(): toolset = target_dict['toolset'] target_build_file = gyp.common.BuildFile(target) for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) # Loop this way instead of "for dependency in" or "for index in xrange" # because the dependencies list will be modified within the loop body. index = 0 while index < len(dependencies): (dependency_build_file, dependency_target, dependency_toolset) = \ gyp.common.ParseQualifiedTarget(dependencies[index]) if dependency_target != '*' and dependency_toolset != '*': # Not a wildcard. Keep it moving. index = index + 1 continue if dependency_build_file == target_build_file: # It's an error for a target to depend on all other targets in # the same file, because a target cannot depend on itself. raise GypError('Found wildcard in ' + dependency_key + ' of ' + target + ' referring to same build file') # Take the wildcard out and adjust the index so that the next # dependency in the list will be processed the next time through the # loop. del dependencies[index] index = index - 1 # Loop through the targets in the other build file, adding them to # this target's list of dependencies in place of the removed # wildcard. dependency_target_dicts = data[dependency_build_file]['targets'] for dependency_target_dict in dependency_target_dicts: if int(dependency_target_dict.get('suppress_wildcard', False)): continue dependency_target_name = dependency_target_dict['target_name'] if (dependency_target != '*' and dependency_target != dependency_target_name): continue dependency_target_toolset = dependency_target_dict['toolset'] if (dependency_toolset != '*' and dependency_toolset != dependency_target_toolset): continue dependency = gyp.common.QualifiedTarget(dependency_build_file, dependency_target_name, dependency_target_toolset) index = index + 1 dependencies.insert(index, dependency) index = index + 1 def Unify(l): """Removes duplicate elements from l, keeping the first element.""" seen = {} return [seen.setdefault(e, e) for e in l if e not in seen] def RemoveDuplicateDependencies(targets): """Makes sure every dependency appears only once in all targets's dependency lists.""" for target_name, target_dict in targets.iteritems(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) if dependencies: target_dict[dependency_key] = Unify(dependencies) def Filter(l, item): """Removes item from l.""" res = {} return [res.setdefault(e, e) for e in l if e != item] def RemoveSelfDependencies(targets): """Remove self dependencies from targets that have the prune_self_dependency variable set.""" for target_name, target_dict in targets.iteritems(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) if dependencies: for t in dependencies: if t == target_name: if targets[t].get('variables', {}).get('prune_self_dependency', 0): target_dict[dependency_key] = Filter(dependencies, target_name) class DependencyGraphNode(object): """ Attributes: ref: A reference to an object that this DependencyGraphNode represents. dependencies: List of DependencyGraphNodes on which this one depends. dependents: List of DependencyGraphNodes that depend on this one. """ class CircularException(GypError): pass def __init__(self, ref): self.ref = ref self.dependencies = [] self.dependents = [] def __repr__(self): return '<DependencyGraphNode: %r>' % self.ref def FlattenToList(self): # flat_list is the sorted list of dependencies - actually, the list items # are the "ref" attributes of DependencyGraphNodes. Every target will # appear in flat_list after all of its dependencies, and before all of its # dependents. flat_list = [] # in_degree_zeros is the list of DependencyGraphNodes that have no # dependencies not in flat_list. Initially, it is a copy of the children # of this node, because when the graph was built, nodes with no # dependencies were made implicit dependents of the root node. in_degree_zeros = set(self.dependents[:]) while in_degree_zeros: # Nodes in in_degree_zeros have no dependencies not in flat_list, so they # can be appended to flat_list. Take these nodes out of in_degree_zeros # as work progresses, so that the next node to process from the list can # always be accessed at a consistent position. node = in_degree_zeros.pop() flat_list.append(node.ref) # Look at dependents of the node just added to flat_list. Some of them # may now belong in in_degree_zeros. for node_dependent in node.dependents: is_in_degree_zero = True for node_dependent_dependency in node_dependent.dependencies: if not node_dependent_dependency.ref in flat_list: # The dependent one or more dependencies not in flat_list. There # will be more chances to add it to flat_list when examining # it again as a dependent of those other dependencies, provided # that there are no cycles. is_in_degree_zero = False break if is_in_degree_zero: # All of the dependent's dependencies are already in flat_list. Add # it to in_degree_zeros where it will be processed in a future # iteration of the outer loop. in_degree_zeros.add(node_dependent) return flat_list def FindCycles(self, path=None): """ Returns a list of cycles in the graph, where each cycle is its own list. """ if path is None: path = [self] results = [] for node in self.dependents: if node in path: cycle = [node] for part in path: cycle.append(part) if part == node: break results.append(tuple(cycle)) else: results.extend(node.FindCycles([node] + path)) return list(set(results)) def DirectDependencies(self, dependencies=None): """Returns a list of just direct dependencies.""" if dependencies == None: dependencies = [] for dependency in self.dependencies: # Check for None, corresponding to the root node. if dependency.ref != None and dependency.ref not in dependencies: dependencies.append(dependency.ref) return dependencies def _AddImportedDependencies(self, targets, dependencies=None): """Given a list of direct dependencies, adds indirect dependencies that other dependencies have declared to export their settings. This method does not operate on self. Rather, it operates on the list of dependencies in the |dependencies| argument. For each dependency in that list, if any declares that it exports the settings of one of its own dependencies, those dependencies whose settings are "passed through" are added to the list. As new items are added to the list, they too will be processed, so it is possible to import settings through multiple levels of dependencies. This method is not terribly useful on its own, it depends on being "primed" with a list of direct dependencies such as one provided by DirectDependencies. DirectAndImportedDependencies is intended to be the public entry point. """ if dependencies == None: dependencies = [] index = 0 while index < len(dependencies): dependency = dependencies[index] dependency_dict = targets[dependency] # Add any dependencies whose settings should be imported to the list # if not already present. Newly-added items will be checked for # their own imports when the list iteration reaches them. # Rather than simply appending new items, insert them after the # dependency that exported them. This is done to more closely match # the depth-first method used by DeepDependencies. add_index = 1 for imported_dependency in \ dependency_dict.get('export_dependent_settings', []): if imported_dependency not in dependencies: dependencies.insert(index + add_index, imported_dependency) add_index = add_index + 1 index = index + 1 return dependencies def DirectAndImportedDependencies(self, targets, dependencies=None): """Returns a list of a target's direct dependencies and all indirect dependencies that a dependency has advertised settings should be exported through the dependency for. """ dependencies = self.DirectDependencies(dependencies) return self._AddImportedDependencies(targets, dependencies) def DeepDependencies(self, dependencies=None): """Returns a list of all of a target's dependencies, recursively.""" if dependencies == None: dependencies = [] for dependency in self.dependencies: # Check for None, corresponding to the root node. if dependency.ref != None and dependency.ref not in dependencies: dependencies.append(dependency.ref) dependency.DeepDependencies(dependencies) return dependencies def _LinkDependenciesInternal(self, targets, include_shared_libraries, dependencies=None, initial=True): """Returns a list of dependency targets that are linked into this target. This function has a split personality, depending on the setting of |initial|. Outside callers should always leave |initial| at its default setting. When adding a target to the list of dependencies, this function will recurse into itself with |initial| set to False, to collect dependencies that are linked into the linkable target for which the list is being built. If |include_shared_libraries| is False, the resulting dependencies will not include shared_library targets that are linked into this target. """ if dependencies == None: dependencies = [] # Check for None, corresponding to the root node. if self.ref == None: return dependencies # It's kind of sucky that |targets| has to be passed into this function, # but that's presently the easiest way to access the target dicts so that # this function can find target types. if 'target_name' not in targets[self.ref]: raise GypError("Missing 'target_name' field in target.") if 'type' not in targets[self.ref]: raise GypError("Missing 'type' field in target %s" % targets[self.ref]['target_name']) target_type = targets[self.ref]['type'] is_linkable = target_type in linkable_types if initial and not is_linkable: # If this is the first target being examined and it's not linkable, # return an empty list of link dependencies, because the link # dependencies are intended to apply to the target itself (initial is # True) and this target won't be linked. return dependencies # Don't traverse 'none' targets if explicitly excluded. if (target_type == 'none' and not targets[self.ref].get('dependencies_traverse', True)): if self.ref not in dependencies: dependencies.append(self.ref) return dependencies # Executables and loadable modules are already fully and finally linked. # Nothing else can be a link dependency of them, there can only be # dependencies in the sense that a dependent target might run an # executable or load the loadable_module. if not initial and target_type in ('executable', 'loadable_module'): return dependencies # Shared libraries are already fully linked. They should only be included # in |dependencies| when adjusting static library dependencies (in order to # link against the shared_library's import lib), but should not be included # in |dependencies| when propagating link_settings. # The |include_shared_libraries| flag controls which of these two cases we # are handling. if (not initial and target_type == 'shared_library' and not include_shared_libraries): return dependencies # The target is linkable, add it to the list of link dependencies. if self.ref not in dependencies: dependencies.append(self.ref) if initial or not is_linkable: # If this is a subsequent target and it's linkable, don't look any # further for linkable dependencies, as they'll already be linked into # this target linkable. Always look at dependencies of the initial # target, and always look at dependencies of non-linkables. for dependency in self.dependencies: dependency._LinkDependenciesInternal(targets, include_shared_libraries, dependencies, False) return dependencies def DependenciesForLinkSettings(self, targets): """ Returns a list of dependency targets whose link_settings should be merged into this target. """ # TODO(sbaig) Currently, chrome depends on the bug that shared libraries' # link_settings are propagated. So for now, we will allow it, unless the # 'allow_sharedlib_linksettings_propagation' flag is explicitly set to # False. Once chrome is fixed, we can remove this flag. include_shared_libraries = \ targets[self.ref].get('allow_sharedlib_linksettings_propagation', True) return self._LinkDependenciesInternal(targets, include_shared_libraries) def DependenciesToLinkAgainst(self, targets): """ Returns a list of dependency targets that are linked into this target. """ return self._LinkDependenciesInternal(targets, True) def BuildDependencyList(targets): # Create a DependencyGraphNode for each target. Put it into a dict for easy # access. dependency_nodes = {} for target, spec in targets.iteritems(): if target not in dependency_nodes: dependency_nodes[target] = DependencyGraphNode(target) # Set up the dependency links. Targets that have no dependencies are treated # as dependent on root_node. root_node = DependencyGraphNode(None) for target, spec in targets.iteritems(): target_node = dependency_nodes[target] target_build_file = gyp.common.BuildFile(target) dependencies = spec.get('dependencies') if not dependencies: target_node.dependencies = [root_node] root_node.dependents.append(target_node) else: for dependency in dependencies: dependency_node = dependency_nodes.get(dependency) if not dependency_node: raise GypError("Dependency '%s' not found while " "trying to load target %s" % (dependency, target)) target_node.dependencies.append(dependency_node) dependency_node.dependents.append(target_node) flat_list = root_node.FlattenToList() # If there's anything left unvisited, there must be a circular dependency # (cycle). If you need to figure out what's wrong, look for elements of # targets that are not in flat_list. if len(flat_list) != len(targets): raise DependencyGraphNode.CircularException( 'Some targets not reachable, cycle in dependency graph detected: ' + ' '.join(set(flat_list) ^ set(targets))) return [dependency_nodes, flat_list] def VerifyNoGYPFileCircularDependencies(targets): # Create a DependencyGraphNode for each gyp file containing a target. Put # it into a dict for easy access. dependency_nodes = {} for target in targets.iterkeys(): build_file = gyp.common.BuildFile(target) if not build_file in dependency_nodes: dependency_nodes[build_file] = DependencyGraphNode(build_file) # Set up the dependency links. for target, spec in targets.iteritems(): build_file = gyp.common.BuildFile(target) build_file_node = dependency_nodes[build_file] target_dependencies = spec.get('dependencies', []) for dependency in target_dependencies: try: dependency_build_file = gyp.common.BuildFile(dependency) except GypError, e: gyp.common.ExceptionAppend( e, 'while computing dependencies of .gyp file %s' % build_file) raise if dependency_build_file == build_file: # A .gyp file is allowed to refer back to itself. continue dependency_node = dependency_nodes.get(dependency_build_file) if not dependency_node: raise GypError("Dependancy '%s' not found" % dependency_build_file) if dependency_node not in build_file_node.dependencies: build_file_node.dependencies.append(dependency_node) dependency_node.dependents.append(build_file_node) # Files that have no dependencies are treated as dependent on root_node. root_node = DependencyGraphNode(None) for build_file_node in dependency_nodes.itervalues(): if len(build_file_node.dependencies) == 0: build_file_node.dependencies.append(root_node) root_node.dependents.append(build_file_node) flat_list = root_node.FlattenToList() # If there's anything left unvisited, there must be a circular dependency # (cycle). if len(flat_list) != len(dependency_nodes): bad_files = [] for file in dependency_nodes.iterkeys(): if not file in flat_list: bad_files.append(file) common_path_prefix = os.path.commonprefix(dependency_nodes) cycles = [] for cycle in root_node.FindCycles(): simplified_paths = [] for node in cycle: assert(node.ref.startswith(common_path_prefix)) simplified_paths.append(node.ref[len(common_path_prefix):]) cycles.append('Cycle: %s' % ' -> '.join(simplified_paths)) raise DependencyGraphNode.CircularException, \ 'Cycles in .gyp file dependency graph detected:\n' + '\n'.join(cycles) def DoDependentSettings(key, flat_list, targets, dependency_nodes): # key should be one of all_dependent_settings, direct_dependent_settings, # or link_settings. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) if key == 'all_dependent_settings': dependencies = dependency_nodes[target].DeepDependencies() elif key == 'direct_dependent_settings': dependencies = \ dependency_nodes[target].DirectAndImportedDependencies(targets) elif key == 'link_settings': dependencies = \ dependency_nodes[target].DependenciesForLinkSettings(targets) else: raise GypError("DoDependentSettings doesn't know how to determine " 'dependencies for ' + key) for dependency in dependencies: dependency_dict = targets[dependency] if not key in dependency_dict: continue dependency_build_file = gyp.common.BuildFile(dependency) MergeDicts(target_dict, dependency_dict[key], build_file, dependency_build_file) def AdjustStaticLibraryDependencies(flat_list, targets, dependency_nodes, sort_dependencies): # Recompute target "dependencies" properties. For each static library # target, remove "dependencies" entries referring to other static libraries, # unless the dependency has the "hard_dependency" attribute set. For each # linkable target, add a "dependencies" entry referring to all of the # target's computed list of link dependencies (including static libraries # if no such entry is already present. for target in flat_list: target_dict = targets[target] target_type = target_dict['type'] if target_type == 'static_library': if not 'dependencies' in target_dict: continue target_dict['dependencies_original'] = target_dict.get( 'dependencies', [])[:] # A static library should not depend on another static library unless # the dependency relationship is "hard," which should only be done when # a dependent relies on some side effect other than just the build # product, like a rule or action output. Further, if a target has a # non-hard dependency, but that dependency exports a hard dependency, # the non-hard dependency can safely be removed, but the exported hard # dependency must be added to the target to keep the same dependency # ordering. dependencies = \ dependency_nodes[target].DirectAndImportedDependencies(targets) index = 0 while index < len(dependencies): dependency = dependencies[index] dependency_dict = targets[dependency] # Remove every non-hard static library dependency and remove every # non-static library dependency that isn't a direct dependency. if (dependency_dict['type'] == 'static_library' and \ not dependency_dict.get('hard_dependency', False)) or \ (dependency_dict['type'] != 'static_library' and \ not dependency in target_dict['dependencies']): # Take the dependency out of the list, and don't increment index # because the next dependency to analyze will shift into the index # formerly occupied by the one being removed. del dependencies[index] else: index = index + 1 # Update the dependencies. If the dependencies list is empty, it's not # needed, so unhook it. if len(dependencies) > 0: target_dict['dependencies'] = dependencies else: del target_dict['dependencies'] elif target_type in linkable_types: # Get a list of dependency targets that should be linked into this # target. Add them to the dependencies list if they're not already # present. link_dependencies = \ dependency_nodes[target].DependenciesToLinkAgainst(targets) for dependency in link_dependencies: if dependency == target: continue if not 'dependencies' in target_dict: target_dict['dependencies'] = [] if not dependency in target_dict['dependencies']: target_dict['dependencies'].append(dependency) # Sort the dependencies list in the order from dependents to dependencies. # e.g. If A and B depend on C and C depends on D, sort them in A, B, C, D. # Note: flat_list is already sorted in the order from dependencies to # dependents. if sort_dependencies and 'dependencies' in target_dict: target_dict['dependencies'] = [dep for dep in reversed(flat_list) if dep in target_dict['dependencies']] # Initialize this here to speed up MakePathRelative. exception_re = re.compile(r'''["']?[-/$<>^]''') def MakePathRelative(to_file, fro_file, item): # If item is a relative path, it's relative to the build file dict that it's # coming from. Fix it up to make it relative to the build file dict that # it's going into. # Exception: any |item| that begins with these special characters is # returned without modification. # / Used when a path is already absolute (shortcut optimization; # such paths would be returned as absolute anyway) # $ Used for build environment variables # - Used for some build environment flags (such as -lapr-1 in a # "libraries" section) # < Used for our own variable and command expansions (see ExpandVariables) # > Used for our own variable and command expansions (see ExpandVariables) # ^ Used for our own variable and command expansions (see ExpandVariables) # # "/' Used when a value is quoted. If these are present, then we # check the second character instead. # if to_file == fro_file or exception_re.match(item): return item else: # TODO(dglazkov) The backslash/forward-slash replacement at the end is a # temporary measure. This should really be addressed by keeping all paths # in POSIX until actual project generation. ret = os.path.normpath(os.path.join( gyp.common.RelativePath(os.path.dirname(fro_file), os.path.dirname(to_file)), item)).replace('\\', '/') if item[-1] == '/': ret += '/' return ret def MergeLists(to, fro, to_file, fro_file, is_paths=False, append=True): # Python documentation recommends objects which do not support hash # set this value to None. Python library objects follow this rule. is_hashable = lambda val: val.__hash__ # If x is hashable, returns whether x is in s. Else returns whether x is in l. def is_in_set_or_list(x, s, l): if is_hashable(x): return x in s return x in l prepend_index = 0 # Make membership testing of hashables in |to| (in particular, strings) # faster. hashable_to_set = set(x for x in to if is_hashable(x)) for item in fro: singleton = False if isinstance(item, str) or isinstance(item, int): # The cheap and easy case. if is_paths: to_item = MakePathRelative(to_file, fro_file, item) else: to_item = item if not isinstance(item, str) or not item.startswith('-'): # Any string that doesn't begin with a "-" is a singleton - it can # only appear once in a list, to be enforced by the list merge append # or prepend. singleton = True elif isinstance(item, dict): # Make a copy of the dictionary, continuing to look for paths to fix. # The other intelligent aspects of merge processing won't apply because # item is being merged into an empty dict. to_item = {} MergeDicts(to_item, item, to_file, fro_file) elif isinstance(item, list): # Recurse, making a copy of the list. If the list contains any # descendant dicts, path fixing will occur. Note that here, custom # values for is_paths and append are dropped; those are only to be # applied to |to| and |fro|, not sublists of |fro|. append shouldn't # matter anyway because the new |to_item| list is empty. to_item = [] MergeLists(to_item, item, to_file, fro_file) else: raise TypeError, \ 'Attempt to merge list item of unsupported type ' + \ item.__class__.__name__ if append: # If appending a singleton that's already in the list, don't append. # This ensures that the earliest occurrence of the item will stay put. if not singleton or not is_in_set_or_list(to_item, hashable_to_set, to): to.append(to_item) if is_hashable(to_item): hashable_to_set.add(to_item) else: # If prepending a singleton that's already in the list, remove the # existing instance and proceed with the prepend. This ensures that the # item appears at the earliest possible position in the list. while singleton and to_item in to: to.remove(to_item) # Don't just insert everything at index 0. That would prepend the new # items to the list in reverse order, which would be an unwelcome # surprise. to.insert(prepend_index, to_item) if is_hashable(to_item): hashable_to_set.add(to_item) prepend_index = prepend_index + 1 def MergeDicts(to, fro, to_file, fro_file): # I wanted to name the parameter "from" but it's a Python keyword... for k, v in fro.iteritems(): # It would be nice to do "if not k in to: to[k] = v" but that wouldn't give # copy semantics. Something else may want to merge from the |fro| dict # later, and having the same dict ref pointed to twice in the tree isn't # what anyone wants considering that the dicts may subsequently be # modified. if k in to: bad_merge = False if isinstance(v, str) or isinstance(v, int): if not (isinstance(to[k], str) or isinstance(to[k], int)): bad_merge = True elif v.__class__ != to[k].__class__: bad_merge = True if bad_merge: raise TypeError, \ 'Attempt to merge dict value of type ' + v.__class__.__name__ + \ ' into incompatible type ' + to[k].__class__.__name__ + \ ' for key ' + k if isinstance(v, str) or isinstance(v, int): # Overwrite the existing value, if any. Cheap and easy. is_path = IsPathSection(k) if is_path: to[k] = MakePathRelative(to_file, fro_file, v) else: to[k] = v elif isinstance(v, dict): # Recurse, guaranteeing copies will be made of objects that require it. if not k in to: to[k] = {} MergeDicts(to[k], v, to_file, fro_file) elif isinstance(v, list): # Lists in dicts can be merged with different policies, depending on # how the key in the "from" dict (k, the from-key) is written. # # If the from-key has ...the to-list will have this action # this character appended:... applied when receiving the from-list: # = replace # + prepend # ? set, only if to-list does not yet exist # (none) append # # This logic is list-specific, but since it relies on the associated # dict key, it's checked in this dict-oriented function. ext = k[-1] append = True if ext == '=': list_base = k[:-1] lists_incompatible = [list_base, list_base + '?'] to[list_base] = [] elif ext == '+': list_base = k[:-1] lists_incompatible = [list_base + '=', list_base + '?'] append = False elif ext == '?': list_base = k[:-1] lists_incompatible = [list_base, list_base + '=', list_base + '+'] else: list_base = k lists_incompatible = [list_base + '=', list_base + '?'] # Some combinations of merge policies appearing together are meaningless. # It's stupid to replace and append simultaneously, for example. Append # and prepend are the only policies that can coexist. for list_incompatible in lists_incompatible: if list_incompatible in fro: raise GypError('Incompatible list policies ' + k + ' and ' + list_incompatible) if list_base in to: if ext == '?': # If the key ends in "?", the list will only be merged if it doesn't # already exist. continue if not isinstance(to[list_base], list): # This may not have been checked above if merging in a list with an # extension character. raise TypeError, \ 'Attempt to merge dict value of type ' + v.__class__.__name__ + \ ' into incompatible type ' + to[list_base].__class__.__name__ + \ ' for key ' + list_base + '(' + k + ')' else: to[list_base] = [] # Call MergeLists, which will make copies of objects that require it. # MergeLists can recurse back into MergeDicts, although this will be # to make copies of dicts (with paths fixed), there will be no # subsequent dict "merging" once entering a list because lists are # always replaced, appended to, or prepended to. is_paths = IsPathSection(list_base) MergeLists(to[list_base], v, to_file, fro_file, is_paths, append) else: raise TypeError, \ 'Attempt to merge dict value of unsupported type ' + \ v.__class__.__name__ + ' for key ' + k def MergeConfigWithInheritance(new_configuration_dict, build_file, target_dict, configuration, visited): # Skip if previously visted. if configuration in visited: return # Look at this configuration. configuration_dict = target_dict['configurations'][configuration] # Merge in parents. for parent in configuration_dict.get('inherit_from', []): MergeConfigWithInheritance(new_configuration_dict, build_file, target_dict, parent, visited + [configuration]) # Merge it into the new config. MergeDicts(new_configuration_dict, configuration_dict, build_file, build_file) # Drop abstract. if 'abstract' in new_configuration_dict: del new_configuration_dict['abstract'] def SetUpConfigurations(target, target_dict): # key_suffixes is a list of key suffixes that might appear on key names. # These suffixes are handled in conditional evaluations (for =, +, and ?) # and rules/exclude processing (for ! and /). Keys with these suffixes # should be treated the same as keys without. key_suffixes = ['=', '+', '?', '!', '/'] build_file = gyp.common.BuildFile(target) # Provide a single configuration by default if none exists. # TODO(mark): Signal an error if default_configurations exists but # configurations does not. if not 'configurations' in target_dict: target_dict['configurations'] = {'Default': {}} if not 'default_configuration' in target_dict: concrete = [i for i in target_dict['configurations'].iterkeys() if not target_dict['configurations'][i].get('abstract')] target_dict['default_configuration'] = sorted(concrete)[0] for configuration in target_dict['configurations'].keys(): old_configuration_dict = target_dict['configurations'][configuration] # Skip abstract configurations (saves work only). if old_configuration_dict.get('abstract'): continue # Configurations inherit (most) settings from the enclosing target scope. # Get the inheritance relationship right by making a copy of the target # dict. new_configuration_dict = copy.deepcopy(target_dict) # Take out the bits that don't belong in a "configurations" section. # Since configuration setup is done before conditional, exclude, and rules # processing, be careful with handling of the suffix characters used in # those phases. delete_keys = [] for key in new_configuration_dict: key_ext = key[-1:] if key_ext in key_suffixes: key_base = key[:-1] else: key_base = key if key_base in non_configuration_keys: delete_keys.append(key) for key in delete_keys: del new_configuration_dict[key] # Merge in configuration (with all its parents first). MergeConfigWithInheritance(new_configuration_dict, build_file, target_dict, configuration, []) # Put the new result back into the target dict as a configuration. target_dict['configurations'][configuration] = new_configuration_dict # Now drop all the abstract ones. for configuration in target_dict['configurations'].keys(): old_configuration_dict = target_dict['configurations'][configuration] if old_configuration_dict.get('abstract'): del target_dict['configurations'][configuration] # Now that all of the target's configurations have been built, go through # the target dict's keys and remove everything that's been moved into a # "configurations" section. delete_keys = [] for key in target_dict: key_ext = key[-1:] if key_ext in key_suffixes: key_base = key[:-1] else: key_base = key if not key_base in non_configuration_keys: delete_keys.append(key) for key in delete_keys: del target_dict[key] # Check the configurations to see if they contain invalid keys. for configuration in target_dict['configurations'].keys(): configuration_dict = target_dict['configurations'][configuration] for key in configuration_dict.keys(): if key in invalid_configuration_keys: raise GypError('%s not allowed in the %s configuration, found in ' 'target %s' % (key, configuration, target)) def ProcessListFiltersInDict(name, the_dict): """Process regular expression and exclusion-based filters on lists. An exclusion list is in a dict key named with a trailing "!", like "sources!". Every item in such a list is removed from the associated main list, which in this example, would be "sources". Removed items are placed into a "sources_excluded" list in the dict. Regular expression (regex) filters are contained in dict keys named with a trailing "/", such as "sources/" to operate on the "sources" list. Regex filters in a dict take the form: 'sources/': [ ['exclude', '_(linux|mac|win)\\.cc$'], ['include', '_mac\\.cc$'] ], The first filter says to exclude all files ending in _linux.cc, _mac.cc, and _win.cc. The second filter then includes all files ending in _mac.cc that are now or were once in the "sources" list. Items matching an "exclude" filter are subject to the same processing as would occur if they were listed by name in an exclusion list (ending in "!"). Items matching an "include" filter are brought back into the main list if previously excluded by an exclusion list or exclusion regex filter. Subsequent matching "exclude" patterns can still cause items to be excluded after matching an "include". """ # Look through the dictionary for any lists whose keys end in "!" or "/". # These are lists that will be treated as exclude lists and regular # expression-based exclude/include lists. Collect the lists that are # needed first, looking for the lists that they operate on, and assemble # then into |lists|. This is done in a separate loop up front, because # the _included and _excluded keys need to be added to the_dict, and that # can't be done while iterating through it. lists = [] del_lists = [] for key, value in the_dict.iteritems(): operation = key[-1] if operation != '!' and operation != '/': continue if not isinstance(value, list): raise ValueError, name + ' key ' + key + ' must be list, not ' + \ value.__class__.__name__ list_key = key[:-1] if list_key not in the_dict: # This happens when there's a list like "sources!" but no corresponding # "sources" list. Since there's nothing for it to operate on, queue up # the "sources!" list for deletion now. del_lists.append(key) continue if not isinstance(the_dict[list_key], list): value = the_dict[list_key] raise ValueError, name + ' key ' + list_key + \ ' must be list, not ' + \ value.__class__.__name__ + ' when applying ' + \ {'!': 'exclusion', '/': 'regex'}[operation] if not list_key in lists: lists.append(list_key) # Delete the lists that are known to be unneeded at this point. for del_list in del_lists: del the_dict[del_list] for list_key in lists: the_list = the_dict[list_key] # Initialize the list_actions list, which is parallel to the_list. Each # item in list_actions identifies whether the corresponding item in # the_list should be excluded, unconditionally preserved (included), or # whether no exclusion or inclusion has been applied. Items for which # no exclusion or inclusion has been applied (yet) have value -1, items # excluded have value 0, and items included have value 1. Includes and # excludes override previous actions. All items in list_actions are # initialized to -1 because no excludes or includes have been processed # yet. list_actions = list((-1,) * len(the_list)) exclude_key = list_key + '!' if exclude_key in the_dict: for exclude_item in the_dict[exclude_key]: for index in xrange(0, len(the_list)): if exclude_item == the_list[index]: # This item matches the exclude_item, so set its action to 0 # (exclude). list_actions[index] = 0 # The "whatever!" list is no longer needed, dump it. del the_dict[exclude_key] regex_key = list_key + '/' if regex_key in the_dict: for regex_item in the_dict[regex_key]: [action, pattern] = regex_item pattern_re = re.compile(pattern) if action == 'exclude': # This item matches an exclude regex, so set its value to 0 (exclude). action_value = 0 elif action == 'include': # This item matches an include regex, so set its value to 1 (include). action_value = 1 else: # This is an action that doesn't make any sense. raise ValueError, 'Unrecognized action ' + action + ' in ' + name + \ ' key ' + regex_key for index in xrange(0, len(the_list)): list_item = the_list[index] if list_actions[index] == action_value: # Even if the regex matches, nothing will change so continue (regex # searches are expensive). continue if pattern_re.search(list_item): # Regular expression match. list_actions[index] = action_value # The "whatever/" list is no longer needed, dump it. del the_dict[regex_key] # Add excluded items to the excluded list. # # Note that exclude_key ("sources!") is different from excluded_key # ("sources_excluded"). The exclude_key list is input and it was already # processed and deleted; the excluded_key list is output and it's about # to be created. excluded_key = list_key + '_excluded' if excluded_key in the_dict: raise GypError(name + ' key ' + excluded_key + ' must not be present prior ' ' to applying exclusion/regex filters for ' + list_key) excluded_list = [] # Go backwards through the list_actions list so that as items are deleted, # the indices of items that haven't been seen yet don't shift. That means # that things need to be prepended to excluded_list to maintain them in the # same order that they existed in the_list. for index in xrange(len(list_actions) - 1, -1, -1): if list_actions[index] == 0: # Dump anything with action 0 (exclude). Keep anything with action 1 # (include) or -1 (no include or exclude seen for the item). excluded_list.insert(0, the_list[index]) del the_list[index] # If anything was excluded, put the excluded list into the_dict at # excluded_key. if len(excluded_list) > 0: the_dict[excluded_key] = excluded_list # Now recurse into subdicts and lists that may contain dicts. for key, value in the_dict.iteritems(): if isinstance(value, dict): ProcessListFiltersInDict(key, value) elif isinstance(value, list): ProcessListFiltersInList(key, value) def ProcessListFiltersInList(name, the_list): for item in the_list: if isinstance(item, dict): ProcessListFiltersInDict(name, item) elif isinstance(item, list): ProcessListFiltersInList(name, item) def ValidateTargetType(target, target_dict): """Ensures the 'type' field on the target is one of the known types. Arguments: target: string, name of target. target_dict: dict, target spec. Raises an exception on error. """ VALID_TARGET_TYPES = ('executable', 'loadable_module', 'static_library', 'shared_library', 'none') target_type = target_dict.get('type', None) if target_type not in VALID_TARGET_TYPES: raise GypError("Target %s has an invalid target type '%s'. " "Must be one of %s." % (target, target_type, '/'.join(VALID_TARGET_TYPES))) if (target_dict.get('standalone_static_library', 0) and not target_type == 'static_library'): raise GypError('Target %s has type %s but standalone_static_library flag is' ' only valid for static_library type.' % (target, target_type)) def ValidateSourcesInTarget(target, target_dict, build_file): # TODO: Check if MSVC allows this for loadable_module targets. if target_dict.get('type', None) not in ('static_library', 'shared_library'): return sources = target_dict.get('sources', []) basenames = {} for source in sources: name, ext = os.path.splitext(source) is_compiled_file = ext in [ '.c', '.cc', '.cpp', '.cxx', '.m', '.mm', '.s', '.S'] if not is_compiled_file: continue basename = os.path.basename(name) # Don't include extension. basenames.setdefault(basename, []).append(source) error = '' for basename, files in basenames.iteritems(): if len(files) > 1: error += ' %s: %s\n' % (basename, ' '.join(files)) if error: print('static library %s has several files with the same basename:\n' % target + error + 'Some build systems, e.g. MSVC08, ' 'cannot handle that.') raise GypError('Duplicate basenames in sources section, see list above') def ValidateRulesInTarget(target, target_dict, extra_sources_for_rules): """Ensures that the rules sections in target_dict are valid and consistent, and determines which sources they apply to. Arguments: target: string, name of target. target_dict: dict, target spec containing "rules" and "sources" lists. extra_sources_for_rules: a list of keys to scan for rule matches in addition to 'sources'. """ # Dicts to map between values found in rules' 'rule_name' and 'extension' # keys and the rule dicts themselves. rule_names = {} rule_extensions = {} rules = target_dict.get('rules', []) for rule in rules: # Make sure that there's no conflict among rule names and extensions. rule_name = rule['rule_name'] if rule_name in rule_names: raise GypError('rule %s exists in duplicate, target %s' % (rule_name, target)) rule_names[rule_name] = rule rule_extension = rule['extension'] if rule_extension.startswith('.'): rule_extension = rule_extension[1:] if rule_extension in rule_extensions: raise GypError(('extension %s associated with multiple rules, ' + 'target %s rules %s and %s') % (rule_extension, target, rule_extensions[rule_extension]['rule_name'], rule_name)) rule_extensions[rule_extension] = rule # Make sure rule_sources isn't already there. It's going to be # created below if needed. if 'rule_sources' in rule: raise GypError( 'rule_sources must not exist in input, target %s rule %s' % (target, rule_name)) rule_sources = [] source_keys = ['sources'] source_keys.extend(extra_sources_for_rules) for source_key in source_keys: for source in target_dict.get(source_key, []): (source_root, source_extension) = os.path.splitext(source) if source_extension.startswith('.'): source_extension = source_extension[1:] if source_extension == rule_extension: rule_sources.append(source) if len(rule_sources) > 0: rule['rule_sources'] = rule_sources def ValidateRunAsInTarget(target, target_dict, build_file): target_name = target_dict.get('target_name') run_as = target_dict.get('run_as') if not run_as: return if not isinstance(run_as, dict): raise GypError("The 'run_as' in target %s from file %s should be a " "dictionary." % (target_name, build_file)) action = run_as.get('action') if not action: raise GypError("The 'run_as' in target %s from file %s must have an " "'action' section." % (target_name, build_file)) if not isinstance(action, list): raise GypError("The 'action' for 'run_as' in target %s from file %s " "must be a list." % (target_name, build_file)) working_directory = run_as.get('working_directory') if working_directory and not isinstance(working_directory, str): raise GypError("The 'working_directory' for 'run_as' in target %s " "in file %s should be a string." % (target_name, build_file)) environment = run_as.get('environment') if environment and not isinstance(environment, dict): raise GypError("The 'environment' for 'run_as' in target %s " "in file %s should be a dictionary." % (target_name, build_file)) def ValidateActionsInTarget(target, target_dict, build_file): '''Validates the inputs to the actions in a target.''' target_name = target_dict.get('target_name') actions = target_dict.get('actions', []) for action in actions: action_name = action.get('action_name') if not action_name: raise GypError("Anonymous action in target %s. " "An action must have an 'action_name' field." % target_name) inputs = action.get('inputs', None) if inputs is None: raise GypError('Action in target %s has no inputs.' % target_name) action_command = action.get('action') if action_command and not action_command[0]: raise GypError("Empty action as command in target %s." % target_name) def TurnIntIntoStrInDict(the_dict): """Given dict the_dict, recursively converts all integers into strings. """ # Use items instead of iteritems because there's no need to try to look at # reinserted keys and their associated values. for k, v in the_dict.items(): if isinstance(v, int): v = str(v) the_dict[k] = v elif isinstance(v, dict): TurnIntIntoStrInDict(v) elif isinstance(v, list): TurnIntIntoStrInList(v) if isinstance(k, int): the_dict[str(k)] = v del the_dict[k] def TurnIntIntoStrInList(the_list): """Given list the_list, recursively converts all integers into strings. """ for index in xrange(0, len(the_list)): item = the_list[index] if isinstance(item, int): the_list[index] = str(item) elif isinstance(item, dict): TurnIntIntoStrInDict(item) elif isinstance(item, list): TurnIntIntoStrInList(item) def PruneUnwantedTargets(targets, flat_list, dependency_nodes, root_targets, data): """Return only the targets that are deep dependencies of |root_targets|.""" qualified_root_targets = [] for target in root_targets: target = target.strip() qualified_targets = gyp.common.FindQualifiedTargets(target, flat_list) if not qualified_targets: raise GypError("Could not find target %s" % target) qualified_root_targets.extend(qualified_targets) wanted_targets = {} for target in qualified_root_targets: wanted_targets[target] = targets[target] for dependency in dependency_nodes[target].DeepDependencies(): wanted_targets[dependency] = targets[dependency] wanted_flat_list = [t for t in flat_list if t in wanted_targets] # Prune unwanted targets from each build_file's data dict. for build_file in data['target_build_files']: if not 'targets' in data[build_file]: continue new_targets = [] for target in data[build_file]['targets']: qualified_name = gyp.common.QualifiedTarget(build_file, target['target_name'], target['toolset']) if qualified_name in wanted_targets: new_targets.append(target) data[build_file]['targets'] = new_targets return wanted_targets, wanted_flat_list def VerifyNoCollidingTargets(targets): """Verify that no two targets in the same directory share the same name. Arguments: targets: A list of targets in the form 'path/to/file.gyp:target_name'. """ # Keep a dict going from 'subdirectory:target_name' to 'foo.gyp'. used = {} for target in targets: # Separate out 'path/to/file.gyp, 'target_name' from # 'path/to/file.gyp:target_name'. path, name = target.rsplit(':', 1) # Separate out 'path/to', 'file.gyp' from 'path/to/file.gyp'. subdir, gyp = os.path.split(path) # Use '.' for the current directory '', so that the error messages make # more sense. if not subdir: subdir = '.' # Prepare a key like 'path/to:target_name'. key = subdir + ':' + name if key in used: # Complain if this target is already used. raise GypError('Duplicate target name "%s" in directory "%s" used both ' 'in "%s" and "%s".' % (name, subdir, gyp, used[key])) used[key] = gyp def SetGeneratorGlobals(generator_input_info): # Set up path_sections and non_configuration_keys with the default data plus # the generator-specific data. global path_sections path_sections = base_path_sections[:] path_sections.extend(generator_input_info['path_sections']) global non_configuration_keys non_configuration_keys = base_non_configuration_keys[:] non_configuration_keys.extend(generator_input_info['non_configuration_keys']) global multiple_toolsets multiple_toolsets = generator_input_info[ 'generator_supports_multiple_toolsets'] global generator_filelist_paths generator_filelist_paths = generator_input_info['generator_filelist_paths'] def Load(build_files, variables, includes, depth, generator_input_info, check, circular_check, parallel, root_targets): SetGeneratorGlobals(generator_input_info) # A generator can have other lists (in addition to sources) be processed # for rules. extra_sources_for_rules = generator_input_info['extra_sources_for_rules'] # Load build files. This loads every target-containing build file into # the |data| dictionary such that the keys to |data| are build file names, # and the values are the entire build file contents after "early" or "pre" # processing has been done and includes have been resolved. # NOTE: data contains both "target" files (.gyp) and "includes" (.gypi), as # well as meta-data (e.g. 'included_files' key). 'target_build_files' keeps # track of the keys corresponding to "target" files. data = {'target_build_files': set()} aux_data = {} # Normalize paths everywhere. This is important because paths will be # used as keys to the data dict and for references between input files. build_files = set(map(os.path.normpath, build_files)) if parallel: LoadTargetBuildFilesParallel(build_files, data, aux_data, variables, includes, depth, check, generator_input_info) else: for build_file in build_files: try: LoadTargetBuildFile(build_file, data, aux_data, variables, includes, depth, check, True) except Exception, e: gyp.common.ExceptionAppend(e, 'while trying to load %s' % build_file) raise # Build a dict to access each target's subdict by qualified name. targets = BuildTargetsDict(data) # Fully qualify all dependency links. QualifyDependencies(targets) # Remove self-dependencies from targets that have 'prune_self_dependencies' # set to 1. RemoveSelfDependencies(targets) # Expand dependencies specified as build_file:*. ExpandWildcardDependencies(targets, data) # Apply exclude (!) and regex (/) list filters only for dependency_sections. for target_name, target_dict in targets.iteritems(): tmp_dict = {} for key_base in dependency_sections: for op in ('', '!', '/'): key = key_base + op if key in target_dict: tmp_dict[key] = target_dict[key] del target_dict[key] ProcessListFiltersInDict(target_name, tmp_dict) # Write the results back to |target_dict|. for key in tmp_dict: target_dict[key] = tmp_dict[key] # Make sure every dependency appears at most once. RemoveDuplicateDependencies(targets) if circular_check: # Make sure that any targets in a.gyp don't contain dependencies in other # .gyp files that further depend on a.gyp. VerifyNoGYPFileCircularDependencies(targets) [dependency_nodes, flat_list] = BuildDependencyList(targets) if root_targets: # Remove, from |targets| and |flat_list|, the targets that are not deep # dependencies of the targets specified in |root_targets|. targets, flat_list = PruneUnwantedTargets( targets, flat_list, dependency_nodes, root_targets, data) # Check that no two targets in the same directory have the same name. VerifyNoCollidingTargets(flat_list) # Handle dependent settings of various types. for settings_type in ['all_dependent_settings', 'direct_dependent_settings', 'link_settings']: DoDependentSettings(settings_type, flat_list, targets, dependency_nodes) # Take out the dependent settings now that they've been published to all # of the targets that require them. for target in flat_list: if settings_type in targets[target]: del targets[target][settings_type] # Make sure static libraries don't declare dependencies on other static # libraries, but that linkables depend on all unlinked static libraries # that they need so that their link steps will be correct. gii = generator_input_info if gii['generator_wants_static_library_dependencies_adjusted']: AdjustStaticLibraryDependencies(flat_list, targets, dependency_nodes, gii['generator_wants_sorted_dependencies']) # Apply "post"/"late"/"target" variable expansions and condition evaluations. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) ProcessVariablesAndConditionsInDict( target_dict, PHASE_LATE, variables, build_file) # Move everything that can go into a "configurations" section into one. for target in flat_list: target_dict = targets[target] SetUpConfigurations(target, target_dict) # Apply exclude (!) and regex (/) list filters. for target in flat_list: target_dict = targets[target] ProcessListFiltersInDict(target, target_dict) # Apply "latelate" variable expansions and condition evaluations. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) ProcessVariablesAndConditionsInDict( target_dict, PHASE_LATELATE, variables, build_file) # Make sure that the rules make sense, and build up rule_sources lists as # needed. Not all generators will need to use the rule_sources lists, but # some may, and it seems best to build the list in a common spot. # Also validate actions and run_as elements in targets. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) ValidateTargetType(target, target_dict) # TODO(thakis): Get vpx_scale/arm/scalesystemdependent.c to be renamed to # scalesystemdependent_arm_additions.c or similar. if 'arm' not in variables.get('target_arch', ''): ValidateSourcesInTarget(target, target_dict, build_file) ValidateRulesInTarget(target, target_dict, extra_sources_for_rules) ValidateRunAsInTarget(target, target_dict, build_file) ValidateActionsInTarget(target, target_dict, build_file) # Generators might not expect ints. Turn them into strs. TurnIntIntoStrInDict(data) # TODO(mark): Return |data| for now because the generator needs a list of # build files that came in. In the future, maybe it should just accept # a list, and not the whole data dict. return [flat_list, targets, data] �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/input_test.py�������000755 �000766 �000024 �00000006207 12455173731 032376� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Unit tests for the input.py file.""" import gyp.input import unittest import sys class TestFindCycles(unittest.TestCase): def setUp(self): self.nodes = {} for x in ('a', 'b', 'c', 'd', 'e'): self.nodes[x] = gyp.input.DependencyGraphNode(x) def _create_dependency(self, dependent, dependency): dependent.dependencies.append(dependency) dependency.dependents.append(dependent) def test_no_cycle_empty_graph(self): for label, node in self.nodes.iteritems(): self.assertEquals([], node.FindCycles()) def test_no_cycle_line(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['c']) self._create_dependency(self.nodes['c'], self.nodes['d']) for label, node in self.nodes.iteritems(): self.assertEquals([], node.FindCycles()) def test_no_cycle_dag(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['a'], self.nodes['c']) self._create_dependency(self.nodes['b'], self.nodes['c']) for label, node in self.nodes.iteritems(): self.assertEquals([], node.FindCycles()) def test_cycle_self_reference(self): self._create_dependency(self.nodes['a'], self.nodes['a']) self.assertEquals([(self.nodes['a'], self.nodes['a'])], self.nodes['a'].FindCycles()) def test_cycle_two_nodes(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['a']) self.assertEquals([(self.nodes['a'], self.nodes['b'], self.nodes['a'])], self.nodes['a'].FindCycles()) self.assertEquals([(self.nodes['b'], self.nodes['a'], self.nodes['b'])], self.nodes['b'].FindCycles()) def test_two_cycles(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['a']) self._create_dependency(self.nodes['b'], self.nodes['c']) self._create_dependency(self.nodes['c'], self.nodes['b']) cycles = self.nodes['a'].FindCycles() self.assertTrue( (self.nodes['a'], self.nodes['b'], self.nodes['a']) in cycles) self.assertTrue( (self.nodes['b'], self.nodes['c'], self.nodes['b']) in cycles) self.assertEquals(2, len(cycles)) def test_big_cycle(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['c']) self._create_dependency(self.nodes['c'], self.nodes['d']) self._create_dependency(self.nodes['d'], self.nodes['e']) self._create_dependency(self.nodes['e'], self.nodes['a']) self.assertEquals([(self.nodes['a'], self.nodes['b'], self.nodes['c'], self.nodes['d'], self.nodes['e'], self.nodes['a'])], self.nodes['a'].FindCycles()) if __name__ == '__main__': unittest.main() �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/mac_tool.py���������000755 �000766 �000024 �00000045555 12455173731 032006� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions to perform Xcode-style build steps. These functions are executed via gyp-mac-tool when using the Makefile generator. """ import fcntl import fnmatch import glob import json import os import plistlib import re import shutil import string import subprocess import sys import tempfile def main(args): executor = MacTool() exit_code = executor.Dispatch(args) if exit_code is not None: sys.exit(exit_code) class MacTool(object): """This class performs all the Mac tooling steps. The methods can either be executed directly, or dispatched from an argument list.""" def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) return getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like copy-info-plist to CopyInfoPlist""" return name_string.title().replace('-', '') def ExecCopyBundleResource(self, source, dest): """Copies a resource file to the bundle/Resources directory, performing any necessary compilation on each resource.""" extension = os.path.splitext(source)[1].lower() if os.path.isdir(source): # Copy tree. # TODO(thakis): This copies file attributes like mtime, while the # single-file branch below doesn't. This should probably be changed to # be consistent with the single-file branch. if os.path.exists(dest): shutil.rmtree(dest) shutil.copytree(source, dest) elif extension == '.xib': return self._CopyXIBFile(source, dest) elif extension == '.storyboard': return self._CopyXIBFile(source, dest) elif extension == '.strings': self._CopyStringsFile(source, dest) else: shutil.copy(source, dest) def _CopyXIBFile(self, source, dest): """Compiles a XIB file with ibtool into a binary plist in the bundle.""" # ibtool sometimes crashes with relative paths. See crbug.com/314728. base = os.path.dirname(os.path.realpath(__file__)) if os.path.relpath(source): source = os.path.join(base, source) if os.path.relpath(dest): dest = os.path.join(base, dest) args = ['xcrun', 'ibtool', '--errors', '--warnings', '--notices', '--output-format', 'human-readable-text', '--compile', dest, source] ibtool_section_re = re.compile(r'/\*.*\*/') ibtool_re = re.compile(r'.*note:.*is clipping its content') ibtoolout = subprocess.Popen(args, stdout=subprocess.PIPE) current_section_header = None for line in ibtoolout.stdout: if ibtool_section_re.match(line): current_section_header = line elif not ibtool_re.match(line): if current_section_header: sys.stdout.write(current_section_header) current_section_header = None sys.stdout.write(line) return ibtoolout.returncode def _CopyStringsFile(self, source, dest): """Copies a .strings file using iconv to reconvert the input into UTF-16.""" input_code = self._DetectInputEncoding(source) or "UTF-8" # Xcode's CpyCopyStringsFile / builtin-copyStrings seems to call # CFPropertyListCreateFromXMLData() behind the scenes; at least it prints # CFPropertyListCreateFromXMLData(): Old-style plist parser: missing # semicolon in dictionary. # on invalid files. Do the same kind of validation. import CoreFoundation s = open(source, 'rb').read() d = CoreFoundation.CFDataCreate(None, s, len(s)) _, error = CoreFoundation.CFPropertyListCreateFromXMLData(None, d, 0, None) if error: return fp = open(dest, 'wb') fp.write(s.decode(input_code).encode('UTF-16')) fp.close() def _DetectInputEncoding(self, file_name): """Reads the first few bytes from file_name and tries to guess the text encoding. Returns None as a guess if it can't detect it.""" fp = open(file_name, 'rb') try: header = fp.read(3) except e: fp.close() return None fp.close() if header.startswith("\xFE\xFF"): return "UTF-16" elif header.startswith("\xFF\xFE"): return "UTF-16" elif header.startswith("\xEF\xBB\xBF"): return "UTF-8" else: return None def ExecCopyInfoPlist(self, source, dest, *keys): """Copies the |source| Info.plist to the destination directory |dest|.""" # Read the source Info.plist into memory. fd = open(source, 'r') lines = fd.read() fd.close() # Insert synthesized key/value pairs (e.g. BuildMachineOSBuild). plist = plistlib.readPlistFromString(lines) if keys: plist = dict(plist.items() + json.loads(keys[0]).items()) lines = plistlib.writePlistToString(plist) # Go through all the environment variables and replace them as variables in # the file. IDENT_RE = re.compile('[/\s]') for key in os.environ: if key.startswith('_'): continue evar = '${%s}' % key evalue = os.environ[key] lines = string.replace(lines, evar, evalue) # Xcode supports various suffices on environment variables, which are # all undocumented. :rfc1034identifier is used in the standard project # template these days, and :identifier was used earlier. They are used to # convert non-url characters into things that look like valid urls -- # except that the replacement character for :identifier, '_' isn't valid # in a URL either -- oops, hence :rfc1034identifier was born. evar = '${%s:identifier}' % key evalue = IDENT_RE.sub('_', os.environ[key]) lines = string.replace(lines, evar, evalue) evar = '${%s:rfc1034identifier}' % key evalue = IDENT_RE.sub('-', os.environ[key]) lines = string.replace(lines, evar, evalue) # Remove any keys with values that haven't been replaced. lines = lines.split('\n') for i in range(len(lines)): if lines[i].strip().startswith("<string>${"): lines[i] = None lines[i - 1] = None lines = '\n'.join(filter(lambda x: x is not None, lines)) # Write out the file with variables replaced. fd = open(dest, 'w') fd.write(lines) fd.close() # Now write out PkgInfo file now that the Info.plist file has been # "compiled". self._WritePkgInfo(dest) def _WritePkgInfo(self, info_plist): """This writes the PkgInfo file from the data stored in Info.plist.""" plist = plistlib.readPlist(info_plist) if not plist: return # Only create PkgInfo for executable types. package_type = plist['CFBundlePackageType'] if package_type != 'APPL': return # The format of PkgInfo is eight characters, representing the bundle type # and bundle signature, each four characters. If that is missing, four # '?' characters are used instead. signature_code = plist.get('CFBundleSignature', '????') if len(signature_code) != 4: # Wrong length resets everything, too. signature_code = '?' * 4 dest = os.path.join(os.path.dirname(info_plist), 'PkgInfo') fp = open(dest, 'w') fp.write('%s%s' % (package_type, signature_code)) fp.close() def ExecFlock(self, lockfile, *cmd_list): """Emulates the most basic behavior of Linux's flock(1).""" # Rely on exception handling to report errors. fd = os.open(lockfile, os.O_RDONLY|os.O_NOCTTY|os.O_CREAT, 0o666) fcntl.flock(fd, fcntl.LOCK_EX) return subprocess.call(cmd_list) def ExecFilterLibtool(self, *cmd_list): """Calls libtool and filters out '/path/to/libtool: file: foo.o has no symbols'.""" libtool_re = re.compile(r'^.*libtool: file: .* has no symbols$') libtoolout = subprocess.Popen(cmd_list, stderr=subprocess.PIPE) _, err = libtoolout.communicate() for line in err.splitlines(): if not libtool_re.match(line): print >>sys.stderr, line return libtoolout.returncode def ExecPackageFramework(self, framework, version): """Takes a path to Something.framework and the Current version of that and sets up all the symlinks.""" # Find the name of the binary based on the part before the ".framework". binary = os.path.basename(framework).split('.')[0] CURRENT = 'Current' RESOURCES = 'Resources' VERSIONS = 'Versions' if not os.path.exists(os.path.join(framework, VERSIONS, version, binary)): # Binary-less frameworks don't seem to contain symlinks (see e.g. # chromium's out/Debug/org.chromium.Chromium.manifest/ bundle). return # Move into the framework directory to set the symlinks correctly. pwd = os.getcwd() os.chdir(framework) # Set up the Current version. self._Relink(version, os.path.join(VERSIONS, CURRENT)) # Set up the root symlinks. self._Relink(os.path.join(VERSIONS, CURRENT, binary), binary) self._Relink(os.path.join(VERSIONS, CURRENT, RESOURCES), RESOURCES) # Back to where we were before! os.chdir(pwd) def _Relink(self, dest, link): """Creates a symlink to |dest| named |link|. If |link| already exists, it is overwritten.""" if os.path.lexists(link): os.remove(link) os.symlink(dest, link) def ExecCodeSignBundle(self, key, resource_rules, entitlements, provisioning): """Code sign a bundle. This function tries to code sign an iOS bundle, following the same algorithm as Xcode: 1. copy ResourceRules.plist from the user or the SDK into the bundle, 2. pick the provisioning profile that best match the bundle identifier, and copy it into the bundle as embedded.mobileprovision, 3. copy Entitlements.plist from user or SDK next to the bundle, 4. code sign the bundle. """ resource_rules_path = self._InstallResourceRules(resource_rules) substitutions, overrides = self._InstallProvisioningProfile( provisioning, self._GetCFBundleIdentifier()) entitlements_path = self._InstallEntitlements( entitlements, substitutions, overrides) subprocess.check_call([ 'codesign', '--force', '--sign', key, '--resource-rules', resource_rules_path, '--entitlements', entitlements_path, os.path.join( os.environ['TARGET_BUILD_DIR'], os.environ['FULL_PRODUCT_NAME'])]) def _InstallResourceRules(self, resource_rules): """Installs ResourceRules.plist from user or SDK into the bundle. Args: resource_rules: string, optional, path to the ResourceRules.plist file to use, default to "${SDKROOT}/ResourceRules.plist" Returns: Path to the copy of ResourceRules.plist into the bundle. """ source_path = resource_rules target_path = os.path.join( os.environ['BUILT_PRODUCTS_DIR'], os.environ['CONTENTS_FOLDER_PATH'], 'ResourceRules.plist') if not source_path: source_path = os.path.join( os.environ['SDKROOT'], 'ResourceRules.plist') shutil.copy2(source_path, target_path) return target_path def _InstallProvisioningProfile(self, profile, bundle_identifier): """Installs embedded.mobileprovision into the bundle. Args: profile: string, optional, short name of the .mobileprovision file to use, if empty or the file is missing, the best file installed will be used bundle_identifier: string, value of CFBundleIdentifier from Info.plist Returns: A tuple containing two dictionary: variables substitutions and values to overrides when generating the entitlements file. """ source_path, provisioning_data, team_id = self._FindProvisioningProfile( profile, bundle_identifier) target_path = os.path.join( os.environ['BUILT_PRODUCTS_DIR'], os.environ['CONTENTS_FOLDER_PATH'], 'embedded.mobileprovision') shutil.copy2(source_path, target_path) substitutions = self._GetSubstitutions(bundle_identifier, team_id + '.') return substitutions, provisioning_data['Entitlements'] def _FindProvisioningProfile(self, profile, bundle_identifier): """Finds the .mobileprovision file to use for signing the bundle. Checks all the installed provisioning profiles (or if the user specified the PROVISIONING_PROFILE variable, only consult it) and select the most specific that correspond to the bundle identifier. Args: profile: string, optional, short name of the .mobileprovision file to use, if empty or the file is missing, the best file installed will be used bundle_identifier: string, value of CFBundleIdentifier from Info.plist Returns: A tuple of the path to the selected provisioning profile, the data of the embedded plist in the provisioning profile and the team identifier to use for code signing. Raises: SystemExit: if no .mobileprovision can be used to sign the bundle. """ profiles_dir = os.path.join( os.environ['HOME'], 'Library', 'MobileDevice', 'Provisioning Profiles') if not os.path.isdir(profiles_dir): print >>sys.stderr, ( 'cannot find mobile provisioning for %s' % bundle_identifier) sys.exit(1) provisioning_profiles = None if profile: profile_path = os.path.join(profiles_dir, profile + '.mobileprovision') if os.path.exists(profile_path): provisioning_profiles = [profile_path] if not provisioning_profiles: provisioning_profiles = glob.glob( os.path.join(profiles_dir, '*.mobileprovision')) valid_provisioning_profiles = {} for profile_path in provisioning_profiles: profile_data = self._LoadProvisioningProfile(profile_path) app_id_pattern = profile_data.get( 'Entitlements', {}).get('application-identifier', '') for team_identifier in profile_data.get('TeamIdentifier', []): app_id = '%s.%s' % (team_identifier, bundle_identifier) if fnmatch.fnmatch(app_id, app_id_pattern): valid_provisioning_profiles[app_id_pattern] = ( profile_path, profile_data, team_identifier) if not valid_provisioning_profiles: print >>sys.stderr, ( 'cannot find mobile provisioning for %s' % bundle_identifier) sys.exit(1) # If the user has multiple provisioning profiles installed that can be # used for ${bundle_identifier}, pick the most specific one (ie. the # provisioning profile whose pattern is the longest). selected_key = max(valid_provisioning_profiles, key=lambda v: len(v)) return valid_provisioning_profiles[selected_key] def _LoadProvisioningProfile(self, profile_path): """Extracts the plist embedded in a provisioning profile. Args: profile_path: string, path to the .mobileprovision file Returns: Content of the plist embedded in the provisioning profile as a dictionary. """ with tempfile.NamedTemporaryFile() as temp: subprocess.check_call([ 'security', 'cms', '-D', '-i', profile_path, '-o', temp.name]) return self._LoadPlistMaybeBinary(temp.name) def _LoadPlistMaybeBinary(self, plist_path): """Loads into a memory a plist possibly encoded in binary format. This is a wrapper around plistlib.readPlist that tries to convert the plist to the XML format if it can't be parsed (assuming that it is in the binary format). Args: plist_path: string, path to a plist file, in XML or binary format Returns: Content of the plist as a dictionary. """ try: # First, try to read the file using plistlib that only supports XML, # and if an exception is raised, convert a temporary copy to XML and # load that copy. return plistlib.readPlist(plist_path) except: pass with tempfile.NamedTemporaryFile() as temp: shutil.copy2(plist_path, temp.name) subprocess.check_call(['plutil', '-convert', 'xml1', temp.name]) return plistlib.readPlist(temp.name) def _GetSubstitutions(self, bundle_identifier, app_identifier_prefix): """Constructs a dictionary of variable substitutions for Entitlements.plist. Args: bundle_identifier: string, value of CFBundleIdentifier from Info.plist app_identifier_prefix: string, value for AppIdentifierPrefix Returns: Dictionary of substitutions to apply when generating Entitlements.plist. """ return { 'CFBundleIdentifier': bundle_identifier, 'AppIdentifierPrefix': app_identifier_prefix, } def _GetCFBundleIdentifier(self): """Extracts CFBundleIdentifier value from Info.plist in the bundle. Returns: Value of CFBundleIdentifier in the Info.plist located in the bundle. """ info_plist_path = os.path.join( os.environ['TARGET_BUILD_DIR'], os.environ['INFOPLIST_PATH']) info_plist_data = self._LoadPlistMaybeBinary(info_plist_path) return info_plist_data['CFBundleIdentifier'] def _InstallEntitlements(self, entitlements, substitutions, overrides): """Generates and install the ${BundleName}.xcent entitlements file. Expands variables "$(variable)" pattern in the source entitlements file, add extra entitlements defined in the .mobileprovision file and the copy the generated plist to "${BundlePath}.xcent". Args: entitlements: string, optional, path to the Entitlements.plist template to use, defaults to "${SDKROOT}/Entitlements.plist" substitutions: dictionary, variable substitutions overrides: dictionary, values to add to the entitlements Returns: Path to the generated entitlements file. """ source_path = entitlements target_path = os.path.join( os.environ['BUILT_PRODUCTS_DIR'], os.environ['PRODUCT_NAME'] + '.xcent') if not source_path: source_path = os.path.join( os.environ['SDKROOT'], 'Entitlements.plist') shutil.copy2(source_path, target_path) data = self._LoadPlistMaybeBinary(target_path) data = self._ExpandVariables(data, substitutions) if overrides: for key in overrides: if key not in data: data[key] = overrides[key] plistlib.writePlist(data, target_path) return target_path def _ExpandVariables(self, data, substitutions): """Expands variables "$(variable)" in data. Args: data: object, can be either string, list or dictionary substitutions: dictionary, variable substitutions to perform Returns: Copy of data where each references to "$(variable)" has been replaced by the corresponding value found in substitutions, or left intact if the key was not found. """ if isinstance(data, str): for key, value in substitutions.iteritems(): data = data.replace('$(%s)' % key, value) return data if isinstance(data, list): return [self._ExpandVariables(v, substitutions) for v in data] if isinstance(data, dict): return dict((k, self._ExpandVariables(data[k], substitutions)) for k in data) return data if __name__ == '__main__': sys.exit(main(sys.argv[1:])) ���������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/msvs_emulation.py���000644 �000766 �000024 �00000123217 12455173731 033243� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ This module helps emulate Visual Studio 2008 behavior on top of other build systems, primarily ninja. """ import os import re import subprocess import sys import gyp.MSVSVersion windows_quoter_regex = re.compile(r'(\\*)"') def QuoteForRspFile(arg): """Quote a command line argument so that it appears as one argument when processed via cmd.exe and parsed by CommandLineToArgvW (as is typical for Windows programs).""" # See http://goo.gl/cuFbX and http://goo.gl/dhPnp including the comment # threads. This is actually the quoting rules for CommandLineToArgvW, not # for the shell, because the shell doesn't do anything in Windows. This # works more or less because most programs (including the compiler, etc.) # use that function to handle command line arguments. # For a literal quote, CommandLineToArgvW requires 2n+1 backslashes # preceding it, and results in n backslashes + the quote. So we substitute # in 2* what we match, +1 more, plus the quote. arg = windows_quoter_regex.sub(lambda mo: 2 * mo.group(1) + '\\"', arg) # %'s also need to be doubled otherwise they're interpreted as batch # positional arguments. Also make sure to escape the % so that they're # passed literally through escaping so they can be singled to just the # original %. Otherwise, trying to pass the literal representation that # looks like an environment variable to the shell (e.g. %PATH%) would fail. arg = arg.replace('%', '%%') # These commands are used in rsp files, so no escaping for the shell (via ^) # is necessary. # Finally, wrap the whole thing in quotes so that the above quote rule # applies and whitespace isn't a word break. return '"' + arg + '"' def EncodeRspFileList(args): """Process a list of arguments using QuoteCmdExeArgument.""" # Note that the first argument is assumed to be the command. Don't add # quotes around it because then built-ins like 'echo', etc. won't work. # Take care to normpath only the path in the case of 'call ../x.bat' because # otherwise the whole thing is incorrectly interpreted as a path and not # normalized correctly. if not args: return '' if args[0].startswith('call '): call, program = args[0].split(' ', 1) program = call + ' ' + os.path.normpath(program) else: program = os.path.normpath(args[0]) return program + ' ' + ' '.join(QuoteForRspFile(arg) for arg in args[1:]) def _GenericRetrieve(root, default, path): """Given a list of dictionary keys |path| and a tree of dicts |root|, find value at path, or return |default| if any of the path doesn't exist.""" if not root: return default if not path: return root return _GenericRetrieve(root.get(path[0]), default, path[1:]) def _AddPrefix(element, prefix): """Add |prefix| to |element| or each subelement if element is iterable.""" if element is None: return element # Note, not Iterable because we don't want to handle strings like that. if isinstance(element, list) or isinstance(element, tuple): return [prefix + e for e in element] else: return prefix + element def _DoRemapping(element, map): """If |element| then remap it through |map|. If |element| is iterable then each item will be remapped. Any elements not found will be removed.""" if map is not None and element is not None: if not callable(map): map = map.get # Assume it's a dict, otherwise a callable to do the remap. if isinstance(element, list) or isinstance(element, tuple): element = filter(None, [map(elem) for elem in element]) else: element = map(element) return element def _AppendOrReturn(append, element): """If |append| is None, simply return |element|. If |append| is not None, then add |element| to it, adding each item in |element| if it's a list or tuple.""" if append is not None and element is not None: if isinstance(element, list) or isinstance(element, tuple): append.extend(element) else: append.append(element) else: return element def _FindDirectXInstallation(): """Try to find an installation location for the DirectX SDK. Check for the standard environment variable, and if that doesn't exist, try to find via the registry. May return None if not found in either location.""" # Return previously calculated value, if there is one if hasattr(_FindDirectXInstallation, 'dxsdk_dir'): return _FindDirectXInstallation.dxsdk_dir dxsdk_dir = os.environ.get('DXSDK_DIR') if not dxsdk_dir: # Setup params to pass to and attempt to launch reg.exe. cmd = ['reg.exe', 'query', r'HKLM\Software\Microsoft\DirectX', '/s'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) for line in p.communicate()[0].splitlines(): if 'InstallPath' in line: dxsdk_dir = line.split(' ')[3] + "\\" # Cache return value _FindDirectXInstallation.dxsdk_dir = dxsdk_dir return dxsdk_dir class MsvsSettings(object): """A class that understands the gyp 'msvs_...' values (especially the msvs_settings field). They largely correpond to the VS2008 IDE DOM. This class helps map those settings to command line options.""" def __init__(self, spec, generator_flags): self.spec = spec self.vs_version = GetVSVersion(generator_flags) self.dxsdk_dir = _FindDirectXInstallation() # Try to find an installation location for the Windows DDK by checking # the WDK_DIR environment variable, may be None. self.wdk_dir = os.environ.get('WDK_DIR') supported_fields = [ ('msvs_configuration_attributes', dict), ('msvs_settings', dict), ('msvs_system_include_dirs', list), ('msvs_disabled_warnings', list), ('msvs_precompiled_header', str), ('msvs_precompiled_source', str), ('msvs_configuration_platform', str), ('msvs_target_platform', str), ] configs = spec['configurations'] for field, default in supported_fields: setattr(self, field, {}) for configname, config in configs.iteritems(): getattr(self, field)[configname] = config.get(field, default()) self.msvs_cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.']) def GetVSMacroEnv(self, base_to_build=None, config=None): """Get a dict of variables mapping internal VS macro names to their gyp equivalents.""" target_platform = 'Win32' if self.GetArch(config) == 'x86' else 'x64' target_name = self.spec.get('product_prefix', '') + \ self.spec.get('product_name', self.spec['target_name']) target_dir = base_to_build + '\\' if base_to_build else '' replacements = { '$(OutDir)\\': target_dir, '$(TargetDir)\\': target_dir, '$(IntDir)': '$!INTERMEDIATE_DIR', '$(InputPath)': '${source}', '$(InputName)': '${root}', '$(ProjectName)': self.spec['target_name'], '$(TargetName)': target_name, '$(PlatformName)': target_platform, '$(ProjectDir)\\': '', } # '$(VSInstallDir)' and '$(VCInstallDir)' are available when and only when # Visual Studio is actually installed. if self.vs_version.Path(): replacements['$(VSInstallDir)'] = self.vs_version.Path() replacements['$(VCInstallDir)'] = os.path.join(self.vs_version.Path(), 'VC') + '\\' # Chromium uses DXSDK_DIR in include/lib paths, but it may or may not be # set. This happens when the SDK is sync'd via src-internal, rather than # by typical end-user installation of the SDK. If it's not set, we don't # want to leave the unexpanded variable in the path, so simply strip it. replacements['$(DXSDK_DIR)'] = self.dxsdk_dir if self.dxsdk_dir else '' replacements['$(WDK_DIR)'] = self.wdk_dir if self.wdk_dir else '' return replacements def ConvertVSMacros(self, s, base_to_build=None, config=None): """Convert from VS macro names to something equivalent.""" env = self.GetVSMacroEnv(base_to_build, config=config) return ExpandMacros(s, env) def AdjustLibraries(self, libraries): """Strip -l from library if it's specified with that.""" libs = [lib[2:] if lib.startswith('-l') else lib for lib in libraries] return [lib + '.lib' if not lib.endswith('.lib') else lib for lib in libs] def _GetAndMunge(self, field, path, default, prefix, append, map): """Retrieve a value from |field| at |path| or return |default|. If |append| is specified, and the item is found, it will be appended to that object instead of returned. If |map| is specified, results will be remapped through |map| before being returned or appended.""" result = _GenericRetrieve(field, default, path) result = _DoRemapping(result, map) result = _AddPrefix(result, prefix) return _AppendOrReturn(append, result) class _GetWrapper(object): def __init__(self, parent, field, base_path, append=None): self.parent = parent self.field = field self.base_path = [base_path] self.append = append def __call__(self, name, map=None, prefix='', default=None): return self.parent._GetAndMunge(self.field, self.base_path + [name], default=default, prefix=prefix, append=self.append, map=map) def GetArch(self, config): """Get architecture based on msvs_configuration_platform and msvs_target_platform. Returns either 'x86' or 'x64'.""" configuration_platform = self.msvs_configuration_platform.get(config, '') platform = self.msvs_target_platform.get(config, '') if not platform: # If no specific override, use the configuration's. platform = configuration_platform # Map from platform to architecture. return {'Win32': 'x86', 'x64': 'x64'}.get(platform, 'x86') def _TargetConfig(self, config): """Returns the target-specific configuration.""" # There's two levels of architecture/platform specification in VS. The # first level is globally for the configuration (this is what we consider # "the" config at the gyp level, which will be something like 'Debug' or # 'Release_x64'), and a second target-specific configuration, which is an # override for the global one. |config| is remapped here to take into # account the local target-specific overrides to the global configuration. arch = self.GetArch(config) if arch == 'x64' and not config.endswith('_x64'): config += '_x64' if arch == 'x86' and config.endswith('_x64'): config = config.rsplit('_', 1)[0] return config def _Setting(self, path, config, default=None, prefix='', append=None, map=None): """_GetAndMunge for msvs_settings.""" return self._GetAndMunge( self.msvs_settings[config], path, default, prefix, append, map) def _ConfigAttrib(self, path, config, default=None, prefix='', append=None, map=None): """_GetAndMunge for msvs_configuration_attributes.""" return self._GetAndMunge( self.msvs_configuration_attributes[config], path, default, prefix, append, map) def AdjustIncludeDirs(self, include_dirs, config): """Updates include_dirs to expand VS specific paths, and adds the system include dirs used for platform SDK and similar.""" config = self._TargetConfig(config) includes = include_dirs + self.msvs_system_include_dirs[config] includes.extend(self._Setting( ('VCCLCompilerTool', 'AdditionalIncludeDirectories'), config, default=[])) return [self.ConvertVSMacros(p, config=config) for p in includes] def GetComputedDefines(self, config): """Returns the set of defines that are injected to the defines list based on other VS settings.""" config = self._TargetConfig(config) defines = [] if self._ConfigAttrib(['CharacterSet'], config) == '1': defines.extend(('_UNICODE', 'UNICODE')) if self._ConfigAttrib(['CharacterSet'], config) == '2': defines.append('_MBCS') defines.extend(self._Setting( ('VCCLCompilerTool', 'PreprocessorDefinitions'), config, default=[])) return defines def GetCompilerPdbName(self, config, expand_special): """Get the pdb file name that should be used for compiler invocations, or None if there's no explicit name specified.""" config = self._TargetConfig(config) pdbname = self._Setting( ('VCCLCompilerTool', 'ProgramDataBaseFileName'), config) if pdbname: pdbname = expand_special(self.ConvertVSMacros(pdbname)) return pdbname def GetMapFileName(self, config, expand_special): """Gets the explicitly overriden map file name for a target or returns None if it's not set.""" config = self._TargetConfig(config) map_file = self._Setting(('VCLinkerTool', 'MapFileName'), config) if map_file: map_file = expand_special(self.ConvertVSMacros(map_file, config=config)) return map_file def GetOutputName(self, config, expand_special): """Gets the explicitly overridden output name for a target or returns None if it's not overridden.""" config = self._TargetConfig(config) type = self.spec['type'] root = 'VCLibrarianTool' if type == 'static_library' else 'VCLinkerTool' # TODO(scottmg): Handle OutputDirectory without OutputFile. output_file = self._Setting((root, 'OutputFile'), config) if output_file: output_file = expand_special(self.ConvertVSMacros( output_file, config=config)) return output_file def GetPDBName(self, config, expand_special, default): """Gets the explicitly overridden pdb name for a target or returns default if it's not overridden, or if no pdb will be generated.""" config = self._TargetConfig(config) output_file = self._Setting(('VCLinkerTool', 'ProgramDatabaseFile'), config) generate_debug_info = self._Setting( ('VCLinkerTool', 'GenerateDebugInformation'), config) if generate_debug_info: if output_file: return expand_special(self.ConvertVSMacros(output_file, config=config)) else: return default else: return None def GetCflags(self, config): """Returns the flags that need to be added to .c and .cc compilations.""" config = self._TargetConfig(config) cflags = [] cflags.extend(['/wd' + w for w in self.msvs_disabled_warnings[config]]) cl = self._GetWrapper(self, self.msvs_settings[config], 'VCCLCompilerTool', append=cflags) cl('Optimization', map={'0': 'd', '1': '1', '2': '2', '3': 'x'}, prefix='/O', default='2') cl('InlineFunctionExpansion', prefix='/Ob') cl('DisableSpecificWarnings', prefix='/wd') cl('StringPooling', map={'true': '/GF'}) cl('EnableFiberSafeOptimizations', map={'true': '/GT'}) cl('OmitFramePointers', map={'false': '-', 'true': ''}, prefix='/Oy') cl('EnableIntrinsicFunctions', map={'false': '-', 'true': ''}, prefix='/Oi') cl('FavorSizeOrSpeed', map={'1': 't', '2': 's'}, prefix='/O') cl('WholeProgramOptimization', map={'true': '/GL'}) cl('WarningLevel', prefix='/W') cl('WarnAsError', map={'true': '/WX'}) cl('DebugInformationFormat', map={'1': '7', '3': 'i', '4': 'I'}, prefix='/Z') cl('RuntimeTypeInfo', map={'true': '/GR', 'false': '/GR-'}) cl('EnableFunctionLevelLinking', map={'true': '/Gy', 'false': '/Gy-'}) cl('MinimalRebuild', map={'true': '/Gm'}) cl('BufferSecurityCheck', map={'true': '/GS', 'false': '/GS-'}) cl('BasicRuntimeChecks', map={'1': 's', '2': 'u', '3': '1'}, prefix='/RTC') cl('RuntimeLibrary', map={'0': 'T', '1': 'Td', '2': 'D', '3': 'Dd'}, prefix='/M') cl('ExceptionHandling', map={'1': 'sc','2': 'a'}, prefix='/EH') cl('DefaultCharIsUnsigned', map={'true': '/J'}) cl('TreatWChar_tAsBuiltInType', map={'false': '-', 'true': ''}, prefix='/Zc:wchar_t') cl('EnablePREfast', map={'true': '/analyze'}) cl('AdditionalOptions', prefix='') cflags.extend(['/FI' + f for f in self._Setting( ('VCCLCompilerTool', 'ForcedIncludeFiles'), config, default=[])]) if self.vs_version.short_name in ('2013', '2013e'): # New flag required in 2013 to maintain previous PDB behavior. cflags.append('/FS') # ninja handles parallelism by itself, don't have the compiler do it too. cflags = filter(lambda x: not x.startswith('/MP'), cflags) return cflags def GetPrecompiledHeader(self, config, gyp_to_build_path): """Returns an object that handles the generation of precompiled header build steps.""" config = self._TargetConfig(config) return _PchHelper(self, config, gyp_to_build_path) def _GetPchFlags(self, config, extension): """Get the flags to be added to the cflags for precompiled header support. """ config = self._TargetConfig(config) # The PCH is only built once by a particular source file. Usage of PCH must # only be for the same language (i.e. C vs. C++), so only include the pch # flags when the language matches. if self.msvs_precompiled_header[config]: source_ext = os.path.splitext(self.msvs_precompiled_source[config])[1] if _LanguageMatchesForPch(source_ext, extension): pch = os.path.split(self.msvs_precompiled_header[config])[1] return ['/Yu' + pch, '/FI' + pch, '/Fp${pchprefix}.' + pch + '.pch'] return [] def GetCflagsC(self, config): """Returns the flags that need to be added to .c compilations.""" config = self._TargetConfig(config) return self._GetPchFlags(config, '.c') def GetCflagsCC(self, config): """Returns the flags that need to be added to .cc compilations.""" config = self._TargetConfig(config) return ['/TP'] + self._GetPchFlags(config, '.cc') def _GetAdditionalLibraryDirectories(self, root, config, gyp_to_build_path): """Get and normalize the list of paths in AdditionalLibraryDirectories setting.""" config = self._TargetConfig(config) libpaths = self._Setting((root, 'AdditionalLibraryDirectories'), config, default=[]) libpaths = [os.path.normpath( gyp_to_build_path(self.ConvertVSMacros(p, config=config))) for p in libpaths] return ['/LIBPATH:"' + p + '"' for p in libpaths] def GetLibFlags(self, config, gyp_to_build_path): """Returns the flags that need to be added to lib commands.""" config = self._TargetConfig(config) libflags = [] lib = self._GetWrapper(self, self.msvs_settings[config], 'VCLibrarianTool', append=libflags) libflags.extend(self._GetAdditionalLibraryDirectories( 'VCLibrarianTool', config, gyp_to_build_path)) lib('LinkTimeCodeGeneration', map={'true': '/LTCG'}) lib('TargetMachine', map={'1': 'X86', '17': 'X64'}, prefix='/MACHINE:') lib('AdditionalOptions') return libflags def GetDefFile(self, gyp_to_build_path): """Returns the .def file from sources, if any. Otherwise returns None.""" spec = self.spec if spec['type'] in ('shared_library', 'loadable_module', 'executable'): def_files = [s for s in spec.get('sources', []) if s.endswith('.def')] if len(def_files) == 1: return gyp_to_build_path(def_files[0]) elif len(def_files) > 1: raise Exception("Multiple .def files") return None def _GetDefFileAsLdflags(self, ldflags, gyp_to_build_path): """.def files get implicitly converted to a ModuleDefinitionFile for the linker in the VS generator. Emulate that behaviour here.""" def_file = self.GetDefFile(gyp_to_build_path) if def_file: ldflags.append('/DEF:"%s"' % def_file) def GetPGDName(self, config, expand_special): """Gets the explicitly overridden pgd name for a target or returns None if it's not overridden.""" config = self._TargetConfig(config) output_file = self._Setting( ('VCLinkerTool', 'ProfileGuidedDatabase'), config) if output_file: output_file = expand_special(self.ConvertVSMacros( output_file, config=config)) return output_file def GetLdflags(self, config, gyp_to_build_path, expand_special, manifest_base_name, output_name, is_executable, build_dir): """Returns the flags that need to be added to link commands, and the manifest files.""" config = self._TargetConfig(config) ldflags = [] ld = self._GetWrapper(self, self.msvs_settings[config], 'VCLinkerTool', append=ldflags) self._GetDefFileAsLdflags(ldflags, gyp_to_build_path) ld('GenerateDebugInformation', map={'true': '/DEBUG'}) ld('TargetMachine', map={'1': 'X86', '17': 'X64'}, prefix='/MACHINE:') ldflags.extend(self._GetAdditionalLibraryDirectories( 'VCLinkerTool', config, gyp_to_build_path)) ld('DelayLoadDLLs', prefix='/DELAYLOAD:') ld('TreatLinkerWarningAsErrors', prefix='/WX', map={'true': '', 'false': ':NO'}) out = self.GetOutputName(config, expand_special) if out: ldflags.append('/OUT:' + out) pdb = self.GetPDBName(config, expand_special, output_name + '.pdb') if pdb: ldflags.append('/PDB:' + pdb) pgd = self.GetPGDName(config, expand_special) if pgd: ldflags.append('/PGD:' + pgd) map_file = self.GetMapFileName(config, expand_special) ld('GenerateMapFile', map={'true': '/MAP:' + map_file if map_file else '/MAP'}) ld('MapExports', map={'true': '/MAPINFO:EXPORTS'}) ld('AdditionalOptions', prefix='') minimum_required_version = self._Setting( ('VCLinkerTool', 'MinimumRequiredVersion'), config, default='') if minimum_required_version: minimum_required_version = ',' + minimum_required_version ld('SubSystem', map={'1': 'CONSOLE%s' % minimum_required_version, '2': 'WINDOWS%s' % minimum_required_version}, prefix='/SUBSYSTEM:') ld('TerminalServerAware', map={'1': ':NO', '2': ''}, prefix='/TSAWARE') ld('LinkIncremental', map={'1': ':NO', '2': ''}, prefix='/INCREMENTAL') ld('BaseAddress', prefix='/BASE:') ld('FixedBaseAddress', map={'1': ':NO', '2': ''}, prefix='/FIXED') ld('RandomizedBaseAddress', map={'1': ':NO', '2': ''}, prefix='/DYNAMICBASE') ld('DataExecutionPrevention', map={'1': ':NO', '2': ''}, prefix='/NXCOMPAT') ld('OptimizeReferences', map={'1': 'NOREF', '2': 'REF'}, prefix='/OPT:') ld('ForceSymbolReferences', prefix='/INCLUDE:') ld('EnableCOMDATFolding', map={'1': 'NOICF', '2': 'ICF'}, prefix='/OPT:') ld('LinkTimeCodeGeneration', map={'1': '', '2': ':PGINSTRUMENT', '3': ':PGOPTIMIZE', '4': ':PGUPDATE'}, prefix='/LTCG') ld('IgnoreDefaultLibraryNames', prefix='/NODEFAULTLIB:') ld('ResourceOnlyDLL', map={'true': '/NOENTRY'}) ld('EntryPointSymbol', prefix='/ENTRY:') ld('Profile', map={'true': '/PROFILE'}) ld('LargeAddressAware', map={'1': ':NO', '2': ''}, prefix='/LARGEADDRESSAWARE') # TODO(scottmg): This should sort of be somewhere else (not really a flag). ld('AdditionalDependencies', prefix='') # If the base address is not specifically controlled, DYNAMICBASE should # be on by default. base_flags = filter(lambda x: 'DYNAMICBASE' in x or x == '/FIXED', ldflags) if not base_flags: ldflags.append('/DYNAMICBASE') # If the NXCOMPAT flag has not been specified, default to on. Despite the # documentation that says this only defaults to on when the subsystem is # Vista or greater (which applies to the linker), the IDE defaults it on # unless it's explicitly off. if not filter(lambda x: 'NXCOMPAT' in x, ldflags): ldflags.append('/NXCOMPAT') have_def_file = filter(lambda x: x.startswith('/DEF:'), ldflags) manifest_flags, intermediate_manifest, manifest_files = \ self._GetLdManifestFlags(config, manifest_base_name, gyp_to_build_path, is_executable and not have_def_file, build_dir) ldflags.extend(manifest_flags) return ldflags, intermediate_manifest, manifest_files def _GetLdManifestFlags(self, config, name, gyp_to_build_path, allow_isolation, build_dir): """Returns a 3-tuple: - the set of flags that need to be added to the link to generate a default manifest - the intermediate manifest that the linker will generate that should be used to assert it doesn't add anything to the merged one. - the list of all the manifest files to be merged by the manifest tool and included into the link.""" generate_manifest = self._Setting(('VCLinkerTool', 'GenerateManifest'), config, default='true') if generate_manifest != 'true': # This means not only that the linker should not generate the intermediate # manifest but also that the manifest tool should do nothing even when # additional manifests are specified. return ['/MANIFEST:NO'], [], [] output_name = name + '.intermediate.manifest' flags = [ '/MANIFEST', '/ManifestFile:' + output_name, ] # Instead of using the MANIFESTUAC flags, we generate a .manifest to # include into the list of manifests. This allows us to avoid the need to # do two passes during linking. The /MANIFEST flag and /ManifestFile are # still used, and the intermediate manifest is used to assert that the # final manifest we get from merging all the additional manifest files # (plus the one we generate here) isn't modified by merging the # intermediate into it. # Always NO, because we generate a manifest file that has what we want. flags.append('/MANIFESTUAC:NO') config = self._TargetConfig(config) enable_uac = self._Setting(('VCLinkerTool', 'EnableUAC'), config, default='true') manifest_files = [] generated_manifest_outer = \ "<?xml version='1.0' encoding='UTF-8' standalone='yes'?>" \ "<assembly xmlns='urn:schemas-microsoft-com:asm.v1' manifestVersion='1.0'>%s" \ "</assembly>" if enable_uac == 'true': execution_level = self._Setting(('VCLinkerTool', 'UACExecutionLevel'), config, default='0') execution_level_map = { '0': 'asInvoker', '1': 'highestAvailable', '2': 'requireAdministrator' } ui_access = self._Setting(('VCLinkerTool', 'UACUIAccess'), config, default='false') inner = ''' <trustInfo xmlns="urn:schemas-microsoft-com:asm.v3"> <security> <requestedPrivileges> <requestedExecutionLevel level='%s' uiAccess='%s' /> </requestedPrivileges> </security> </trustInfo>''' % (execution_level_map[execution_level], ui_access) else: inner = '' generated_manifest_contents = generated_manifest_outer % inner generated_name = name + '.generated.manifest' # Need to join with the build_dir here as we're writing it during # generation time, but we return the un-joined version because the build # will occur in that directory. We only write the file if the contents # have changed so that simply regenerating the project files doesn't # cause a relink. build_dir_generated_name = os.path.join(build_dir, generated_name) gyp.common.EnsureDirExists(build_dir_generated_name) f = gyp.common.WriteOnDiff(build_dir_generated_name) f.write(generated_manifest_contents) f.close() manifest_files = [generated_name] if allow_isolation: flags.append('/ALLOWISOLATION') manifest_files += self._GetAdditionalManifestFiles(config, gyp_to_build_path) return flags, output_name, manifest_files def _GetAdditionalManifestFiles(self, config, gyp_to_build_path): """Gets additional manifest files that are added to the default one generated by the linker.""" files = self._Setting(('VCManifestTool', 'AdditionalManifestFiles'), config, default=[]) if isinstance(files, str): files = files.split(';') return [os.path.normpath( gyp_to_build_path(self.ConvertVSMacros(f, config=config))) for f in files] def IsUseLibraryDependencyInputs(self, config): """Returns whether the target should be linked via Use Library Dependency Inputs (using component .objs of a given .lib).""" config = self._TargetConfig(config) uldi = self._Setting(('VCLinkerTool', 'UseLibraryDependencyInputs'), config) return uldi == 'true' def IsEmbedManifest(self, config): """Returns whether manifest should be linked into binary.""" config = self._TargetConfig(config) embed = self._Setting(('VCManifestTool', 'EmbedManifest'), config, default='true') return embed == 'true' def IsLinkIncremental(self, config): """Returns whether the target should be linked incrementally.""" config = self._TargetConfig(config) link_inc = self._Setting(('VCLinkerTool', 'LinkIncremental'), config) return link_inc != '1' def GetRcflags(self, config, gyp_to_ninja_path): """Returns the flags that need to be added to invocations of the resource compiler.""" config = self._TargetConfig(config) rcflags = [] rc = self._GetWrapper(self, self.msvs_settings[config], 'VCResourceCompilerTool', append=rcflags) rc('AdditionalIncludeDirectories', map=gyp_to_ninja_path, prefix='/I') rcflags.append('/I' + gyp_to_ninja_path('.')) rc('PreprocessorDefinitions', prefix='/d') # /l arg must be in hex without leading '0x' rc('Culture', prefix='/l', map=lambda x: hex(int(x))[2:]) return rcflags def BuildCygwinBashCommandLine(self, args, path_to_base): """Build a command line that runs args via cygwin bash. We assume that all incoming paths are in Windows normpath'd form, so they need to be converted to posix style for the part of the command line that's passed to bash. We also have to do some Visual Studio macro emulation here because various rules use magic VS names for things. Also note that rules that contain ninja variables cannot be fixed here (for example ${source}), so the outer generator needs to make sure that the paths that are written out are in posix style, if the command line will be used here.""" cygwin_dir = os.path.normpath( os.path.join(path_to_base, self.msvs_cygwin_dirs[0])) cd = ('cd %s' % path_to_base).replace('\\', '/') args = [a.replace('\\', '/').replace('"', '\\"') for a in args] args = ["'%s'" % a.replace("'", "'\\''") for a in args] bash_cmd = ' '.join(args) cmd = ( 'call "%s\\setup_env.bat" && set CYGWIN=nontsec && ' % cygwin_dir + 'bash -c "%s ; %s"' % (cd, bash_cmd)) return cmd def IsRuleRunUnderCygwin(self, rule): """Determine if an action should be run under cygwin. If the variable is unset, or set to 1 we use cygwin.""" return int(rule.get('msvs_cygwin_shell', self.spec.get('msvs_cygwin_shell', 1))) != 0 def _HasExplicitRuleForExtension(self, spec, extension): """Determine if there's an explicit rule for a particular extension.""" for rule in spec.get('rules', []): if rule['extension'] == extension: return True return False def HasExplicitIdlRules(self, spec): """Determine if there's an explicit rule for idl files. When there isn't we need to generate implicit rules to build MIDL .idl files.""" return self._HasExplicitRuleForExtension(spec, 'idl') def HasExplicitAsmRules(self, spec): """Determine if there's an explicit rule for asm files. When there isn't we need to generate implicit rules to assemble .asm files.""" return self._HasExplicitRuleForExtension(spec, 'asm') def GetIdlBuildData(self, source, config): """Determine the implicit outputs for an idl file. Returns output directory, outputs, and variables and flags that are required.""" config = self._TargetConfig(config) midl_get = self._GetWrapper(self, self.msvs_settings[config], 'VCMIDLTool') def midl(name, default=None): return self.ConvertVSMacros(midl_get(name, default=default), config=config) tlb = midl('TypeLibraryName', default='${root}.tlb') header = midl('HeaderFileName', default='${root}.h') dlldata = midl('DLLDataFileName', default='dlldata.c') iid = midl('InterfaceIdentifierFileName', default='${root}_i.c') proxy = midl('ProxyFileName', default='${root}_p.c') # Note that .tlb is not included in the outputs as it is not always # generated depending on the content of the input idl file. outdir = midl('OutputDirectory', default='') output = [header, dlldata, iid, proxy] variables = [('tlb', tlb), ('h', header), ('dlldata', dlldata), ('iid', iid), ('proxy', proxy)] # TODO(scottmg): Are there configuration settings to set these flags? target_platform = 'win32' if self.GetArch(config) == 'x86' else 'x64' flags = ['/char', 'signed', '/env', target_platform, '/Oicf'] return outdir, output, variables, flags def _LanguageMatchesForPch(source_ext, pch_source_ext): c_exts = ('.c',) cc_exts = ('.cc', '.cxx', '.cpp') return ((source_ext in c_exts and pch_source_ext in c_exts) or (source_ext in cc_exts and pch_source_ext in cc_exts)) class PrecompiledHeader(object): """Helper to generate dependencies and build rules to handle generation of precompiled headers. Interface matches the GCH handler in xcode_emulation.py. """ def __init__( self, settings, config, gyp_to_build_path, gyp_to_unique_output, obj_ext): self.settings = settings self.config = config pch_source = self.settings.msvs_precompiled_source[self.config] self.pch_source = gyp_to_build_path(pch_source) filename, _ = os.path.splitext(pch_source) self.output_obj = gyp_to_unique_output(filename + obj_ext).lower() def _PchHeader(self): """Get the header that will appear in an #include line for all source files.""" return os.path.split(self.settings.msvs_precompiled_header[self.config])[1] def GetObjDependencies(self, sources, objs, arch): """Given a list of sources files and the corresponding object files, returns a list of the pch files that should be depended upon. The additional wrapping in the return value is for interface compatability with make.py on Mac, and xcode_emulation.py.""" assert arch is None if not self._PchHeader(): return [] pch_ext = os.path.splitext(self.pch_source)[1] for source in sources: if _LanguageMatchesForPch(os.path.splitext(source)[1], pch_ext): return [(None, None, self.output_obj)] return [] def GetPchBuildCommands(self, arch): """Not used on Windows as there are no additional build steps required (instead, existing steps are modified in GetFlagsModifications below).""" return [] def GetFlagsModifications(self, input, output, implicit, command, cflags_c, cflags_cc, expand_special): """Get the modified cflags and implicit dependencies that should be used for the pch compilation step.""" if input == self.pch_source: pch_output = ['/Yc' + self._PchHeader()] if command == 'cxx': return ([('cflags_cc', map(expand_special, cflags_cc + pch_output))], self.output_obj, []) elif command == 'cc': return ([('cflags_c', map(expand_special, cflags_c + pch_output))], self.output_obj, []) return [], output, implicit vs_version = None def GetVSVersion(generator_flags): global vs_version if not vs_version: vs_version = gyp.MSVSVersion.SelectVisualStudioVersion( generator_flags.get('msvs_version', 'auto')) return vs_version def _GetVsvarsSetupArgs(generator_flags, arch): vs = GetVSVersion(generator_flags) return vs.SetupScript() def ExpandMacros(string, expansions): """Expand $(Variable) per expansions dict. See MsvsSettings.GetVSMacroEnv for the canonical way to retrieve a suitable dict.""" if '$' in string: for old, new in expansions.iteritems(): assert '$(' not in new, new string = string.replace(old, new) return string def _ExtractImportantEnvironment(output_of_set): """Extracts environment variables required for the toolchain to run from a textual dump output by the cmd.exe 'set' command.""" envvars_to_save = ( 'goma_.*', # TODO(scottmg): This is ugly, but needed for goma. 'include', 'lib', 'libpath', 'path', 'pathext', 'systemroot', 'temp', 'tmp', ) env = {} for line in output_of_set.splitlines(): for envvar in envvars_to_save: if re.match(envvar + '=', line.lower()): var, setting = line.split('=', 1) if envvar == 'path': # Our own rules (for running gyp-win-tool) and other actions in # Chromium rely on python being in the path. Add the path to this # python here so that if it's not in the path when ninja is run # later, python will still be found. setting = os.path.dirname(sys.executable) + os.pathsep + setting env[var.upper()] = setting break for required in ('SYSTEMROOT', 'TEMP', 'TMP'): if required not in env: raise Exception('Environment variable "%s" ' 'required to be set to valid path' % required) return env def _FormatAsEnvironmentBlock(envvar_dict): """Format as an 'environment block' directly suitable for CreateProcess. Briefly this is a list of key=value\0, terminated by an additional \0. See CreateProcess documentation for more details.""" block = '' nul = '\0' for key, value in envvar_dict.iteritems(): block += key + '=' + value + nul block += nul return block def _ExtractCLPath(output_of_where): """Gets the path to cl.exe based on the output of calling the environment setup batch file, followed by the equivalent of `where`.""" # Take the first line, as that's the first found in the PATH. for line in output_of_where.strip().splitlines(): if line.startswith('LOC:'): return line[len('LOC:'):].strip() def GenerateEnvironmentFiles(toplevel_build_dir, generator_flags, open_out): """It's not sufficient to have the absolute path to the compiler, linker, etc. on Windows, as those tools rely on .dlls being in the PATH. We also need to support both x86 and x64 compilers within the same build (to support msvs_target_platform hackery). Different architectures require a different compiler binary, and different supporting environment variables (INCLUDE, LIB, LIBPATH). So, we extract the environment here, wrap all invocations of compiler tools (cl, link, lib, rc, midl, etc.) via win_tool.py which sets up the environment, and then we do not prefix the compiler with an absolute path, instead preferring something like "cl.exe" in the rule which will then run whichever the environment setup has put in the path. When the following procedure to generate environment files does not meet your requirement (e.g. for custom toolchains), you can pass "-G ninja_use_custom_environment_files" to the gyp to suppress file generation and use custom environment files prepared by yourself.""" archs = ('x86', 'x64') if generator_flags.get('ninja_use_custom_environment_files', 0): cl_paths = {} for arch in archs: cl_paths[arch] = 'cl.exe' return cl_paths vs = GetVSVersion(generator_flags) cl_paths = {} for arch in archs: # Extract environment variables for subprocesses. args = vs.SetupScript(arch) args.extend(('&&', 'set')) popen = subprocess.Popen( args, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) variables, _ = popen.communicate() env = _ExtractImportantEnvironment(variables) env_block = _FormatAsEnvironmentBlock(env) f = open_out(os.path.join(toplevel_build_dir, 'environment.' + arch), 'wb') f.write(env_block) f.close() # Find cl.exe location for this architecture. args = vs.SetupScript(arch) args.extend(('&&', 'for', '%i', 'in', '(cl.exe)', 'do', '@echo', 'LOC:%~$PATH:i')) popen = subprocess.Popen(args, shell=True, stdout=subprocess.PIPE) output, _ = popen.communicate() cl_paths[arch] = _ExtractCLPath(output) return cl_paths def VerifyMissingSources(sources, build_dir, generator_flags, gyp_to_ninja): """Emulate behavior of msvs_error_on_missing_sources present in the msvs generator: Check that all regular source files, i.e. not created at run time, exist on disk. Missing files cause needless recompilation when building via VS, and we want this check to match for people/bots that build using ninja, so they're not surprised when the VS build fails.""" if int(generator_flags.get('msvs_error_on_missing_sources', 0)): no_specials = filter(lambda x: '$' not in x, sources) relative = [os.path.join(build_dir, gyp_to_ninja(s)) for s in no_specials] missing = filter(lambda x: not os.path.exists(x), relative) if missing: # They'll look like out\Release\..\..\stuff\things.cc, so normalize the # path for a slightly less crazy looking output. cleaned_up = [os.path.normpath(x) for x in missing] raise Exception('Missing input files:\n%s' % '\n'.join(cleaned_up)) # Sets some values in default_variables, which are required for many # generators, run on Windows. def CalculateCommonVariables(default_variables, params): generator_flags = params.get('generator_flags', {}) # Set a variable so conditions can be based on msvs_version. msvs_version = gyp.msvs_emulation.GetVSVersion(generator_flags) default_variables['MSVS_VERSION'] = msvs_version.ShortName() # To determine processor word size on Windows, in addition to checking # PROCESSOR_ARCHITECTURE (which reflects the word size of the current # process), it is also necessary to check PROCESSOR_ARCHITEW6432 (which # contains the actual word size of the system when running thru WOW64). if ('64' in os.environ.get('PROCESSOR_ARCHITECTURE', '') or '64' in os.environ.get('PROCESSOR_ARCHITEW6432', '')): default_variables['MSVS_OS_BITS'] = 64 else: default_variables['MSVS_OS_BITS'] = 32 ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSNew.py����������000644 �000766 �000024 �00000027524 12455173731 031444� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """New implementation of Visual Studio project generation.""" import os import random import gyp.common # hashlib is supplied as of Python 2.5 as the replacement interface for md5 # and other secure hashes. In 2.6, md5 is deprecated. Import hashlib if # available, avoiding a deprecation warning under 2.6. Import md5 otherwise, # preserving 2.4 compatibility. try: import hashlib _new_md5 = hashlib.md5 except ImportError: import md5 _new_md5 = md5.new # Initialize random number generator random.seed() # GUIDs for project types ENTRY_TYPE_GUIDS = { 'project': '{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}', 'folder': '{2150E333-8FDC-42A3-9474-1A3956D46DE8}', } #------------------------------------------------------------------------------ # Helper functions def MakeGuid(name, seed='msvs_new'): """Returns a GUID for the specified target name. Args: name: Target name. seed: Seed for MD5 hash. Returns: A GUID-line string calculated from the name and seed. This generates something which looks like a GUID, but depends only on the name and seed. This means the same name/seed will always generate the same GUID, so that projects and solutions which refer to each other can explicitly determine the GUID to refer to explicitly. It also means that the GUID will not change when the project for a target is rebuilt. """ # Calculate a MD5 signature for the seed and name. d = _new_md5(str(seed) + str(name)).hexdigest().upper() # Convert most of the signature to GUID form (discard the rest) guid = ('{' + d[:8] + '-' + d[8:12] + '-' + d[12:16] + '-' + d[16:20] + '-' + d[20:32] + '}') return guid #------------------------------------------------------------------------------ class MSVSSolutionEntry(object): def __cmp__(self, other): # Sort by name then guid (so things are in order on vs2008). return cmp((self.name, self.get_guid()), (other.name, other.get_guid())) class MSVSFolder(MSVSSolutionEntry): """Folder in a Visual Studio project or solution.""" def __init__(self, path, name = None, entries = None, guid = None, items = None): """Initializes the folder. Args: path: Full path to the folder. name: Name of the folder. entries: List of folder entries to nest inside this folder. May contain Folder or Project objects. May be None, if the folder is empty. guid: GUID to use for folder, if not None. items: List of solution items to include in the folder project. May be None, if the folder does not directly contain items. """ if name: self.name = name else: # Use last layer. self.name = os.path.basename(path) self.path = path self.guid = guid # Copy passed lists (or set to empty lists) self.entries = sorted(list(entries or [])) self.items = list(items or []) self.entry_type_guid = ENTRY_TYPE_GUIDS['folder'] def get_guid(self): if self.guid is None: # Use consistent guids for folders (so things don't regenerate). self.guid = MakeGuid(self.path, seed='msvs_folder') return self.guid #------------------------------------------------------------------------------ class MSVSProject(MSVSSolutionEntry): """Visual Studio project.""" def __init__(self, path, name = None, dependencies = None, guid = None, spec = None, build_file = None, config_platform_overrides = None, fixpath_prefix = None): """Initializes the project. Args: path: Absolute path to the project file. name: Name of project. If None, the name will be the same as the base name of the project file. dependencies: List of other Project objects this project is dependent upon, if not None. guid: GUID to use for project, if not None. spec: Dictionary specifying how to build this project. build_file: Filename of the .gyp file that the vcproj file comes from. config_platform_overrides: optional dict of configuration platforms to used in place of the default for this target. fixpath_prefix: the path used to adjust the behavior of _fixpath """ self.path = path self.guid = guid self.spec = spec self.build_file = build_file # Use project filename if name not specified self.name = name or os.path.splitext(os.path.basename(path))[0] # Copy passed lists (or set to empty lists) self.dependencies = list(dependencies or []) self.entry_type_guid = ENTRY_TYPE_GUIDS['project'] if config_platform_overrides: self.config_platform_overrides = config_platform_overrides else: self.config_platform_overrides = {} self.fixpath_prefix = fixpath_prefix self.msbuild_toolset = None def set_dependencies(self, dependencies): self.dependencies = list(dependencies or []) def get_guid(self): if self.guid is None: # Set GUID from path # TODO(rspangler): This is fragile. # 1. We can't just use the project filename sans path, since there could # be multiple projects with the same base name (for example, # foo/unittest.vcproj and bar/unittest.vcproj). # 2. The path needs to be relative to $SOURCE_ROOT, so that the project # GUID is the same whether it's included from base/base.sln or # foo/bar/baz/baz.sln. # 3. The GUID needs to be the same each time this builder is invoked, so # that we don't need to rebuild the solution when the project changes. # 4. We should be able to handle pre-built project files by reading the # GUID from the files. self.guid = MakeGuid(self.name) return self.guid def set_msbuild_toolset(self, msbuild_toolset): self.msbuild_toolset = msbuild_toolset #------------------------------------------------------------------------------ class MSVSSolution: """Visual Studio solution.""" def __init__(self, path, version, entries=None, variants=None, websiteProperties=True): """Initializes the solution. Args: path: Path to solution file. version: Format version to emit. entries: List of entries in solution. May contain Folder or Project objects. May be None, if the folder is empty. variants: List of build variant strings. If none, a default list will be used. websiteProperties: Flag to decide if the website properties section is generated. """ self.path = path self.websiteProperties = websiteProperties self.version = version # Copy passed lists (or set to empty lists) self.entries = list(entries or []) if variants: # Copy passed list self.variants = variants[:] else: # Use default self.variants = ['Debug|Win32', 'Release|Win32'] # TODO(rspangler): Need to be able to handle a mapping of solution config # to project config. Should we be able to handle variants being a dict, # or add a separate variant_map variable? If it's a dict, we can't # guarantee the order of variants since dict keys aren't ordered. # TODO(rspangler): Automatically write to disk for now; should delay until # node-evaluation time. self.Write() def Write(self, writer=gyp.common.WriteOnDiff): """Writes the solution file to disk. Raises: IndexError: An entry appears multiple times. """ # Walk the entry tree and collect all the folders and projects. all_entries = set() entries_to_check = self.entries[:] while entries_to_check: e = entries_to_check.pop(0) # If this entry has been visited, nothing to do. if e in all_entries: continue all_entries.add(e) # If this is a folder, check its entries too. if isinstance(e, MSVSFolder): entries_to_check += e.entries all_entries = sorted(all_entries) # Open file and print header f = writer(self.path) f.write('Microsoft Visual Studio Solution File, ' 'Format Version %s\r\n' % self.version.SolutionVersion()) f.write('# %s\r\n' % self.version.Description()) # Project entries sln_root = os.path.split(self.path)[0] for e in all_entries: relative_path = gyp.common.RelativePath(e.path, sln_root) # msbuild does not accept an empty folder_name. # use '.' in case relative_path is empty. folder_name = relative_path.replace('/', '\\') or '.' f.write('Project("%s") = "%s", "%s", "%s"\r\n' % ( e.entry_type_guid, # Entry type GUID e.name, # Folder name folder_name, # Folder name (again) e.get_guid(), # Entry GUID )) # TODO(rspangler): Need a way to configure this stuff if self.websiteProperties: f.write('\tProjectSection(WebsiteProperties) = preProject\r\n' '\t\tDebug.AspNetCompiler.Debug = "True"\r\n' '\t\tRelease.AspNetCompiler.Debug = "False"\r\n' '\tEndProjectSection\r\n') if isinstance(e, MSVSFolder): if e.items: f.write('\tProjectSection(SolutionItems) = preProject\r\n') for i in e.items: f.write('\t\t%s = %s\r\n' % (i, i)) f.write('\tEndProjectSection\r\n') if isinstance(e, MSVSProject): if e.dependencies: f.write('\tProjectSection(ProjectDependencies) = postProject\r\n') for d in e.dependencies: f.write('\t\t%s = %s\r\n' % (d.get_guid(), d.get_guid())) f.write('\tEndProjectSection\r\n') f.write('EndProject\r\n') # Global section f.write('Global\r\n') # Configurations (variants) f.write('\tGlobalSection(SolutionConfigurationPlatforms) = preSolution\r\n') for v in self.variants: f.write('\t\t%s = %s\r\n' % (v, v)) f.write('\tEndGlobalSection\r\n') # Sort config guids for easier diffing of solution changes. config_guids = [] config_guids_overrides = {} for e in all_entries: if isinstance(e, MSVSProject): config_guids.append(e.get_guid()) config_guids_overrides[e.get_guid()] = e.config_platform_overrides config_guids.sort() f.write('\tGlobalSection(ProjectConfigurationPlatforms) = postSolution\r\n') for g in config_guids: for v in self.variants: nv = config_guids_overrides[g].get(v, v) # Pick which project configuration to build for this solution # configuration. f.write('\t\t%s.%s.ActiveCfg = %s\r\n' % ( g, # Project GUID v, # Solution build configuration nv, # Project build config for that solution config )) # Enable project in this solution configuration. f.write('\t\t%s.%s.Build.0 = %s\r\n' % ( g, # Project GUID v, # Solution build configuration nv, # Project build config for that solution config )) f.write('\tEndGlobalSection\r\n') # TODO(rspangler): Should be able to configure this stuff too (though I've # never seen this be any different) f.write('\tGlobalSection(SolutionProperties) = preSolution\r\n') f.write('\t\tHideSolutionNode = FALSE\r\n') f.write('\tEndGlobalSection\r\n') # Folder mappings # Omit this section if there are no folders if any([e.entries for e in all_entries if isinstance(e, MSVSFolder)]): f.write('\tGlobalSection(NestedProjects) = preSolution\r\n') for e in all_entries: if not isinstance(e, MSVSFolder): continue # Does not apply to projects, only folders for subentry in e.entries: f.write('\t\t%s = %s\r\n' % (subentry.get_guid(), e.get_guid())) f.write('\tEndGlobalSection\r\n') f.write('EndGlobal\r\n') f.close() ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSProject.py������000644 �000766 �000024 �00000014363 12455173731 032316� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Visual Studio project reader/writer.""" import gyp.common import gyp.easy_xml as easy_xml #------------------------------------------------------------------------------ class Tool(object): """Visual Studio tool.""" def __init__(self, name, attrs=None): """Initializes the tool. Args: name: Tool name. attrs: Dict of tool attributes; may be None. """ self._attrs = attrs or {} self._attrs['Name'] = name def _GetSpecification(self): """Creates an element for the tool. Returns: A new xml.dom.Element for the tool. """ return ['Tool', self._attrs] class Filter(object): """Visual Studio filter - that is, a virtual folder.""" def __init__(self, name, contents=None): """Initializes the folder. Args: name: Filter (folder) name. contents: List of filenames and/or Filter objects contained. """ self.name = name self.contents = list(contents or []) #------------------------------------------------------------------------------ class Writer(object): """Visual Studio XML project writer.""" def __init__(self, project_path, version, name, guid=None, platforms=None): """Initializes the project. Args: project_path: Path to the project file. version: Format version to emit. name: Name of the project. guid: GUID to use for project, if not None. platforms: Array of string, the supported platforms. If null, ['Win32'] """ self.project_path = project_path self.version = version self.name = name self.guid = guid # Default to Win32 for platforms. if not platforms: platforms = ['Win32'] # Initialize the specifications of the various sections. self.platform_section = ['Platforms'] for platform in platforms: self.platform_section.append(['Platform', {'Name': platform}]) self.tool_files_section = ['ToolFiles'] self.configurations_section = ['Configurations'] self.files_section = ['Files'] # Keep a dict keyed on filename to speed up access. self.files_dict = dict() def AddToolFile(self, path): """Adds a tool file to the project. Args: path: Relative path from project to tool file. """ self.tool_files_section.append(['ToolFile', {'RelativePath': path}]) def _GetSpecForConfiguration(self, config_type, config_name, attrs, tools): """Returns the specification for a configuration. Args: config_type: Type of configuration node. config_name: Configuration name. attrs: Dict of configuration attributes; may be None. tools: List of tools (strings or Tool objects); may be None. Returns: """ # Handle defaults if not attrs: attrs = {} if not tools: tools = [] # Add configuration node and its attributes node_attrs = attrs.copy() node_attrs['Name'] = config_name specification = [config_type, node_attrs] # Add tool nodes and their attributes if tools: for t in tools: if isinstance(t, Tool): specification.append(t._GetSpecification()) else: specification.append(Tool(t)._GetSpecification()) return specification def AddConfig(self, name, attrs=None, tools=None): """Adds a configuration to the project. Args: name: Configuration name. attrs: Dict of configuration attributes; may be None. tools: List of tools (strings or Tool objects); may be None. """ spec = self._GetSpecForConfiguration('Configuration', name, attrs, tools) self.configurations_section.append(spec) def _AddFilesToNode(self, parent, files): """Adds files and/or filters to the parent node. Args: parent: Destination node files: A list of Filter objects and/or relative paths to files. Will call itself recursively, if the files list contains Filter objects. """ for f in files: if isinstance(f, Filter): node = ['Filter', {'Name': f.name}] self._AddFilesToNode(node, f.contents) else: node = ['File', {'RelativePath': f}] self.files_dict[f] = node parent.append(node) def AddFiles(self, files): """Adds files to the project. Args: files: A list of Filter objects and/or relative paths to files. This makes a copy of the file/filter tree at the time of this call. If you later add files to a Filter object which was passed into a previous call to AddFiles(), it will not be reflected in this project. """ self._AddFilesToNode(self.files_section, files) # TODO(rspangler) This also doesn't handle adding files to an existing # filter. That is, it doesn't merge the trees. def AddFileConfig(self, path, config, attrs=None, tools=None): """Adds a configuration to a file. Args: path: Relative path to the file. config: Name of configuration to add. attrs: Dict of configuration attributes; may be None. tools: List of tools (strings or Tool objects); may be None. Raises: ValueError: Relative path does not match any file added via AddFiles(). """ # Find the file node with the right relative path parent = self.files_dict.get(path) if not parent: raise ValueError('AddFileConfig: file "%s" not in project.' % path) # Add the config to the file node spec = self._GetSpecForConfiguration('FileConfiguration', config, attrs, tools) parent.append(spec) def WriteIfChanged(self): """Writes the project file.""" # First create XML content definition content = [ 'VisualStudioProject', {'ProjectType': 'Visual C++', 'Version': self.version.ProjectVersion(), 'Name': self.name, 'ProjectGUID': self.guid, 'RootNamespace': self.name, 'Keyword': 'Win32Proj' }, self.platform_section, self.tool_files_section, self.configurations_section, ['References'], # empty section self.files_section, ['Globals'] # empty section ] easy_xml.WriteXmlIfChanged(content, self.project_path, encoding="Windows-1252") �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings.py�����000644 �000766 �000024 �00000125126 12455173731 032510� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Code to validate and convert settings of the Microsoft build tools. This file contains code to validate and convert settings of the Microsoft build tools. The function ConvertToMSBuildSettings(), ValidateMSVSSettings(), and ValidateMSBuildSettings() are the entry points. This file was created by comparing the projects created by Visual Studio 2008 and Visual Studio 2010 for all available settings through the user interface. The MSBuild schemas were also considered. They are typically found in the MSBuild install directory, e.g. c:\Program Files (x86)\MSBuild """ import sys import re # Dictionaries of settings validators. The key is the tool name, the value is # a dictionary mapping setting names to validation functions. _msvs_validators = {} _msbuild_validators = {} # A dictionary of settings converters. The key is the tool name, the value is # a dictionary mapping setting names to conversion functions. _msvs_to_msbuild_converters = {} # Tool name mapping from MSVS to MSBuild. _msbuild_name_of_tool = {} class _Tool(object): """Represents a tool used by MSVS or MSBuild. Attributes: msvs_name: The name of the tool in MSVS. msbuild_name: The name of the tool in MSBuild. """ def __init__(self, msvs_name, msbuild_name): self.msvs_name = msvs_name self.msbuild_name = msbuild_name def _AddTool(tool): """Adds a tool to the four dictionaries used to process settings. This only defines the tool. Each setting also needs to be added. Args: tool: The _Tool object to be added. """ _msvs_validators[tool.msvs_name] = {} _msbuild_validators[tool.msbuild_name] = {} _msvs_to_msbuild_converters[tool.msvs_name] = {} _msbuild_name_of_tool[tool.msvs_name] = tool.msbuild_name def _GetMSBuildToolSettings(msbuild_settings, tool): """Returns an MSBuild tool dictionary. Creates it if needed.""" return msbuild_settings.setdefault(tool.msbuild_name, {}) class _Type(object): """Type of settings (Base class).""" def ValidateMSVS(self, value): """Verifies that the value is legal for MSVS. Args: value: the value to check for this type. Raises: ValueError if value is not valid for MSVS. """ def ValidateMSBuild(self, value): """Verifies that the value is legal for MSBuild. Args: value: the value to check for this type. Raises: ValueError if value is not valid for MSBuild. """ def ConvertToMSBuild(self, value): """Returns the MSBuild equivalent of the MSVS value given. Args: value: the MSVS value to convert. Returns: the MSBuild equivalent. Raises: ValueError if value is not valid. """ return value class _String(_Type): """A setting that's just a string.""" def ValidateMSVS(self, value): if not isinstance(value, basestring): raise ValueError('expected string; got %r' % value) def ValidateMSBuild(self, value): if not isinstance(value, basestring): raise ValueError('expected string; got %r' % value) def ConvertToMSBuild(self, value): # Convert the macros return ConvertVCMacrosToMSBuild(value) class _StringList(_Type): """A settings that's a list of strings.""" def ValidateMSVS(self, value): if not isinstance(value, basestring) and not isinstance(value, list): raise ValueError('expected string list; got %r' % value) def ValidateMSBuild(self, value): if not isinstance(value, basestring) and not isinstance(value, list): raise ValueError('expected string list; got %r' % value) def ConvertToMSBuild(self, value): # Convert the macros if isinstance(value, list): return [ConvertVCMacrosToMSBuild(i) for i in value] else: return ConvertVCMacrosToMSBuild(value) class _Boolean(_Type): """Boolean settings, can have the values 'false' or 'true'.""" def _Validate(self, value): if value != 'true' and value != 'false': raise ValueError('expected bool; got %r' % value) def ValidateMSVS(self, value): self._Validate(value) def ValidateMSBuild(self, value): self._Validate(value) def ConvertToMSBuild(self, value): self._Validate(value) return value class _Integer(_Type): """Integer settings.""" def __init__(self, msbuild_base=10): _Type.__init__(self) self._msbuild_base = msbuild_base def ValidateMSVS(self, value): # Try to convert, this will raise ValueError if invalid. self.ConvertToMSBuild(value) def ValidateMSBuild(self, value): # Try to convert, this will raise ValueError if invalid. int(value, self._msbuild_base) def ConvertToMSBuild(self, value): msbuild_format = (self._msbuild_base == 10) and '%d' or '0x%04x' return msbuild_format % int(value) class _Enumeration(_Type): """Type of settings that is an enumeration. In MSVS, the values are indexes like '0', '1', and '2'. MSBuild uses text labels that are more representative, like 'Win32'. Constructor args: label_list: an array of MSBuild labels that correspond to the MSVS index. In the rare cases where MSVS has skipped an index value, None is used in the array to indicate the unused spot. new: an array of labels that are new to MSBuild. """ def __init__(self, label_list, new=None): _Type.__init__(self) self._label_list = label_list self._msbuild_values = set(value for value in label_list if value is not None) if new is not None: self._msbuild_values.update(new) def ValidateMSVS(self, value): # Try to convert. It will raise an exception if not valid. self.ConvertToMSBuild(value) def ValidateMSBuild(self, value): if value not in self._msbuild_values: raise ValueError('unrecognized enumerated value %s' % value) def ConvertToMSBuild(self, value): index = int(value) if index < 0 or index >= len(self._label_list): raise ValueError('index value (%d) not in expected range [0, %d)' % (index, len(self._label_list))) label = self._label_list[index] if label is None: raise ValueError('converted value for %s not specified.' % value) return label # Instantiate the various generic types. _boolean = _Boolean() _integer = _Integer() # For now, we don't do any special validation on these types: _string = _String() _file_name = _String() _folder_name = _String() _file_list = _StringList() _folder_list = _StringList() _string_list = _StringList() # Some boolean settings went from numerical values to boolean. The # mapping is 0: default, 1: false, 2: true. _newly_boolean = _Enumeration(['', 'false', 'true']) def _Same(tool, name, setting_type): """Defines a setting that has the same name in MSVS and MSBuild. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. name: the name of the setting. setting_type: the type of this setting. """ _Renamed(tool, name, name, setting_type) def _Renamed(tool, msvs_name, msbuild_name, setting_type): """Defines a setting for which the name has changed. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. msvs_name: the name of the MSVS setting. msbuild_name: the name of the MSBuild setting. setting_type: the type of this setting. """ def _Translate(value, msbuild_settings): msbuild_tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool) msbuild_tool_settings[msbuild_name] = setting_type.ConvertToMSBuild(value) _msvs_validators[tool.msvs_name][msvs_name] = setting_type.ValidateMSVS _msbuild_validators[tool.msbuild_name][msbuild_name] = ( setting_type.ValidateMSBuild) _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate def _Moved(tool, settings_name, msbuild_tool_name, setting_type): _MovedAndRenamed(tool, settings_name, msbuild_tool_name, settings_name, setting_type) def _MovedAndRenamed(tool, msvs_settings_name, msbuild_tool_name, msbuild_settings_name, setting_type): """Defines a setting that may have moved to a new section. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. msvs_settings_name: the MSVS name of the setting. msbuild_tool_name: the name of the MSBuild tool to place the setting under. msbuild_settings_name: the MSBuild name of the setting. setting_type: the type of this setting. """ def _Translate(value, msbuild_settings): tool_settings = msbuild_settings.setdefault(msbuild_tool_name, {}) tool_settings[msbuild_settings_name] = setting_type.ConvertToMSBuild(value) _msvs_validators[tool.msvs_name][msvs_settings_name] = ( setting_type.ValidateMSVS) validator = setting_type.ValidateMSBuild _msbuild_validators[msbuild_tool_name][msbuild_settings_name] = validator _msvs_to_msbuild_converters[tool.msvs_name][msvs_settings_name] = _Translate def _MSVSOnly(tool, name, setting_type): """Defines a setting that is only found in MSVS. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. name: the name of the setting. setting_type: the type of this setting. """ def _Translate(unused_value, unused_msbuild_settings): # Since this is for MSVS only settings, no translation will happen. pass _msvs_validators[tool.msvs_name][name] = setting_type.ValidateMSVS _msvs_to_msbuild_converters[tool.msvs_name][name] = _Translate def _MSBuildOnly(tool, name, setting_type): """Defines a setting that is only found in MSBuild. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. name: the name of the setting. setting_type: the type of this setting. """ _msbuild_validators[tool.msbuild_name][name] = setting_type.ValidateMSBuild def _ConvertedToAdditionalOption(tool, msvs_name, flag): """Defines a setting that's handled via a command line option in MSBuild. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. msvs_name: the name of the MSVS setting that if 'true' becomes a flag flag: the flag to insert at the end of the AdditionalOptions """ def _Translate(value, msbuild_settings): if value == 'true': tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool) if 'AdditionalOptions' in tool_settings: new_flags = '%s %s' % (tool_settings['AdditionalOptions'], flag) else: new_flags = flag tool_settings['AdditionalOptions'] = new_flags _msvs_validators[tool.msvs_name][msvs_name] = _boolean.ValidateMSVS _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate def _CustomGeneratePreprocessedFile(tool, msvs_name): def _Translate(value, msbuild_settings): tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool) if value == '0': tool_settings['PreprocessToFile'] = 'false' tool_settings['PreprocessSuppressLineNumbers'] = 'false' elif value == '1': # /P tool_settings['PreprocessToFile'] = 'true' tool_settings['PreprocessSuppressLineNumbers'] = 'false' elif value == '2': # /EP /P tool_settings['PreprocessToFile'] = 'true' tool_settings['PreprocessSuppressLineNumbers'] = 'true' else: raise ValueError('value must be one of [0, 1, 2]; got %s' % value) # Create a bogus validator that looks for '0', '1', or '2' msvs_validator = _Enumeration(['a', 'b', 'c']).ValidateMSVS _msvs_validators[tool.msvs_name][msvs_name] = msvs_validator msbuild_validator = _boolean.ValidateMSBuild msbuild_tool_validators = _msbuild_validators[tool.msbuild_name] msbuild_tool_validators['PreprocessToFile'] = msbuild_validator msbuild_tool_validators['PreprocessSuppressLineNumbers'] = msbuild_validator _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate fix_vc_macro_slashes_regex_list = ('IntDir', 'OutDir') fix_vc_macro_slashes_regex = re.compile( r'(\$\((?:%s)\))(?:[\\/]+)' % "|".join(fix_vc_macro_slashes_regex_list) ) def FixVCMacroSlashes(s): """Replace macros which have excessive following slashes. These macros are known to have a built-in trailing slash. Furthermore, many scripts hiccup on processing paths with extra slashes in the middle. This list is probably not exhaustive. Add as needed. """ if '$' in s: s = fix_vc_macro_slashes_regex.sub(r'\1', s) return s def ConvertVCMacrosToMSBuild(s): """Convert the the MSVS macros found in the string to the MSBuild equivalent. This list is probably not exhaustive. Add as needed. """ if '$' in s: replace_map = { '$(ConfigurationName)': '$(Configuration)', '$(InputDir)': '%(RootDir)%(Directory)', '$(InputExt)': '%(Extension)', '$(InputFileName)': '%(Filename)%(Extension)', '$(InputName)': '%(Filename)', '$(InputPath)': '%(FullPath)', '$(ParentName)': '$(ProjectFileName)', '$(PlatformName)': '$(Platform)', '$(SafeInputName)': '%(Filename)', } for old, new in replace_map.iteritems(): s = s.replace(old, new) s = FixVCMacroSlashes(s) return s def ConvertToMSBuildSettings(msvs_settings, stderr=sys.stderr): """Converts MSVS settings (VS2008 and earlier) to MSBuild settings (VS2010+). Args: msvs_settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. Returns: A dictionary of MSBuild settings. The key is either the MSBuild tool name or the empty string (for the global settings). The values are themselves dictionaries of settings and their values. """ msbuild_settings = {} for msvs_tool_name, msvs_tool_settings in msvs_settings.iteritems(): if msvs_tool_name in _msvs_to_msbuild_converters: msvs_tool = _msvs_to_msbuild_converters[msvs_tool_name] for msvs_setting, msvs_value in msvs_tool_settings.iteritems(): if msvs_setting in msvs_tool: # Invoke the translation function. try: msvs_tool[msvs_setting](msvs_value, msbuild_settings) except ValueError, e: print >> stderr, ('Warning: while converting %s/%s to MSBuild, ' '%s' % (msvs_tool_name, msvs_setting, e)) else: # We don't know this setting. Give a warning. print >> stderr, ('Warning: unrecognized setting %s/%s ' 'while converting to MSBuild.' % (msvs_tool_name, msvs_setting)) else: print >> stderr, ('Warning: unrecognized tool %s while converting to ' 'MSBuild.' % msvs_tool_name) return msbuild_settings def ValidateMSVSSettings(settings, stderr=sys.stderr): """Validates that the names of the settings are valid for MSVS. Args: settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. """ _ValidateSettings(_msvs_validators, settings, stderr) def ValidateMSBuildSettings(settings, stderr=sys.stderr): """Validates that the names of the settings are valid for MSBuild. Args: settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. """ _ValidateSettings(_msbuild_validators, settings, stderr) def _ValidateSettings(validators, settings, stderr): """Validates that the settings are valid for MSBuild or MSVS. We currently only validate the names of the settings, not their values. Args: validators: A dictionary of tools and their validators. settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. """ for tool_name in settings: if tool_name in validators: tool_validators = validators[tool_name] for setting, value in settings[tool_name].iteritems(): if setting in tool_validators: try: tool_validators[setting](value) except ValueError, e: print >> stderr, ('Warning: for %s/%s, %s' % (tool_name, setting, e)) else: print >> stderr, ('Warning: unrecognized setting %s/%s' % (tool_name, setting)) else: print >> stderr, ('Warning: unrecognized tool %s' % tool_name) # MSVS and MBuild names of the tools. _compile = _Tool('VCCLCompilerTool', 'ClCompile') _link = _Tool('VCLinkerTool', 'Link') _midl = _Tool('VCMIDLTool', 'Midl') _rc = _Tool('VCResourceCompilerTool', 'ResourceCompile') _lib = _Tool('VCLibrarianTool', 'Lib') _manifest = _Tool('VCManifestTool', 'Manifest') _AddTool(_compile) _AddTool(_link) _AddTool(_midl) _AddTool(_rc) _AddTool(_lib) _AddTool(_manifest) # Add sections only found in the MSBuild settings. _msbuild_validators[''] = {} _msbuild_validators['ProjectReference'] = {} _msbuild_validators['ManifestResourceCompile'] = {} # Descriptions of the compiler options, i.e. VCCLCompilerTool in MSVS and # ClCompile in MSBuild. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\cl.xml" for # the schema of the MSBuild ClCompile settings. # Options that have the same name in MSVS and MSBuild _Same(_compile, 'AdditionalIncludeDirectories', _folder_list) # /I _Same(_compile, 'AdditionalOptions', _string_list) _Same(_compile, 'AdditionalUsingDirectories', _folder_list) # /AI _Same(_compile, 'AssemblerListingLocation', _file_name) # /Fa _Same(_compile, 'BrowseInformationFile', _file_name) _Same(_compile, 'BufferSecurityCheck', _boolean) # /GS _Same(_compile, 'DisableLanguageExtensions', _boolean) # /Za _Same(_compile, 'DisableSpecificWarnings', _string_list) # /wd _Same(_compile, 'EnableFiberSafeOptimizations', _boolean) # /GT _Same(_compile, 'EnablePREfast', _boolean) # /analyze Visible='false' _Same(_compile, 'ExpandAttributedSource', _boolean) # /Fx _Same(_compile, 'FloatingPointExceptions', _boolean) # /fp:except _Same(_compile, 'ForceConformanceInForLoopScope', _boolean) # /Zc:forScope _Same(_compile, 'ForcedIncludeFiles', _file_list) # /FI _Same(_compile, 'ForcedUsingFiles', _file_list) # /FU _Same(_compile, 'GenerateXMLDocumentationFiles', _boolean) # /doc _Same(_compile, 'IgnoreStandardIncludePath', _boolean) # /X _Same(_compile, 'MinimalRebuild', _boolean) # /Gm _Same(_compile, 'OmitDefaultLibName', _boolean) # /Zl _Same(_compile, 'OmitFramePointers', _boolean) # /Oy _Same(_compile, 'PreprocessorDefinitions', _string_list) # /D _Same(_compile, 'ProgramDataBaseFileName', _file_name) # /Fd _Same(_compile, 'RuntimeTypeInfo', _boolean) # /GR _Same(_compile, 'ShowIncludes', _boolean) # /showIncludes _Same(_compile, 'SmallerTypeCheck', _boolean) # /RTCc _Same(_compile, 'StringPooling', _boolean) # /GF _Same(_compile, 'SuppressStartupBanner', _boolean) # /nologo _Same(_compile, 'TreatWChar_tAsBuiltInType', _boolean) # /Zc:wchar_t _Same(_compile, 'UndefineAllPreprocessorDefinitions', _boolean) # /u _Same(_compile, 'UndefinePreprocessorDefinitions', _string_list) # /U _Same(_compile, 'UseFullPaths', _boolean) # /FC _Same(_compile, 'WholeProgramOptimization', _boolean) # /GL _Same(_compile, 'XMLDocumentationFileName', _file_name) _Same(_compile, 'AssemblerOutput', _Enumeration(['NoListing', 'AssemblyCode', # /FA 'All', # /FAcs 'AssemblyAndMachineCode', # /FAc 'AssemblyAndSourceCode'])) # /FAs _Same(_compile, 'BasicRuntimeChecks', _Enumeration(['Default', 'StackFrameRuntimeCheck', # /RTCs 'UninitializedLocalUsageCheck', # /RTCu 'EnableFastChecks'])) # /RTC1 _Same(_compile, 'BrowseInformation', _Enumeration(['false', 'true', # /FR 'true'])) # /Fr _Same(_compile, 'CallingConvention', _Enumeration(['Cdecl', # /Gd 'FastCall', # /Gr 'StdCall'])) # /Gz _Same(_compile, 'CompileAs', _Enumeration(['Default', 'CompileAsC', # /TC 'CompileAsCpp'])) # /TP _Same(_compile, 'DebugInformationFormat', _Enumeration(['', # Disabled 'OldStyle', # /Z7 None, 'ProgramDatabase', # /Zi 'EditAndContinue'])) # /ZI _Same(_compile, 'EnableEnhancedInstructionSet', _Enumeration(['NotSet', 'StreamingSIMDExtensions', # /arch:SSE 'StreamingSIMDExtensions2'])) # /arch:SSE2 _Same(_compile, 'ErrorReporting', _Enumeration(['None', # /errorReport:none 'Prompt', # /errorReport:prompt 'Queue'], # /errorReport:queue new=['Send'])) # /errorReport:send" _Same(_compile, 'ExceptionHandling', _Enumeration(['false', 'Sync', # /EHsc 'Async'], # /EHa new=['SyncCThrow'])) # /EHs _Same(_compile, 'FavorSizeOrSpeed', _Enumeration(['Neither', 'Speed', # /Ot 'Size'])) # /Os _Same(_compile, 'FloatingPointModel', _Enumeration(['Precise', # /fp:precise 'Strict', # /fp:strict 'Fast'])) # /fp:fast _Same(_compile, 'InlineFunctionExpansion', _Enumeration(['Default', 'OnlyExplicitInline', # /Ob1 'AnySuitable'], # /Ob2 new=['Disabled'])) # /Ob0 _Same(_compile, 'Optimization', _Enumeration(['Disabled', # /Od 'MinSpace', # /O1 'MaxSpeed', # /O2 'Full'])) # /Ox _Same(_compile, 'RuntimeLibrary', _Enumeration(['MultiThreaded', # /MT 'MultiThreadedDebug', # /MTd 'MultiThreadedDLL', # /MD 'MultiThreadedDebugDLL'])) # /MDd _Same(_compile, 'StructMemberAlignment', _Enumeration(['Default', '1Byte', # /Zp1 '2Bytes', # /Zp2 '4Bytes', # /Zp4 '8Bytes', # /Zp8 '16Bytes'])) # /Zp16 _Same(_compile, 'WarningLevel', _Enumeration(['TurnOffAllWarnings', # /W0 'Level1', # /W1 'Level2', # /W2 'Level3', # /W3 'Level4'], # /W4 new=['EnableAllWarnings'])) # /Wall # Options found in MSVS that have been renamed in MSBuild. _Renamed(_compile, 'EnableFunctionLevelLinking', 'FunctionLevelLinking', _boolean) # /Gy _Renamed(_compile, 'EnableIntrinsicFunctions', 'IntrinsicFunctions', _boolean) # /Oi _Renamed(_compile, 'KeepComments', 'PreprocessKeepComments', _boolean) # /C _Renamed(_compile, 'ObjectFile', 'ObjectFileName', _file_name) # /Fo _Renamed(_compile, 'OpenMP', 'OpenMPSupport', _boolean) # /openmp _Renamed(_compile, 'PrecompiledHeaderThrough', 'PrecompiledHeaderFile', _file_name) # Used with /Yc and /Yu _Renamed(_compile, 'PrecompiledHeaderFile', 'PrecompiledHeaderOutputFile', _file_name) # /Fp _Renamed(_compile, 'UsePrecompiledHeader', 'PrecompiledHeader', _Enumeration(['NotUsing', # VS recognized '' for this value too. 'Create', # /Yc 'Use'])) # /Yu _Renamed(_compile, 'WarnAsError', 'TreatWarningAsError', _boolean) # /WX _ConvertedToAdditionalOption(_compile, 'DefaultCharIsUnsigned', '/J') # MSVS options not found in MSBuild. _MSVSOnly(_compile, 'Detect64BitPortabilityProblems', _boolean) _MSVSOnly(_compile, 'UseUnicodeResponseFiles', _boolean) # MSBuild options not found in MSVS. _MSBuildOnly(_compile, 'BuildingInIDE', _boolean) _MSBuildOnly(_compile, 'CompileAsManaged', _Enumeration([], new=['false', 'true', # /clr 'Pure', # /clr:pure 'Safe', # /clr:safe 'OldSyntax'])) # /clr:oldSyntax _MSBuildOnly(_compile, 'CreateHotpatchableImage', _boolean) # /hotpatch _MSBuildOnly(_compile, 'MultiProcessorCompilation', _boolean) # /MP _MSBuildOnly(_compile, 'PreprocessOutputPath', _string) # /Fi _MSBuildOnly(_compile, 'ProcessorNumber', _integer) # the number of processors _MSBuildOnly(_compile, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_compile, 'TreatSpecificWarningsAsErrors', _string_list) # /we _MSBuildOnly(_compile, 'UseUnicodeForAssemblerListing', _boolean) # /FAu # Defines a setting that needs very customized processing _CustomGeneratePreprocessedFile(_compile, 'GeneratePreprocessedFile') # Directives for converting MSVS VCLinkerTool to MSBuild Link. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\link.xml" for # the schema of the MSBuild Link settings. # Options that have the same name in MSVS and MSBuild _Same(_link, 'AdditionalDependencies', _file_list) _Same(_link, 'AdditionalLibraryDirectories', _folder_list) # /LIBPATH # /MANIFESTDEPENDENCY: _Same(_link, 'AdditionalManifestDependencies', _file_list) _Same(_link, 'AdditionalOptions', _string_list) _Same(_link, 'AddModuleNamesToAssembly', _file_list) # /ASSEMBLYMODULE _Same(_link, 'AllowIsolation', _boolean) # /ALLOWISOLATION _Same(_link, 'AssemblyLinkResource', _file_list) # /ASSEMBLYLINKRESOURCE _Same(_link, 'BaseAddress', _string) # /BASE _Same(_link, 'CLRUnmanagedCodeCheck', _boolean) # /CLRUNMANAGEDCODECHECK _Same(_link, 'DelayLoadDLLs', _file_list) # /DELAYLOAD _Same(_link, 'DelaySign', _boolean) # /DELAYSIGN _Same(_link, 'EmbedManagedResourceFile', _file_list) # /ASSEMBLYRESOURCE _Same(_link, 'EnableUAC', _boolean) # /MANIFESTUAC _Same(_link, 'EntryPointSymbol', _string) # /ENTRY _Same(_link, 'ForceSymbolReferences', _file_list) # /INCLUDE _Same(_link, 'FunctionOrder', _file_name) # /ORDER _Same(_link, 'GenerateDebugInformation', _boolean) # /DEBUG _Same(_link, 'GenerateMapFile', _boolean) # /MAP _Same(_link, 'HeapCommitSize', _string) _Same(_link, 'HeapReserveSize', _string) # /HEAP _Same(_link, 'IgnoreAllDefaultLibraries', _boolean) # /NODEFAULTLIB _Same(_link, 'IgnoreEmbeddedIDL', _boolean) # /IGNOREIDL _Same(_link, 'ImportLibrary', _file_name) # /IMPLIB _Same(_link, 'KeyContainer', _file_name) # /KEYCONTAINER _Same(_link, 'KeyFile', _file_name) # /KEYFILE _Same(_link, 'ManifestFile', _file_name) # /ManifestFile _Same(_link, 'MapExports', _boolean) # /MAPINFO:EXPORTS _Same(_link, 'MapFileName', _file_name) _Same(_link, 'MergedIDLBaseFileName', _file_name) # /IDLOUT _Same(_link, 'MergeSections', _string) # /MERGE _Same(_link, 'MidlCommandFile', _file_name) # /MIDL _Same(_link, 'ModuleDefinitionFile', _file_name) # /DEF _Same(_link, 'OutputFile', _file_name) # /OUT _Same(_link, 'PerUserRedirection', _boolean) _Same(_link, 'Profile', _boolean) # /PROFILE _Same(_link, 'ProfileGuidedDatabase', _file_name) # /PGD _Same(_link, 'ProgramDatabaseFile', _file_name) # /PDB _Same(_link, 'RegisterOutput', _boolean) _Same(_link, 'SetChecksum', _boolean) # /RELEASE _Same(_link, 'StackCommitSize', _string) _Same(_link, 'StackReserveSize', _string) # /STACK _Same(_link, 'StripPrivateSymbols', _file_name) # /PDBSTRIPPED _Same(_link, 'SupportUnloadOfDelayLoadedDLL', _boolean) # /DELAY:UNLOAD _Same(_link, 'SuppressStartupBanner', _boolean) # /NOLOGO _Same(_link, 'SwapRunFromCD', _boolean) # /SWAPRUN:CD _Same(_link, 'TurnOffAssemblyGeneration', _boolean) # /NOASSEMBLY _Same(_link, 'TypeLibraryFile', _file_name) # /TLBOUT _Same(_link, 'TypeLibraryResourceID', _integer) # /TLBID _Same(_link, 'UACUIAccess', _boolean) # /uiAccess='true' _Same(_link, 'Version', _string) # /VERSION _Same(_link, 'EnableCOMDATFolding', _newly_boolean) # /OPT:ICF _Same(_link, 'FixedBaseAddress', _newly_boolean) # /FIXED _Same(_link, 'LargeAddressAware', _newly_boolean) # /LARGEADDRESSAWARE _Same(_link, 'OptimizeReferences', _newly_boolean) # /OPT:REF _Same(_link, 'RandomizedBaseAddress', _newly_boolean) # /DYNAMICBASE _Same(_link, 'TerminalServerAware', _newly_boolean) # /TSAWARE _subsystem_enumeration = _Enumeration( ['NotSet', 'Console', # /SUBSYSTEM:CONSOLE 'Windows', # /SUBSYSTEM:WINDOWS 'Native', # /SUBSYSTEM:NATIVE 'EFI Application', # /SUBSYSTEM:EFI_APPLICATION 'EFI Boot Service Driver', # /SUBSYSTEM:EFI_BOOT_SERVICE_DRIVER 'EFI ROM', # /SUBSYSTEM:EFI_ROM 'EFI Runtime', # /SUBSYSTEM:EFI_RUNTIME_DRIVER 'WindowsCE'], # /SUBSYSTEM:WINDOWSCE new=['POSIX']) # /SUBSYSTEM:POSIX _target_machine_enumeration = _Enumeration( ['NotSet', 'MachineX86', # /MACHINE:X86 None, 'MachineARM', # /MACHINE:ARM 'MachineEBC', # /MACHINE:EBC 'MachineIA64', # /MACHINE:IA64 None, 'MachineMIPS', # /MACHINE:MIPS 'MachineMIPS16', # /MACHINE:MIPS16 'MachineMIPSFPU', # /MACHINE:MIPSFPU 'MachineMIPSFPU16', # /MACHINE:MIPSFPU16 None, None, None, 'MachineSH4', # /MACHINE:SH4 None, 'MachineTHUMB', # /MACHINE:THUMB 'MachineX64']) # /MACHINE:X64 _Same(_link, 'AssemblyDebug', _Enumeration(['', 'true', # /ASSEMBLYDEBUG 'false'])) # /ASSEMBLYDEBUG:DISABLE _Same(_link, 'CLRImageType', _Enumeration(['Default', 'ForceIJWImage', # /CLRIMAGETYPE:IJW 'ForcePureILImage', # /Switch="CLRIMAGETYPE:PURE 'ForceSafeILImage'])) # /Switch="CLRIMAGETYPE:SAFE _Same(_link, 'CLRThreadAttribute', _Enumeration(['DefaultThreadingAttribute', # /CLRTHREADATTRIBUTE:NONE 'MTAThreadingAttribute', # /CLRTHREADATTRIBUTE:MTA 'STAThreadingAttribute'])) # /CLRTHREADATTRIBUTE:STA _Same(_link, 'DataExecutionPrevention', _Enumeration(['', 'false', # /NXCOMPAT:NO 'true'])) # /NXCOMPAT _Same(_link, 'Driver', _Enumeration(['NotSet', 'Driver', # /Driver 'UpOnly', # /DRIVER:UPONLY 'WDM'])) # /DRIVER:WDM _Same(_link, 'LinkTimeCodeGeneration', _Enumeration(['Default', 'UseLinkTimeCodeGeneration', # /LTCG 'PGInstrument', # /LTCG:PGInstrument 'PGOptimization', # /LTCG:PGOptimize 'PGUpdate'])) # /LTCG:PGUpdate _Same(_link, 'ShowProgress', _Enumeration(['NotSet', 'LinkVerbose', # /VERBOSE 'LinkVerboseLib'], # /VERBOSE:Lib new=['LinkVerboseICF', # /VERBOSE:ICF 'LinkVerboseREF', # /VERBOSE:REF 'LinkVerboseSAFESEH', # /VERBOSE:SAFESEH 'LinkVerboseCLR'])) # /VERBOSE:CLR _Same(_link, 'SubSystem', _subsystem_enumeration) _Same(_link, 'TargetMachine', _target_machine_enumeration) _Same(_link, 'UACExecutionLevel', _Enumeration(['AsInvoker', # /level='asInvoker' 'HighestAvailable', # /level='highestAvailable' 'RequireAdministrator'])) # /level='requireAdministrator' _Same(_link, 'MinimumRequiredVersion', _string) _Same(_link, 'TreatLinkerWarningAsErrors', _boolean) # /WX # Options found in MSVS that have been renamed in MSBuild. _Renamed(_link, 'ErrorReporting', 'LinkErrorReporting', _Enumeration(['NoErrorReport', # /ERRORREPORT:NONE 'PromptImmediately', # /ERRORREPORT:PROMPT 'QueueForNextLogin'], # /ERRORREPORT:QUEUE new=['SendErrorReport'])) # /ERRORREPORT:SEND _Renamed(_link, 'IgnoreDefaultLibraryNames', 'IgnoreSpecificDefaultLibraries', _file_list) # /NODEFAULTLIB _Renamed(_link, 'ResourceOnlyDLL', 'NoEntryPoint', _boolean) # /NOENTRY _Renamed(_link, 'SwapRunFromNet', 'SwapRunFromNET', _boolean) # /SWAPRUN:NET _Moved(_link, 'GenerateManifest', '', _boolean) _Moved(_link, 'IgnoreImportLibrary', '', _boolean) _Moved(_link, 'LinkIncremental', '', _newly_boolean) _Moved(_link, 'LinkLibraryDependencies', 'ProjectReference', _boolean) _Moved(_link, 'UseLibraryDependencyInputs', 'ProjectReference', _boolean) # MSVS options not found in MSBuild. _MSVSOnly(_link, 'OptimizeForWindows98', _newly_boolean) _MSVSOnly(_link, 'UseUnicodeResponseFiles', _boolean) # These settings generate correctly in the MSVS output files when using # e.g. DelayLoadDLLs! or AdditionalDependencies! to exclude files from # configuration entries, but result in spurious artifacts which can be # safely ignored here. See crbug.com/246570 _MSVSOnly(_link, 'AdditionalLibraryDirectories_excluded', _folder_list) _MSVSOnly(_link, 'DelayLoadDLLs_excluded', _file_list) _MSVSOnly(_link, 'AdditionalDependencies_excluded', _file_list) # MSBuild options not found in MSVS. _MSBuildOnly(_link, 'BuildingInIDE', _boolean) _MSBuildOnly(_link, 'ImageHasSafeExceptionHandlers', _boolean) # /SAFESEH _MSBuildOnly(_link, 'LinkDLL', _boolean) # /DLL Visible='false' _MSBuildOnly(_link, 'LinkStatus', _boolean) # /LTCG:STATUS _MSBuildOnly(_link, 'PreventDllBinding', _boolean) # /ALLOWBIND _MSBuildOnly(_link, 'SupportNobindOfDelayLoadedDLL', _boolean) # /DELAY:NOBIND _MSBuildOnly(_link, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_link, 'MSDOSStubFileName', _file_name) # /STUB Visible='false' _MSBuildOnly(_link, 'SectionAlignment', _integer) # /ALIGN _MSBuildOnly(_link, 'SpecifySectionAttributes', _string) # /SECTION _MSBuildOnly(_link, 'ForceFileOutput', _Enumeration([], new=['Enabled', # /FORCE # /FORCE:MULTIPLE 'MultiplyDefinedSymbolOnly', 'UndefinedSymbolOnly'])) # /FORCE:UNRESOLVED _MSBuildOnly(_link, 'CreateHotPatchableImage', _Enumeration([], new=['Enabled', # /FUNCTIONPADMIN 'X86Image', # /FUNCTIONPADMIN:5 'X64Image', # /FUNCTIONPADMIN:6 'ItaniumImage'])) # /FUNCTIONPADMIN:16 _MSBuildOnly(_link, 'CLRSupportLastError', _Enumeration([], new=['Enabled', # /CLRSupportLastError 'Disabled', # /CLRSupportLastError:NO # /CLRSupportLastError:SYSTEMDLL 'SystemDlls'])) # Directives for converting VCResourceCompilerTool to ResourceCompile. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\rc.xml" for # the schema of the MSBuild ResourceCompile settings. _Same(_rc, 'AdditionalOptions', _string_list) _Same(_rc, 'AdditionalIncludeDirectories', _folder_list) # /I _Same(_rc, 'Culture', _Integer(msbuild_base=16)) _Same(_rc, 'IgnoreStandardIncludePath', _boolean) # /X _Same(_rc, 'PreprocessorDefinitions', _string_list) # /D _Same(_rc, 'ResourceOutputFileName', _string) # /fo _Same(_rc, 'ShowProgress', _boolean) # /v # There is no UI in VisualStudio 2008 to set the following properties. # However they are found in CL and other tools. Include them here for # completeness, as they are very likely to have the same usage pattern. _Same(_rc, 'SuppressStartupBanner', _boolean) # /nologo _Same(_rc, 'UndefinePreprocessorDefinitions', _string_list) # /u # MSBuild options not found in MSVS. _MSBuildOnly(_rc, 'NullTerminateStrings', _boolean) # /n _MSBuildOnly(_rc, 'TrackerLogDirectory', _folder_name) # Directives for converting VCMIDLTool to Midl. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\midl.xml" for # the schema of the MSBuild Midl settings. _Same(_midl, 'AdditionalIncludeDirectories', _folder_list) # /I _Same(_midl, 'AdditionalOptions', _string_list) _Same(_midl, 'CPreprocessOptions', _string) # /cpp_opt _Same(_midl, 'ErrorCheckAllocations', _boolean) # /error allocation _Same(_midl, 'ErrorCheckBounds', _boolean) # /error bounds_check _Same(_midl, 'ErrorCheckEnumRange', _boolean) # /error enum _Same(_midl, 'ErrorCheckRefPointers', _boolean) # /error ref _Same(_midl, 'ErrorCheckStubData', _boolean) # /error stub_data _Same(_midl, 'GenerateStublessProxies', _boolean) # /Oicf _Same(_midl, 'GenerateTypeLibrary', _boolean) _Same(_midl, 'HeaderFileName', _file_name) # /h _Same(_midl, 'IgnoreStandardIncludePath', _boolean) # /no_def_idir _Same(_midl, 'InterfaceIdentifierFileName', _file_name) # /iid _Same(_midl, 'MkTypLibCompatible', _boolean) # /mktyplib203 _Same(_midl, 'OutputDirectory', _string) # /out _Same(_midl, 'PreprocessorDefinitions', _string_list) # /D _Same(_midl, 'ProxyFileName', _file_name) # /proxy _Same(_midl, 'RedirectOutputAndErrors', _file_name) # /o _Same(_midl, 'SuppressStartupBanner', _boolean) # /nologo _Same(_midl, 'TypeLibraryName', _file_name) # /tlb _Same(_midl, 'UndefinePreprocessorDefinitions', _string_list) # /U _Same(_midl, 'WarnAsError', _boolean) # /WX _Same(_midl, 'DefaultCharType', _Enumeration(['Unsigned', # /char unsigned 'Signed', # /char signed 'Ascii'])) # /char ascii7 _Same(_midl, 'TargetEnvironment', _Enumeration(['NotSet', 'Win32', # /env win32 'Itanium', # /env ia64 'X64'])) # /env x64 _Same(_midl, 'EnableErrorChecks', _Enumeration(['EnableCustom', 'None', # /error none 'All'])) # /error all _Same(_midl, 'StructMemberAlignment', _Enumeration(['NotSet', '1', # Zp1 '2', # Zp2 '4', # Zp4 '8'])) # Zp8 _Same(_midl, 'WarningLevel', _Enumeration(['0', # /W0 '1', # /W1 '2', # /W2 '3', # /W3 '4'])) # /W4 _Renamed(_midl, 'DLLDataFileName', 'DllDataFileName', _file_name) # /dlldata _Renamed(_midl, 'ValidateParameters', 'ValidateAllParameters', _boolean) # /robust # MSBuild options not found in MSVS. _MSBuildOnly(_midl, 'ApplicationConfigurationMode', _boolean) # /app_config _MSBuildOnly(_midl, 'ClientStubFile', _file_name) # /cstub _MSBuildOnly(_midl, 'GenerateClientFiles', _Enumeration([], new=['Stub', # /client stub 'None'])) # /client none _MSBuildOnly(_midl, 'GenerateServerFiles', _Enumeration([], new=['Stub', # /client stub 'None'])) # /client none _MSBuildOnly(_midl, 'LocaleID', _integer) # /lcid DECIMAL _MSBuildOnly(_midl, 'ServerStubFile', _file_name) # /sstub _MSBuildOnly(_midl, 'SuppressCompilerWarnings', _boolean) # /no_warn _MSBuildOnly(_midl, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_midl, 'TypeLibFormat', _Enumeration([], new=['NewFormat', # /newtlb 'OldFormat'])) # /oldtlb # Directives for converting VCLibrarianTool to Lib. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\lib.xml" for # the schema of the MSBuild Lib settings. _Same(_lib, 'AdditionalDependencies', _file_list) _Same(_lib, 'AdditionalLibraryDirectories', _folder_list) # /LIBPATH _Same(_lib, 'AdditionalOptions', _string_list) _Same(_lib, 'ExportNamedFunctions', _string_list) # /EXPORT _Same(_lib, 'ForceSymbolReferences', _string) # /INCLUDE _Same(_lib, 'IgnoreAllDefaultLibraries', _boolean) # /NODEFAULTLIB _Same(_lib, 'IgnoreSpecificDefaultLibraries', _file_list) # /NODEFAULTLIB _Same(_lib, 'ModuleDefinitionFile', _file_name) # /DEF _Same(_lib, 'OutputFile', _file_name) # /OUT _Same(_lib, 'SuppressStartupBanner', _boolean) # /NOLOGO _Same(_lib, 'UseUnicodeResponseFiles', _boolean) _Same(_lib, 'LinkTimeCodeGeneration', _boolean) # /LTCG _Same(_lib, 'TargetMachine', _target_machine_enumeration) # TODO(jeanluc) _link defines the same value that gets moved to # ProjectReference. We may want to validate that they are consistent. _Moved(_lib, 'LinkLibraryDependencies', 'ProjectReference', _boolean) # TODO(jeanluc) I don't think these are genuine settings but byproducts of Gyp. _MSVSOnly(_lib, 'AdditionalLibraryDirectories_excluded', _folder_list) _MSBuildOnly(_lib, 'DisplayLibrary', _string) # /LIST Visible='false' _MSBuildOnly(_lib, 'ErrorReporting', _Enumeration([], new=['PromptImmediately', # /ERRORREPORT:PROMPT 'QueueForNextLogin', # /ERRORREPORT:QUEUE 'SendErrorReport', # /ERRORREPORT:SEND 'NoErrorReport'])) # /ERRORREPORT:NONE _MSBuildOnly(_lib, 'MinimumRequiredVersion', _string) _MSBuildOnly(_lib, 'Name', _file_name) # /NAME _MSBuildOnly(_lib, 'RemoveObjects', _file_list) # /REMOVE _MSBuildOnly(_lib, 'SubSystem', _subsystem_enumeration) _MSBuildOnly(_lib, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_lib, 'TreatLibWarningAsErrors', _boolean) # /WX _MSBuildOnly(_lib, 'Verbose', _boolean) # Directives for converting VCManifestTool to Mt. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\mt.xml" for # the schema of the MSBuild Lib settings. # Options that have the same name in MSVS and MSBuild _Same(_manifest, 'AdditionalManifestFiles', _file_list) # /manifest _Same(_manifest, 'AdditionalOptions', _string_list) _Same(_manifest, 'AssemblyIdentity', _string) # /identity: _Same(_manifest, 'ComponentFileName', _file_name) # /dll _Same(_manifest, 'GenerateCatalogFiles', _boolean) # /makecdfs _Same(_manifest, 'InputResourceManifests', _string) # /inputresource _Same(_manifest, 'OutputManifestFile', _file_name) # /out _Same(_manifest, 'RegistrarScriptFile', _file_name) # /rgs _Same(_manifest, 'ReplacementsFile', _file_name) # /replacements _Same(_manifest, 'SuppressStartupBanner', _boolean) # /nologo _Same(_manifest, 'TypeLibraryFile', _file_name) # /tlb: _Same(_manifest, 'UpdateFileHashes', _boolean) # /hashupdate _Same(_manifest, 'UpdateFileHashesSearchPath', _file_name) _Same(_manifest, 'VerboseOutput', _boolean) # /verbose # Options that have moved location. _MovedAndRenamed(_manifest, 'ManifestResourceFile', 'ManifestResourceCompile', 'ResourceOutputFileName', _file_name) _Moved(_manifest, 'EmbedManifest', '', _boolean) # MSVS options not found in MSBuild. _MSVSOnly(_manifest, 'DependencyInformationFile', _file_name) _MSVSOnly(_manifest, 'UseFAT32Workaround', _boolean) _MSVSOnly(_manifest, 'UseUnicodeResponseFiles', _boolean) # MSBuild options not found in MSVS. _MSBuildOnly(_manifest, 'EnableDPIAwareness', _boolean) _MSBuildOnly(_manifest, 'GenerateCategoryTags', _boolean) # /category _MSBuildOnly(_manifest, 'ManifestFromManagedAssembly', _file_name) # /managedassemblyname _MSBuildOnly(_manifest, 'OutputResourceManifests', _string) # /outputresource _MSBuildOnly(_manifest, 'SuppressDependencyElement', _boolean) # /nodependency _MSBuildOnly(_manifest, 'TrackerLogDirectory', _folder_name) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings_test.py000755 �000766 �000024 �00000200530 12455173731 033543� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Unit tests for the MSVSSettings.py file.""" import StringIO import unittest import gyp.MSVSSettings as MSVSSettings class TestSequenceFunctions(unittest.TestCase): def setUp(self): self.stderr = StringIO.StringIO() def _ExpectedWarnings(self, expected): """Compares recorded lines to expected warnings.""" self.stderr.seek(0) actual = self.stderr.read().split('\n') actual = [line for line in actual if line] self.assertEqual(sorted(expected), sorted(actual)) def testValidateMSVSSettings_tool_names(self): """Tests that only MSVS tool names are allowed.""" MSVSSettings.ValidateMSVSSettings( {'VCCLCompilerTool': {}, 'VCLinkerTool': {}, 'VCMIDLTool': {}, 'foo': {}, 'VCResourceCompilerTool': {}, 'VCLibrarianTool': {}, 'VCManifestTool': {}, 'ClCompile': {}}, self.stderr) self._ExpectedWarnings([ 'Warning: unrecognized tool foo', 'Warning: unrecognized tool ClCompile']) def testValidateMSVSSettings_settings(self): """Tests that for invalid MSVS settings.""" MSVSSettings.ValidateMSVSSettings( {'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': ['string1', 'string2'], 'AdditionalUsingDirectories': 'folder1;folder2', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': '0', 'BasicRuntimeChecks': '5', 'BrowseInformation': 'fdkslj', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'CallingConvention': '-1', 'CompileAs': '1', 'DebugInformationFormat': '2', 'DefaultCharIsUnsigned': 'true', 'Detect64BitPortabilityProblems': 'true', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'string1;string2', 'EnableEnhancedInstructionSet': '1', 'EnableFiberSafeOptimizations': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'EnablePREfast': 'true', 'Enableprefast': 'bogus', 'ErrorReporting': '1', 'ExceptionHandling': '1', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': '1', 'FloatingPointExceptions': 'true', 'FloatingPointModel': '1', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2', 'ForcedUsingFiles': 'file1;file2', 'GeneratePreprocessedFile': '1', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': '1', 'KeepComments': 'true', 'MinimalRebuild': 'true', 'ObjectFile': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMP': 'true', 'Optimization': '1', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderThrough': 'a_file_name', 'PreprocessorDefinitions': 'string1;string2', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': '1', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1', 'SuppressStartupBanner': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'string1;string2', 'UseFullPaths': 'true', 'UsePrecompiledHeader': '1', 'UseUnicodeResponseFiles': 'true', 'WarnAsError': 'true', 'WarningLevel': '1', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name', 'ZZXYZ': 'bogus'}, 'VCLinkerTool': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalManifestDependencies': 'file1;file2', 'AdditionalOptions': 'a string1', 'AddModuleNamesToAssembly': 'file1;file2', 'AllowIsolation': 'true', 'AssemblyDebug': '2', 'AssemblyLinkResource': 'file1;file2', 'BaseAddress': 'a string1', 'CLRImageType': '2', 'CLRThreadAttribute': '2', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '2', 'DelayLoadDLLs': 'file1;file2', 'DelaySign': 'true', 'Driver': '2', 'EmbedManagedResourceFile': 'file1;file2', 'EnableCOMDATFolding': '2', 'EnableUAC': 'true', 'EntryPointSymbol': 'a string1', 'ErrorReporting': '2', 'FixedBaseAddress': '2', 'ForceSymbolReferences': 'file1;file2', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateManifest': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a string1', 'HeapReserveSize': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreDefaultLibraryNames': 'file1;file2', 'IgnoreEmbeddedIDL': 'true', 'IgnoreImportLibrary': 'true', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': '2', 'LinkIncremental': '2', 'LinkLibraryDependencies': 'true', 'LinkTimeCodeGeneration': '2', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a string1', 'MidlCommandFile': 'a_file_name', 'ModuleDefinitionFile': 'a_file_name', 'OptimizeForWindows98': '1', 'OptimizeReferences': '2', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': '2', 'RegisterOutput': 'true', 'ResourceOnlyDLL': 'true', 'SetChecksum': 'true', 'ShowProgress': '2', 'StackCommitSize': 'a string1', 'StackReserveSize': 'a string1', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': '2', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNet': 'true', 'TargetMachine': '2', 'TerminalServerAware': '2', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': '2', 'UACUIAccess': 'true', 'UseLibraryDependencyInputs': 'true', 'UseUnicodeResponseFiles': 'true', 'Version': 'a string1'}, 'VCMIDLTool': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'CPreprocessOptions': 'a string1', 'DefaultCharType': '1', 'DLLDataFileName': 'a_file_name', 'EnableErrorChecks': '1', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'MkTypLibCompatible': 'true', 'notgood': 'bogus', 'OutputDirectory': 'a string1', 'PreprocessorDefinitions': 'string1;string2', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'StructMemberAlignment': '1', 'SuppressStartupBanner': 'true', 'TargetEnvironment': '1', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'string1;string2', 'ValidateParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '1'}, 'VCResourceCompilerTool': { 'AdditionalOptions': 'a string1', 'AdditionalIncludeDirectories': 'folder1;folder2', 'Culture': '1003', 'IgnoreStandardIncludePath': 'true', 'notgood2': 'bogus', 'PreprocessorDefinitions': 'string1;string2', 'ResourceOutputFileName': 'a string1', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'UndefinePreprocessorDefinitions': 'string1;string2'}, 'VCLibrarianTool': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'ExportNamedFunctions': 'string1;string2', 'ForceSymbolReferences': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2', 'LinkLibraryDependencies': 'true', 'ModuleDefinitionFile': 'a_file_name', 'OutputFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'UseUnicodeResponseFiles': 'true'}, 'VCManifestTool': { 'AdditionalManifestFiles': 'file1;file2', 'AdditionalOptions': 'a string1', 'AssemblyIdentity': 'a string1', 'ComponentFileName': 'a_file_name', 'DependencyInformationFile': 'a_file_name', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'a string1', 'ManifestResourceFile': 'a_file_name', 'OutputManifestFile': 'a_file_name', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'truel', 'UpdateFileHashesSearchPath': 'a_file_name', 'UseFAT32Workaround': 'true', 'UseUnicodeResponseFiles': 'true', 'VerboseOutput': 'true'}}, self.stderr) self._ExpectedWarnings([ 'Warning: for VCCLCompilerTool/BasicRuntimeChecks, ' 'index value (5) not in expected range [0, 4)', 'Warning: for VCCLCompilerTool/BrowseInformation, ' "invalid literal for int() with base 10: 'fdkslj'", 'Warning: for VCCLCompilerTool/CallingConvention, ' 'index value (-1) not in expected range [0, 3)', 'Warning: for VCCLCompilerTool/DebugInformationFormat, ' 'converted value for 2 not specified.', 'Warning: unrecognized setting VCCLCompilerTool/Enableprefast', 'Warning: unrecognized setting VCCLCompilerTool/ZZXYZ', 'Warning: for VCLinkerTool/TargetMachine, ' 'converted value for 2 not specified.', 'Warning: unrecognized setting VCMIDLTool/notgood', 'Warning: unrecognized setting VCResourceCompilerTool/notgood2', 'Warning: for VCManifestTool/UpdateFileHashes, ' "expected bool; got 'truel'" '']) def testValidateMSBuildSettings_settings(self): """Tests that for invalid MSBuild settings.""" MSVSSettings.ValidateMSBuildSettings( {'ClCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': ['string1', 'string2'], 'AdditionalUsingDirectories': 'folder1;folder2', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': 'NoListing', 'BasicRuntimeChecks': 'StackFrameRuntimeCheck', 'BrowseInformation': 'false', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'BuildingInIDE': 'true', 'CallingConvention': 'Cdecl', 'CompileAs': 'CompileAsC', 'CompileAsManaged': 'Pure', 'CreateHotpatchableImage': 'true', 'DebugInformationFormat': 'ProgramDatabase', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'string1;string2', 'EnableEnhancedInstructionSet': 'StreamingSIMDExtensions', 'EnableFiberSafeOptimizations': 'true', 'EnablePREfast': 'true', 'Enableprefast': 'bogus', 'ErrorReporting': 'Prompt', 'ExceptionHandling': 'SyncCThrow', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': 'Neither', 'FloatingPointExceptions': 'true', 'FloatingPointModel': 'Precise', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2', 'ForcedUsingFiles': 'file1;file2', 'FunctionLevelLinking': 'false', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': 'OnlyExplicitInline', 'IntrinsicFunctions': 'false', 'MinimalRebuild': 'true', 'MultiProcessorCompilation': 'true', 'ObjectFileName': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMPSupport': 'true', 'Optimization': 'Disabled', 'PrecompiledHeader': 'NotUsing', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderOutputFile': 'a_file_name', 'PreprocessKeepComments': 'true', 'PreprocessorDefinitions': 'string1;string2', 'PreprocessOutputPath': 'a string1', 'PreprocessSuppressLineNumbers': 'false', 'PreprocessToFile': 'false', 'ProcessorNumber': '33', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': 'MultiThreaded', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1Byte', 'SuppressStartupBanner': 'true', 'TrackerLogDirectory': 'a_folder', 'TreatSpecificWarningsAsErrors': 'string1;string2', 'TreatWarningAsError': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'string1;string2', 'UseFullPaths': 'true', 'UseUnicodeForAssemblerListing': 'true', 'WarningLevel': 'TurnOffAllWarnings', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name', 'ZZXYZ': 'bogus'}, 'Link': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalManifestDependencies': 'file1;file2', 'AdditionalOptions': 'a string1', 'AddModuleNamesToAssembly': 'file1;file2', 'AllowIsolation': 'true', 'AssemblyDebug': '', 'AssemblyLinkResource': 'file1;file2', 'BaseAddress': 'a string1', 'BuildingInIDE': 'true', 'CLRImageType': 'ForceIJWImage', 'CLRSupportLastError': 'Enabled', 'CLRThreadAttribute': 'MTAThreadingAttribute', 'CLRUnmanagedCodeCheck': 'true', 'CreateHotPatchableImage': 'X86Image', 'DataExecutionPrevention': 'false', 'DelayLoadDLLs': 'file1;file2', 'DelaySign': 'true', 'Driver': 'NotSet', 'EmbedManagedResourceFile': 'file1;file2', 'EnableCOMDATFolding': 'false', 'EnableUAC': 'true', 'EntryPointSymbol': 'a string1', 'FixedBaseAddress': 'false', 'ForceFileOutput': 'Enabled', 'ForceSymbolReferences': 'file1;file2', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a string1', 'HeapReserveSize': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreEmbeddedIDL': 'true', 'IgnoreSpecificDefaultLibraries': 'a_file_list', 'ImageHasSafeExceptionHandlers': 'true', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': 'false', 'LinkDLL': 'true', 'LinkErrorReporting': 'SendErrorReport', 'LinkStatus': 'true', 'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a string1', 'MidlCommandFile': 'a_file_name', 'MinimumRequiredVersion': 'a string1', 'ModuleDefinitionFile': 'a_file_name', 'MSDOSStubFileName': 'a_file_name', 'NoEntryPoint': 'true', 'OptimizeReferences': 'false', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'PreventDllBinding': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': 'false', 'RegisterOutput': 'true', 'SectionAlignment': '33', 'SetChecksum': 'true', 'ShowProgress': 'LinkVerboseREF', 'SpecifySectionAttributes': 'a string1', 'StackCommitSize': 'a string1', 'StackReserveSize': 'a string1', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': 'Console', 'SupportNobindOfDelayLoadedDLL': 'true', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNET': 'true', 'TargetMachine': 'MachineX86', 'TerminalServerAware': 'false', 'TrackerLogDirectory': 'a_folder', 'TreatLinkerWarningAsErrors': 'true', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': 'AsInvoker', 'UACUIAccess': 'true', 'Version': 'a string1'}, 'ResourceCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'Culture': '0x236', 'IgnoreStandardIncludePath': 'true', 'NullTerminateStrings': 'true', 'PreprocessorDefinitions': 'string1;string2', 'ResourceOutputFileName': 'a string1', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'TrackerLogDirectory': 'a_folder', 'UndefinePreprocessorDefinitions': 'string1;string2'}, 'Midl': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'ApplicationConfigurationMode': 'true', 'ClientStubFile': 'a_file_name', 'CPreprocessOptions': 'a string1', 'DefaultCharType': 'Signed', 'DllDataFileName': 'a_file_name', 'EnableErrorChecks': 'EnableCustom', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateClientFiles': 'Stub', 'GenerateServerFiles': 'None', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'LocaleID': '33', 'MkTypLibCompatible': 'true', 'OutputDirectory': 'a string1', 'PreprocessorDefinitions': 'string1;string2', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'ServerStubFile': 'a_file_name', 'StructMemberAlignment': 'NotSet', 'SuppressCompilerWarnings': 'true', 'SuppressStartupBanner': 'true', 'TargetEnvironment': 'Itanium', 'TrackerLogDirectory': 'a_folder', 'TypeLibFormat': 'NewFormat', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'string1;string2', 'ValidateAllParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '1'}, 'Lib': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'DisplayLibrary': 'a string1', 'ErrorReporting': 'PromptImmediately', 'ExportNamedFunctions': 'string1;string2', 'ForceSymbolReferences': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2', 'LinkTimeCodeGeneration': 'true', 'MinimumRequiredVersion': 'a string1', 'ModuleDefinitionFile': 'a_file_name', 'Name': 'a_file_name', 'OutputFile': 'a_file_name', 'RemoveObjects': 'file1;file2', 'SubSystem': 'Console', 'SuppressStartupBanner': 'true', 'TargetMachine': 'MachineX86i', 'TrackerLogDirectory': 'a_folder', 'TreatLibWarningAsErrors': 'true', 'UseUnicodeResponseFiles': 'true', 'Verbose': 'true'}, 'Manifest': { 'AdditionalManifestFiles': 'file1;file2', 'AdditionalOptions': 'a string1', 'AssemblyIdentity': 'a string1', 'ComponentFileName': 'a_file_name', 'EnableDPIAwareness': 'fal', 'GenerateCatalogFiles': 'truel', 'GenerateCategoryTags': 'true', 'InputResourceManifests': 'a string1', 'ManifestFromManagedAssembly': 'a_file_name', 'notgood3': 'bogus', 'OutputManifestFile': 'a_file_name', 'OutputResourceManifests': 'a string1', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressDependencyElement': 'true', 'SuppressStartupBanner': 'true', 'TrackerLogDirectory': 'a_folder', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'a_file_name', 'VerboseOutput': 'true'}, 'ProjectReference': { 'LinkLibraryDependencies': 'true', 'UseLibraryDependencyInputs': 'true'}, 'ManifestResourceCompile': { 'ResourceOutputFileName': 'a_file_name'}, '': { 'EmbedManifest': 'true', 'GenerateManifest': 'true', 'IgnoreImportLibrary': 'true', 'LinkIncremental': 'false'}}, self.stderr) self._ExpectedWarnings([ 'Warning: unrecognized setting ClCompile/Enableprefast', 'Warning: unrecognized setting ClCompile/ZZXYZ', 'Warning: unrecognized setting Manifest/notgood3', 'Warning: for Manifest/GenerateCatalogFiles, ' "expected bool; got 'truel'", 'Warning: for Lib/TargetMachine, unrecognized enumerated value ' 'MachineX86i', "Warning: for Manifest/EnableDPIAwareness, expected bool; got 'fal'"]) def testConvertToMSBuildSettings_empty(self): """Tests an empty conversion.""" msvs_settings = {} expected_msbuild_settings = {} actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) def testConvertToMSBuildSettings_minimal(self): """Tests a minimal conversion.""" msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/foo', 'BasicRuntimeChecks': '0', }, 'VCLinkerTool': { 'LinkTimeCodeGeneration': '1', 'ErrorReporting': '1', 'DataExecutionPrevention': '2', }, } expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/foo', 'BasicRuntimeChecks': 'Default', }, 'Link': { 'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration', 'LinkErrorReporting': 'PromptImmediately', 'DataExecutionPrevention': 'true', }, } actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) def testConvertToMSBuildSettings_warnings(self): """Tests conversion that generates warnings.""" msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': '1', 'AdditionalOptions': '2', # These are incorrect values: 'BasicRuntimeChecks': '12', 'BrowseInformation': '21', 'UsePrecompiledHeader': '13', 'GeneratePreprocessedFile': '14'}, 'VCLinkerTool': { # These are incorrect values: 'Driver': '10', 'LinkTimeCodeGeneration': '31', 'ErrorReporting': '21', 'FixedBaseAddress': '6'}, 'VCResourceCompilerTool': { # Custom 'Culture': '1003'}} expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': '1', 'AdditionalOptions': '2'}, 'Link': {}, 'ResourceCompile': { # Custom 'Culture': '0x03eb'}} actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([ 'Warning: while converting VCCLCompilerTool/BasicRuntimeChecks to ' 'MSBuild, index value (12) not in expected range [0, 4)', 'Warning: while converting VCCLCompilerTool/BrowseInformation to ' 'MSBuild, index value (21) not in expected range [0, 3)', 'Warning: while converting VCCLCompilerTool/UsePrecompiledHeader to ' 'MSBuild, index value (13) not in expected range [0, 3)', 'Warning: while converting VCCLCompilerTool/GeneratePreprocessedFile to ' 'MSBuild, value must be one of [0, 1, 2]; got 14', 'Warning: while converting VCLinkerTool/Driver to ' 'MSBuild, index value (10) not in expected range [0, 4)', 'Warning: while converting VCLinkerTool/LinkTimeCodeGeneration to ' 'MSBuild, index value (31) not in expected range [0, 5)', 'Warning: while converting VCLinkerTool/ErrorReporting to ' 'MSBuild, index value (21) not in expected range [0, 3)', 'Warning: while converting VCLinkerTool/FixedBaseAddress to ' 'MSBuild, index value (6) not in expected range [0, 3)', ]) def testConvertToMSBuildSettings_full_synthetic(self): """Tests conversion of all the MSBuild settings.""" msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'AdditionalUsingDirectories': 'folder1;folder2;folder3', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': '0', 'BasicRuntimeChecks': '1', 'BrowseInformation': '2', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'CallingConvention': '0', 'CompileAs': '1', 'DebugInformationFormat': '4', 'DefaultCharIsUnsigned': 'true', 'Detect64BitPortabilityProblems': 'true', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'd1;d2;d3', 'EnableEnhancedInstructionSet': '0', 'EnableFiberSafeOptimizations': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'EnablePREfast': 'true', 'ErrorReporting': '1', 'ExceptionHandling': '2', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': '0', 'FloatingPointExceptions': 'true', 'FloatingPointModel': '1', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2;file3', 'ForcedUsingFiles': 'file1;file2;file3', 'GeneratePreprocessedFile': '1', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': '2', 'KeepComments': 'true', 'MinimalRebuild': 'true', 'ObjectFile': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMP': 'true', 'Optimization': '3', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderThrough': 'a_file_name', 'PreprocessorDefinitions': 'd1;d2;d3', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': '0', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1', 'SuppressStartupBanner': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'UseFullPaths': 'true', 'UsePrecompiledHeader': '1', 'UseUnicodeResponseFiles': 'true', 'WarnAsError': 'true', 'WarningLevel': '2', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name'}, 'VCLinkerTool': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalLibraryDirectories_excluded': 'folder1;folder2;folder3', 'AdditionalManifestDependencies': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AddModuleNamesToAssembly': 'file1;file2;file3', 'AllowIsolation': 'true', 'AssemblyDebug': '0', 'AssemblyLinkResource': 'file1;file2;file3', 'BaseAddress': 'a_string', 'CLRImageType': '1', 'CLRThreadAttribute': '2', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '0', 'DelayLoadDLLs': 'file1;file2;file3', 'DelaySign': 'true', 'Driver': '1', 'EmbedManagedResourceFile': 'file1;file2;file3', 'EnableCOMDATFolding': '0', 'EnableUAC': 'true', 'EntryPointSymbol': 'a_string', 'ErrorReporting': '0', 'FixedBaseAddress': '1', 'ForceSymbolReferences': 'file1;file2;file3', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateManifest': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a_string', 'HeapReserveSize': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreDefaultLibraryNames': 'file1;file2;file3', 'IgnoreEmbeddedIDL': 'true', 'IgnoreImportLibrary': 'true', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': '2', 'LinkIncremental': '1', 'LinkLibraryDependencies': 'true', 'LinkTimeCodeGeneration': '2', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a_string', 'MidlCommandFile': 'a_file_name', 'ModuleDefinitionFile': 'a_file_name', 'OptimizeForWindows98': '1', 'OptimizeReferences': '0', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': '1', 'RegisterOutput': 'true', 'ResourceOnlyDLL': 'true', 'SetChecksum': 'true', 'ShowProgress': '0', 'StackCommitSize': 'a_string', 'StackReserveSize': 'a_string', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': '2', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNet': 'true', 'TargetMachine': '3', 'TerminalServerAware': '2', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': '1', 'UACUIAccess': 'true', 'UseLibraryDependencyInputs': 'false', 'UseUnicodeResponseFiles': 'true', 'Version': 'a_string'}, 'VCResourceCompilerTool': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'Culture': '1003', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': 'd1;d2;d3', 'ResourceOutputFileName': 'a_string', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3'}, 'VCMIDLTool': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'CPreprocessOptions': 'a_string', 'DefaultCharType': '0', 'DLLDataFileName': 'a_file_name', 'EnableErrorChecks': '2', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'MkTypLibCompatible': 'true', 'OutputDirectory': 'a_string', 'PreprocessorDefinitions': 'd1;d2;d3', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'StructMemberAlignment': '3', 'SuppressStartupBanner': 'true', 'TargetEnvironment': '1', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'ValidateParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '4'}, 'VCLibrarianTool': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalLibraryDirectories_excluded': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'ExportNamedFunctions': 'd1;d2;d3', 'ForceSymbolReferences': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2;file3', 'LinkLibraryDependencies': 'true', 'ModuleDefinitionFile': 'a_file_name', 'OutputFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'UseUnicodeResponseFiles': 'true'}, 'VCManifestTool': { 'AdditionalManifestFiles': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AssemblyIdentity': 'a_string', 'ComponentFileName': 'a_file_name', 'DependencyInformationFile': 'a_file_name', 'EmbedManifest': 'true', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'a_string', 'ManifestResourceFile': 'my_name', 'OutputManifestFile': 'a_file_name', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'a_file_name', 'UseFAT32Workaround': 'true', 'UseUnicodeResponseFiles': 'true', 'VerboseOutput': 'true'}} expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string /J', 'AdditionalUsingDirectories': 'folder1;folder2;folder3', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': 'NoListing', 'BasicRuntimeChecks': 'StackFrameRuntimeCheck', 'BrowseInformation': 'true', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'CallingConvention': 'Cdecl', 'CompileAs': 'CompileAsC', 'DebugInformationFormat': 'EditAndContinue', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'd1;d2;d3', 'EnableEnhancedInstructionSet': 'NotSet', 'EnableFiberSafeOptimizations': 'true', 'EnablePREfast': 'true', 'ErrorReporting': 'Prompt', 'ExceptionHandling': 'Async', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': 'Neither', 'FloatingPointExceptions': 'true', 'FloatingPointModel': 'Strict', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2;file3', 'ForcedUsingFiles': 'file1;file2;file3', 'FunctionLevelLinking': 'true', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': 'AnySuitable', 'IntrinsicFunctions': 'true', 'MinimalRebuild': 'true', 'ObjectFileName': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMPSupport': 'true', 'Optimization': 'Full', 'PrecompiledHeader': 'Create', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderOutputFile': 'a_file_name', 'PreprocessKeepComments': 'true', 'PreprocessorDefinitions': 'd1;d2;d3', 'PreprocessSuppressLineNumbers': 'false', 'PreprocessToFile': 'true', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': 'MultiThreaded', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1Byte', 'SuppressStartupBanner': 'true', 'TreatWarningAsError': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'UseFullPaths': 'true', 'WarningLevel': 'Level2', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name'}, 'Link': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalManifestDependencies': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AddModuleNamesToAssembly': 'file1;file2;file3', 'AllowIsolation': 'true', 'AssemblyDebug': '', 'AssemblyLinkResource': 'file1;file2;file3', 'BaseAddress': 'a_string', 'CLRImageType': 'ForceIJWImage', 'CLRThreadAttribute': 'STAThreadingAttribute', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '', 'DelayLoadDLLs': 'file1;file2;file3', 'DelaySign': 'true', 'Driver': 'Driver', 'EmbedManagedResourceFile': 'file1;file2;file3', 'EnableCOMDATFolding': '', 'EnableUAC': 'true', 'EntryPointSymbol': 'a_string', 'FixedBaseAddress': 'false', 'ForceSymbolReferences': 'file1;file2;file3', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a_string', 'HeapReserveSize': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreEmbeddedIDL': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2;file3', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': 'true', 'LinkErrorReporting': 'NoErrorReport', 'LinkTimeCodeGeneration': 'PGInstrument', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a_string', 'MidlCommandFile': 'a_file_name', 'ModuleDefinitionFile': 'a_file_name', 'NoEntryPoint': 'true', 'OptimizeReferences': '', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': 'false', 'RegisterOutput': 'true', 'SetChecksum': 'true', 'ShowProgress': 'NotSet', 'StackCommitSize': 'a_string', 'StackReserveSize': 'a_string', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': 'Windows', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNET': 'true', 'TargetMachine': 'MachineARM', 'TerminalServerAware': 'true', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': 'HighestAvailable', 'UACUIAccess': 'true', 'Version': 'a_string'}, 'ResourceCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'Culture': '0x03eb', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': 'd1;d2;d3', 'ResourceOutputFileName': 'a_string', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3'}, 'Midl': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'CPreprocessOptions': 'a_string', 'DefaultCharType': 'Unsigned', 'DllDataFileName': 'a_file_name', 'EnableErrorChecks': 'All', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'MkTypLibCompatible': 'true', 'OutputDirectory': 'a_string', 'PreprocessorDefinitions': 'd1;d2;d3', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'StructMemberAlignment': '4', 'SuppressStartupBanner': 'true', 'TargetEnvironment': 'Win32', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'ValidateAllParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '4'}, 'Lib': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'ExportNamedFunctions': 'd1;d2;d3', 'ForceSymbolReferences': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2;file3', 'ModuleDefinitionFile': 'a_file_name', 'OutputFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'UseUnicodeResponseFiles': 'true'}, 'Manifest': { 'AdditionalManifestFiles': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AssemblyIdentity': 'a_string', 'ComponentFileName': 'a_file_name', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'a_string', 'OutputManifestFile': 'a_file_name', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'a_file_name', 'VerboseOutput': 'true'}, 'ManifestResourceCompile': { 'ResourceOutputFileName': 'my_name'}, 'ProjectReference': { 'LinkLibraryDependencies': 'true', 'UseLibraryDependencyInputs': 'false'}, '': { 'EmbedManifest': 'true', 'GenerateManifest': 'true', 'IgnoreImportLibrary': 'true', 'LinkIncremental': 'false'}} actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) def testConvertToMSBuildSettings_actual(self): """Tests the conversion of an actual project. A VS2008 project with most of the options defined was created through the VS2008 IDE. It was then converted to VS2010. The tool settings found in the .vcproj and .vcxproj files were converted to the two dictionaries msvs_settings and expected_msbuild_settings. Note that for many settings, the VS2010 converter adds macros like %(AdditionalIncludeDirectories) to make sure than inherited values are included. Since the Gyp projects we generate do not use inheritance, we removed these macros. They were: ClCompile: AdditionalIncludeDirectories: ';%(AdditionalIncludeDirectories)' AdditionalOptions: ' %(AdditionalOptions)' AdditionalUsingDirectories: ';%(AdditionalUsingDirectories)' DisableSpecificWarnings: ';%(DisableSpecificWarnings)', ForcedIncludeFiles: ';%(ForcedIncludeFiles)', ForcedUsingFiles: ';%(ForcedUsingFiles)', PreprocessorDefinitions: ';%(PreprocessorDefinitions)', UndefinePreprocessorDefinitions: ';%(UndefinePreprocessorDefinitions)', Link: AdditionalDependencies: ';%(AdditionalDependencies)', AdditionalLibraryDirectories: ';%(AdditionalLibraryDirectories)', AdditionalManifestDependencies: ';%(AdditionalManifestDependencies)', AdditionalOptions: ' %(AdditionalOptions)', AddModuleNamesToAssembly: ';%(AddModuleNamesToAssembly)', AssemblyLinkResource: ';%(AssemblyLinkResource)', DelayLoadDLLs: ';%(DelayLoadDLLs)', EmbedManagedResourceFile: ';%(EmbedManagedResourceFile)', ForceSymbolReferences: ';%(ForceSymbolReferences)', IgnoreSpecificDefaultLibraries: ';%(IgnoreSpecificDefaultLibraries)', ResourceCompile: AdditionalIncludeDirectories: ';%(AdditionalIncludeDirectories)', AdditionalOptions: ' %(AdditionalOptions)', PreprocessorDefinitions: ';%(PreprocessorDefinitions)', Manifest: AdditionalManifestFiles: ';%(AdditionalManifestFiles)', AdditionalOptions: ' %(AdditionalOptions)', InputResourceManifests: ';%(InputResourceManifests)', """ msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/more', 'AdditionalUsingDirectories': 'test', 'AssemblerListingLocation': '$(IntDir)\\a', 'AssemblerOutput': '1', 'BasicRuntimeChecks': '3', 'BrowseInformation': '1', 'BrowseInformationFile': '$(IntDir)\\e', 'BufferSecurityCheck': 'false', 'CallingConvention': '1', 'CompileAs': '1', 'DebugInformationFormat': '4', 'DefaultCharIsUnsigned': 'true', 'Detect64BitPortabilityProblems': 'true', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'abc', 'EnableEnhancedInstructionSet': '1', 'EnableFiberSafeOptimizations': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'EnablePREfast': 'true', 'ErrorReporting': '2', 'ExceptionHandling': '2', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': '2', 'FloatingPointExceptions': 'true', 'FloatingPointModel': '1', 'ForceConformanceInForLoopScope': 'false', 'ForcedIncludeFiles': 'def', 'ForcedUsingFiles': 'ge', 'GeneratePreprocessedFile': '2', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': '1', 'KeepComments': 'true', 'MinimalRebuild': 'true', 'ObjectFile': '$(IntDir)\\b', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMP': 'true', 'Optimization': '3', 'PrecompiledHeaderFile': '$(IntDir)\\$(TargetName).pche', 'PrecompiledHeaderThrough': 'StdAfx.hd', 'PreprocessorDefinitions': 'WIN32;_DEBUG;_CONSOLE', 'ProgramDataBaseFileName': '$(IntDir)\\vc90b.pdb', 'RuntimeLibrary': '3', 'RuntimeTypeInfo': 'false', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '3', 'SuppressStartupBanner': 'false', 'TreatWChar_tAsBuiltInType': 'false', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'wer', 'UseFullPaths': 'true', 'UsePrecompiledHeader': '0', 'UseUnicodeResponseFiles': 'false', 'WarnAsError': 'true', 'WarningLevel': '3', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': '$(IntDir)\\c'}, 'VCLinkerTool': { 'AdditionalDependencies': 'zx', 'AdditionalLibraryDirectories': 'asd', 'AdditionalManifestDependencies': 's2', 'AdditionalOptions': '/mor2', 'AddModuleNamesToAssembly': 'd1', 'AllowIsolation': 'false', 'AssemblyDebug': '1', 'AssemblyLinkResource': 'd5', 'BaseAddress': '23423', 'CLRImageType': '3', 'CLRThreadAttribute': '1', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '0', 'DelayLoadDLLs': 'd4', 'DelaySign': 'true', 'Driver': '2', 'EmbedManagedResourceFile': 'd2', 'EnableCOMDATFolding': '1', 'EnableUAC': 'false', 'EntryPointSymbol': 'f5', 'ErrorReporting': '2', 'FixedBaseAddress': '1', 'ForceSymbolReferences': 'd3', 'FunctionOrder': 'fssdfsd', 'GenerateDebugInformation': 'true', 'GenerateManifest': 'false', 'GenerateMapFile': 'true', 'HeapCommitSize': '13', 'HeapReserveSize': '12', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreDefaultLibraryNames': 'flob;flok', 'IgnoreEmbeddedIDL': 'true', 'IgnoreImportLibrary': 'true', 'ImportLibrary': 'f4', 'KeyContainer': 'f7', 'KeyFile': 'f6', 'LargeAddressAware': '2', 'LinkIncremental': '0', 'LinkLibraryDependencies': 'false', 'LinkTimeCodeGeneration': '1', 'ManifestFile': '$(IntDir)\\$(TargetFileName).2intermediate.manifest', 'MapExports': 'true', 'MapFileName': 'd5', 'MergedIDLBaseFileName': 'f2', 'MergeSections': 'f5', 'MidlCommandFile': 'f1', 'ModuleDefinitionFile': 'sdsd', 'OptimizeForWindows98': '2', 'OptimizeReferences': '2', 'OutputFile': '$(OutDir)\\$(ProjectName)2.exe', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': '$(TargetDir)$(TargetName).pgdd', 'ProgramDatabaseFile': 'Flob.pdb', 'RandomizedBaseAddress': '1', 'RegisterOutput': 'true', 'ResourceOnlyDLL': 'true', 'SetChecksum': 'false', 'ShowProgress': '1', 'StackCommitSize': '15', 'StackReserveSize': '14', 'StripPrivateSymbols': 'd3', 'SubSystem': '1', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'false', 'SwapRunFromCD': 'true', 'SwapRunFromNet': 'true', 'TargetMachine': '1', 'TerminalServerAware': '1', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'f3', 'TypeLibraryResourceID': '12', 'UACExecutionLevel': '2', 'UACUIAccess': 'true', 'UseLibraryDependencyInputs': 'true', 'UseUnicodeResponseFiles': 'false', 'Version': '333'}, 'VCResourceCompilerTool': { 'AdditionalIncludeDirectories': 'f3', 'AdditionalOptions': '/more3', 'Culture': '3084', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': '_UNICODE;UNICODE2', 'ResourceOutputFileName': '$(IntDir)/$(InputName)3.res', 'ShowProgress': 'true'}, 'VCManifestTool': { 'AdditionalManifestFiles': 'sfsdfsd', 'AdditionalOptions': 'afdsdafsd', 'AssemblyIdentity': 'sddfdsadfsa', 'ComponentFileName': 'fsdfds', 'DependencyInformationFile': '$(IntDir)\\mt.depdfd', 'EmbedManifest': 'false', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'asfsfdafs', 'ManifestResourceFile': '$(IntDir)\\$(TargetFileName).embed.manifest.resfdsf', 'OutputManifestFile': '$(TargetPath).manifestdfs', 'RegistrarScriptFile': 'sdfsfd', 'ReplacementsFile': 'sdffsd', 'SuppressStartupBanner': 'false', 'TypeLibraryFile': 'sfsd', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'sfsd', 'UseFAT32Workaround': 'true', 'UseUnicodeResponseFiles': 'false', 'VerboseOutput': 'true'}} expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/more /J', 'AdditionalUsingDirectories': 'test', 'AssemblerListingLocation': '$(IntDir)a', 'AssemblerOutput': 'AssemblyCode', 'BasicRuntimeChecks': 'EnableFastChecks', 'BrowseInformation': 'true', 'BrowseInformationFile': '$(IntDir)e', 'BufferSecurityCheck': 'false', 'CallingConvention': 'FastCall', 'CompileAs': 'CompileAsC', 'DebugInformationFormat': 'EditAndContinue', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'abc', 'EnableEnhancedInstructionSet': 'StreamingSIMDExtensions', 'EnableFiberSafeOptimizations': 'true', 'EnablePREfast': 'true', 'ErrorReporting': 'Queue', 'ExceptionHandling': 'Async', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': 'Size', 'FloatingPointExceptions': 'true', 'FloatingPointModel': 'Strict', 'ForceConformanceInForLoopScope': 'false', 'ForcedIncludeFiles': 'def', 'ForcedUsingFiles': 'ge', 'FunctionLevelLinking': 'true', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': 'OnlyExplicitInline', 'IntrinsicFunctions': 'true', 'MinimalRebuild': 'true', 'ObjectFileName': '$(IntDir)b', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMPSupport': 'true', 'Optimization': 'Full', 'PrecompiledHeader': 'NotUsing', # Actual conversion gives '' 'PrecompiledHeaderFile': 'StdAfx.hd', 'PrecompiledHeaderOutputFile': '$(IntDir)$(TargetName).pche', 'PreprocessKeepComments': 'true', 'PreprocessorDefinitions': 'WIN32;_DEBUG;_CONSOLE', 'PreprocessSuppressLineNumbers': 'true', 'PreprocessToFile': 'true', 'ProgramDataBaseFileName': '$(IntDir)vc90b.pdb', 'RuntimeLibrary': 'MultiThreadedDebugDLL', 'RuntimeTypeInfo': 'false', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '4Bytes', 'SuppressStartupBanner': 'false', 'TreatWarningAsError': 'true', 'TreatWChar_tAsBuiltInType': 'false', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'wer', 'UseFullPaths': 'true', 'WarningLevel': 'Level3', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': '$(IntDir)c'}, 'Link': { 'AdditionalDependencies': 'zx', 'AdditionalLibraryDirectories': 'asd', 'AdditionalManifestDependencies': 's2', 'AdditionalOptions': '/mor2', 'AddModuleNamesToAssembly': 'd1', 'AllowIsolation': 'false', 'AssemblyDebug': 'true', 'AssemblyLinkResource': 'd5', 'BaseAddress': '23423', 'CLRImageType': 'ForceSafeILImage', 'CLRThreadAttribute': 'MTAThreadingAttribute', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '', 'DelayLoadDLLs': 'd4', 'DelaySign': 'true', 'Driver': 'UpOnly', 'EmbedManagedResourceFile': 'd2', 'EnableCOMDATFolding': 'false', 'EnableUAC': 'false', 'EntryPointSymbol': 'f5', 'FixedBaseAddress': 'false', 'ForceSymbolReferences': 'd3', 'FunctionOrder': 'fssdfsd', 'GenerateDebugInformation': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': '13', 'HeapReserveSize': '12', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreEmbeddedIDL': 'true', 'IgnoreSpecificDefaultLibraries': 'flob;flok', 'ImportLibrary': 'f4', 'KeyContainer': 'f7', 'KeyFile': 'f6', 'LargeAddressAware': 'true', 'LinkErrorReporting': 'QueueForNextLogin', 'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration', 'ManifestFile': '$(IntDir)$(TargetFileName).2intermediate.manifest', 'MapExports': 'true', 'MapFileName': 'd5', 'MergedIDLBaseFileName': 'f2', 'MergeSections': 'f5', 'MidlCommandFile': 'f1', 'ModuleDefinitionFile': 'sdsd', 'NoEntryPoint': 'true', 'OptimizeReferences': 'true', 'OutputFile': '$(OutDir)$(ProjectName)2.exe', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': '$(TargetDir)$(TargetName).pgdd', 'ProgramDatabaseFile': 'Flob.pdb', 'RandomizedBaseAddress': 'false', 'RegisterOutput': 'true', 'SetChecksum': 'false', 'ShowProgress': 'LinkVerbose', 'StackCommitSize': '15', 'StackReserveSize': '14', 'StripPrivateSymbols': 'd3', 'SubSystem': 'Console', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'false', 'SwapRunFromCD': 'true', 'SwapRunFromNET': 'true', 'TargetMachine': 'MachineX86', 'TerminalServerAware': 'false', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'f3', 'TypeLibraryResourceID': '12', 'UACExecutionLevel': 'RequireAdministrator', 'UACUIAccess': 'true', 'Version': '333'}, 'ResourceCompile': { 'AdditionalIncludeDirectories': 'f3', 'AdditionalOptions': '/more3', 'Culture': '0x0c0c', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': '_UNICODE;UNICODE2', 'ResourceOutputFileName': '$(IntDir)%(Filename)3.res', 'ShowProgress': 'true'}, 'Manifest': { 'AdditionalManifestFiles': 'sfsdfsd', 'AdditionalOptions': 'afdsdafsd', 'AssemblyIdentity': 'sddfdsadfsa', 'ComponentFileName': 'fsdfds', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'asfsfdafs', 'OutputManifestFile': '$(TargetPath).manifestdfs', 'RegistrarScriptFile': 'sdfsfd', 'ReplacementsFile': 'sdffsd', 'SuppressStartupBanner': 'false', 'TypeLibraryFile': 'sfsd', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'sfsd', 'VerboseOutput': 'true'}, 'ProjectReference': { 'LinkLibraryDependencies': 'false', 'UseLibraryDependencyInputs': 'true'}, '': { 'EmbedManifest': 'false', 'GenerateManifest': 'false', 'IgnoreImportLibrary': 'true', 'LinkIncremental': '' }, 'ManifestResourceCompile': { 'ResourceOutputFileName': '$(IntDir)$(TargetFileName).embed.manifest.resfdsf'} } actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) if __name__ == '__main__': unittest.main() ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSToolFile.py�����000644 �000766 �000024 �00000003414 12455173731 032420� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Visual Studio project reader/writer.""" import gyp.common import gyp.easy_xml as easy_xml class Writer(object): """Visual Studio XML tool file writer.""" def __init__(self, tool_file_path, name): """Initializes the tool file. Args: tool_file_path: Path to the tool file. name: Name of the tool file. """ self.tool_file_path = tool_file_path self.name = name self.rules_section = ['Rules'] def AddCustomBuildRule(self, name, cmd, description, additional_dependencies, outputs, extensions): """Adds a rule to the tool file. Args: name: Name of the rule. description: Description of the rule. cmd: Command line of the rule. additional_dependencies: other files which may trigger the rule. outputs: outputs of the rule. extensions: extensions handled by the rule. """ rule = ['CustomBuildRule', {'Name': name, 'ExecutionDescription': description, 'CommandLine': cmd, 'Outputs': ';'.join(outputs), 'FileExtensions': ';'.join(extensions), 'AdditionalDependencies': ';'.join(additional_dependencies) }] self.rules_section.append(rule) def WriteIfChanged(self): """Writes the tool file.""" content = ['VisualStudioToolFile', {'Version': '8.00', 'Name': self.name }, self.rules_section ] easy_xml.WriteXmlIfChanged(content, self.tool_file_path, encoding="Windows-1252") ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSUserFile.py�����000644 �000766 �000024 �00000011746 12455173731 032430� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Visual Studio user preferences file writer.""" import os import re import socket # for gethostname import gyp.common import gyp.easy_xml as easy_xml #------------------------------------------------------------------------------ def _FindCommandInPath(command): """If there are no slashes in the command given, this function searches the PATH env to find the given command, and converts it to an absolute path. We have to do this because MSVS is looking for an actual file to launch a debugger on, not just a command line. Note that this happens at GYP time, so anything needing to be built needs to have a full path.""" if '/' in command or '\\' in command: # If the command already has path elements (either relative or # absolute), then assume it is constructed properly. return command else: # Search through the path list and find an existing file that # we can access. paths = os.environ.get('PATH','').split(os.pathsep) for path in paths: item = os.path.join(path, command) if os.path.isfile(item) and os.access(item, os.X_OK): return item return command def _QuoteWin32CommandLineArgs(args): new_args = [] for arg in args: # Replace all double-quotes with double-double-quotes to escape # them for cmd shell, and then quote the whole thing if there # are any. if arg.find('"') != -1: arg = '""'.join(arg.split('"')) arg = '"%s"' % arg # Otherwise, if there are any spaces, quote the whole arg. elif re.search(r'[ \t\n]', arg): arg = '"%s"' % arg new_args.append(arg) return new_args class Writer(object): """Visual Studio XML user user file writer.""" def __init__(self, user_file_path, version, name): """Initializes the user file. Args: user_file_path: Path to the user file. version: Version info. name: Name of the user file. """ self.user_file_path = user_file_path self.version = version self.name = name self.configurations = {} def AddConfig(self, name): """Adds a configuration to the project. Args: name: Configuration name. """ self.configurations[name] = ['Configuration', {'Name': name}] def AddDebugSettings(self, config_name, command, environment = {}, working_directory=""): """Adds a DebugSettings node to the user file for a particular config. Args: command: command line to run. First element in the list is the executable. All elements of the command will be quoted if necessary. working_directory: other files which may trigger the rule. (optional) """ command = _QuoteWin32CommandLineArgs(command) abs_command = _FindCommandInPath(command[0]) if environment and isinstance(environment, dict): env_list = ['%s="%s"' % (key, val) for (key,val) in environment.iteritems()] environment = ' '.join(env_list) else: environment = '' n_cmd = ['DebugSettings', {'Command': abs_command, 'WorkingDirectory': working_directory, 'CommandArguments': " ".join(command[1:]), 'RemoteMachine': socket.gethostname(), 'Environment': environment, 'EnvironmentMerge': 'true', # Currently these are all "dummy" values that we're just setting # in the default manner that MSVS does it. We could use some of # these to add additional capabilities, I suppose, but they might # not have parity with other platforms then. 'Attach': 'false', 'DebuggerType': '3', # 'auto' debugger 'Remote': '1', 'RemoteCommand': '', 'HttpUrl': '', 'PDBPath': '', 'SQLDebugging': '', 'DebuggerFlavor': '0', 'MPIRunCommand': '', 'MPIRunArguments': '', 'MPIRunWorkingDirectory': '', 'ApplicationCommand': '', 'ApplicationArguments': '', 'ShimCommand': '', 'MPIAcceptMode': '', 'MPIAcceptFilter': '' }] # Find the config, and add it if it doesn't exist. if config_name not in self.configurations: self.AddConfig(config_name) # Add the DebugSettings onto the appropriate config. self.configurations[config_name].append(n_cmd) def WriteIfChanged(self): """Writes the user file.""" configs = ['Configurations'] for config, spec in sorted(self.configurations.iteritems()): configs.append(spec) content = ['VisualStudioUserFile', {'Version': self.version.ProjectVersion(), 'Name': self.name }, configs] easy_xml.WriteXmlIfChanged(content, self.user_file_path, encoding="Windows-1252") ��������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSUtil.py���������000644 �000766 �000024 �00000022252 12455173731 031621� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions shared amongst the Windows generators.""" import copy import os _TARGET_TYPE_EXT = { 'executable': '.exe', 'loadable_module': '.dll', 'shared_library': '.dll', } def _GetLargePdbShimCcPath(): """Returns the path of the large_pdb_shim.cc file.""" this_dir = os.path.abspath(os.path.dirname(__file__)) src_dir = os.path.abspath(os.path.join(this_dir, '..', '..')) win_data_dir = os.path.join(src_dir, 'data', 'win') large_pdb_shim_cc = os.path.join(win_data_dir, 'large-pdb-shim.cc') return large_pdb_shim_cc def _DeepCopySomeKeys(in_dict, keys): """Performs a partial deep-copy on |in_dict|, only copying the keys in |keys|. Arguments: in_dict: The dictionary to copy. keys: The keys to be copied. If a key is in this list and doesn't exist in |in_dict| this is not an error. Returns: The partially deep-copied dictionary. """ d = {} for key in keys: if key not in in_dict: continue d[key] = copy.deepcopy(in_dict[key]) return d def _SuffixName(name, suffix): """Add a suffix to the end of a target. Arguments: name: name of the target (foo#target) suffix: the suffix to be added Returns: Target name with suffix added (foo_suffix#target) """ parts = name.rsplit('#', 1) parts[0] = '%s_%s' % (parts[0], suffix) return '#'.join(parts) def _ShardName(name, number): """Add a shard number to the end of a target. Arguments: name: name of the target (foo#target) number: shard number Returns: Target name with shard added (foo_1#target) """ return _SuffixName(name, str(number)) def ShardTargets(target_list, target_dicts): """Shard some targets apart to work around the linkers limits. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. Returns: Tuple of the new sharded versions of the inputs. """ # Gather the targets to shard, and how many pieces. targets_to_shard = {} for t in target_dicts: shards = int(target_dicts[t].get('msvs_shard', 0)) if shards: targets_to_shard[t] = shards # Shard target_list. new_target_list = [] for t in target_list: if t in targets_to_shard: for i in range(targets_to_shard[t]): new_target_list.append(_ShardName(t, i)) else: new_target_list.append(t) # Shard target_dict. new_target_dicts = {} for t in target_dicts: if t in targets_to_shard: for i in range(targets_to_shard[t]): name = _ShardName(t, i) new_target_dicts[name] = copy.copy(target_dicts[t]) new_target_dicts[name]['target_name'] = _ShardName( new_target_dicts[name]['target_name'], i) sources = new_target_dicts[name].get('sources', []) new_sources = [] for pos in range(i, len(sources), targets_to_shard[t]): new_sources.append(sources[pos]) new_target_dicts[name]['sources'] = new_sources else: new_target_dicts[t] = target_dicts[t] # Shard dependencies. for t in new_target_dicts: dependencies = copy.copy(new_target_dicts[t].get('dependencies', [])) new_dependencies = [] for d in dependencies: if d in targets_to_shard: for i in range(targets_to_shard[d]): new_dependencies.append(_ShardName(d, i)) else: new_dependencies.append(d) new_target_dicts[t]['dependencies'] = new_dependencies return (new_target_list, new_target_dicts) def _GetPdbPath(target_dict, config_name, vars): """Returns the path to the PDB file that will be generated by a given configuration. The lookup proceeds as follows: - Look for an explicit path in the VCLinkerTool configuration block. - Look for an 'msvs_large_pdb_path' variable. - Use '<(PRODUCT_DIR)/<(product_name).(exe|dll).pdb' if 'product_name' is specified. - Use '<(PRODUCT_DIR)/<(target_name).(exe|dll).pdb'. Arguments: target_dict: The target dictionary to be searched. config_name: The name of the configuration of interest. vars: A dictionary of common GYP variables with generator-specific values. Returns: The path of the corresponding PDB file. """ config = target_dict['configurations'][config_name] msvs = config.setdefault('msvs_settings', {}) linker = msvs.get('VCLinkerTool', {}) pdb_path = linker.get('ProgramDatabaseFile') if pdb_path: return pdb_path variables = target_dict.get('variables', {}) pdb_path = variables.get('msvs_large_pdb_path', None) if pdb_path: return pdb_path pdb_base = target_dict.get('product_name', target_dict['target_name']) pdb_base = '%s%s.pdb' % (pdb_base, _TARGET_TYPE_EXT[target_dict['type']]) pdb_path = vars['PRODUCT_DIR'] + '/' + pdb_base return pdb_path def InsertLargePdbShims(target_list, target_dicts, vars): """Insert a shim target that forces the linker to use 4KB pagesize PDBs. This is a workaround for targets with PDBs greater than 1GB in size, the limit for the 1KB pagesize PDBs created by the linker by default. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. vars: A dictionary of common GYP variables with generator-specific values. Returns: Tuple of the shimmed version of the inputs. """ # Determine which targets need shimming. targets_to_shim = [] for t in target_dicts: target_dict = target_dicts[t] # We only want to shim targets that have msvs_large_pdb enabled. if not int(target_dict.get('msvs_large_pdb', 0)): continue # This is intended for executable, shared_library and loadable_module # targets where every configuration is set up to produce a PDB output. # If any of these conditions is not true then the shim logic will fail # below. targets_to_shim.append(t) large_pdb_shim_cc = _GetLargePdbShimCcPath() for t in targets_to_shim: target_dict = target_dicts[t] target_name = target_dict.get('target_name') base_dict = _DeepCopySomeKeys(target_dict, ['configurations', 'default_configuration', 'toolset']) # This is the dict for copying the source file (part of the GYP tree) # to the intermediate directory of the project. This is necessary because # we can't always build a relative path to the shim source file (on Windows # GYP and the project may be on different drives), and Ninja hates absolute # paths (it ends up generating the .obj and .obj.d alongside the source # file, polluting GYPs tree). copy_suffix = 'large_pdb_copy' copy_target_name = target_name + '_' + copy_suffix full_copy_target_name = _SuffixName(t, copy_suffix) shim_cc_basename = os.path.basename(large_pdb_shim_cc) shim_cc_dir = vars['SHARED_INTERMEDIATE_DIR'] + '/' + copy_target_name shim_cc_path = shim_cc_dir + '/' + shim_cc_basename copy_dict = copy.deepcopy(base_dict) copy_dict['target_name'] = copy_target_name copy_dict['type'] = 'none' copy_dict['sources'] = [ large_pdb_shim_cc ] copy_dict['copies'] = [{ 'destination': shim_cc_dir, 'files': [ large_pdb_shim_cc ] }] # This is the dict for the PDB generating shim target. It depends on the # copy target. shim_suffix = 'large_pdb_shim' shim_target_name = target_name + '_' + shim_suffix full_shim_target_name = _SuffixName(t, shim_suffix) shim_dict = copy.deepcopy(base_dict) shim_dict['target_name'] = shim_target_name shim_dict['type'] = 'static_library' shim_dict['sources'] = [ shim_cc_path ] shim_dict['dependencies'] = [ full_copy_target_name ] # Set up the shim to output its PDB to the same location as the final linker # target. for config_name, config in shim_dict.get('configurations').iteritems(): pdb_path = _GetPdbPath(target_dict, config_name, vars) # A few keys that we don't want to propagate. for key in ['msvs_precompiled_header', 'msvs_precompiled_source', 'test']: config.pop(key, None) msvs = config.setdefault('msvs_settings', {}) # Update the compiler directives in the shim target. compiler = msvs.setdefault('VCCLCompilerTool', {}) compiler['DebugInformationFormat'] = '3' compiler['ProgramDataBaseFileName'] = pdb_path # Set the explicit PDB path in the appropriate configuration of the # original target. config = target_dict['configurations'][config_name] msvs = config.setdefault('msvs_settings', {}) linker = msvs.setdefault('VCLinkerTool', {}) linker['GenerateDebugInformation'] = 'true' linker['ProgramDatabaseFile'] = pdb_path # Add the new targets. They must go to the beginning of the list so that # the dependency generation works as expected in ninja. target_list.insert(0, full_copy_target_name) target_list.insert(0, full_shim_target_name) target_dicts[full_copy_target_name] = copy_dict target_dicts[full_shim_target_name] = shim_dict # Update the original target to depend on the shim target. target_dict.setdefault('dependencies', []).append(full_shim_target_name) return (target_list, target_dicts)������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/MSVSVersion.py������000644 �000766 �000024 �00000036263 12455173731 032340� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Handle version information related to Visual Stuio.""" import errno import os import re import subprocess import sys import gyp import glob class VisualStudioVersion(object): """Information regarding a version of Visual Studio.""" def __init__(self, short_name, description, solution_version, project_version, flat_sln, uses_vcxproj, path, sdk_based, default_toolset=None): self.short_name = short_name self.description = description self.solution_version = solution_version self.project_version = project_version self.flat_sln = flat_sln self.uses_vcxproj = uses_vcxproj self.path = path self.sdk_based = sdk_based self.default_toolset = default_toolset def ShortName(self): return self.short_name def Description(self): """Get the full description of the version.""" return self.description def SolutionVersion(self): """Get the version number of the sln files.""" return self.solution_version def ProjectVersion(self): """Get the version number of the vcproj or vcxproj files.""" return self.project_version def FlatSolution(self): return self.flat_sln def UsesVcxproj(self): """Returns true if this version uses a vcxproj file.""" return self.uses_vcxproj def ProjectExtension(self): """Returns the file extension for the project.""" return self.uses_vcxproj and '.vcxproj' or '.vcproj' def Path(self): """Returns the path to Visual Studio installation.""" return self.path def ToolPath(self, tool): """Returns the path to a given compiler tool. """ return os.path.normpath(os.path.join(self.path, "VC/bin", tool)) def DefaultToolset(self): """Returns the msbuild toolset version that will be used in the absence of a user override.""" return self.default_toolset def SetupScript(self, target_arch): """Returns a command (with arguments) to be used to set up the environment.""" # Check if we are running in the SDK command line environment and use # the setup script from the SDK if so. |target_arch| should be either # 'x86' or 'x64'. assert target_arch in ('x86', 'x64') sdk_dir = os.environ.get('WindowsSDKDir') if self.sdk_based and sdk_dir: return [os.path.normpath(os.path.join(sdk_dir, 'Bin/SetEnv.Cmd')), '/' + target_arch] else: # We don't use VC/vcvarsall.bat for x86 because vcvarsall calls # vcvars32, which it can only find if VS??COMNTOOLS is set, which it # isn't always. if target_arch == 'x86': if self.short_name == '2013' and ( os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'): # VS2013 non-Express has a x64-x86 cross that we want to prefer. return [os.path.normpath( os.path.join(self.path, 'VC/vcvarsall.bat')), 'amd64_x86'] # Otherwise, the standard x86 compiler. return [os.path.normpath( os.path.join(self.path, 'Common7/Tools/vsvars32.bat'))] else: assert target_arch == 'x64' arg = 'x86_amd64' # Use the 64-on-64 compiler if we're not using an express # edition and we're running on a 64bit OS. if self.short_name[-1] != 'e' and ( os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'): arg = 'amd64' return [os.path.normpath( os.path.join(self.path, 'VC/vcvarsall.bat')), arg] def _RegistryQueryBase(sysdir, key, value): """Use reg.exe to read a particular key. While ideally we might use the win32 module, we would like gyp to be python neutral, so for instance cygwin python lacks this module. Arguments: sysdir: The system subdirectory to attempt to launch reg.exe from. key: The registry key to read from. value: The particular value to read. Return: stdout from reg.exe, or None for failure. """ # Skip if not on Windows or Python Win32 setup issue if sys.platform not in ('win32', 'cygwin'): return None # Setup params to pass to and attempt to launch reg.exe cmd = [os.path.join(os.environ.get('WINDIR', ''), sysdir, 'reg.exe'), 'query', key] if value: cmd.extend(['/v', value]) p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) # Obtain the stdout from reg.exe, reading to the end so p.returncode is valid # Note that the error text may be in [1] in some cases text = p.communicate()[0] # Check return code from reg.exe; officially 0==success and 1==error if p.returncode: return None return text def _RegistryQuery(key, value=None): """Use reg.exe to read a particular key through _RegistryQueryBase. First tries to launch from %WinDir%\Sysnative to avoid WoW64 redirection. If that fails, it falls back to System32. Sysnative is available on Vista and up and available on Windows Server 2003 and XP through KB patch 942589. Note that Sysnative will always fail if using 64-bit python due to it being a virtual directory and System32 will work correctly in the first place. KB 942589 - http://support.microsoft.com/kb/942589/en-us. Arguments: key: The registry key. value: The particular registry value to read (optional). Return: stdout from reg.exe, or None for failure. """ text = None try: text = _RegistryQueryBase('Sysnative', key, value) except OSError, e: if e.errno == errno.ENOENT: text = _RegistryQueryBase('System32', key, value) else: raise return text def _RegistryGetValue(key, value): """Use reg.exe to obtain the value of a registry key. Args: key: The registry key. value: The particular registry value to read. Return: contents of the registry key's value, or None on failure. """ text = _RegistryQuery(key, value) if not text: return None # Extract value. match = re.search(r'REG_\w+\s+([^\r]+)\r\n', text) if not match: return None return match.group(1) def _RegistryKeyExists(key): """Use reg.exe to see if a key exists. Args: key: The registry key to check. Return: True if the key exists """ if not _RegistryQuery(key): return False return True def _CreateVersion(name, path, sdk_based=False): """Sets up MSVS project generation. Setup is based off the GYP_MSVS_VERSION environment variable or whatever is autodetected if GYP_MSVS_VERSION is not explicitly specified. If a version is passed in that doesn't match a value in versions python will throw a error. """ if path: path = os.path.normpath(path) versions = { '2013': VisualStudioVersion('2013', 'Visual Studio 2013', solution_version='13.00', project_version='12.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v120'), '2013e': VisualStudioVersion('2013e', 'Visual Studio 2013', solution_version='13.00', project_version='12.0', flat_sln=True, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v120'), '2012': VisualStudioVersion('2012', 'Visual Studio 2012', solution_version='12.00', project_version='4.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v110'), '2012e': VisualStudioVersion('2012e', 'Visual Studio 2012', solution_version='12.00', project_version='4.0', flat_sln=True, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v110'), '2010': VisualStudioVersion('2010', 'Visual Studio 2010', solution_version='11.00', project_version='4.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based), '2010e': VisualStudioVersion('2010e', 'Visual C++ Express 2010', solution_version='11.00', project_version='4.0', flat_sln=True, uses_vcxproj=True, path=path, sdk_based=sdk_based), '2008': VisualStudioVersion('2008', 'Visual Studio 2008', solution_version='10.00', project_version='9.00', flat_sln=False, uses_vcxproj=False, path=path, sdk_based=sdk_based), '2008e': VisualStudioVersion('2008e', 'Visual Studio 2008', solution_version='10.00', project_version='9.00', flat_sln=True, uses_vcxproj=False, path=path, sdk_based=sdk_based), '2005': VisualStudioVersion('2005', 'Visual Studio 2005', solution_version='9.00', project_version='8.00', flat_sln=False, uses_vcxproj=False, path=path, sdk_based=sdk_based), '2005e': VisualStudioVersion('2005e', 'Visual Studio 2005', solution_version='9.00', project_version='8.00', flat_sln=True, uses_vcxproj=False, path=path, sdk_based=sdk_based), } return versions[str(name)] def _ConvertToCygpath(path): """Convert to cygwin path if we are using cygwin.""" if sys.platform == 'cygwin': p = subprocess.Popen(['cygpath', path], stdout=subprocess.PIPE) path = p.communicate()[0].strip() return path def _DetectVisualStudioVersions(versions_to_check, force_express): """Collect the list of installed visual studio versions. Returns: A list of visual studio versions installed in descending order of usage preference. Base this on the registry and a quick check if devenv.exe exists. Only versions 8-10 are considered. Possibilities are: 2005(e) - Visual Studio 2005 (8) 2008(e) - Visual Studio 2008 (9) 2010(e) - Visual Studio 2010 (10) 2012(e) - Visual Studio 2012 (11) 2013(e) - Visual Studio 2013 (11) Where (e) is e for express editions of MSVS and blank otherwise. """ version_to_year = { '8.0': '2005', '9.0': '2008', '10.0': '2010', '11.0': '2012', '12.0': '2013', } versions = [] for version in versions_to_check: # Old method of searching for which VS version is installed # We don't use the 2010-encouraged-way because we also want to get the # path to the binaries, which it doesn't offer. keys = [r'HKLM\Software\Microsoft\VisualStudio\%s' % version, r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\%s' % version, r'HKLM\Software\Microsoft\VCExpress\%s' % version, r'HKLM\Software\Wow6432Node\Microsoft\VCExpress\%s' % version] for index in range(len(keys)): path = _RegistryGetValue(keys[index], 'InstallDir') if not path: continue path = _ConvertToCygpath(path) # Check for full. full_path = os.path.join(path, 'devenv.exe') express_path = os.path.join(path, '*express.exe') if not force_express and os.path.exists(full_path): # Add this one. versions.append(_CreateVersion(version_to_year[version], os.path.join(path, '..', '..'))) # Check for express. elif glob.glob(express_path): # Add this one. versions.append(_CreateVersion(version_to_year[version] + 'e', os.path.join(path, '..', '..'))) # The old method above does not work when only SDK is installed. keys = [r'HKLM\Software\Microsoft\VisualStudio\SxS\VC7', r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VC7'] for index in range(len(keys)): path = _RegistryGetValue(keys[index], version) if not path: continue path = _ConvertToCygpath(path) versions.append(_CreateVersion(version_to_year[version] + 'e', os.path.join(path, '..'), sdk_based=True)) return versions def SelectVisualStudioVersion(version='auto'): """Select which version of Visual Studio projects to generate. Arguments: version: Hook to allow caller to force a particular version (vs auto). Returns: An object representing a visual studio project format version. """ # In auto mode, check environment variable for override. if version == 'auto': version = os.environ.get('GYP_MSVS_VERSION', 'auto') version_map = { 'auto': ('10.0', '12.0', '9.0', '8.0', '11.0'), '2005': ('8.0',), '2005e': ('8.0',), '2008': ('9.0',), '2008e': ('9.0',), '2010': ('10.0',), '2010e': ('10.0',), '2012': ('11.0',), '2012e': ('11.0',), '2013': ('12.0',), '2013e': ('12.0',), } override_path = os.environ.get('GYP_MSVS_OVERRIDE_PATH') if override_path: msvs_version = os.environ.get('GYP_MSVS_VERSION') if not msvs_version: raise ValueError('GYP_MSVS_OVERRIDE_PATH requires GYP_MSVS_VERSION to be ' 'set to a particular version (e.g. 2010e).') return _CreateVersion(msvs_version, override_path, sdk_based=True) version = str(version) versions = _DetectVisualStudioVersions(version_map[version], 'e' in version) if not versions: if version == 'auto': # Default to 2005 if we couldn't find anything return _CreateVersion('2005', None) else: return _CreateVersion(version, None) return versions[0] ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/ninja_syntax.py�����000644 �000766 �000024 �00000012640 12455173731 032700� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# This file comes from # https://github.com/martine/ninja/blob/master/misc/ninja_syntax.py # Do not edit! Edit the upstream one instead. """Python module for generating .ninja files. Note that this is emphatically not a required piece of Ninja; it's just a helpful utility for build-file-generation systems that already use Python. """ import textwrap import re def escape_path(word): return word.replace('$ ','$$ ').replace(' ','$ ').replace(':', '$:') class Writer(object): def __init__(self, output, width=78): self.output = output self.width = width def newline(self): self.output.write('\n') def comment(self, text): for line in textwrap.wrap(text, self.width - 2): self.output.write('# ' + line + '\n') def variable(self, key, value, indent=0): if value is None: return if isinstance(value, list): value = ' '.join(filter(None, value)) # Filter out empty strings. self._line('%s = %s' % (key, value), indent) def pool(self, name, depth): self._line('pool %s' % name) self.variable('depth', depth, indent=1) def rule(self, name, command, description=None, depfile=None, generator=False, pool=None, restat=False, rspfile=None, rspfile_content=None, deps=None): self._line('rule %s' % name) self.variable('command', command, indent=1) if description: self.variable('description', description, indent=1) if depfile: self.variable('depfile', depfile, indent=1) if generator: self.variable('generator', '1', indent=1) if pool: self.variable('pool', pool, indent=1) if restat: self.variable('restat', '1', indent=1) if rspfile: self.variable('rspfile', rspfile, indent=1) if rspfile_content: self.variable('rspfile_content', rspfile_content, indent=1) if deps: self.variable('deps', deps, indent=1) def build(self, outputs, rule, inputs=None, implicit=None, order_only=None, variables=None): outputs = self._as_list(outputs) all_inputs = self._as_list(inputs)[:] out_outputs = list(map(escape_path, outputs)) all_inputs = list(map(escape_path, all_inputs)) if implicit: implicit = map(escape_path, self._as_list(implicit)) all_inputs.append('|') all_inputs.extend(implicit) if order_only: order_only = map(escape_path, self._as_list(order_only)) all_inputs.append('||') all_inputs.extend(order_only) self._line('build %s: %s' % (' '.join(out_outputs), ' '.join([rule] + all_inputs))) if variables: if isinstance(variables, dict): iterator = iter(variables.items()) else: iterator = iter(variables) for key, val in iterator: self.variable(key, val, indent=1) return outputs def include(self, path): self._line('include %s' % path) def subninja(self, path): self._line('subninja %s' % path) def default(self, paths): self._line('default %s' % ' '.join(self._as_list(paths))) def _count_dollars_before_index(self, s, i): """Returns the number of '$' characters right in front of s[i].""" dollar_count = 0 dollar_index = i - 1 while dollar_index > 0 and s[dollar_index] == '$': dollar_count += 1 dollar_index -= 1 return dollar_count def _line(self, text, indent=0): """Write 'text' word-wrapped at self.width characters.""" leading_space = ' ' * indent while len(leading_space) + len(text) > self.width: # The text is too wide; wrap if possible. # Find the rightmost space that would obey our width constraint and # that's not an escaped space. available_space = self.width - len(leading_space) - len(' $') space = available_space while True: space = text.rfind(' ', 0, space) if space < 0 or \ self._count_dollars_before_index(text, space) % 2 == 0: break if space < 0: # No such space; just use the first unescaped space we can find. space = available_space - 1 while True: space = text.find(' ', space + 1) if space < 0 or \ self._count_dollars_before_index(text, space) % 2 == 0: break if space < 0: # Give up on breaking. break self.output.write(leading_space + text[0:space] + ' $\n') text = text[space+1:] # Subsequent lines are continuations, so indent them. leading_space = ' ' * (indent+2) self.output.write(leading_space + text + '\n') def _as_list(self, input): if input is None: return [] if isinstance(input, list): return input return [input] def escape(string): """Escape a string such that it can be embedded into a Ninja file without further interpretation.""" assert '\n' not in string, 'Ninja syntax does not allow newlines' # We only have one special metacharacter: '$'. return string.replace('$', '$$') ������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/ordered_dict.py�����000644 �000766 �000024 �00000024176 12455173731 032631� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Unmodified from http://code.activestate.com/recipes/576693/ # other than to add MIT license header (as specified on page, but not in code). # Linked from Python documentation here: # http://docs.python.org/2/library/collections.html#collections.OrderedDict # # This should be deleted once Py2.7 is available on all bots, see # http://crbug.com/241769. # # Copyright (c) 2009 Raymond Hettinger. # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to deal # in the Software without restriction, including without limitation the rights # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # copies of the Software, and to permit persons to whom the Software is # furnished to do so, subject to the following conditions: # # The above copyright notice and this permission notice shall be included in # all copies or substantial portions of the Software. # # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN # THE SOFTWARE. # Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy. # Passes Python2.7's test suite and incorporates all the latest updates. try: from thread import get_ident as _get_ident except ImportError: from dummy_thread import get_ident as _get_ident try: from _abcoll import KeysView, ValuesView, ItemsView except ImportError: pass class OrderedDict(dict): 'Dictionary that remembers insertion order' # An inherited dict maps keys to values. # The inherited dict provides __getitem__, __len__, __contains__, and get. # The remaining methods are order-aware. # Big-O running times for all methods are the same as for regular dictionaries. # The internal self.__map dictionary maps keys to links in a doubly linked list. # The circular doubly linked list starts and ends with a sentinel element. # The sentinel element never gets deleted (this simplifies the algorithm). # Each link is stored as a list of length three: [PREV, NEXT, KEY]. def __init__(self, *args, **kwds): '''Initialize an ordered dictionary. Signature is the same as for regular dictionaries, but keyword arguments are not recommended because their insertion order is arbitrary. ''' if len(args) > 1: raise TypeError('expected at most 1 arguments, got %d' % len(args)) try: self.__root except AttributeError: self.__root = root = [] # sentinel node root[:] = [root, root, None] self.__map = {} self.__update(*args, **kwds) def __setitem__(self, key, value, dict_setitem=dict.__setitem__): 'od.__setitem__(i, y) <==> od[i]=y' # Setting a new item creates a new link which goes at the end of the linked # list, and the inherited dictionary is updated with the new key/value pair. if key not in self: root = self.__root last = root[0] last[1] = root[0] = self.__map[key] = [last, root, key] dict_setitem(self, key, value) def __delitem__(self, key, dict_delitem=dict.__delitem__): 'od.__delitem__(y) <==> del od[y]' # Deleting an existing item uses self.__map to find the link which is # then removed by updating the links in the predecessor and successor nodes. dict_delitem(self, key) link_prev, link_next, key = self.__map.pop(key) link_prev[1] = link_next link_next[0] = link_prev def __iter__(self): 'od.__iter__() <==> iter(od)' root = self.__root curr = root[1] while curr is not root: yield curr[2] curr = curr[1] def __reversed__(self): 'od.__reversed__() <==> reversed(od)' root = self.__root curr = root[0] while curr is not root: yield curr[2] curr = curr[0] def clear(self): 'od.clear() -> None. Remove all items from od.' try: for node in self.__map.itervalues(): del node[:] root = self.__root root[:] = [root, root, None] self.__map.clear() except AttributeError: pass dict.clear(self) def popitem(self, last=True): '''od.popitem() -> (k, v), return and remove a (key, value) pair. Pairs are returned in LIFO order if last is true or FIFO order if false. ''' if not self: raise KeyError('dictionary is empty') root = self.__root if last: link = root[0] link_prev = link[0] link_prev[1] = root root[0] = link_prev else: link = root[1] link_next = link[1] root[1] = link_next link_next[0] = root key = link[2] del self.__map[key] value = dict.pop(self, key) return key, value # -- the following methods do not depend on the internal structure -- def keys(self): 'od.keys() -> list of keys in od' return list(self) def values(self): 'od.values() -> list of values in od' return [self[key] for key in self] def items(self): 'od.items() -> list of (key, value) pairs in od' return [(key, self[key]) for key in self] def iterkeys(self): 'od.iterkeys() -> an iterator over the keys in od' return iter(self) def itervalues(self): 'od.itervalues -> an iterator over the values in od' for k in self: yield self[k] def iteritems(self): 'od.iteritems -> an iterator over the (key, value) items in od' for k in self: yield (k, self[k]) # Suppress 'OrderedDict.update: Method has no argument': # pylint: disable=E0211 def update(*args, **kwds): '''od.update(E, **F) -> None. Update od from dict/iterable E and F. If E is a dict instance, does: for k in E: od[k] = E[k] If E has a .keys() method, does: for k in E.keys(): od[k] = E[k] Or if E is an iterable of items, does: for k, v in E: od[k] = v In either case, this is followed by: for k, v in F.items(): od[k] = v ''' if len(args) > 2: raise TypeError('update() takes at most 2 positional ' 'arguments (%d given)' % (len(args),)) elif not args: raise TypeError('update() takes at least 1 argument (0 given)') self = args[0] # Make progressively weaker assumptions about "other" other = () if len(args) == 2: other = args[1] if isinstance(other, dict): for key in other: self[key] = other[key] elif hasattr(other, 'keys'): for key in other.keys(): self[key] = other[key] else: for key, value in other: self[key] = value for key, value in kwds.items(): self[key] = value __update = update # let subclasses override update without breaking __init__ __marker = object() def pop(self, key, default=__marker): '''od.pop(k[,d]) -> v, remove specified key and return the corresponding value. If key is not found, d is returned if given, otherwise KeyError is raised. ''' if key in self: result = self[key] del self[key] return result if default is self.__marker: raise KeyError(key) return default def setdefault(self, key, default=None): 'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od' if key in self: return self[key] self[key] = default return default def __repr__(self, _repr_running={}): 'od.__repr__() <==> repr(od)' call_key = id(self), _get_ident() if call_key in _repr_running: return '...' _repr_running[call_key] = 1 try: if not self: return '%s()' % (self.__class__.__name__,) return '%s(%r)' % (self.__class__.__name__, self.items()) finally: del _repr_running[call_key] def __reduce__(self): 'Return state information for pickling' items = [[k, self[k]] for k in self] inst_dict = vars(self).copy() for k in vars(OrderedDict()): inst_dict.pop(k, None) if inst_dict: return (self.__class__, (items,), inst_dict) return self.__class__, (items,) def copy(self): 'od.copy() -> a shallow copy of od' return self.__class__(self) @classmethod def fromkeys(cls, iterable, value=None): '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S and values equal to v (which defaults to None). ''' d = cls() for key in iterable: d[key] = value return d def __eq__(self, other): '''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive while comparison to a regular mapping is order-insensitive. ''' if isinstance(other, OrderedDict): return len(self)==len(other) and self.items() == other.items() return dict.__eq__(self, other) def __ne__(self, other): return not self == other # -- the following methods are only used in Python 2.7 -- def viewkeys(self): "od.viewkeys() -> a set-like object providing a view on od's keys" return KeysView(self) def viewvalues(self): "od.viewvalues() -> an object providing a view on od's values" return ValuesView(self) def viewitems(self): "od.viewitems() -> a set-like object providing a view on od's items" return ItemsView(self) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/win_tool.py���������000755 �000766 �000024 �00000026570 12455173731 032037� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions for Windows builds. These functions are executed via gyp-win-tool when using the ninja generator. """ import os import re import shutil import subprocess import string import sys BASE_DIR = os.path.dirname(os.path.abspath(__file__)) # A regex matching an argument corresponding to the output filename passed to # link.exe. _LINK_EXE_OUT_ARG = re.compile('/OUT:(?P<out>.+)$', re.IGNORECASE) def main(args): executor = WinTool() exit_code = executor.Dispatch(args) if exit_code is not None: sys.exit(exit_code) class WinTool(object): """This class performs all the Windows tooling steps. The methods can either be executed directly, or dispatched from an argument list.""" def _UseSeparateMspdbsrv(self, env, args): """Allows to use a unique instance of mspdbsrv.exe per linker instead of a shared one.""" if len(args) < 1: raise Exception("Not enough arguments") if args[0] != 'link.exe': return # Use the output filename passed to the linker to generate an endpoint name # for mspdbsrv.exe. endpoint_name = None for arg in args: m = _LINK_EXE_OUT_ARG.match(arg) if m: endpoint_name = '%s_%d' % (m.group('out'), os.getpid()) break if endpoint_name is None: return # Adds the appropriate environment variable. This will be read by link.exe # to know which instance of mspdbsrv.exe it should connect to (if it's # not set then the default endpoint is used). env['_MSPDBSRV_ENDPOINT_'] = endpoint_name def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) return getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like recursive-mirror to RecursiveMirror.""" return name_string.title().replace('-', '') def _GetEnv(self, arch): """Gets the saved environment from a file for a given architecture.""" # The environment is saved as an "environment block" (see CreateProcess # and msvs_emulation for details). We convert to a dict here. # Drop last 2 NULs, one for list terminator, one for trailing vs. separator. pairs = open(arch).read()[:-2].split('\0') kvs = [item.split('=', 1) for item in pairs] return dict(kvs) def ExecStamp(self, path): """Simple stamp command.""" open(path, 'w').close() def ExecRecursiveMirror(self, source, dest): """Emulation of rm -rf out && cp -af in out.""" if os.path.exists(dest): if os.path.isdir(dest): shutil.rmtree(dest) else: os.unlink(dest) if os.path.isdir(source): shutil.copytree(source, dest) else: shutil.copy2(source, dest) def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args): """Filter diagnostic output from link that looks like: ' Creating library ui.dll.lib and object ui.dll.exp' This happens when there are exports from the dll or exe. """ env = self._GetEnv(arch) if use_separate_mspdbsrv == 'True': self._UseSeparateMspdbsrv(env, args) link = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = link.communicate() for line in out.splitlines(): if not line.startswith(' Creating library '): print line return link.returncode def ExecLinkWithManifests(self, arch, embed_manifest, out, ldcmd, resname, mt, rc, intermediate_manifest, *manifests): """A wrapper for handling creating a manifest resource and then executing a link command.""" # The 'normal' way to do manifests is to have link generate a manifest # based on gathering dependencies from the object files, then merge that # manifest with other manifests supplied as sources, convert the merged # manifest to a resource, and then *relink*, including the compiled # version of the manifest resource. This breaks incremental linking, and # is generally overly complicated. Instead, we merge all the manifests # provided (along with one that includes what would normally be in the # linker-generated one, see msvs_emulation.py), and include that into the # first and only link. We still tell link to generate a manifest, but we # only use that to assert that our simpler process did not miss anything. variables = { 'python': sys.executable, 'arch': arch, 'out': out, 'ldcmd': ldcmd, 'resname': resname, 'mt': mt, 'rc': rc, 'intermediate_manifest': intermediate_manifest, 'manifests': ' '.join(manifests), } add_to_ld = '' if manifests: subprocess.check_call( '%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo ' '-manifest %(manifests)s -out:%(out)s.manifest' % variables) if embed_manifest == 'True': subprocess.check_call( '%(python)s gyp-win-tool manifest-to-rc %(arch)s %(out)s.manifest' ' %(out)s.manifest.rc %(resname)s' % variables) subprocess.check_call( '%(python)s gyp-win-tool rc-wrapper %(arch)s %(rc)s ' '%(out)s.manifest.rc' % variables) add_to_ld = ' %(out)s.manifest.res' % variables subprocess.check_call(ldcmd + add_to_ld) # Run mt.exe on the theoretically complete manifest we generated, merging # it with the one the linker generated to confirm that the linker # generated one does not add anything. This is strictly unnecessary for # correctness, it's only to verify that e.g. /MANIFESTDEPENDENCY was not # used in a #pragma comment. if manifests: # Merge the intermediate one with ours to .assert.manifest, then check # that .assert.manifest is identical to ours. subprocess.check_call( '%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo ' '-manifest %(out)s.manifest %(intermediate_manifest)s ' '-out:%(out)s.assert.manifest' % variables) assert_manifest = '%(out)s.assert.manifest' % variables our_manifest = '%(out)s.manifest' % variables # Load and normalize the manifests. mt.exe sometimes removes whitespace, # and sometimes doesn't unfortunately. with open(our_manifest, 'rb') as our_f: with open(assert_manifest, 'rb') as assert_f: our_data = our_f.read().translate(None, string.whitespace) assert_data = assert_f.read().translate(None, string.whitespace) if our_data != assert_data: os.unlink(out) def dump(filename): sys.stderr.write('%s\n-----\n' % filename) with open(filename, 'rb') as f: sys.stderr.write(f.read() + '\n-----\n') dump(intermediate_manifest) dump(our_manifest) dump(assert_manifest) sys.stderr.write( 'Linker generated manifest "%s" added to final manifest "%s" ' '(result in "%s"). ' 'Were /MANIFEST switches used in #pragma statements? ' % ( intermediate_manifest, our_manifest, assert_manifest)) return 1 def ExecManifestWrapper(self, arch, *args): """Run manifest tool with environment set. Strip out undesirable warning (some XML blocks are recognized by the OS loader, but not the manifest tool).""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() for line in out.splitlines(): if line and 'manifest authoring warning 81010002' not in line: print line return popen.returncode def ExecManifestToRc(self, arch, *args): """Creates a resource file pointing a SxS assembly manifest. |args| is tuple containing path to resource file, path to manifest file and resource name which can be "1" (for executables) or "2" (for DLLs).""" manifest_path, resource_path, resource_name = args with open(resource_path, 'wb') as output: output.write('#include <windows.h>\n%s RT_MANIFEST "%s"' % ( resource_name, os.path.abspath(manifest_path).replace('\\', '/'))) def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl, *flags): """Filter noisy filenames output from MIDL compile step that isn't quietable via command line flags. """ args = ['midl', '/nologo'] + list(flags) + [ '/out', outdir, '/tlb', tlb, '/h', h, '/dlldata', dlldata, '/iid', iid, '/proxy', proxy, idl] env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() # Filter junk out of stdout, and write filtered versions. Output we want # to filter is pairs of lines that look like this: # Processing C:\Program Files (x86)\Microsoft SDKs\...\include\objidl.idl # objidl.idl lines = out.splitlines() prefix = 'Processing ' processing = set(os.path.basename(x) for x in lines if x.startswith(prefix)) for line in lines: if not line.startswith(prefix) and line not in processing: print line return popen.returncode def ExecAsmWrapper(self, arch, *args): """Filter logo banner from invocations of asm.exe.""" env = self._GetEnv(arch) # MSVS doesn't assemble x64 asm files. if arch == 'environment.x64': return 0 popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() for line in out.splitlines(): if (not line.startswith('Copyright (C) Microsoft Corporation') and not line.startswith('Microsoft (R) Macro Assembler') and not line.startswith(' Assembling: ') and line): print line return popen.returncode def ExecRcWrapper(self, arch, *args): """Filter logo banner from invocations of rc.exe. Older versions of RC don't support the /nologo flag.""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() for line in out.splitlines(): if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and not line.startswith('Copyright (C) Microsoft Corporation') and line): print line return popen.returncode def ExecActionWrapper(self, arch, rspfile, *dir): """Runs an action command line from a response file using the environment for |arch|. If |dir| is supplied, use that as the working directory.""" env = self._GetEnv(arch) # TODO(scottmg): This is a temporary hack to get some specific variables # through to actions that are set after gyp-time. http://crbug.com/333738. for k, v in os.environ.iteritems(): if k not in env: env[k] = v args = open(rspfile).read() dir = dir[0] if dir else None return subprocess.call(args, shell=True, env=env, cwd=dir) if __name__ == '__main__': sys.exit(main(sys.argv[1:])) ����������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xcode_emulation.py��000644 �000766 �000024 �00000160020 12455173731 033346� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ This module contains classes that help to emulate xcodebuild behavior on top of other build systems, such as make and ninja. """ import copy import gyp.common import os import os.path import re import shlex import subprocess import sys import tempfile from gyp.common import GypError class XcodeSettings(object): """A class that understands the gyp 'xcode_settings' object.""" # Populated lazily by _SdkPath(). Shared by all XcodeSettings, so cached # at class-level for efficiency. _sdk_path_cache = {} _sdk_root_cache = {} # Populated lazily by GetExtraPlistItems(). Shared by all XcodeSettings, so # cached at class-level for efficiency. _plist_cache = {} # Populated lazily by GetIOSPostbuilds. Shared by all XcodeSettings, so # cached at class-level for efficiency. _codesigning_key_cache = {} # Populated lazily by _XcodeVersion. Shared by all XcodeSettings, so cached # at class-level for efficiency. _xcode_version_cache = () def __init__(self, spec): self.spec = spec self.isIOS = False # Per-target 'xcode_settings' are pushed down into configs earlier by gyp. # This means self.xcode_settings[config] always contains all settings # for that config -- the per-target settings as well. Settings that are # the same for all configs are implicitly per-target settings. self.xcode_settings = {} configs = spec['configurations'] for configname, config in configs.iteritems(): self.xcode_settings[configname] = config.get('xcode_settings', {}) self._ConvertConditionalKeys(configname) if self.xcode_settings[configname].get('IPHONEOS_DEPLOYMENT_TARGET', None): self.isIOS = True # This is only non-None temporarily during the execution of some methods. self.configname = None # Used by _AdjustLibrary to match .a and .dylib entries in libraries. self.library_re = re.compile(r'^lib([^/]+)\.(a|dylib)$') def _ConvertConditionalKeys(self, configname): """Converts or warns on conditional keys. Xcode supports conditional keys, such as CODE_SIGN_IDENTITY[sdk=iphoneos*]. This is a partial implementation with some keys converted while the rest force a warning.""" settings = self.xcode_settings[configname] conditional_keys = [key for key in settings if key.endswith(']')] for key in conditional_keys: # If you need more, speak up at http://crbug.com/122592 if key.endswith("[sdk=iphoneos*]"): if configname.endswith("iphoneos"): new_key = key.split("[")[0] settings[new_key] = settings[key] else: print 'Warning: Conditional keys not implemented, ignoring:', \ ' '.join(conditional_keys) del settings[key] def _Settings(self): assert self.configname return self.xcode_settings[self.configname] def _Test(self, test_key, cond_key, default): return self._Settings().get(test_key, default) == cond_key def _Appendf(self, lst, test_key, format_str, default=None): if test_key in self._Settings(): lst.append(format_str % str(self._Settings()[test_key])) elif default: lst.append(format_str % str(default)) def _WarnUnimplemented(self, test_key): if test_key in self._Settings(): print 'Warning: Ignoring not yet implemented key "%s".' % test_key def _IsBundle(self): return int(self.spec.get('mac_bundle', 0)) != 0 def GetFrameworkVersion(self): """Returns the framework version of the current target. Only valid for bundles.""" assert self._IsBundle() return self.GetPerTargetSetting('FRAMEWORK_VERSION', default='A') def GetWrapperExtension(self): """Returns the bundle extension (.app, .framework, .plugin, etc). Only valid for bundles.""" assert self._IsBundle() if self.spec['type'] in ('loadable_module', 'shared_library'): default_wrapper_extension = { 'loadable_module': 'bundle', 'shared_library': 'framework', }[self.spec['type']] wrapper_extension = self.GetPerTargetSetting( 'WRAPPER_EXTENSION', default=default_wrapper_extension) return '.' + self.spec.get('product_extension', wrapper_extension) elif self.spec['type'] == 'executable': return '.' + self.spec.get('product_extension', 'app') else: assert False, "Don't know extension for '%s', target '%s'" % ( self.spec['type'], self.spec['target_name']) def GetProductName(self): """Returns PRODUCT_NAME.""" return self.spec.get('product_name', self.spec['target_name']) def GetFullProductName(self): """Returns FULL_PRODUCT_NAME.""" if self._IsBundle(): return self.GetWrapperName() else: return self._GetStandaloneBinaryPath() def GetWrapperName(self): """Returns the directory name of the bundle represented by this target. Only valid for bundles.""" assert self._IsBundle() return self.GetProductName() + self.GetWrapperExtension() def GetBundleContentsFolderPath(self): """Returns the qualified path to the bundle's contents folder. E.g. Chromium.app/Contents or Foo.bundle/Versions/A. Only valid for bundles.""" if self.isIOS: return self.GetWrapperName() assert self._IsBundle() if self.spec['type'] == 'shared_library': return os.path.join( self.GetWrapperName(), 'Versions', self.GetFrameworkVersion()) else: # loadable_modules have a 'Contents' folder like executables. return os.path.join(self.GetWrapperName(), 'Contents') def GetBundleResourceFolder(self): """Returns the qualified path to the bundle's resource folder. E.g. Chromium.app/Contents/Resources. Only valid for bundles.""" assert self._IsBundle() if self.isIOS: return self.GetBundleContentsFolderPath() return os.path.join(self.GetBundleContentsFolderPath(), 'Resources') def GetBundlePlistPath(self): """Returns the qualified path to the bundle's plist file. E.g. Chromium.app/Contents/Info.plist. Only valid for bundles.""" assert self._IsBundle() if self.spec['type'] in ('executable', 'loadable_module'): return os.path.join(self.GetBundleContentsFolderPath(), 'Info.plist') else: return os.path.join(self.GetBundleContentsFolderPath(), 'Resources', 'Info.plist') def GetProductType(self): """Returns the PRODUCT_TYPE of this target.""" if self._IsBundle(): return { 'executable': 'com.apple.product-type.application', 'loadable_module': 'com.apple.product-type.bundle', 'shared_library': 'com.apple.product-type.framework', }[self.spec['type']] else: return { 'executable': 'com.apple.product-type.tool', 'loadable_module': 'com.apple.product-type.library.dynamic', 'shared_library': 'com.apple.product-type.library.dynamic', 'static_library': 'com.apple.product-type.library.static', }[self.spec['type']] def GetMachOType(self): """Returns the MACH_O_TYPE of this target.""" # Weird, but matches Xcode. if not self._IsBundle() and self.spec['type'] == 'executable': return '' return { 'executable': 'mh_execute', 'static_library': 'staticlib', 'shared_library': 'mh_dylib', 'loadable_module': 'mh_bundle', }[self.spec['type']] def _GetBundleBinaryPath(self): """Returns the name of the bundle binary of by this target. E.g. Chromium.app/Contents/MacOS/Chromium. Only valid for bundles.""" assert self._IsBundle() if self.spec['type'] in ('shared_library') or self.isIOS: path = self.GetBundleContentsFolderPath() elif self.spec['type'] in ('executable', 'loadable_module'): path = os.path.join(self.GetBundleContentsFolderPath(), 'MacOS') return os.path.join(path, self.GetExecutableName()) def _GetStandaloneExecutableSuffix(self): if 'product_extension' in self.spec: return '.' + self.spec['product_extension'] return { 'executable': '', 'static_library': '.a', 'shared_library': '.dylib', 'loadable_module': '.so', }[self.spec['type']] def _GetStandaloneExecutablePrefix(self): return self.spec.get('product_prefix', { 'executable': '', 'static_library': 'lib', 'shared_library': 'lib', # Non-bundled loadable_modules are called foo.so for some reason # (that is, .so and no prefix) with the xcode build -- match that. 'loadable_module': '', }[self.spec['type']]) def _GetStandaloneBinaryPath(self): """Returns the name of the non-bundle binary represented by this target. E.g. hello_world. Only valid for non-bundles.""" assert not self._IsBundle() assert self.spec['type'] in ( 'executable', 'shared_library', 'static_library', 'loadable_module'), ( 'Unexpected type %s' % self.spec['type']) target = self.spec['target_name'] if self.spec['type'] == 'static_library': if target[:3] == 'lib': target = target[3:] elif self.spec['type'] in ('loadable_module', 'shared_library'): if target[:3] == 'lib': target = target[3:] target_prefix = self._GetStandaloneExecutablePrefix() target = self.spec.get('product_name', target) target_ext = self._GetStandaloneExecutableSuffix() return target_prefix + target + target_ext def GetExecutableName(self): """Returns the executable name of the bundle represented by this target. E.g. Chromium.""" if self._IsBundle(): return self.spec.get('product_name', self.spec['target_name']) else: return self._GetStandaloneBinaryPath() def GetExecutablePath(self): """Returns the directory name of the bundle represented by this target. E.g. Chromium.app/Contents/MacOS/Chromium.""" if self._IsBundle(): return self._GetBundleBinaryPath() else: return self._GetStandaloneBinaryPath() def GetActiveArchs(self, configname): """Returns the architectures this target should be built for.""" # TODO: Look at VALID_ARCHS, ONLY_ACTIVE_ARCH; possibly set # CURRENT_ARCH / NATIVE_ARCH env vars? return self.xcode_settings[configname].get('ARCHS', [self._DefaultArch()]) def _GetStdout(self, cmdlist): job = subprocess.Popen(cmdlist, stdout=subprocess.PIPE) out = job.communicate()[0] if job.returncode != 0: sys.stderr.write(out + '\n') raise GypError('Error %d running %s' % (job.returncode, cmdlist[0])) return out.rstrip('\n') def _GetSdkVersionInfoItem(self, sdk, infoitem): # xcodebuild requires Xcode and can't run on Command Line Tools-only # systems from 10.7 onward. # Since the CLT has no SDK paths anyway, returning None is the # most sensible route and should still do the right thing. try: return self._GetStdout(['xcodebuild', '-version', '-sdk', sdk, infoitem]) except: pass def _SdkRoot(self, configname): if configname is None: configname = self.configname return self.GetPerConfigSetting('SDKROOT', configname, default='') def _SdkPath(self, configname=None): sdk_root = self._SdkRoot(configname) if sdk_root.startswith('/'): return sdk_root return self._XcodeSdkPath(sdk_root) def _XcodeSdkPath(self, sdk_root): if sdk_root not in XcodeSettings._sdk_path_cache: sdk_path = self._GetSdkVersionInfoItem(sdk_root, 'Path') XcodeSettings._sdk_path_cache[sdk_root] = sdk_path if sdk_root: XcodeSettings._sdk_root_cache[sdk_path] = sdk_root return XcodeSettings._sdk_path_cache[sdk_root] def _AppendPlatformVersionMinFlags(self, lst): self._Appendf(lst, 'MACOSX_DEPLOYMENT_TARGET', '-mmacosx-version-min=%s') if 'IPHONEOS_DEPLOYMENT_TARGET' in self._Settings(): # TODO: Implement this better? sdk_path_basename = os.path.basename(self._SdkPath()) if sdk_path_basename.lower().startswith('iphonesimulator'): self._Appendf(lst, 'IPHONEOS_DEPLOYMENT_TARGET', '-mios-simulator-version-min=%s') else: self._Appendf(lst, 'IPHONEOS_DEPLOYMENT_TARGET', '-miphoneos-version-min=%s') def GetCflags(self, configname, arch=None): """Returns flags that need to be added to .c, .cc, .m, and .mm compilations.""" # This functions (and the similar ones below) do not offer complete # emulation of all xcode_settings keys. They're implemented on demand. self.configname = configname cflags = [] sdk_root = self._SdkPath() if 'SDKROOT' in self._Settings() and sdk_root: cflags.append('-isysroot %s' % sdk_root) if self._Test('CLANG_WARN_CONSTANT_CONVERSION', 'YES', default='NO'): cflags.append('-Wconstant-conversion') if self._Test('GCC_CHAR_IS_UNSIGNED_CHAR', 'YES', default='NO'): cflags.append('-funsigned-char') if self._Test('GCC_CW_ASM_SYNTAX', 'YES', default='YES'): cflags.append('-fasm-blocks') if 'GCC_DYNAMIC_NO_PIC' in self._Settings(): if self._Settings()['GCC_DYNAMIC_NO_PIC'] == 'YES': cflags.append('-mdynamic-no-pic') else: pass # TODO: In this case, it depends on the target. xcode passes # mdynamic-no-pic by default for executable and possibly static lib # according to mento if self._Test('GCC_ENABLE_PASCAL_STRINGS', 'YES', default='YES'): cflags.append('-mpascal-strings') self._Appendf(cflags, 'GCC_OPTIMIZATION_LEVEL', '-O%s', default='s') if self._Test('GCC_GENERATE_DEBUGGING_SYMBOLS', 'YES', default='YES'): dbg_format = self._Settings().get('DEBUG_INFORMATION_FORMAT', 'dwarf') if dbg_format == 'dwarf': cflags.append('-gdwarf-2') elif dbg_format == 'stabs': raise NotImplementedError('stabs debug format is not supported yet.') elif dbg_format == 'dwarf-with-dsym': cflags.append('-gdwarf-2') else: raise NotImplementedError('Unknown debug format %s' % dbg_format) if self._Settings().get('GCC_STRICT_ALIASING') == 'YES': cflags.append('-fstrict-aliasing') elif self._Settings().get('GCC_STRICT_ALIASING') == 'NO': cflags.append('-fno-strict-aliasing') if self._Test('GCC_SYMBOLS_PRIVATE_EXTERN', 'YES', default='NO'): cflags.append('-fvisibility=hidden') if self._Test('GCC_TREAT_WARNINGS_AS_ERRORS', 'YES', default='NO'): cflags.append('-Werror') if self._Test('GCC_WARN_ABOUT_MISSING_NEWLINE', 'YES', default='NO'): cflags.append('-Wnewline-eof') self._AppendPlatformVersionMinFlags(cflags) # TODO: if self._Test('COPY_PHASE_STRIP', 'YES', default='NO'): self._WarnUnimplemented('COPY_PHASE_STRIP') self._WarnUnimplemented('GCC_DEBUGGING_SYMBOLS') self._WarnUnimplemented('GCC_ENABLE_OBJC_EXCEPTIONS') # TODO: This is exported correctly, but assigning to it is not supported. self._WarnUnimplemented('MACH_O_TYPE') self._WarnUnimplemented('PRODUCT_TYPE') if arch is not None: archs = [arch] else: archs = self._Settings().get('ARCHS', [self._DefaultArch()]) if len(archs) != 1: # TODO: Supporting fat binaries will be annoying. self._WarnUnimplemented('ARCHS') archs = ['i386'] cflags.append('-arch ' + archs[0]) if archs[0] in ('i386', 'x86_64'): if self._Test('GCC_ENABLE_SSE3_EXTENSIONS', 'YES', default='NO'): cflags.append('-msse3') if self._Test('GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS', 'YES', default='NO'): cflags.append('-mssse3') # Note 3rd 's'. if self._Test('GCC_ENABLE_SSE41_EXTENSIONS', 'YES', default='NO'): cflags.append('-msse4.1') if self._Test('GCC_ENABLE_SSE42_EXTENSIONS', 'YES', default='NO'): cflags.append('-msse4.2') cflags += self._Settings().get('WARNING_CFLAGS', []) if sdk_root: framework_root = sdk_root else: framework_root = '' config = self.spec['configurations'][self.configname] framework_dirs = config.get('mac_framework_dirs', []) for directory in framework_dirs: cflags.append('-F' + directory.replace('$(SDKROOT)', framework_root)) self.configname = None return cflags def GetCflagsC(self, configname): """Returns flags that need to be added to .c, and .m compilations.""" self.configname = configname cflags_c = [] if self._Settings().get('GCC_C_LANGUAGE_STANDARD', '') == 'ansi': cflags_c.append('-ansi') else: self._Appendf(cflags_c, 'GCC_C_LANGUAGE_STANDARD', '-std=%s') cflags_c += self._Settings().get('OTHER_CFLAGS', []) self.configname = None return cflags_c def GetCflagsCC(self, configname): """Returns flags that need to be added to .cc, and .mm compilations.""" self.configname = configname cflags_cc = [] clang_cxx_language_standard = self._Settings().get( 'CLANG_CXX_LANGUAGE_STANDARD') # Note: Don't make c++0x to c++11 so that c++0x can be used with older # clangs that don't understand c++11 yet (like Xcode 4.2's). if clang_cxx_language_standard: cflags_cc.append('-std=%s' % clang_cxx_language_standard) self._Appendf(cflags_cc, 'CLANG_CXX_LIBRARY', '-stdlib=%s') if self._Test('GCC_ENABLE_CPP_RTTI', 'NO', default='YES'): cflags_cc.append('-fno-rtti') if self._Test('GCC_ENABLE_CPP_EXCEPTIONS', 'NO', default='YES'): cflags_cc.append('-fno-exceptions') if self._Test('GCC_INLINES_ARE_PRIVATE_EXTERN', 'YES', default='NO'): cflags_cc.append('-fvisibility-inlines-hidden') if self._Test('GCC_THREADSAFE_STATICS', 'NO', default='YES'): cflags_cc.append('-fno-threadsafe-statics') # Note: This flag is a no-op for clang, it only has an effect for gcc. if self._Test('GCC_WARN_ABOUT_INVALID_OFFSETOF_MACRO', 'NO', default='YES'): cflags_cc.append('-Wno-invalid-offsetof') other_ccflags = [] for flag in self._Settings().get('OTHER_CPLUSPLUSFLAGS', ['$(inherited)']): # TODO: More general variable expansion. Missing in many other places too. if flag in ('$inherited', '$(inherited)', '${inherited}'): flag = '$OTHER_CFLAGS' if flag in ('$OTHER_CFLAGS', '$(OTHER_CFLAGS)', '${OTHER_CFLAGS}'): other_ccflags += self._Settings().get('OTHER_CFLAGS', []) else: other_ccflags.append(flag) cflags_cc += other_ccflags self.configname = None return cflags_cc def _AddObjectiveCGarbageCollectionFlags(self, flags): gc_policy = self._Settings().get('GCC_ENABLE_OBJC_GC', 'unsupported') if gc_policy == 'supported': flags.append('-fobjc-gc') elif gc_policy == 'required': flags.append('-fobjc-gc-only') def _AddObjectiveCARCFlags(self, flags): if self._Test('CLANG_ENABLE_OBJC_ARC', 'YES', default='NO'): flags.append('-fobjc-arc') def _AddObjectiveCMissingPropertySynthesisFlags(self, flags): if self._Test('CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS', 'YES', default='NO'): flags.append('-Wobjc-missing-property-synthesis') def GetCflagsObjC(self, configname): """Returns flags that need to be added to .m compilations.""" self.configname = configname cflags_objc = [] self._AddObjectiveCGarbageCollectionFlags(cflags_objc) self._AddObjectiveCARCFlags(cflags_objc) self._AddObjectiveCMissingPropertySynthesisFlags(cflags_objc) self.configname = None return cflags_objc def GetCflagsObjCC(self, configname): """Returns flags that need to be added to .mm compilations.""" self.configname = configname cflags_objcc = [] self._AddObjectiveCGarbageCollectionFlags(cflags_objcc) self._AddObjectiveCARCFlags(cflags_objcc) self._AddObjectiveCMissingPropertySynthesisFlags(cflags_objcc) if self._Test('GCC_OBJC_CALL_CXX_CDTORS', 'YES', default='NO'): cflags_objcc.append('-fobjc-call-cxx-cdtors') self.configname = None return cflags_objcc def GetInstallNameBase(self): """Return DYLIB_INSTALL_NAME_BASE for this target.""" # Xcode sets this for shared_libraries, and for nonbundled loadable_modules. if (self.spec['type'] != 'shared_library' and (self.spec['type'] != 'loadable_module' or self._IsBundle())): return None install_base = self.GetPerTargetSetting( 'DYLIB_INSTALL_NAME_BASE', default='/Library/Frameworks' if self._IsBundle() else '/usr/local/lib') return install_base def _StandardizePath(self, path): """Do :standardizepath processing for path.""" # I'm not quite sure what :standardizepath does. Just call normpath(), # but don't let @executable_path/../foo collapse to foo. if '/' in path: prefix, rest = '', path if path.startswith('@'): prefix, rest = path.split('/', 1) rest = os.path.normpath(rest) # :standardizepath path = os.path.join(prefix, rest) return path def GetInstallName(self): """Return LD_DYLIB_INSTALL_NAME for this target.""" # Xcode sets this for shared_libraries, and for nonbundled loadable_modules. if (self.spec['type'] != 'shared_library' and (self.spec['type'] != 'loadable_module' or self._IsBundle())): return None default_install_name = \ '$(DYLIB_INSTALL_NAME_BASE:standardizepath)/$(EXECUTABLE_PATH)' install_name = self.GetPerTargetSetting( 'LD_DYLIB_INSTALL_NAME', default=default_install_name) # Hardcode support for the variables used in chromium for now, to # unblock people using the make build. if '$' in install_name: assert install_name in ('$(DYLIB_INSTALL_NAME_BASE:standardizepath)/' '$(WRAPPER_NAME)/$(PRODUCT_NAME)', default_install_name), ( 'Variables in LD_DYLIB_INSTALL_NAME are not generally supported ' 'yet in target \'%s\' (got \'%s\')' % (self.spec['target_name'], install_name)) install_name = install_name.replace( '$(DYLIB_INSTALL_NAME_BASE:standardizepath)', self._StandardizePath(self.GetInstallNameBase())) if self._IsBundle(): # These are only valid for bundles, hence the |if|. install_name = install_name.replace( '$(WRAPPER_NAME)', self.GetWrapperName()) install_name = install_name.replace( '$(PRODUCT_NAME)', self.GetProductName()) else: assert '$(WRAPPER_NAME)' not in install_name assert '$(PRODUCT_NAME)' not in install_name install_name = install_name.replace( '$(EXECUTABLE_PATH)', self.GetExecutablePath()) return install_name def _MapLinkerFlagFilename(self, ldflag, gyp_to_build_path): """Checks if ldflag contains a filename and if so remaps it from gyp-directory-relative to build-directory-relative.""" # This list is expanded on demand. # They get matched as: # -exported_symbols_list file # -Wl,exported_symbols_list file # -Wl,exported_symbols_list,file LINKER_FILE = '(\S+)' WORD = '\S+' linker_flags = [ ['-exported_symbols_list', LINKER_FILE], # Needed for NaCl. ['-unexported_symbols_list', LINKER_FILE], ['-reexported_symbols_list', LINKER_FILE], ['-sectcreate', WORD, WORD, LINKER_FILE], # Needed for remoting. ] for flag_pattern in linker_flags: regex = re.compile('(?:-Wl,)?' + '[ ,]'.join(flag_pattern)) m = regex.match(ldflag) if m: ldflag = ldflag[:m.start(1)] + gyp_to_build_path(m.group(1)) + \ ldflag[m.end(1):] # Required for ffmpeg (no idea why they don't use LIBRARY_SEARCH_PATHS, # TODO(thakis): Update ffmpeg.gyp): if ldflag.startswith('-L'): ldflag = '-L' + gyp_to_build_path(ldflag[len('-L'):]) return ldflag def GetLdflags(self, configname, product_dir, gyp_to_build_path, arch=None): """Returns flags that need to be passed to the linker. Args: configname: The name of the configuration to get ld flags for. product_dir: The directory where products such static and dynamic libraries are placed. This is added to the library search path. gyp_to_build_path: A function that converts paths relative to the current gyp file to paths relative to the build direcotry. """ self.configname = configname ldflags = [] # The xcode build is relative to a gyp file's directory, and OTHER_LDFLAGS # can contain entries that depend on this. Explicitly absolutify these. for ldflag in self._Settings().get('OTHER_LDFLAGS', []): ldflags.append(self._MapLinkerFlagFilename(ldflag, gyp_to_build_path)) if self._Test('DEAD_CODE_STRIPPING', 'YES', default='NO'): ldflags.append('-Wl,-dead_strip') if self._Test('PREBINDING', 'YES', default='NO'): ldflags.append('-Wl,-prebind') self._Appendf( ldflags, 'DYLIB_COMPATIBILITY_VERSION', '-compatibility_version %s') self._Appendf( ldflags, 'DYLIB_CURRENT_VERSION', '-current_version %s') self._AppendPlatformVersionMinFlags(ldflags) if 'SDKROOT' in self._Settings() and self._SdkPath(): ldflags.append('-isysroot ' + self._SdkPath()) for library_path in self._Settings().get('LIBRARY_SEARCH_PATHS', []): ldflags.append('-L' + gyp_to_build_path(library_path)) if 'ORDER_FILE' in self._Settings(): ldflags.append('-Wl,-order_file ' + '-Wl,' + gyp_to_build_path( self._Settings()['ORDER_FILE'])) if arch is not None: archs = [arch] else: archs = self._Settings().get('ARCHS', [self._DefaultArch()]) if len(archs) != 1: # TODO: Supporting fat binaries will be annoying. self._WarnUnimplemented('ARCHS') archs = ['i386'] ldflags.append('-arch ' + archs[0]) # Xcode adds the product directory by default. ldflags.append('-L' + product_dir) install_name = self.GetInstallName() if install_name and self.spec['type'] != 'loadable_module': ldflags.append('-install_name ' + install_name.replace(' ', r'\ ')) for rpath in self._Settings().get('LD_RUNPATH_SEARCH_PATHS', []): ldflags.append('-Wl,-rpath,' + rpath) sdk_root = self._SdkPath() if not sdk_root: sdk_root = '' config = self.spec['configurations'][self.configname] framework_dirs = config.get('mac_framework_dirs', []) for directory in framework_dirs: ldflags.append('-F' + directory.replace('$(SDKROOT)', sdk_root)) self.configname = None return ldflags def GetLibtoolflags(self, configname): """Returns flags that need to be passed to the static linker. Args: configname: The name of the configuration to get ld flags for. """ self.configname = configname libtoolflags = [] for libtoolflag in self._Settings().get('OTHER_LDFLAGS', []): libtoolflags.append(libtoolflag) # TODO(thakis): ARCHS? self.configname = None return libtoolflags def GetPerTargetSettings(self): """Gets a list of all the per-target settings. This will only fetch keys whose values are the same across all configurations.""" first_pass = True result = {} for configname in sorted(self.xcode_settings.keys()): if first_pass: result = dict(self.xcode_settings[configname]) first_pass = False else: for key, value in self.xcode_settings[configname].iteritems(): if key not in result: continue elif result[key] != value: del result[key] return result def GetPerConfigSetting(self, setting, configname, default=None): if configname in self.xcode_settings: return self.xcode_settings[configname].get(setting, default) else: return self.GetPerTargetSetting(setting, default) def GetPerTargetSetting(self, setting, default=None): """Tries to get xcode_settings.setting from spec. Assumes that the setting has the same value in all configurations and throws otherwise.""" is_first_pass = True result = None for configname in sorted(self.xcode_settings.keys()): if is_first_pass: result = self.xcode_settings[configname].get(setting, None) is_first_pass = False else: assert result == self.xcode_settings[configname].get(setting, None), ( "Expected per-target setting for '%s', got per-config setting " "(target %s)" % (setting, self.spec['target_name'])) if result is None: return default return result def _GetStripPostbuilds(self, configname, output_binary, quiet): """Returns a list of shell commands that contain the shell commands neccessary to strip this target's binary. These should be run as postbuilds before the actual postbuilds run.""" self.configname = configname result = [] if (self._Test('DEPLOYMENT_POSTPROCESSING', 'YES', default='NO') and self._Test('STRIP_INSTALLED_PRODUCT', 'YES', default='NO')): default_strip_style = 'debugging' if self.spec['type'] == 'loadable_module' and self._IsBundle(): default_strip_style = 'non-global' elif self.spec['type'] == 'executable': default_strip_style = 'all' strip_style = self._Settings().get('STRIP_STYLE', default_strip_style) strip_flags = { 'all': '', 'non-global': '-x', 'debugging': '-S', }[strip_style] explicit_strip_flags = self._Settings().get('STRIPFLAGS', '') if explicit_strip_flags: strip_flags += ' ' + _NormalizeEnvVarReferences(explicit_strip_flags) if not quiet: result.append('echo STRIP\\(%s\\)' % self.spec['target_name']) result.append('strip %s %s' % (strip_flags, output_binary)) self.configname = None return result def _GetDebugInfoPostbuilds(self, configname, output, output_binary, quiet): """Returns a list of shell commands that contain the shell commands neccessary to massage this target's debug information. These should be run as postbuilds before the actual postbuilds run.""" self.configname = configname # For static libraries, no dSYMs are created. result = [] if (self._Test('GCC_GENERATE_DEBUGGING_SYMBOLS', 'YES', default='YES') and self._Test( 'DEBUG_INFORMATION_FORMAT', 'dwarf-with-dsym', default='dwarf') and self.spec['type'] != 'static_library'): if not quiet: result.append('echo DSYMUTIL\\(%s\\)' % self.spec['target_name']) result.append('dsymutil %s -o %s' % (output_binary, output + '.dSYM')) self.configname = None return result def _GetTargetPostbuilds(self, configname, output, output_binary, quiet=False): """Returns a list of shell commands that contain the shell commands to run as postbuilds for this target, before the actual postbuilds.""" # dSYMs need to build before stripping happens. return ( self._GetDebugInfoPostbuilds(configname, output, output_binary, quiet) + self._GetStripPostbuilds(configname, output_binary, quiet)) def _GetIOSPostbuilds(self, configname, output_binary): """Return a shell command to codesign the iOS output binary so it can be deployed to a device. This should be run as the very last step of the build.""" if not (self.isIOS and self.spec['type'] == "executable"): return [] settings = self.xcode_settings[configname] key = self._GetIOSCodeSignIdentityKey(settings) if not key: return [] # Warn for any unimplemented signing xcode keys. unimpl = ['OTHER_CODE_SIGN_FLAGS'] unimpl = set(unimpl) & set(self.xcode_settings[configname].keys()) if unimpl: print 'Warning: Some codesign keys not implemented, ignoring: %s' % ( ', '.join(sorted(unimpl))) return ['%s code-sign-bundle "%s" "%s" "%s" "%s"' % ( os.path.join('${TARGET_BUILD_DIR}', 'gyp-mac-tool'), key, settings.get('CODE_SIGN_RESOURCE_RULES_PATH', ''), settings.get('CODE_SIGN_ENTITLEMENTS', ''), settings.get('PROVISIONING_PROFILE', '')) ] def _GetIOSCodeSignIdentityKey(self, settings): identity = settings.get('CODE_SIGN_IDENTITY') if not identity: return None if identity not in XcodeSettings._codesigning_key_cache: output = subprocess.check_output( ['security', 'find-identity', '-p', 'codesigning', '-v']) for line in output.splitlines(): if identity in line: fingerprint = line.split()[1] cache = XcodeSettings._codesigning_key_cache assert identity not in cache or fingerprint == cache[identity], ( "Multiple codesigning fingerprints for identity: %s" % identity) XcodeSettings._codesigning_key_cache[identity] = fingerprint return XcodeSettings._codesigning_key_cache.get(identity, '') def AddImplicitPostbuilds(self, configname, output, output_binary, postbuilds=[], quiet=False): """Returns a list of shell commands that should run before and after |postbuilds|.""" assert output_binary is not None pre = self._GetTargetPostbuilds(configname, output, output_binary, quiet) post = self._GetIOSPostbuilds(configname, output_binary) return pre + postbuilds + post def _AdjustLibrary(self, library, config_name=None): if library.endswith('.framework'): l = '-framework ' + os.path.splitext(os.path.basename(library))[0] else: m = self.library_re.match(library) if m: l = '-l' + m.group(1) else: l = library sdk_root = self._SdkPath(config_name) if not sdk_root: sdk_root = '' return l.replace('$(SDKROOT)', sdk_root) def AdjustLibraries(self, libraries, config_name=None): """Transforms entries like 'Cocoa.framework' in libraries into entries like '-framework Cocoa', 'libcrypto.dylib' into '-lcrypto', etc. """ libraries = [self._AdjustLibrary(library, config_name) for library in libraries] return libraries def _BuildMachineOSBuild(self): return self._GetStdout(['sw_vers', '-buildVersion']) # This method ported from the logic in Homebrew's CLT version check def _CLTVersion(self): # pkgutil output looks like # package-id: com.apple.pkg.CLTools_Executables # version: 5.0.1.0.1.1382131676 # volume: / # location: / # install-time: 1382544035 # groups: com.apple.FindSystemFiles.pkg-group com.apple.DevToolsBoth.pkg-group com.apple.DevToolsNonRelocatableShared.pkg-group STANDALONE_PKG_ID = "com.apple.pkg.DeveloperToolsCLILeo" FROM_XCODE_PKG_ID = "com.apple.pkg.DeveloperToolsCLI" MAVERICKS_PKG_ID = "com.apple.pkg.CLTools_Executables" regex = re.compile('version: (?P<version>.+)') for key in [MAVERICKS_PKG_ID, STANDALONE_PKG_ID, FROM_XCODE_PKG_ID]: try: output = self._GetStdout(['/usr/sbin/pkgutil', '--pkg-info', key]) return re.search(regex, output).groupdict()['version'] except: continue def _XcodeVersion(self): # `xcodebuild -version` output looks like # Xcode 4.6.3 # Build version 4H1503 # or like # Xcode 3.2.6 # Component versions: DevToolsCore-1809.0; DevToolsSupport-1806.0 # BuildVersion: 10M2518 # Convert that to '0463', '4H1503'. if len(XcodeSettings._xcode_version_cache) == 0: try: version_list = self._GetStdout(['xcodebuild', '-version']).splitlines() # In some circumstances xcodebuild exits 0 but doesn't return # the right results; for example, a user on 10.7 or 10.8 with # a bogus path set via xcode-select # In that case this may be a CLT-only install so fall back to # checking that version. if len(version_list) < 2: raise GypError, "xcodebuild returned unexpected results" except: version = self._CLTVersion() if version: version = re.match('(\d\.\d\.?\d*)', version).groups()[0] else: raise GypError, "No Xcode or CLT version detected!" # The CLT has no build information, so we return an empty string. version_list = [version, ''] version = version_list[0] build = version_list[-1] # Be careful to convert "4.2" to "0420": version = version.split()[-1].replace('.', '') version = (version + '0' * (3 - len(version))).zfill(4) if build: build = build.split()[-1] XcodeSettings._xcode_version_cache = (version, build) return XcodeSettings._xcode_version_cache def _XcodeIOSDeviceFamily(self, configname): family = self.xcode_settings[configname].get('TARGETED_DEVICE_FAMILY', '1') return [int(x) for x in family.split(',')] def GetExtraPlistItems(self, configname=None): """Returns a dictionary with extra items to insert into Info.plist.""" if configname not in XcodeSettings._plist_cache: cache = {} cache['BuildMachineOSBuild'] = self._BuildMachineOSBuild() xcode, xcode_build = self._XcodeVersion() cache['DTXcode'] = xcode cache['DTXcodeBuild'] = xcode_build sdk_root = self._SdkRoot(configname) if not sdk_root: sdk_root = self._DefaultSdkRoot() cache['DTSDKName'] = sdk_root if xcode >= '0430': cache['DTSDKBuild'] = self._GetSdkVersionInfoItem( sdk_root, 'ProductBuildVersion') else: cache['DTSDKBuild'] = cache['BuildMachineOSBuild'] if self.isIOS: cache['DTPlatformName'] = cache['DTSDKName'] if configname.endswith("iphoneos"): cache['DTPlatformVersion'] = self._GetSdkVersionInfoItem( sdk_root, 'ProductVersion') cache['CFBundleSupportedPlatforms'] = ['iPhoneOS'] else: cache['CFBundleSupportedPlatforms'] = ['iPhoneSimulator'] XcodeSettings._plist_cache[configname] = cache # Include extra plist items that are per-target, not per global # XcodeSettings. items = dict(XcodeSettings._plist_cache[configname]) if self.isIOS: items['UIDeviceFamily'] = self._XcodeIOSDeviceFamily(configname) return items def _DefaultSdkRoot(self): """Returns the default SDKROOT to use. Prior to version 5.0.0, if SDKROOT was not explicitly set in the Xcode project, then the environment variable was empty. Starting with this version, Xcode uses the name of the newest SDK installed. """ if self._XcodeVersion() < '0500': return '' default_sdk_path = self._XcodeSdkPath('') default_sdk_root = XcodeSettings._sdk_root_cache.get(default_sdk_path) if default_sdk_root: return default_sdk_root try: all_sdks = self._GetStdout(['xcodebuild', '-showsdks']) except: # If xcodebuild fails, there will be no valid SDKs return '' for line in all_sdks.splitlines(): items = line.split() if len(items) >= 3 and items[-2] == '-sdk': sdk_root = items[-1] sdk_path = self._XcodeSdkPath(sdk_root) if sdk_path == default_sdk_path: return sdk_root return '' def _DefaultArch(self): # For Mac projects, Xcode changed the default value used when ARCHS is not # set from "i386" to "x86_64". # # For iOS projects, if ARCHS is unset, it defaults to "armv7 armv7s" when # building for a device, and the simulator binaries are always build for # "i386". # # For new projects, ARCHS is set to $(ARCHS_STANDARD_INCLUDING_64_BIT), # which correspond to "armv7 armv7s arm64", and when building the simulator # the architecture is either "i386" or "x86_64" depending on the simulated # device (respectively 32-bit or 64-bit device). # # Since the value returned by this function is only used when ARCHS is not # set, then on iOS we return "i386", as the default xcode project generator # does not set ARCHS if it is not set in the .gyp file. if self.isIOS: return 'i386' version, build = self._XcodeVersion() if version >= '0500': return 'x86_64' return 'i386' class MacPrefixHeader(object): """A class that helps with emulating Xcode's GCC_PREFIX_HEADER feature. This feature consists of several pieces: * If GCC_PREFIX_HEADER is present, all compilations in that project get an additional |-include path_to_prefix_header| cflag. * If GCC_PRECOMPILE_PREFIX_HEADER is present too, then the prefix header is instead compiled, and all other compilations in the project get an additional |-include path_to_compiled_header| instead. + Compiled prefix headers have the extension gch. There is one gch file for every language used in the project (c, cc, m, mm), since gch files for different languages aren't compatible. + gch files themselves are built with the target's normal cflags, but they obviously don't get the |-include| flag. Instead, they need a -x flag that describes their language. + All o files in the target need to depend on the gch file, to make sure it's built before any o file is built. This class helps with some of these tasks, but it needs help from the build system for writing dependencies to the gch files, for writing build commands for the gch files, and for figuring out the location of the gch files. """ def __init__(self, xcode_settings, gyp_path_to_build_path, gyp_path_to_build_output): """If xcode_settings is None, all methods on this class are no-ops. Args: gyp_path_to_build_path: A function that takes a gyp-relative path, and returns a path relative to the build directory. gyp_path_to_build_output: A function that takes a gyp-relative path and a language code ('c', 'cc', 'm', or 'mm'), and that returns a path to where the output of precompiling that path for that language should be placed (without the trailing '.gch'). """ # This doesn't support per-configuration prefix headers. Good enough # for now. self.header = None self.compile_headers = False if xcode_settings: self.header = xcode_settings.GetPerTargetSetting('GCC_PREFIX_HEADER') self.compile_headers = xcode_settings.GetPerTargetSetting( 'GCC_PRECOMPILE_PREFIX_HEADER', default='NO') != 'NO' self.compiled_headers = {} if self.header: if self.compile_headers: for lang in ['c', 'cc', 'm', 'mm']: self.compiled_headers[lang] = gyp_path_to_build_output( self.header, lang) self.header = gyp_path_to_build_path(self.header) def _CompiledHeader(self, lang, arch): assert self.compile_headers h = self.compiled_headers[lang] if arch: h += '.' + arch return h def GetInclude(self, lang, arch=None): """Gets the cflags to include the prefix header for language |lang|.""" if self.compile_headers and lang in self.compiled_headers: return '-include %s' % self._CompiledHeader(lang, arch) elif self.header: return '-include %s' % self.header else: return '' def _Gch(self, lang, arch): """Returns the actual file name of the prefix header for language |lang|.""" assert self.compile_headers return self._CompiledHeader(lang, arch) + '.gch' def GetObjDependencies(self, sources, objs, arch=None): """Given a list of source files and the corresponding object files, returns a list of (source, object, gch) tuples, where |gch| is the build-directory relative path to the gch file each object file depends on. |compilable[i]| has to be the source file belonging to |objs[i]|.""" if not self.header or not self.compile_headers: return [] result = [] for source, obj in zip(sources, objs): ext = os.path.splitext(source)[1] lang = { '.c': 'c', '.cpp': 'cc', '.cc': 'cc', '.cxx': 'cc', '.m': 'm', '.mm': 'mm', }.get(ext, None) if lang: result.append((source, obj, self._Gch(lang, arch))) return result def GetPchBuildCommands(self, arch=None): """Returns [(path_to_gch, language_flag, language, header)]. |path_to_gch| and |header| are relative to the build directory. """ if not self.header or not self.compile_headers: return [] return [ (self._Gch('c', arch), '-x c-header', 'c', self.header), (self._Gch('cc', arch), '-x c++-header', 'cc', self.header), (self._Gch('m', arch), '-x objective-c-header', 'm', self.header), (self._Gch('mm', arch), '-x objective-c++-header', 'mm', self.header), ] def MergeGlobalXcodeSettingsToSpec(global_dict, spec): """Merges the global xcode_settings dictionary into each configuration of the target represented by spec. For keys that are both in the global and the local xcode_settings dict, the local key gets precendence. """ # The xcode generator special-cases global xcode_settings and does something # that amounts to merging in the global xcode_settings into each local # xcode_settings dict. global_xcode_settings = global_dict.get('xcode_settings', {}) for config in spec['configurations'].values(): if 'xcode_settings' in config: new_settings = global_xcode_settings.copy() new_settings.update(config['xcode_settings']) config['xcode_settings'] = new_settings def IsMacBundle(flavor, spec): """Returns if |spec| should be treated as a bundle. Bundles are directories with a certain subdirectory structure, instead of just a single file. Bundle rules do not produce a binary but also package resources into that directory.""" is_mac_bundle = (int(spec.get('mac_bundle', 0)) != 0 and flavor == 'mac') if is_mac_bundle: assert spec['type'] != 'none', ( 'mac_bundle targets cannot have type none (target "%s")' % spec['target_name']) return is_mac_bundle def GetMacBundleResources(product_dir, xcode_settings, resources): """Yields (output, resource) pairs for every resource in |resources|. Only call this for mac bundle targets. Args: product_dir: Path to the directory containing the output bundle, relative to the build directory. xcode_settings: The XcodeSettings of the current target. resources: A list of bundle resources, relative to the build directory. """ dest = os.path.join(product_dir, xcode_settings.GetBundleResourceFolder()) for res in resources: output = dest # The make generator doesn't support it, so forbid it everywhere # to keep the generators more interchangable. assert ' ' not in res, ( "Spaces in resource filenames not supported (%s)" % res) # Split into (path,file). res_parts = os.path.split(res) # Now split the path into (prefix,maybe.lproj). lproj_parts = os.path.split(res_parts[0]) # If the resource lives in a .lproj bundle, add that to the destination. if lproj_parts[1].endswith('.lproj'): output = os.path.join(output, lproj_parts[1]) output = os.path.join(output, res_parts[1]) # Compiled XIB files are referred to by .nib. if output.endswith('.xib'): output = os.path.splitext(output)[0] + '.nib' # Compiled storyboard files are referred to by .storyboardc. if output.endswith('.storyboard'): output = os.path.splitext(output)[0] + '.storyboardc' yield output, res def GetMacInfoPlist(product_dir, xcode_settings, gyp_path_to_build_path): """Returns (info_plist, dest_plist, defines, extra_env), where: * |info_plist| is the source plist path, relative to the build directory, * |dest_plist| is the destination plist path, relative to the build directory, * |defines| is a list of preprocessor defines (empty if the plist shouldn't be preprocessed, * |extra_env| is a dict of env variables that should be exported when invoking |mac_tool copy-info-plist|. Only call this for mac bundle targets. Args: product_dir: Path to the directory containing the output bundle, relative to the build directory. xcode_settings: The XcodeSettings of the current target. gyp_to_build_path: A function that converts paths relative to the current gyp file to paths relative to the build direcotry. """ info_plist = xcode_settings.GetPerTargetSetting('INFOPLIST_FILE') if not info_plist: return None, None, [], {} # The make generator doesn't support it, so forbid it everywhere # to keep the generators more interchangable. assert ' ' not in info_plist, ( "Spaces in Info.plist filenames not supported (%s)" % info_plist) info_plist = gyp_path_to_build_path(info_plist) # If explicitly set to preprocess the plist, invoke the C preprocessor and # specify any defines as -D flags. if xcode_settings.GetPerTargetSetting( 'INFOPLIST_PREPROCESS', default='NO') == 'YES': # Create an intermediate file based on the path. defines = shlex.split(xcode_settings.GetPerTargetSetting( 'INFOPLIST_PREPROCESSOR_DEFINITIONS', default='')) else: defines = [] dest_plist = os.path.join(product_dir, xcode_settings.GetBundlePlistPath()) extra_env = xcode_settings.GetPerTargetSettings() return info_plist, dest_plist, defines, extra_env def _GetXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration, additional_settings=None): """Return the environment variables that Xcode would set. See http://developer.apple.com/library/mac/#documentation/DeveloperTools/Reference/XcodeBuildSettingRef/1-Build_Setting_Reference/build_setting_ref.html#//apple_ref/doc/uid/TP40003931-CH3-SW153 for a full list. Args: xcode_settings: An XcodeSettings object. If this is None, this function returns an empty dict. built_products_dir: Absolute path to the built products dir. srcroot: Absolute path to the source root. configuration: The build configuration name. additional_settings: An optional dict with more values to add to the result. """ if not xcode_settings: return {} # This function is considered a friend of XcodeSettings, so let it reach into # its implementation details. spec = xcode_settings.spec # These are filled in on a as-needed basis. env = { 'BUILT_PRODUCTS_DIR' : built_products_dir, 'CONFIGURATION' : configuration, 'PRODUCT_NAME' : xcode_settings.GetProductName(), # See /Developer/Platforms/MacOSX.platform/Developer/Library/Xcode/Specifications/MacOSX\ Product\ Types.xcspec for FULL_PRODUCT_NAME 'SRCROOT' : srcroot, 'SOURCE_ROOT': '${SRCROOT}', # This is not true for static libraries, but currently the env is only # written for bundles: 'TARGET_BUILD_DIR' : built_products_dir, 'TEMP_DIR' : '${TMPDIR}', } if xcode_settings.GetPerConfigSetting('SDKROOT', configuration): env['SDKROOT'] = xcode_settings._SdkPath(configuration) else: env['SDKROOT'] = '' if spec['type'] in ( 'executable', 'static_library', 'shared_library', 'loadable_module'): env['EXECUTABLE_NAME'] = xcode_settings.GetExecutableName() env['EXECUTABLE_PATH'] = xcode_settings.GetExecutablePath() env['FULL_PRODUCT_NAME'] = xcode_settings.GetFullProductName() mach_o_type = xcode_settings.GetMachOType() if mach_o_type: env['MACH_O_TYPE'] = mach_o_type env['PRODUCT_TYPE'] = xcode_settings.GetProductType() if xcode_settings._IsBundle(): env['CONTENTS_FOLDER_PATH'] = \ xcode_settings.GetBundleContentsFolderPath() env['UNLOCALIZED_RESOURCES_FOLDER_PATH'] = \ xcode_settings.GetBundleResourceFolder() env['INFOPLIST_PATH'] = xcode_settings.GetBundlePlistPath() env['WRAPPER_NAME'] = xcode_settings.GetWrapperName() install_name = xcode_settings.GetInstallName() if install_name: env['LD_DYLIB_INSTALL_NAME'] = install_name install_name_base = xcode_settings.GetInstallNameBase() if install_name_base: env['DYLIB_INSTALL_NAME_BASE'] = install_name_base if not additional_settings: additional_settings = {} else: # Flatten lists to strings. for k in additional_settings: if not isinstance(additional_settings[k], str): additional_settings[k] = ' '.join(additional_settings[k]) additional_settings.update(env) for k in additional_settings: additional_settings[k] = _NormalizeEnvVarReferences(additional_settings[k]) return additional_settings def _NormalizeEnvVarReferences(str): """Takes a string containing variable references in the form ${FOO}, $(FOO), or $FOO, and returns a string with all variable references in the form ${FOO}. """ # $FOO -> ${FOO} str = re.sub(r'\$([a-zA-Z_][a-zA-Z0-9_]*)', r'${\1}', str) # $(FOO) -> ${FOO} matches = re.findall(r'(\$\(([a-zA-Z0-9\-_]+)\))', str) for match in matches: to_replace, variable = match assert '$(' not in match, '$($(FOO)) variables not supported: ' + match str = str.replace(to_replace, '${' + variable + '}') return str def ExpandEnvVars(string, expansions): """Expands ${VARIABLES}, $(VARIABLES), and $VARIABLES in string per the expansions list. If the variable expands to something that references another variable, this variable is expanded as well if it's in env -- until no variables present in env are left.""" for k, v in reversed(expansions): string = string.replace('${' + k + '}', v) string = string.replace('$(' + k + ')', v) string = string.replace('$' + k, v) return string def _TopologicallySortedEnvVarKeys(env): """Takes a dict |env| whose values are strings that can refer to other keys, for example env['foo'] = '$(bar) and $(baz)'. Returns a list L of all keys of env such that key2 is after key1 in L if env[key2] refers to env[key1]. Throws an Exception in case of dependency cycles. """ # Since environment variables can refer to other variables, the evaluation # order is important. Below is the logic to compute the dependency graph # and sort it. regex = re.compile(r'\$\{([a-zA-Z0-9\-_]+)\}') def GetEdges(node): # Use a definition of edges such that user_of_variable -> used_varible. # This happens to be easier in this case, since a variable's # definition contains all variables it references in a single string. # We can then reverse the result of the topological sort at the end. # Since: reverse(topsort(DAG)) = topsort(reverse_edges(DAG)) matches = set([v for v in regex.findall(env[node]) if v in env]) for dependee in matches: assert '${' not in dependee, 'Nested variables not supported: ' + dependee return matches try: # Topologically sort, and then reverse, because we used an edge definition # that's inverted from the expected result of this function (see comment # above). order = gyp.common.TopologicallySorted(env.keys(), GetEdges) order.reverse() return order except gyp.common.CycleError, e: raise GypError( 'Xcode environment variables are cyclically dependent: ' + str(e.nodes)) def GetSortedXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration, additional_settings=None): env = _GetXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration, additional_settings) return [(key, env[key]) for key in _TopologicallySortedEnvVarKeys(env)] def GetSpecPostbuildCommands(spec, quiet=False): """Returns the list of postbuilds explicitly defined on |spec|, in a form executable by a shell.""" postbuilds = [] for postbuild in spec.get('postbuilds', []): if not quiet: postbuilds.append('echo POSTBUILD\\(%s\\) %s' % ( spec['target_name'], postbuild['postbuild_name'])) postbuilds.append(gyp.common.EncodePOSIXShellList(postbuild['action'])) return postbuilds def _HasIOSTarget(targets): """Returns true if any target contains the iOS specific key IPHONEOS_DEPLOYMENT_TARGET.""" for target_dict in targets.values(): for config in target_dict['configurations'].values(): if config.get('xcode_settings', {}).get('IPHONEOS_DEPLOYMENT_TARGET'): return True return False def _AddIOSDeviceConfigurations(targets): """Clone all targets and append -iphoneos to the name. Configure these targets to build for iOS devices.""" for target_dict in targets.values(): for config_name in target_dict['configurations'].keys(): config = target_dict['configurations'][config_name] new_config_name = config_name + '-iphoneos' new_config_dict = copy.deepcopy(config) if target_dict['toolset'] == 'target': new_config_dict['xcode_settings']['ARCHS'] = ['armv7'] new_config_dict['xcode_settings']['SDKROOT'] = 'iphoneos' target_dict['configurations'][new_config_name] = new_config_dict return targets def CloneConfigurationForDeviceAndEmulator(target_dicts): """If |target_dicts| contains any iOS targets, automatically create -iphoneos targets for iOS device builds.""" if _HasIOSTarget(target_dicts): return _AddIOSDeviceConfigurations(target_dicts) return target_dicts ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xcodeproj_file.py���000644 �000766 �000024 �00000347610 12455173731 033177� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Xcode project file generator. This module is both an Xcode project file generator and a documentation of the Xcode project file format. Knowledge of the project file format was gained based on extensive experience with Xcode, and by making changes to projects in Xcode.app and observing the resultant changes in the associated project files. XCODE PROJECT FILES The generator targets the file format as written by Xcode 3.2 (specifically, 3.2.6), but past experience has taught that the format has not changed significantly in the past several years, and future versions of Xcode are able to read older project files. Xcode project files are "bundled": the project "file" from an end-user's perspective is actually a directory with an ".xcodeproj" extension. The project file from this module's perspective is actually a file inside this directory, always named "project.pbxproj". This file contains a complete description of the project and is all that is needed to use the xcodeproj. Other files contained in the xcodeproj directory are simply used to store per-user settings, such as the state of various UI elements in the Xcode application. The project.pbxproj file is a property list, stored in a format almost identical to the NeXTstep property list format. The file is able to carry Unicode data, and is encoded in UTF-8. The root element in the property list is a dictionary that contains several properties of minimal interest, and two properties of immense interest. The most important property is a dictionary named "objects". The entire structure of the project is represented by the children of this property. The objects dictionary is keyed by unique 96-bit values represented by 24 uppercase hexadecimal characters. Each value in the objects dictionary is itself a dictionary, describing an individual object. Each object in the dictionary is a member of a class, which is identified by the "isa" property of each object. A variety of classes are represented in a project file. Objects can refer to other objects by ID, using the 24-character hexadecimal object key. A project's objects form a tree, with a root object of class PBXProject at the root. As an example, the PBXProject object serves as parent to an XCConfigurationList object defining the build configurations used in the project, a PBXGroup object serving as a container for all files referenced in the project, and a list of target objects, each of which defines a target in the project. There are several different types of target object, such as PBXNativeTarget and PBXAggregateTarget. In this module, this relationship is expressed by having each target type derive from an abstract base named XCTarget. The project.pbxproj file's root dictionary also contains a property, sibling to the "objects" dictionary, named "rootObject". The value of rootObject is a 24-character object key referring to the root PBXProject object in the objects dictionary. In Xcode, every file used as input to a target or produced as a final product of a target must appear somewhere in the hierarchy rooted at the PBXGroup object referenced by the PBXProject's mainGroup property. A PBXGroup is generally represented as a folder in the Xcode application. PBXGroups can contain other PBXGroups as well as PBXFileReferences, which are pointers to actual files. Each XCTarget contains a list of build phases, represented in this module by the abstract base XCBuildPhase. Examples of concrete XCBuildPhase derivations are PBXSourcesBuildPhase and PBXFrameworksBuildPhase, which correspond to the "Compile Sources" and "Link Binary With Libraries" phases displayed in the Xcode application. Files used as input to these phases (for example, source files in the former case and libraries and frameworks in the latter) are represented by PBXBuildFile objects, referenced by elements of "files" lists in XCTarget objects. Each PBXBuildFile object refers to a PBXBuildFile object as a "weak" reference: it does not "own" the PBXBuildFile, which is owned by the root object's mainGroup or a descendant group. In most cases, the layer of indirection between an XCBuildPhase and a PBXFileReference via a PBXBuildFile appears extraneous, but there's actually one reason for this: file-specific compiler flags are added to the PBXBuildFile object so as to allow a single file to be a member of multiple targets while having distinct compiler flags for each. These flags can be modified in the Xcode applciation in the "Build" tab of a File Info window. When a project is open in the Xcode application, Xcode will rewrite it. As such, this module is careful to adhere to the formatting used by Xcode, to avoid insignificant changes appearing in the file when it is used in the Xcode application. This will keep version control repositories happy, and makes it possible to compare a project file used in Xcode to one generated by this module to determine if any significant changes were made in the application. Xcode has its own way of assigning 24-character identifiers to each object, which is not duplicated here. Because the identifier only is only generated once, when an object is created, and is then left unchanged, there is no need to attempt to duplicate Xcode's behavior in this area. The generator is free to select any identifier, even at random, to refer to the objects it creates, and Xcode will retain those identifiers and use them when subsequently rewriting the project file. However, the generator would choose new random identifiers each time the project files are generated, leading to difficulties comparing "used" project files to "pristine" ones produced by this module, and causing the appearance of changes as every object identifier is changed when updated projects are checked in to a version control repository. To mitigate this problem, this module chooses identifiers in a more deterministic way, by hashing a description of each object as well as its parent and ancestor objects. This strategy should result in minimal "shift" in IDs as successive generations of project files are produced. THIS MODULE This module introduces several classes, all derived from the XCObject class. Nearly all of the "brains" are built into the XCObject class, which understands how to create and modify objects, maintain the proper tree structure, compute identifiers, and print objects. For the most part, classes derived from XCObject need only provide a _schema class object, a dictionary that expresses what properties objects of the class may contain. Given this structure, it's possible to build a minimal project file by creating objects of the appropriate types and making the proper connections: config_list = XCConfigurationList() group = PBXGroup() project = PBXProject({'buildConfigurationList': config_list, 'mainGroup': group}) With the project object set up, it can be added to an XCProjectFile object. XCProjectFile is a pseudo-class in the sense that it is a concrete XCObject subclass that does not actually correspond to a class type found in a project file. Rather, it is used to represent the project file's root dictionary. Printing an XCProjectFile will print the entire project file, including the full "objects" dictionary. project_file = XCProjectFile({'rootObject': project}) project_file.ComputeIDs() project_file.Print() Xcode project files are always encoded in UTF-8. This module will accept strings of either the str class or the unicode class. Strings of class str are assumed to already be encoded in UTF-8. Obviously, if you're just using ASCII, you won't encounter difficulties because ASCII is a UTF-8 subset. Strings of class unicode are handled properly and encoded in UTF-8 when a project file is output. """ import gyp.common import posixpath import re import struct import sys # hashlib is supplied as of Python 2.5 as the replacement interface for sha # and other secure hashes. In 2.6, sha is deprecated. Import hashlib if # available, avoiding a deprecation warning under 2.6. Import sha otherwise, # preserving 2.4 compatibility. try: import hashlib _new_sha1 = hashlib.sha1 except ImportError: import sha _new_sha1 = sha.new # See XCObject._EncodeString. This pattern is used to determine when a string # can be printed unquoted. Strings that match this pattern may be printed # unquoted. Strings that do not match must be quoted and may be further # transformed to be properly encoded. Note that this expression matches the # characters listed with "+", for 1 or more occurrences: if a string is empty, # it must not match this pattern, because it needs to be encoded as "". _unquoted = re.compile('^[A-Za-z0-9$./_]+$') # Strings that match this pattern are quoted regardless of what _unquoted says. # Oddly, Xcode will quote any string with a run of three or more underscores. _quoted = re.compile('___') # This pattern should match any character that needs to be escaped by # XCObject._EncodeString. See that function. _escaped = re.compile('[\\\\"]|[\x00-\x1f]') # Used by SourceTreeAndPathFromPath _path_leading_variable = re.compile('^\$\((.*?)\)(/(.*))?$') def SourceTreeAndPathFromPath(input_path): """Given input_path, returns a tuple with sourceTree and path values. Examples: input_path (source_tree, output_path) '$(VAR)/path' ('VAR', 'path') '$(VAR)' ('VAR', None) 'path' (None, 'path') """ source_group_match = _path_leading_variable.match(input_path) if source_group_match: source_tree = source_group_match.group(1) output_path = source_group_match.group(3) # This may be None. else: source_tree = None output_path = input_path return (source_tree, output_path) def ConvertVariablesToShellSyntax(input_string): return re.sub('\$\((.*?)\)', '${\\1}', input_string) class XCObject(object): """The abstract base of all class types used in Xcode project files. Class variables: _schema: A dictionary defining the properties of this class. The keys to _schema are string property keys as used in project files. Values are a list of four or five elements: [ is_list, property_type, is_strong, is_required, default ] is_list: True if the property described is a list, as opposed to a single element. property_type: The type to use as the value of the property, or if is_list is True, the type to use for each element of the value's list. property_type must be an XCObject subclass, or one of the built-in types str, int, or dict. is_strong: If property_type is an XCObject subclass, is_strong is True to assert that this class "owns," or serves as parent, to the property value (or, if is_list is True, values). is_strong must be False if property_type is not an XCObject subclass. is_required: True if the property is required for the class. Note that is_required being True does not preclude an empty string ("", in the case of property_type str) or list ([], in the case of is_list True) from being set for the property. default: Optional. If is_requried is True, default may be set to provide a default value for objects that do not supply their own value. If is_required is True and default is not provided, users of the class must supply their own value for the property. Note that although the values of the array are expressed in boolean terms, subclasses provide values as integers to conserve horizontal space. _should_print_single_line: False in XCObject. Subclasses whose objects should be written to the project file in the alternate single-line format, such as PBXFileReference and PBXBuildFile, should set this to True. _encode_transforms: Used by _EncodeString to encode unprintable characters. The index into this list is the ordinal of the character to transform; each value is a string used to represent the character in the output. XCObject provides an _encode_transforms list suitable for most XCObject subclasses. _alternate_encode_transforms: Provided for subclasses that wish to use the alternate encoding rules. Xcode seems to use these rules when printing objects in single-line format. Subclasses that desire this behavior should set _encode_transforms to _alternate_encode_transforms. _hashables: A list of XCObject subclasses that can be hashed by ComputeIDs to construct this object's ID. Most classes that need custom hashing behavior should do it by overriding Hashables, but in some cases an object's parent may wish to push a hashable value into its child, and it can do so by appending to _hashables. Attributes: id: The object's identifier, a 24-character uppercase hexadecimal string. Usually, objects being created should not set id until the entire project file structure is built. At that point, UpdateIDs() should be called on the root object to assign deterministic values for id to each object in the tree. parent: The object's parent. This is set by a parent XCObject when a child object is added to it. _properties: The object's property dictionary. An object's properties are described by its class' _schema variable. """ _schema = {} _should_print_single_line = False # See _EncodeString. _encode_transforms = [] i = 0 while i < ord(' '): _encode_transforms.append('\\U%04x' % i) i = i + 1 _encode_transforms[7] = '\\a' _encode_transforms[8] = '\\b' _encode_transforms[9] = '\\t' _encode_transforms[10] = '\\n' _encode_transforms[11] = '\\v' _encode_transforms[12] = '\\f' _encode_transforms[13] = '\\n' _alternate_encode_transforms = list(_encode_transforms) _alternate_encode_transforms[9] = chr(9) _alternate_encode_transforms[10] = chr(10) _alternate_encode_transforms[11] = chr(11) def __init__(self, properties=None, id=None, parent=None): self.id = id self.parent = parent self._properties = {} self._hashables = [] self._SetDefaultsFromSchema() self.UpdateProperties(properties) def __repr__(self): try: name = self.Name() except NotImplementedError: return '<%s at 0x%x>' % (self.__class__.__name__, id(self)) return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self)) def Copy(self): """Make a copy of this object. The new object will have its own copy of lists and dicts. Any XCObject objects owned by this object (marked "strong") will be copied in the new object, even those found in lists. If this object has any weak references to other XCObjects, the same references are added to the new object without making a copy. """ that = self.__class__(id=self.id, parent=self.parent) for key, value in self._properties.iteritems(): is_strong = self._schema[key][2] if isinstance(value, XCObject): if is_strong: new_value = value.Copy() new_value.parent = that that._properties[key] = new_value else: that._properties[key] = value elif isinstance(value, str) or isinstance(value, unicode) or \ isinstance(value, int): that._properties[key] = value elif isinstance(value, list): if is_strong: # If is_strong is True, each element is an XCObject, so it's safe to # call Copy. that._properties[key] = [] for item in value: new_item = item.Copy() new_item.parent = that that._properties[key].append(new_item) else: that._properties[key] = value[:] elif isinstance(value, dict): # dicts are never strong. if is_strong: raise TypeError, 'Strong dict for key ' + key + ' in ' + \ self.__class__.__name__ else: that._properties[key] = value.copy() else: raise TypeError, 'Unexpected type ' + value.__class__.__name__ + \ ' for key ' + key + ' in ' + self.__class__.__name__ return that def Name(self): """Return the name corresponding to an object. Not all objects necessarily need to be nameable, and not all that do have a "name" property. Override as needed. """ # If the schema indicates that "name" is required, try to access the # property even if it doesn't exist. This will result in a KeyError # being raised for the property that should be present, which seems more # appropriate than NotImplementedError in this case. if 'name' in self._properties or \ ('name' in self._schema and self._schema['name'][3]): return self._properties['name'] raise NotImplementedError, \ self.__class__.__name__ + ' must implement Name' def Comment(self): """Return a comment string for the object. Most objects just use their name as the comment, but PBXProject uses different values. The returned comment is not escaped and does not have any comment marker strings applied to it. """ return self.Name() def Hashables(self): hashables = [self.__class__.__name__] name = self.Name() if name != None: hashables.append(name) hashables.extend(self._hashables) return hashables def HashablesForChild(self): return None def ComputeIDs(self, recursive=True, overwrite=True, seed_hash=None): """Set "id" properties deterministically. An object's "id" property is set based on a hash of its class type and name, as well as the class type and name of all ancestor objects. As such, it is only advisable to call ComputeIDs once an entire project file tree is built. If recursive is True, recurse into all descendant objects and update their hashes. If overwrite is True, any existing value set in the "id" property will be replaced. """ def _HashUpdate(hash, data): """Update hash with data's length and contents. If the hash were updated only with the value of data, it would be possible for clowns to induce collisions by manipulating the names of their objects. By adding the length, it's exceedingly less likely that ID collisions will be encountered, intentionally or not. """ hash.update(struct.pack('>i', len(data))) hash.update(data) if seed_hash is None: seed_hash = _new_sha1() hash = seed_hash.copy() hashables = self.Hashables() assert len(hashables) > 0 for hashable in hashables: _HashUpdate(hash, hashable) if recursive: hashables_for_child = self.HashablesForChild() if hashables_for_child is None: child_hash = hash else: assert len(hashables_for_child) > 0 child_hash = seed_hash.copy() for hashable in hashables_for_child: _HashUpdate(child_hash, hashable) for child in self.Children(): child.ComputeIDs(recursive, overwrite, child_hash) if overwrite or self.id is None: # Xcode IDs are only 96 bits (24 hex characters), but a SHA-1 digest is # is 160 bits. Instead of throwing out 64 bits of the digest, xor them # into the portion that gets used. assert hash.digest_size % 4 == 0 digest_int_count = hash.digest_size / 4 digest_ints = struct.unpack('>' + 'I' * digest_int_count, hash.digest()) id_ints = [0, 0, 0] for index in xrange(0, digest_int_count): id_ints[index % 3] ^= digest_ints[index] self.id = '%08X%08X%08X' % tuple(id_ints) def EnsureNoIDCollisions(self): """Verifies that no two objects have the same ID. Checks all descendants. """ ids = {} descendants = self.Descendants() for descendant in descendants: if descendant.id in ids: other = ids[descendant.id] raise KeyError, \ 'Duplicate ID %s, objects "%s" and "%s" in "%s"' % \ (descendant.id, str(descendant._properties), str(other._properties), self._properties['rootObject'].Name()) ids[descendant.id] = descendant def Children(self): """Returns a list of all of this object's owned (strong) children.""" children = [] for property, attributes in self._schema.iteritems(): (is_list, property_type, is_strong) = attributes[0:3] if is_strong and property in self._properties: if not is_list: children.append(self._properties[property]) else: children.extend(self._properties[property]) return children def Descendants(self): """Returns a list of all of this object's descendants, including this object. """ children = self.Children() descendants = [self] for child in children: descendants.extend(child.Descendants()) return descendants def PBXProjectAncestor(self): # The base case for recursion is defined at PBXProject.PBXProjectAncestor. if self.parent: return self.parent.PBXProjectAncestor() return None def _EncodeComment(self, comment): """Encodes a comment to be placed in the project file output, mimicing Xcode behavior. """ # This mimics Xcode behavior by wrapping the comment in "/*" and "*/". If # the string already contains a "*/", it is turned into "(*)/". This keeps # the file writer from outputting something that would be treated as the # end of a comment in the middle of something intended to be entirely a # comment. return '/* ' + comment.replace('*/', '(*)/') + ' */' def _EncodeTransform(self, match): # This function works closely with _EncodeString. It will only be called # by re.sub with match.group(0) containing a character matched by the # the _escaped expression. char = match.group(0) # Backslashes (\) and quotation marks (") are always replaced with a # backslash-escaped version of the same. Everything else gets its # replacement from the class' _encode_transforms array. if char == '\\': return '\\\\' if char == '"': return '\\"' return self._encode_transforms[ord(char)] def _EncodeString(self, value): """Encodes a string to be placed in the project file output, mimicing Xcode behavior. """ # Use quotation marks when any character outside of the range A-Z, a-z, 0-9, # $ (dollar sign), . (period), and _ (underscore) is present. Also use # quotation marks to represent empty strings. # # Escape " (double-quote) and \ (backslash) by preceding them with a # backslash. # # Some characters below the printable ASCII range are encoded specially: # 7 ^G BEL is encoded as "\a" # 8 ^H BS is encoded as "\b" # 11 ^K VT is encoded as "\v" # 12 ^L NP is encoded as "\f" # 127 ^? DEL is passed through as-is without escaping # - In PBXFileReference and PBXBuildFile objects: # 9 ^I HT is passed through as-is without escaping # 10 ^J NL is passed through as-is without escaping # 13 ^M CR is passed through as-is without escaping # - In other objects: # 9 ^I HT is encoded as "\t" # 10 ^J NL is encoded as "\n" # 13 ^M CR is encoded as "\n" rendering it indistinguishable from # 10 ^J NL # All other characters within the ASCII control character range (0 through # 31 inclusive) are encoded as "\U001f" referring to the Unicode code point # in hexadecimal. For example, character 14 (^N SO) is encoded as "\U000e". # Characters above the ASCII range are passed through to the output encoded # as UTF-8 without any escaping. These mappings are contained in the # class' _encode_transforms list. if _unquoted.search(value) and not _quoted.search(value): return value return '"' + _escaped.sub(self._EncodeTransform, value) + '"' def _XCPrint(self, file, tabs, line): file.write('\t' * tabs + line) def _XCPrintableValue(self, tabs, value, flatten_list=False): """Returns a representation of value that may be printed in a project file, mimicing Xcode's behavior. _XCPrintableValue can handle str and int values, XCObjects (which are made printable by returning their id property), and list and dict objects composed of any of the above types. When printing a list or dict, and _should_print_single_line is False, the tabs parameter is used to determine how much to indent the lines corresponding to the items in the list or dict. If flatten_list is True, single-element lists will be transformed into strings. """ printable = '' comment = None if self._should_print_single_line: sep = ' ' element_tabs = '' end_tabs = '' else: sep = '\n' element_tabs = '\t' * (tabs + 1) end_tabs = '\t' * tabs if isinstance(value, XCObject): printable += value.id comment = value.Comment() elif isinstance(value, str): printable += self._EncodeString(value) elif isinstance(value, unicode): printable += self._EncodeString(value.encode('utf-8')) elif isinstance(value, int): printable += str(value) elif isinstance(value, list): if flatten_list and len(value) <= 1: if len(value) == 0: printable += self._EncodeString('') else: printable += self._EncodeString(value[0]) else: printable = '(' + sep for item in value: printable += element_tabs + \ self._XCPrintableValue(tabs + 1, item, flatten_list) + \ ',' + sep printable += end_tabs + ')' elif isinstance(value, dict): printable = '{' + sep for item_key, item_value in sorted(value.iteritems()): printable += element_tabs + \ self._XCPrintableValue(tabs + 1, item_key, flatten_list) + ' = ' + \ self._XCPrintableValue(tabs + 1, item_value, flatten_list) + ';' + \ sep printable += end_tabs + '}' else: raise TypeError, "Can't make " + value.__class__.__name__ + ' printable' if comment != None: printable += ' ' + self._EncodeComment(comment) return printable def _XCKVPrint(self, file, tabs, key, value): """Prints a key and value, members of an XCObject's _properties dictionary, to file. tabs is an int identifying the indentation level. If the class' _should_print_single_line variable is True, tabs is ignored and the key-value pair will be followed by a space insead of a newline. """ if self._should_print_single_line: printable = '' after_kv = ' ' else: printable = '\t' * tabs after_kv = '\n' # Xcode usually prints remoteGlobalIDString values in PBXContainerItemProxy # objects without comments. Sometimes it prints them with comments, but # the majority of the time, it doesn't. To avoid unnecessary changes to # the project file after Xcode opens it, don't write comments for # remoteGlobalIDString. This is a sucky hack and it would certainly be # cleaner to extend the schema to indicate whether or not a comment should # be printed, but since this is the only case where the problem occurs and # Xcode itself can't seem to make up its mind, the hack will suffice. # # Also see PBXContainerItemProxy._schema['remoteGlobalIDString']. if key == 'remoteGlobalIDString' and isinstance(self, PBXContainerItemProxy): value_to_print = value.id else: value_to_print = value # PBXBuildFile's settings property is represented in the output as a dict, # but a hack here has it represented as a string. Arrange to strip off the # quotes so that it shows up in the output as expected. if key == 'settings' and isinstance(self, PBXBuildFile): strip_value_quotes = True else: strip_value_quotes = False # In another one-off, let's set flatten_list on buildSettings properties # of XCBuildConfiguration objects, because that's how Xcode treats them. if key == 'buildSettings' and isinstance(self, XCBuildConfiguration): flatten_list = True else: flatten_list = False try: printable_key = self._XCPrintableValue(tabs, key, flatten_list) printable_value = self._XCPrintableValue(tabs, value_to_print, flatten_list) if strip_value_quotes and len(printable_value) > 1 and \ printable_value[0] == '"' and printable_value[-1] == '"': printable_value = printable_value[1:-1] printable += printable_key + ' = ' + printable_value + ';' + after_kv except TypeError, e: gyp.common.ExceptionAppend(e, 'while printing key "%s"' % key) raise self._XCPrint(file, 0, printable) def Print(self, file=sys.stdout): """Prints a reprentation of this object to file, adhering to Xcode output formatting. """ self.VerifyHasRequiredProperties() if self._should_print_single_line: # When printing an object in a single line, Xcode doesn't put any space # between the beginning of a dictionary (or presumably a list) and the # first contained item, so you wind up with snippets like # ...CDEF = {isa = PBXFileReference; fileRef = 0123... # If it were me, I would have put a space in there after the opening # curly, but I guess this is just another one of those inconsistencies # between how Xcode prints PBXFileReference and PBXBuildFile objects as # compared to other objects. Mimic Xcode's behavior here by using an # empty string for sep. sep = '' end_tabs = 0 else: sep = '\n' end_tabs = 2 # Start the object. For example, '\t\tPBXProject = {\n'. self._XCPrint(file, 2, self._XCPrintableValue(2, self) + ' = {' + sep) # "isa" isn't in the _properties dictionary, it's an intrinsic property # of the class which the object belongs to. Xcode always outputs "isa" # as the first element of an object dictionary. self._XCKVPrint(file, 3, 'isa', self.__class__.__name__) # The remaining elements of an object dictionary are sorted alphabetically. for property, value in sorted(self._properties.iteritems()): self._XCKVPrint(file, 3, property, value) # End the object. self._XCPrint(file, end_tabs, '};\n') def UpdateProperties(self, properties, do_copy=False): """Merge the supplied properties into the _properties dictionary. The input properties must adhere to the class schema or a KeyError or TypeError exception will be raised. If adding an object of an XCObject subclass and the schema indicates a strong relationship, the object's parent will be set to this object. If do_copy is True, then lists, dicts, strong-owned XCObjects, and strong-owned XCObjects in lists will be copied instead of having their references added. """ if properties is None: return for property, value in properties.iteritems(): # Make sure the property is in the schema. if not property in self._schema: raise KeyError, property + ' not in ' + self.__class__.__name__ # Make sure the property conforms to the schema. (is_list, property_type, is_strong) = self._schema[property][0:3] if is_list: if value.__class__ != list: raise TypeError, \ property + ' of ' + self.__class__.__name__ + \ ' must be list, not ' + value.__class__.__name__ for item in value: if not isinstance(item, property_type) and \ not (item.__class__ == unicode and property_type == str): # Accept unicode where str is specified. str is treated as # UTF-8-encoded. raise TypeError, \ 'item of ' + property + ' of ' + self.__class__.__name__ + \ ' must be ' + property_type.__name__ + ', not ' + \ item.__class__.__name__ elif not isinstance(value, property_type) and \ not (value.__class__ == unicode and property_type == str): # Accept unicode where str is specified. str is treated as # UTF-8-encoded. raise TypeError, \ property + ' of ' + self.__class__.__name__ + ' must be ' + \ property_type.__name__ + ', not ' + value.__class__.__name__ # Checks passed, perform the assignment. if do_copy: if isinstance(value, XCObject): if is_strong: self._properties[property] = value.Copy() else: self._properties[property] = value elif isinstance(value, str) or isinstance(value, unicode) or \ isinstance(value, int): self._properties[property] = value elif isinstance(value, list): if is_strong: # If is_strong is True, each element is an XCObject, so it's safe # to call Copy. self._properties[property] = [] for item in value: self._properties[property].append(item.Copy()) else: self._properties[property] = value[:] elif isinstance(value, dict): self._properties[property] = value.copy() else: raise TypeError, "Don't know how to copy a " + \ value.__class__.__name__ + ' object for ' + \ property + ' in ' + self.__class__.__name__ else: self._properties[property] = value # Set up the child's back-reference to this object. Don't use |value| # any more because it may not be right if do_copy is true. if is_strong: if not is_list: self._properties[property].parent = self else: for item in self._properties[property]: item.parent = self def HasProperty(self, key): return key in self._properties def GetProperty(self, key): return self._properties[key] def SetProperty(self, key, value): self.UpdateProperties({key: value}) def DelProperty(self, key): if key in self._properties: del self._properties[key] def AppendProperty(self, key, value): # TODO(mark): Support ExtendProperty too (and make this call that)? # Schema validation. if not key in self._schema: raise KeyError, key + ' not in ' + self.__class__.__name__ (is_list, property_type, is_strong) = self._schema[key][0:3] if not is_list: raise TypeError, key + ' of ' + self.__class__.__name__ + ' must be list' if not isinstance(value, property_type): raise TypeError, 'item of ' + key + ' of ' + self.__class__.__name__ + \ ' must be ' + property_type.__name__ + ', not ' + \ value.__class__.__name__ # If the property doesn't exist yet, create a new empty list to receive the # item. if not key in self._properties: self._properties[key] = [] # Set up the ownership link. if is_strong: value.parent = self # Store the item. self._properties[key].append(value) def VerifyHasRequiredProperties(self): """Ensure that all properties identified as required by the schema are set. """ # TODO(mark): A stronger verification mechanism is needed. Some # subclasses need to perform validation beyond what the schema can enforce. for property, attributes in self._schema.iteritems(): (is_list, property_type, is_strong, is_required) = attributes[0:4] if is_required and not property in self._properties: raise KeyError, self.__class__.__name__ + ' requires ' + property def _SetDefaultsFromSchema(self): """Assign object default values according to the schema. This will not overwrite properties that have already been set.""" defaults = {} for property, attributes in self._schema.iteritems(): (is_list, property_type, is_strong, is_required) = attributes[0:4] if is_required and len(attributes) >= 5 and \ not property in self._properties: default = attributes[4] defaults[property] = default if len(defaults) > 0: # Use do_copy=True so that each new object gets its own copy of strong # objects, lists, and dicts. self.UpdateProperties(defaults, do_copy=True) class XCHierarchicalElement(XCObject): """Abstract base for PBXGroup and PBXFileReference. Not represented in a project file.""" # TODO(mark): Do name and path belong here? Probably so. # If path is set and name is not, name may have a default value. Name will # be set to the basename of path, if the basename of path is different from # the full value of path. If path is already just a leaf name, name will # not be set. _schema = XCObject._schema.copy() _schema.update({ 'comments': [0, str, 0, 0], 'fileEncoding': [0, str, 0, 0], 'includeInIndex': [0, int, 0, 0], 'indentWidth': [0, int, 0, 0], 'lineEnding': [0, int, 0, 0], 'sourceTree': [0, str, 0, 1, '<group>'], 'tabWidth': [0, int, 0, 0], 'usesTabs': [0, int, 0, 0], 'wrapsLines': [0, int, 0, 0], }) def __init__(self, properties=None, id=None, parent=None): # super XCObject.__init__(self, properties, id, parent) if 'path' in self._properties and not 'name' in self._properties: path = self._properties['path'] name = posixpath.basename(path) if name != '' and path != name: self.SetProperty('name', name) if 'path' in self._properties and \ (not 'sourceTree' in self._properties or \ self._properties['sourceTree'] == '<group>'): # If the pathname begins with an Xcode variable like "$(SDKROOT)/", take # the variable out and make the path be relative to that variable by # assigning the variable name as the sourceTree. (source_tree, path) = SourceTreeAndPathFromPath(self._properties['path']) if source_tree != None: self._properties['sourceTree'] = source_tree if path != None: self._properties['path'] = path if source_tree != None and path is None and \ not 'name' in self._properties: # The path was of the form "$(SDKROOT)" with no path following it. # This object is now relative to that variable, so it has no path # attribute of its own. It does, however, keep a name. del self._properties['path'] self._properties['name'] = source_tree def Name(self): if 'name' in self._properties: return self._properties['name'] elif 'path' in self._properties: return self._properties['path'] else: # This happens in the case of the root PBXGroup. return None def Hashables(self): """Custom hashables for XCHierarchicalElements. XCHierarchicalElements are special. Generally, their hashes shouldn't change if the paths don't change. The normal XCObject implementation of Hashables adds a hashable for each object, which means that if the hierarchical structure changes (possibly due to changes caused when TakeOverOnlyChild runs and encounters slight changes in the hierarchy), the hashes will change. For example, if a project file initially contains a/b/f1 and a/b becomes collapsed into a/b, f1 will have a single parent a/b. If someone later adds a/f2 to the project file, a/b can no longer be collapsed, and f1 winds up with parent b and grandparent a. That would be sufficient to change f1's hash. To counteract this problem, hashables for all XCHierarchicalElements except for the main group (which has neither a name nor a path) are taken to be just the set of path components. Because hashables are inherited from parents, this provides assurance that a/b/f1 has the same set of hashables whether its parent is b or a/b. The main group is a special case. As it is permitted to have no name or path, it is permitted to use the standard XCObject hash mechanism. This is not considered a problem because there can be only one main group. """ if self == self.PBXProjectAncestor()._properties['mainGroup']: # super return XCObject.Hashables(self) hashables = [] # Put the name in first, ensuring that if TakeOverOnlyChild collapses # children into a top-level group like "Source", the name always goes # into the list of hashables without interfering with path components. if 'name' in self._properties: # Make it less likely for people to manipulate hashes by following the # pattern of always pushing an object type value onto the list first. hashables.append(self.__class__.__name__ + '.name') hashables.append(self._properties['name']) # NOTE: This still has the problem that if an absolute path is encountered, # including paths with a sourceTree, they'll still inherit their parents' # hashables, even though the paths aren't relative to their parents. This # is not expected to be much of a problem in practice. path = self.PathFromSourceTreeAndPath() if path != None: components = path.split(posixpath.sep) for component in components: hashables.append(self.__class__.__name__ + '.path') hashables.append(component) hashables.extend(self._hashables) return hashables def Compare(self, other): # Allow comparison of these types. PBXGroup has the highest sort rank; # PBXVariantGroup is treated as equal to PBXFileReference. valid_class_types = { PBXFileReference: 'file', PBXGroup: 'group', PBXVariantGroup: 'file', } self_type = valid_class_types[self.__class__] other_type = valid_class_types[other.__class__] if self_type == other_type: # If the two objects are of the same sort rank, compare their names. return cmp(self.Name(), other.Name()) # Otherwise, sort groups before everything else. if self_type == 'group': return -1 return 1 def CompareRootGroup(self, other): # This function should be used only to compare direct children of the # containing PBXProject's mainGroup. These groups should appear in the # listed order. # TODO(mark): "Build" is used by gyp.generator.xcode, perhaps the # generator should have a way of influencing this list rather than having # to hardcode for the generator here. order = ['Source', 'Intermediates', 'Projects', 'Frameworks', 'Products', 'Build'] # If the groups aren't in the listed order, do a name comparison. # Otherwise, groups in the listed order should come before those that # aren't. self_name = self.Name() other_name = other.Name() self_in = isinstance(self, PBXGroup) and self_name in order other_in = isinstance(self, PBXGroup) and other_name in order if not self_in and not other_in: return self.Compare(other) if self_name in order and not other_name in order: return -1 if other_name in order and not self_name in order: return 1 # If both groups are in the listed order, go by the defined order. self_index = order.index(self_name) other_index = order.index(other_name) if self_index < other_index: return -1 if self_index > other_index: return 1 return 0 def PathFromSourceTreeAndPath(self): # Turn the object's sourceTree and path properties into a single flat # string of a form comparable to the path parameter. If there's a # sourceTree property other than "<group>", wrap it in $(...) for the # comparison. components = [] if self._properties['sourceTree'] != '<group>': components.append('$(' + self._properties['sourceTree'] + ')') if 'path' in self._properties: components.append(self._properties['path']) if len(components) > 0: return posixpath.join(*components) return None def FullPath(self): # Returns a full path to self relative to the project file, or relative # to some other source tree. Start with self, and walk up the chain of # parents prepending their paths, if any, until no more parents are # available (project-relative path) or until a path relative to some # source tree is found. xche = self path = None while isinstance(xche, XCHierarchicalElement) and \ (path is None or \ (not path.startswith('/') and not path.startswith('$'))): this_path = xche.PathFromSourceTreeAndPath() if this_path != None and path != None: path = posixpath.join(this_path, path) elif this_path != None: path = this_path xche = xche.parent return path class PBXGroup(XCHierarchicalElement): """ Attributes: _children_by_path: Maps pathnames of children of this PBXGroup to the actual child XCHierarchicalElement objects. _variant_children_by_name_and_path: Maps (name, path) tuples of PBXVariantGroup children to the actual child PBXVariantGroup objects. """ _schema = XCHierarchicalElement._schema.copy() _schema.update({ 'children': [1, XCHierarchicalElement, 1, 1, []], 'name': [0, str, 0, 0], 'path': [0, str, 0, 0], }) def __init__(self, properties=None, id=None, parent=None): # super XCHierarchicalElement.__init__(self, properties, id, parent) self._children_by_path = {} self._variant_children_by_name_and_path = {} for child in self._properties.get('children', []): self._AddChildToDicts(child) def Hashables(self): # super hashables = XCHierarchicalElement.Hashables(self) # It is not sufficient to just rely on name and parent to build a unique # hashable : a node could have two child PBXGroup sharing a common name. # To add entropy the hashable is enhanced with the names of all its # children. for child in self._properties.get('children', []): child_name = child.Name() if child_name != None: hashables.append(child_name) return hashables def HashablesForChild(self): # To avoid a circular reference the hashables used to compute a child id do # not include the child names. return XCHierarchicalElement.Hashables(self) def _AddChildToDicts(self, child): # Sets up this PBXGroup object's dicts to reference the child properly. child_path = child.PathFromSourceTreeAndPath() if child_path: if child_path in self._children_by_path: raise ValueError, 'Found multiple children with path ' + child_path self._children_by_path[child_path] = child if isinstance(child, PBXVariantGroup): child_name = child._properties.get('name', None) key = (child_name, child_path) if key in self._variant_children_by_name_and_path: raise ValueError, 'Found multiple PBXVariantGroup children with ' + \ 'name ' + str(child_name) + ' and path ' + \ str(child_path) self._variant_children_by_name_and_path[key] = child def AppendChild(self, child): # Callers should use this instead of calling # AppendProperty('children', child) directly because this function # maintains the group's dicts. self.AppendProperty('children', child) self._AddChildToDicts(child) def GetChildByName(self, name): # This is not currently optimized with a dict as GetChildByPath is because # it has few callers. Most callers probably want GetChildByPath. This # function is only useful to get children that have names but no paths, # which is rare. The children of the main group ("Source", "Products", # etc.) is pretty much the only case where this likely to come up. # # TODO(mark): Maybe this should raise an error if more than one child is # present with the same name. if not 'children' in self._properties: return None for child in self._properties['children']: if child.Name() == name: return child return None def GetChildByPath(self, path): if not path: return None if path in self._children_by_path: return self._children_by_path[path] return None def GetChildByRemoteObject(self, remote_object): # This method is a little bit esoteric. Given a remote_object, which # should be a PBXFileReference in another project file, this method will # return this group's PBXReferenceProxy object serving as a local proxy # for the remote PBXFileReference. # # This function might benefit from a dict optimization as GetChildByPath # for some workloads, but profiling shows that it's not currently a # problem. if not 'children' in self._properties: return None for child in self._properties['children']: if not isinstance(child, PBXReferenceProxy): continue container_proxy = child._properties['remoteRef'] if container_proxy._properties['remoteGlobalIDString'] == remote_object: return child return None def AddOrGetFileByPath(self, path, hierarchical): """Returns an existing or new file reference corresponding to path. If hierarchical is True, this method will create or use the necessary hierarchical group structure corresponding to path. Otherwise, it will look in and create an item in the current group only. If an existing matching reference is found, it is returned, otherwise, a new one will be created, added to the correct group, and returned. If path identifies a directory by virtue of carrying a trailing slash, this method returns a PBXFileReference of "folder" type. If path identifies a variant, by virtue of it identifying a file inside a directory with an ".lproj" extension, this method returns a PBXVariantGroup containing the variant named by path, and possibly other variants. For all other paths, a "normal" PBXFileReference will be returned. """ # Adding or getting a directory? Directories end with a trailing slash. is_dir = False if path.endswith('/'): is_dir = True path = posixpath.normpath(path) if is_dir: path = path + '/' # Adding or getting a variant? Variants are files inside directories # with an ".lproj" extension. Xcode uses variants for localization. For # a variant path/to/Language.lproj/MainMenu.nib, put a variant group named # MainMenu.nib inside path/to, and give it a variant named Language. In # this example, grandparent would be set to path/to and parent_root would # be set to Language. variant_name = None parent = posixpath.dirname(path) grandparent = posixpath.dirname(parent) parent_basename = posixpath.basename(parent) (parent_root, parent_ext) = posixpath.splitext(parent_basename) if parent_ext == '.lproj': variant_name = parent_root if grandparent == '': grandparent = None # Putting a directory inside a variant group is not currently supported. assert not is_dir or variant_name is None path_split = path.split(posixpath.sep) if len(path_split) == 1 or \ ((is_dir or variant_name != None) and len(path_split) == 2) or \ not hierarchical: # The PBXFileReference or PBXVariantGroup will be added to or gotten from # this PBXGroup, no recursion necessary. if variant_name is None: # Add or get a PBXFileReference. file_ref = self.GetChildByPath(path) if file_ref != None: assert file_ref.__class__ == PBXFileReference else: file_ref = PBXFileReference({'path': path}) self.AppendChild(file_ref) else: # Add or get a PBXVariantGroup. The variant group name is the same # as the basename (MainMenu.nib in the example above). grandparent # specifies the path to the variant group itself, and path_split[-2:] # is the path of the specific variant relative to its group. variant_group_name = posixpath.basename(path) variant_group_ref = self.AddOrGetVariantGroupByNameAndPath( variant_group_name, grandparent) variant_path = posixpath.sep.join(path_split[-2:]) variant_ref = variant_group_ref.GetChildByPath(variant_path) if variant_ref != None: assert variant_ref.__class__ == PBXFileReference else: variant_ref = PBXFileReference({'name': variant_name, 'path': variant_path}) variant_group_ref.AppendChild(variant_ref) # The caller is interested in the variant group, not the specific # variant file. file_ref = variant_group_ref return file_ref else: # Hierarchical recursion. Add or get a PBXGroup corresponding to the # outermost path component, and then recurse into it, chopping off that # path component. next_dir = path_split[0] group_ref = self.GetChildByPath(next_dir) if group_ref != None: assert group_ref.__class__ == PBXGroup else: group_ref = PBXGroup({'path': next_dir}) self.AppendChild(group_ref) return group_ref.AddOrGetFileByPath(posixpath.sep.join(path_split[1:]), hierarchical) def AddOrGetVariantGroupByNameAndPath(self, name, path): """Returns an existing or new PBXVariantGroup for name and path. If a PBXVariantGroup identified by the name and path arguments is already present as a child of this object, it is returned. Otherwise, a new PBXVariantGroup with the correct properties is created, added as a child, and returned. This method will generally be called by AddOrGetFileByPath, which knows when to create a variant group based on the structure of the pathnames passed to it. """ key = (name, path) if key in self._variant_children_by_name_and_path: variant_group_ref = self._variant_children_by_name_and_path[key] assert variant_group_ref.__class__ == PBXVariantGroup return variant_group_ref variant_group_properties = {'name': name} if path != None: variant_group_properties['path'] = path variant_group_ref = PBXVariantGroup(variant_group_properties) self.AppendChild(variant_group_ref) return variant_group_ref def TakeOverOnlyChild(self, recurse=False): """If this PBXGroup has only one child and it's also a PBXGroup, take it over by making all of its children this object's children. This function will continue to take over only children when those children are groups. If there are three PBXGroups representing a, b, and c, with c inside b and b inside a, and a and b have no other children, this will result in a taking over both b and c, forming a PBXGroup for a/b/c. If recurse is True, this function will recurse into children and ask them to collapse themselves by taking over only children as well. Assuming an example hierarchy with files at a/b/c/d1, a/b/c/d2, and a/b/c/d3/e/f (d1, d2, and f are files, the rest are groups), recursion will result in a group for a/b/c containing a group for d3/e. """ # At this stage, check that child class types are PBXGroup exactly, # instead of using isinstance. The only subclass of PBXGroup, # PBXVariantGroup, should not participate in reparenting in the same way: # reparenting by merging different object types would be wrong. while len(self._properties['children']) == 1 and \ self._properties['children'][0].__class__ == PBXGroup: # Loop to take over the innermost only-child group possible. child = self._properties['children'][0] # Assume the child's properties, including its children. Save a copy # of this object's old properties, because they'll still be needed. # This object retains its existing id and parent attributes. old_properties = self._properties self._properties = child._properties self._children_by_path = child._children_by_path if not 'sourceTree' in self._properties or \ self._properties['sourceTree'] == '<group>': # The child was relative to its parent. Fix up the path. Note that # children with a sourceTree other than "<group>" are not relative to # their parents, so no path fix-up is needed in that case. if 'path' in old_properties: if 'path' in self._properties: # Both the original parent and child have paths set. self._properties['path'] = posixpath.join(old_properties['path'], self._properties['path']) else: # Only the original parent has a path, use it. self._properties['path'] = old_properties['path'] if 'sourceTree' in old_properties: # The original parent had a sourceTree set, use it. self._properties['sourceTree'] = old_properties['sourceTree'] # If the original parent had a name set, keep using it. If the original # parent didn't have a name but the child did, let the child's name # live on. If the name attribute seems unnecessary now, get rid of it. if 'name' in old_properties and old_properties['name'] != None and \ old_properties['name'] != self.Name(): self._properties['name'] = old_properties['name'] if 'name' in self._properties and 'path' in self._properties and \ self._properties['name'] == self._properties['path']: del self._properties['name'] # Notify all children of their new parent. for child in self._properties['children']: child.parent = self # If asked to recurse, recurse. if recurse: for child in self._properties['children']: if child.__class__ == PBXGroup: child.TakeOverOnlyChild(recurse) def SortGroup(self): self._properties['children'] = \ sorted(self._properties['children'], cmp=lambda x,y: x.Compare(y)) # Recurse. for child in self._properties['children']: if isinstance(child, PBXGroup): child.SortGroup() class XCFileLikeElement(XCHierarchicalElement): # Abstract base for objects that can be used as the fileRef property of # PBXBuildFile. def PathHashables(self): # A PBXBuildFile that refers to this object will call this method to # obtain additional hashables specific to this XCFileLikeElement. Don't # just use this object's hashables, they're not specific and unique enough # on their own (without access to the parent hashables.) Instead, provide # hashables that identify this object by path by getting its hashables as # well as the hashables of ancestor XCHierarchicalElement objects. hashables = [] xche = self while xche != None and isinstance(xche, XCHierarchicalElement): xche_hashables = xche.Hashables() for index in xrange(0, len(xche_hashables)): hashables.insert(index, xche_hashables[index]) xche = xche.parent return hashables class XCContainerPortal(XCObject): # Abstract base for objects that can be used as the containerPortal property # of PBXContainerItemProxy. pass class XCRemoteObject(XCObject): # Abstract base for objects that can be used as the remoteGlobalIDString # property of PBXContainerItemProxy. pass class PBXFileReference(XCFileLikeElement, XCContainerPortal, XCRemoteObject): _schema = XCFileLikeElement._schema.copy() _schema.update({ 'explicitFileType': [0, str, 0, 0], 'lastKnownFileType': [0, str, 0, 0], 'name': [0, str, 0, 0], 'path': [0, str, 0, 1], }) # Weird output rules for PBXFileReference. _should_print_single_line = True # super _encode_transforms = XCFileLikeElement._alternate_encode_transforms def __init__(self, properties=None, id=None, parent=None): # super XCFileLikeElement.__init__(self, properties, id, parent) if 'path' in self._properties and self._properties['path'].endswith('/'): self._properties['path'] = self._properties['path'][:-1] is_dir = True else: is_dir = False if 'path' in self._properties and \ not 'lastKnownFileType' in self._properties and \ not 'explicitFileType' in self._properties: # TODO(mark): This is the replacement for a replacement for a quick hack. # It is no longer incredibly sucky, but this list needs to be extended. extension_map = { 'a': 'archive.ar', 'app': 'wrapper.application', 'bdic': 'file', 'bundle': 'wrapper.cfbundle', 'c': 'sourcecode.c.c', 'cc': 'sourcecode.cpp.cpp', 'cpp': 'sourcecode.cpp.cpp', 'css': 'text.css', 'cxx': 'sourcecode.cpp.cpp', 'dart': 'sourcecode', 'dylib': 'compiled.mach-o.dylib', 'framework': 'wrapper.framework', 'gyp': 'sourcecode', 'gypi': 'sourcecode', 'h': 'sourcecode.c.h', 'hxx': 'sourcecode.cpp.h', 'icns': 'image.icns', 'java': 'sourcecode.java', 'js': 'sourcecode.javascript', 'm': 'sourcecode.c.objc', 'mm': 'sourcecode.cpp.objcpp', 'nib': 'wrapper.nib', 'o': 'compiled.mach-o.objfile', 'pdf': 'image.pdf', 'pl': 'text.script.perl', 'plist': 'text.plist.xml', 'pm': 'text.script.perl', 'png': 'image.png', 'py': 'text.script.python', 'r': 'sourcecode.rez', 'rez': 'sourcecode.rez', 's': 'sourcecode.asm', 'storyboard': 'file.storyboard', 'strings': 'text.plist.strings', 'ttf': 'file', 'xcconfig': 'text.xcconfig', 'xcdatamodel': 'wrapper.xcdatamodel', 'xib': 'file.xib', 'y': 'sourcecode.yacc', } prop_map = { 'dart': 'explicitFileType', 'gyp': 'explicitFileType', 'gypi': 'explicitFileType', } if is_dir: file_type = 'folder' prop_name = 'lastKnownFileType' else: basename = posixpath.basename(self._properties['path']) (root, ext) = posixpath.splitext(basename) # Check the map using a lowercase extension. # TODO(mark): Maybe it should try with the original case first and fall # back to lowercase, in case there are any instances where case # matters. There currently aren't. if ext != '': ext = ext[1:].lower() # TODO(mark): "text" is the default value, but "file" is appropriate # for unrecognized files not containing text. Xcode seems to choose # based on content. file_type = extension_map.get(ext, 'text') prop_name = prop_map.get(ext, 'lastKnownFileType') self._properties[prop_name] = file_type class PBXVariantGroup(PBXGroup, XCFileLikeElement): """PBXVariantGroup is used by Xcode to represent localizations.""" # No additions to the schema relative to PBXGroup. pass # PBXReferenceProxy is also an XCFileLikeElement subclass. It is defined below # because it uses PBXContainerItemProxy, defined below. class XCBuildConfiguration(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'baseConfigurationReference': [0, PBXFileReference, 0, 0], 'buildSettings': [0, dict, 0, 1, {}], 'name': [0, str, 0, 1], }) def HasBuildSetting(self, key): return key in self._properties['buildSettings'] def GetBuildSetting(self, key): return self._properties['buildSettings'][key] def SetBuildSetting(self, key, value): # TODO(mark): If a list, copy? self._properties['buildSettings'][key] = value def AppendBuildSetting(self, key, value): if not key in self._properties['buildSettings']: self._properties['buildSettings'][key] = [] self._properties['buildSettings'][key].append(value) def DelBuildSetting(self, key): if key in self._properties['buildSettings']: del self._properties['buildSettings'][key] def SetBaseConfiguration(self, value): self._properties['baseConfigurationReference'] = value class XCConfigurationList(XCObject): # _configs is the default list of configurations. _configs = [ XCBuildConfiguration({'name': 'Debug'}), XCBuildConfiguration({'name': 'Release'}) ] _schema = XCObject._schema.copy() _schema.update({ 'buildConfigurations': [1, XCBuildConfiguration, 1, 1, _configs], 'defaultConfigurationIsVisible': [0, int, 0, 1, 1], 'defaultConfigurationName': [0, str, 0, 1, 'Release'], }) def Name(self): return 'Build configuration list for ' + \ self.parent.__class__.__name__ + ' "' + self.parent.Name() + '"' def ConfigurationNamed(self, name): """Convenience accessor to obtain an XCBuildConfiguration by name.""" for configuration in self._properties['buildConfigurations']: if configuration._properties['name'] == name: return configuration raise KeyError, name def DefaultConfiguration(self): """Convenience accessor to obtain the default XCBuildConfiguration.""" return self.ConfigurationNamed(self._properties['defaultConfigurationName']) def HasBuildSetting(self, key): """Determines the state of a build setting in all XCBuildConfiguration child objects. If all child objects have key in their build settings, and the value is the same in all child objects, returns 1. If no child objects have the key in their build settings, returns 0. If some, but not all, child objects have the key in their build settings, or if any children have different values for the key, returns -1. """ has = None value = None for configuration in self._properties['buildConfigurations']: configuration_has = configuration.HasBuildSetting(key) if has is None: has = configuration_has elif has != configuration_has: return -1 if configuration_has: configuration_value = configuration.GetBuildSetting(key) if value is None: value = configuration_value elif value != configuration_value: return -1 if not has: return 0 return 1 def GetBuildSetting(self, key): """Gets the build setting for key. All child XCConfiguration objects must have the same value set for the setting, or a ValueError will be raised. """ # TODO(mark): This is wrong for build settings that are lists. The list # contents should be compared (and a list copy returned?) value = None for configuration in self._properties['buildConfigurations']: configuration_value = configuration.GetBuildSetting(key) if value is None: value = configuration_value else: if value != configuration_value: raise ValueError, 'Variant values for ' + key return value def SetBuildSetting(self, key, value): """Sets the build setting for key to value in all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.SetBuildSetting(key, value) def AppendBuildSetting(self, key, value): """Appends value to the build setting for key, which is treated as a list, in all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.AppendBuildSetting(key, value) def DelBuildSetting(self, key): """Deletes the build setting key from all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.DelBuildSetting(key) def SetBaseConfiguration(self, value): """Sets the build configuration in all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.SetBaseConfiguration(value) class PBXBuildFile(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'fileRef': [0, XCFileLikeElement, 0, 1], 'settings': [0, str, 0, 0], # hack, it's a dict }) # Weird output rules for PBXBuildFile. _should_print_single_line = True _encode_transforms = XCObject._alternate_encode_transforms def Name(self): # Example: "main.cc in Sources" return self._properties['fileRef'].Name() + ' in ' + self.parent.Name() def Hashables(self): # super hashables = XCObject.Hashables(self) # It is not sufficient to just rely on Name() to get the # XCFileLikeElement's name, because that is not a complete pathname. # PathHashables returns hashables unique enough that no two # PBXBuildFiles should wind up with the same set of hashables, unless # someone adds the same file multiple times to the same target. That # would be considered invalid anyway. hashables.extend(self._properties['fileRef'].PathHashables()) return hashables class XCBuildPhase(XCObject): """Abstract base for build phase classes. Not represented in a project file. Attributes: _files_by_path: A dict mapping each path of a child in the files list by path (keys) to the corresponding PBXBuildFile children (values). _files_by_xcfilelikeelement: A dict mapping each XCFileLikeElement (keys) to the corresponding PBXBuildFile children (values). """ # TODO(mark): Some build phase types, like PBXShellScriptBuildPhase, don't # actually have a "files" list. XCBuildPhase should not have "files" but # another abstract subclass of it should provide this, and concrete build # phase types that do have "files" lists should be derived from that new # abstract subclass. XCBuildPhase should only provide buildActionMask and # runOnlyForDeploymentPostprocessing, and not files or the various # file-related methods and attributes. _schema = XCObject._schema.copy() _schema.update({ 'buildActionMask': [0, int, 0, 1, 0x7fffffff], 'files': [1, PBXBuildFile, 1, 1, []], 'runOnlyForDeploymentPostprocessing': [0, int, 0, 1, 0], }) def __init__(self, properties=None, id=None, parent=None): # super XCObject.__init__(self, properties, id, parent) self._files_by_path = {} self._files_by_xcfilelikeelement = {} for pbxbuildfile in self._properties.get('files', []): self._AddBuildFileToDicts(pbxbuildfile) def FileGroup(self, path): # Subclasses must override this by returning a two-element tuple. The # first item in the tuple should be the PBXGroup to which "path" should be # added, either as a child or deeper descendant. The second item should # be a boolean indicating whether files should be added into hierarchical # groups or one single flat group. raise NotImplementedError, \ self.__class__.__name__ + ' must implement FileGroup' def _AddPathToDict(self, pbxbuildfile, path): """Adds path to the dict tracking paths belonging to this build phase. If the path is already a member of this build phase, raises an exception. """ if path in self._files_by_path: raise ValueError, 'Found multiple build files with path ' + path self._files_by_path[path] = pbxbuildfile def _AddBuildFileToDicts(self, pbxbuildfile, path=None): """Maintains the _files_by_path and _files_by_xcfilelikeelement dicts. If path is specified, then it is the path that is being added to the phase, and pbxbuildfile must contain either a PBXFileReference directly referencing that path, or it must contain a PBXVariantGroup that itself contains a PBXFileReference referencing the path. If path is not specified, either the PBXFileReference's path or the paths of all children of the PBXVariantGroup are taken as being added to the phase. If the path is already present in the phase, raises an exception. If the PBXFileReference or PBXVariantGroup referenced by pbxbuildfile are already present in the phase, referenced by a different PBXBuildFile object, raises an exception. This does not raise an exception when a PBXFileReference or PBXVariantGroup reappear and are referenced by the same PBXBuildFile that has already introduced them, because in the case of PBXVariantGroup objects, they may correspond to multiple paths that are not all added simultaneously. When this situation occurs, the path needs to be added to _files_by_path, but nothing needs to change in _files_by_xcfilelikeelement, and the caller should have avoided adding the PBXBuildFile if it is already present in the list of children. """ xcfilelikeelement = pbxbuildfile._properties['fileRef'] paths = [] if path != None: # It's best when the caller provides the path. if isinstance(xcfilelikeelement, PBXVariantGroup): paths.append(path) else: # If the caller didn't provide a path, there can be either multiple # paths (PBXVariantGroup) or one. if isinstance(xcfilelikeelement, PBXVariantGroup): for variant in xcfilelikeelement._properties['children']: paths.append(variant.FullPath()) else: paths.append(xcfilelikeelement.FullPath()) # Add the paths first, because if something's going to raise, the # messages provided by _AddPathToDict are more useful owing to its # having access to a real pathname and not just an object's Name(). for a_path in paths: self._AddPathToDict(pbxbuildfile, a_path) # If another PBXBuildFile references this XCFileLikeElement, there's a # problem. if xcfilelikeelement in self._files_by_xcfilelikeelement and \ self._files_by_xcfilelikeelement[xcfilelikeelement] != pbxbuildfile: raise ValueError, 'Found multiple build files for ' + \ xcfilelikeelement.Name() self._files_by_xcfilelikeelement[xcfilelikeelement] = pbxbuildfile def AppendBuildFile(self, pbxbuildfile, path=None): # Callers should use this instead of calling # AppendProperty('files', pbxbuildfile) directly because this function # maintains the object's dicts. Better yet, callers can just call AddFile # with a pathname and not worry about building their own PBXBuildFile # objects. self.AppendProperty('files', pbxbuildfile) self._AddBuildFileToDicts(pbxbuildfile, path) def AddFile(self, path, settings=None): (file_group, hierarchical) = self.FileGroup(path) file_ref = file_group.AddOrGetFileByPath(path, hierarchical) if file_ref in self._files_by_xcfilelikeelement and \ isinstance(file_ref, PBXVariantGroup): # There's already a PBXBuildFile in this phase corresponding to the # PBXVariantGroup. path just provides a new variant that belongs to # the group. Add the path to the dict. pbxbuildfile = self._files_by_xcfilelikeelement[file_ref] self._AddBuildFileToDicts(pbxbuildfile, path) else: # Add a new PBXBuildFile to get file_ref into the phase. if settings is None: pbxbuildfile = PBXBuildFile({'fileRef': file_ref}) else: pbxbuildfile = PBXBuildFile({'fileRef': file_ref, 'settings': settings}) self.AppendBuildFile(pbxbuildfile, path) class PBXHeadersBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Headers' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) class PBXResourcesBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Resources' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) class PBXSourcesBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Sources' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) class PBXFrameworksBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Frameworks' def FileGroup(self, path): (root, ext) = posixpath.splitext(path) if ext != '': ext = ext[1:].lower() if ext == 'o': # .o files are added to Xcode Frameworks phases, but conceptually aren't # frameworks, they're more like sources or intermediates. Redirect them # to show up in one of those other groups. return self.PBXProjectAncestor().RootGroupForPath(path) else: return (self.PBXProjectAncestor().FrameworksGroup(), False) class PBXShellScriptBuildPhase(XCBuildPhase): _schema = XCBuildPhase._schema.copy() _schema.update({ 'inputPaths': [1, str, 0, 1, []], 'name': [0, str, 0, 0], 'outputPaths': [1, str, 0, 1, []], 'shellPath': [0, str, 0, 1, '/bin/sh'], 'shellScript': [0, str, 0, 1], 'showEnvVarsInLog': [0, int, 0, 0], }) def Name(self): if 'name' in self._properties: return self._properties['name'] return 'ShellScript' class PBXCopyFilesBuildPhase(XCBuildPhase): _schema = XCBuildPhase._schema.copy() _schema.update({ 'dstPath': [0, str, 0, 1], 'dstSubfolderSpec': [0, int, 0, 1], 'name': [0, str, 0, 0], }) # path_tree_re matches "$(DIR)/path" or just "$(DIR)". Match group 1 is # "DIR", match group 3 is "path" or None. path_tree_re = re.compile('^\\$\\((.*)\\)(/(.*)|)$') # path_tree_to_subfolder maps names of Xcode variables to the associated # dstSubfolderSpec property value used in a PBXCopyFilesBuildPhase object. path_tree_to_subfolder = { 'BUILT_PRODUCTS_DIR': 16, # Products Directory # Other types that can be chosen via the Xcode UI. # TODO(mark): Map Xcode variable names to these. # : 1, # Wrapper # : 6, # Executables: 6 # : 7, # Resources # : 15, # Java Resources # : 10, # Frameworks # : 11, # Shared Frameworks # : 12, # Shared Support # : 13, # PlugIns } def Name(self): if 'name' in self._properties: return self._properties['name'] return 'CopyFiles' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) def SetDestination(self, path): """Set the dstSubfolderSpec and dstPath properties from path. path may be specified in the same notation used for XCHierarchicalElements, specifically, "$(DIR)/path". """ path_tree_match = self.path_tree_re.search(path) if path_tree_match: # Everything else needs to be relative to an Xcode variable. path_tree = path_tree_match.group(1) relative_path = path_tree_match.group(3) if path_tree in self.path_tree_to_subfolder: subfolder = self.path_tree_to_subfolder[path_tree] if relative_path is None: relative_path = '' else: # The path starts with an unrecognized Xcode variable # name like $(SRCROOT). Xcode will still handle this # as an "absolute path" that starts with the variable. subfolder = 0 relative_path = path elif path.startswith('/'): # Special case. Absolute paths are in dstSubfolderSpec 0. subfolder = 0 relative_path = path[1:] else: raise ValueError, 'Can\'t use path %s in a %s' % \ (path, self.__class__.__name__) self._properties['dstPath'] = relative_path self._properties['dstSubfolderSpec'] = subfolder class PBXBuildRule(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'compilerSpec': [0, str, 0, 1], 'filePatterns': [0, str, 0, 0], 'fileType': [0, str, 0, 1], 'isEditable': [0, int, 0, 1, 1], 'outputFiles': [1, str, 0, 1, []], 'script': [0, str, 0, 0], }) def Name(self): # Not very inspired, but it's what Xcode uses. return self.__class__.__name__ def Hashables(self): # super hashables = XCObject.Hashables(self) # Use the hashables of the weak objects that this object refers to. hashables.append(self._properties['fileType']) if 'filePatterns' in self._properties: hashables.append(self._properties['filePatterns']) return hashables class PBXContainerItemProxy(XCObject): # When referencing an item in this project file, containerPortal is the # PBXProject root object of this project file. When referencing an item in # another project file, containerPortal is a PBXFileReference identifying # the other project file. # # When serving as a proxy to an XCTarget (in this project file or another), # proxyType is 1. When serving as a proxy to a PBXFileReference (in another # project file), proxyType is 2. Type 2 is used for references to the # producs of the other project file's targets. # # Xcode is weird about remoteGlobalIDString. Usually, it's printed without # a comment, indicating that it's tracked internally simply as a string, but # sometimes it's printed with a comment (usually when the object is initially # created), indicating that it's tracked as a project file object at least # sometimes. This module always tracks it as an object, but contains a hack # to prevent it from printing the comment in the project file output. See # _XCKVPrint. _schema = XCObject._schema.copy() _schema.update({ 'containerPortal': [0, XCContainerPortal, 0, 1], 'proxyType': [0, int, 0, 1], 'remoteGlobalIDString': [0, XCRemoteObject, 0, 1], 'remoteInfo': [0, str, 0, 1], }) def __repr__(self): props = self._properties name = '%s.gyp:%s' % (props['containerPortal'].Name(), props['remoteInfo']) return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self)) def Name(self): # Admittedly not the best name, but it's what Xcode uses. return self.__class__.__name__ def Hashables(self): # super hashables = XCObject.Hashables(self) # Use the hashables of the weak objects that this object refers to. hashables.extend(self._properties['containerPortal'].Hashables()) hashables.extend(self._properties['remoteGlobalIDString'].Hashables()) return hashables class PBXTargetDependency(XCObject): # The "target" property accepts an XCTarget object, and obviously not # NoneType. But XCTarget is defined below, so it can't be put into the # schema yet. The definition of PBXTargetDependency can't be moved below # XCTarget because XCTarget's own schema references PBXTargetDependency. # Python doesn't deal well with this circular relationship, and doesn't have # a real way to do forward declarations. To work around, the type of # the "target" property is reset below, after XCTarget is defined. # # At least one of "name" and "target" is required. _schema = XCObject._schema.copy() _schema.update({ 'name': [0, str, 0, 0], 'target': [0, None.__class__, 0, 0], 'targetProxy': [0, PBXContainerItemProxy, 1, 1], }) def __repr__(self): name = self._properties.get('name') or self._properties['target'].Name() return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self)) def Name(self): # Admittedly not the best name, but it's what Xcode uses. return self.__class__.__name__ def Hashables(self): # super hashables = XCObject.Hashables(self) # Use the hashables of the weak objects that this object refers to. hashables.extend(self._properties['targetProxy'].Hashables()) return hashables class PBXReferenceProxy(XCFileLikeElement): _schema = XCFileLikeElement._schema.copy() _schema.update({ 'fileType': [0, str, 0, 1], 'path': [0, str, 0, 1], 'remoteRef': [0, PBXContainerItemProxy, 1, 1], }) class XCTarget(XCRemoteObject): # An XCTarget is really just an XCObject, the XCRemoteObject thing is just # to allow PBXProject to be used in the remoteGlobalIDString property of # PBXContainerItemProxy. # # Setting a "name" property at instantiation may also affect "productName", # which may in turn affect the "PRODUCT_NAME" build setting in children of # "buildConfigurationList". See __init__ below. _schema = XCRemoteObject._schema.copy() _schema.update({ 'buildConfigurationList': [0, XCConfigurationList, 1, 1, XCConfigurationList()], 'buildPhases': [1, XCBuildPhase, 1, 1, []], 'dependencies': [1, PBXTargetDependency, 1, 1, []], 'name': [0, str, 0, 1], 'productName': [0, str, 0, 1], }) def __init__(self, properties=None, id=None, parent=None, force_outdir=None, force_prefix=None, force_extension=None): # super XCRemoteObject.__init__(self, properties, id, parent) # Set up additional defaults not expressed in the schema. If a "name" # property was supplied, set "productName" if it is not present. Also set # the "PRODUCT_NAME" build setting in each configuration, but only if # the setting is not present in any build configuration. if 'name' in self._properties: if not 'productName' in self._properties: self.SetProperty('productName', self._properties['name']) if 'productName' in self._properties: if 'buildConfigurationList' in self._properties: configs = self._properties['buildConfigurationList'] if configs.HasBuildSetting('PRODUCT_NAME') == 0: configs.SetBuildSetting('PRODUCT_NAME', self._properties['productName']) def AddDependency(self, other): pbxproject = self.PBXProjectAncestor() other_pbxproject = other.PBXProjectAncestor() if pbxproject == other_pbxproject: # Add a dependency to another target in the same project file. container = PBXContainerItemProxy({'containerPortal': pbxproject, 'proxyType': 1, 'remoteGlobalIDString': other, 'remoteInfo': other.Name()}) dependency = PBXTargetDependency({'target': other, 'targetProxy': container}) self.AppendProperty('dependencies', dependency) else: # Add a dependency to a target in a different project file. other_project_ref = \ pbxproject.AddOrGetProjectReference(other_pbxproject)[1] container = PBXContainerItemProxy({ 'containerPortal': other_project_ref, 'proxyType': 1, 'remoteGlobalIDString': other, 'remoteInfo': other.Name(), }) dependency = PBXTargetDependency({'name': other.Name(), 'targetProxy': container}) self.AppendProperty('dependencies', dependency) # Proxy all of these through to the build configuration list. def ConfigurationNamed(self, name): return self._properties['buildConfigurationList'].ConfigurationNamed(name) def DefaultConfiguration(self): return self._properties['buildConfigurationList'].DefaultConfiguration() def HasBuildSetting(self, key): return self._properties['buildConfigurationList'].HasBuildSetting(key) def GetBuildSetting(self, key): return self._properties['buildConfigurationList'].GetBuildSetting(key) def SetBuildSetting(self, key, value): return self._properties['buildConfigurationList'].SetBuildSetting(key, \ value) def AppendBuildSetting(self, key, value): return self._properties['buildConfigurationList'].AppendBuildSetting(key, \ value) def DelBuildSetting(self, key): return self._properties['buildConfigurationList'].DelBuildSetting(key) # Redefine the type of the "target" property. See PBXTargetDependency._schema # above. PBXTargetDependency._schema['target'][1] = XCTarget class PBXNativeTarget(XCTarget): # buildPhases is overridden in the schema to be able to set defaults. # # NOTE: Contrary to most objects, it is advisable to set parent when # constructing PBXNativeTarget. A parent of an XCTarget must be a PBXProject # object. A parent reference is required for a PBXNativeTarget during # construction to be able to set up the target defaults for productReference, # because a PBXBuildFile object must be created for the target and it must # be added to the PBXProject's mainGroup hierarchy. _schema = XCTarget._schema.copy() _schema.update({ 'buildPhases': [1, XCBuildPhase, 1, 1, [PBXSourcesBuildPhase(), PBXFrameworksBuildPhase()]], 'buildRules': [1, PBXBuildRule, 1, 1, []], 'productReference': [0, PBXFileReference, 0, 1], 'productType': [0, str, 0, 1], }) # Mapping from Xcode product-types to settings. The settings are: # filetype : used for explicitFileType in the project file # prefix : the prefix for the file name # suffix : the suffix for the filen ame _product_filetypes = { 'com.apple.product-type.application': ['wrapper.application', '', '.app'], 'com.apple.product-type.bundle': ['wrapper.cfbundle', '', '.bundle'], 'com.apple.product-type.framework': ['wrapper.framework', '', '.framework'], 'com.apple.product-type.library.dynamic': ['compiled.mach-o.dylib', 'lib', '.dylib'], 'com.apple.product-type.library.static': ['archive.ar', 'lib', '.a'], 'com.apple.product-type.tool': ['compiled.mach-o.executable', '', ''], 'com.apple.product-type.bundle.unit-test': ['wrapper.cfbundle', '', '.xctest'], 'com.googlecode.gyp.xcode.bundle': ['compiled.mach-o.dylib', '', '.so'], } def __init__(self, properties=None, id=None, parent=None, force_outdir=None, force_prefix=None, force_extension=None): # super XCTarget.__init__(self, properties, id, parent) if 'productName' in self._properties and \ 'productType' in self._properties and \ not 'productReference' in self._properties and \ self._properties['productType'] in self._product_filetypes: products_group = None pbxproject = self.PBXProjectAncestor() if pbxproject != None: products_group = pbxproject.ProductsGroup() if products_group != None: (filetype, prefix, suffix) = \ self._product_filetypes[self._properties['productType']] # Xcode does not have a distinct type for loadable modules that are # pure BSD targets (not in a bundle wrapper). GYP allows such modules # to be specified by setting a target type to loadable_module without # having mac_bundle set. These are mapped to the pseudo-product type # com.googlecode.gyp.xcode.bundle. # # By picking up this special type and converting it to a dynamic # library (com.apple.product-type.library.dynamic) with fix-ups, # single-file loadable modules can be produced. # # MACH_O_TYPE is changed to mh_bundle to produce the proper file type # (as opposed to mh_dylib). In order for linking to succeed, # DYLIB_CURRENT_VERSION and DYLIB_COMPATIBILITY_VERSION must be # cleared. They are meaningless for type mh_bundle. # # Finally, the .so extension is forcibly applied over the default # (.dylib), unless another forced extension is already selected. # .dylib is plainly wrong, and .bundle is used by loadable_modules in # bundle wrappers (com.apple.product-type.bundle). .so seems an odd # choice because it's used as the extension on many other systems that # don't distinguish between linkable shared libraries and non-linkable # loadable modules, but there's precedent: Python loadable modules on # Mac OS X use an .so extension. if self._properties['productType'] == 'com.googlecode.gyp.xcode.bundle': self._properties['productType'] = \ 'com.apple.product-type.library.dynamic' self.SetBuildSetting('MACH_O_TYPE', 'mh_bundle') self.SetBuildSetting('DYLIB_CURRENT_VERSION', '') self.SetBuildSetting('DYLIB_COMPATIBILITY_VERSION', '') if force_extension is None: force_extension = suffix[1:] if self._properties['productType'] == \ 'com.apple.product-type-bundle.unit.test': if force_extension is None: force_extension = suffix[1:] if force_extension is not None: # If it's a wrapper (bundle), set WRAPPER_EXTENSION. if filetype.startswith('wrapper.'): self.SetBuildSetting('WRAPPER_EXTENSION', force_extension) else: # Extension override. suffix = '.' + force_extension self.SetBuildSetting('EXECUTABLE_EXTENSION', force_extension) if filetype.startswith('compiled.mach-o.executable'): product_name = self._properties['productName'] product_name += suffix suffix = '' self.SetProperty('productName', product_name) self.SetBuildSetting('PRODUCT_NAME', product_name) # Xcode handles most prefixes based on the target type, however there # are exceptions. If a "BSD Dynamic Library" target is added in the # Xcode UI, Xcode sets EXECUTABLE_PREFIX. This check duplicates that # behavior. if force_prefix is not None: prefix = force_prefix if filetype.startswith('wrapper.'): self.SetBuildSetting('WRAPPER_PREFIX', prefix) else: self.SetBuildSetting('EXECUTABLE_PREFIX', prefix) if force_outdir is not None: self.SetBuildSetting('TARGET_BUILD_DIR', force_outdir) # TODO(tvl): Remove the below hack. # http://code.google.com/p/gyp/issues/detail?id=122 # Some targets include the prefix in the target_name. These targets # really should just add a product_name setting that doesn't include # the prefix. For example: # target_name = 'libevent', product_name = 'event' # This check cleans up for them. product_name = self._properties['productName'] prefix_len = len(prefix) if prefix_len and (product_name[:prefix_len] == prefix): product_name = product_name[prefix_len:] self.SetProperty('productName', product_name) self.SetBuildSetting('PRODUCT_NAME', product_name) ref_props = { 'explicitFileType': filetype, 'includeInIndex': 0, 'path': prefix + product_name + suffix, 'sourceTree': 'BUILT_PRODUCTS_DIR', } file_ref = PBXFileReference(ref_props) products_group.AppendChild(file_ref) self.SetProperty('productReference', file_ref) def GetBuildPhaseByType(self, type): if not 'buildPhases' in self._properties: return None the_phase = None for phase in self._properties['buildPhases']: if isinstance(phase, type): # Some phases may be present in multiples in a well-formed project file, # but phases like PBXSourcesBuildPhase may only be present singly, and # this function is intended as an aid to GetBuildPhaseByType. Loop # over the entire list of phases and assert if more than one of the # desired type is found. assert the_phase is None the_phase = phase return the_phase def HeadersPhase(self): headers_phase = self.GetBuildPhaseByType(PBXHeadersBuildPhase) if headers_phase is None: headers_phase = PBXHeadersBuildPhase() # The headers phase should come before the resources, sources, and # frameworks phases, if any. insert_at = len(self._properties['buildPhases']) for index in xrange(0, len(self._properties['buildPhases'])): phase = self._properties['buildPhases'][index] if isinstance(phase, PBXResourcesBuildPhase) or \ isinstance(phase, PBXSourcesBuildPhase) or \ isinstance(phase, PBXFrameworksBuildPhase): insert_at = index break self._properties['buildPhases'].insert(insert_at, headers_phase) headers_phase.parent = self return headers_phase def ResourcesPhase(self): resources_phase = self.GetBuildPhaseByType(PBXResourcesBuildPhase) if resources_phase is None: resources_phase = PBXResourcesBuildPhase() # The resources phase should come before the sources and frameworks # phases, if any. insert_at = len(self._properties['buildPhases']) for index in xrange(0, len(self._properties['buildPhases'])): phase = self._properties['buildPhases'][index] if isinstance(phase, PBXSourcesBuildPhase) or \ isinstance(phase, PBXFrameworksBuildPhase): insert_at = index break self._properties['buildPhases'].insert(insert_at, resources_phase) resources_phase.parent = self return resources_phase def SourcesPhase(self): sources_phase = self.GetBuildPhaseByType(PBXSourcesBuildPhase) if sources_phase is None: sources_phase = PBXSourcesBuildPhase() self.AppendProperty('buildPhases', sources_phase) return sources_phase def FrameworksPhase(self): frameworks_phase = self.GetBuildPhaseByType(PBXFrameworksBuildPhase) if frameworks_phase is None: frameworks_phase = PBXFrameworksBuildPhase() self.AppendProperty('buildPhases', frameworks_phase) return frameworks_phase def AddDependency(self, other): # super XCTarget.AddDependency(self, other) static_library_type = 'com.apple.product-type.library.static' shared_library_type = 'com.apple.product-type.library.dynamic' framework_type = 'com.apple.product-type.framework' if isinstance(other, PBXNativeTarget) and \ 'productType' in self._properties and \ self._properties['productType'] != static_library_type and \ 'productType' in other._properties and \ (other._properties['productType'] == static_library_type or \ ((other._properties['productType'] == shared_library_type or \ other._properties['productType'] == framework_type) and \ ((not other.HasBuildSetting('MACH_O_TYPE')) or other.GetBuildSetting('MACH_O_TYPE') != 'mh_bundle'))): file_ref = other.GetProperty('productReference') pbxproject = self.PBXProjectAncestor() other_pbxproject = other.PBXProjectAncestor() if pbxproject != other_pbxproject: other_project_product_group = \ pbxproject.AddOrGetProjectReference(other_pbxproject)[0] file_ref = other_project_product_group.GetChildByRemoteObject(file_ref) self.FrameworksPhase().AppendProperty('files', PBXBuildFile({'fileRef': file_ref})) class PBXAggregateTarget(XCTarget): pass class PBXProject(XCContainerPortal): # A PBXProject is really just an XCObject, the XCContainerPortal thing is # just to allow PBXProject to be used in the containerPortal property of # PBXContainerItemProxy. """ Attributes: path: "sample.xcodeproj". TODO(mark) Document me! _other_pbxprojects: A dictionary, keyed by other PBXProject objects. Each value is a reference to the dict in the projectReferences list associated with the keyed PBXProject. """ _schema = XCContainerPortal._schema.copy() _schema.update({ 'attributes': [0, dict, 0, 0], 'buildConfigurationList': [0, XCConfigurationList, 1, 1, XCConfigurationList()], 'compatibilityVersion': [0, str, 0, 1, 'Xcode 3.2'], 'hasScannedForEncodings': [0, int, 0, 1, 1], 'mainGroup': [0, PBXGroup, 1, 1, PBXGroup()], 'projectDirPath': [0, str, 0, 1, ''], 'projectReferences': [1, dict, 0, 0], 'projectRoot': [0, str, 0, 1, ''], 'targets': [1, XCTarget, 1, 1, []], }) def __init__(self, properties=None, id=None, parent=None, path=None): self.path = path self._other_pbxprojects = {} # super return XCContainerPortal.__init__(self, properties, id, parent) def Name(self): name = self.path if name[-10:] == '.xcodeproj': name = name[:-10] return posixpath.basename(name) def Path(self): return self.path def Comment(self): return 'Project object' def Children(self): # super children = XCContainerPortal.Children(self) # Add children that the schema doesn't know about. Maybe there's a more # elegant way around this, but this is the only case where we need to own # objects in a dictionary (that is itself in a list), and three lines for # a one-off isn't that big a deal. if 'projectReferences' in self._properties: for reference in self._properties['projectReferences']: children.append(reference['ProductGroup']) return children def PBXProjectAncestor(self): return self def _GroupByName(self, name): if not 'mainGroup' in self._properties: self.SetProperty('mainGroup', PBXGroup()) main_group = self._properties['mainGroup'] group = main_group.GetChildByName(name) if group is None: group = PBXGroup({'name': name}) main_group.AppendChild(group) return group # SourceGroup and ProductsGroup are created by default in Xcode's own # templates. def SourceGroup(self): return self._GroupByName('Source') def ProductsGroup(self): return self._GroupByName('Products') # IntermediatesGroup is used to collect source-like files that are generated # by rules or script phases and are placed in intermediate directories such # as DerivedSources. def IntermediatesGroup(self): return self._GroupByName('Intermediates') # FrameworksGroup and ProjectsGroup are top-level groups used to collect # frameworks and projects. def FrameworksGroup(self): return self._GroupByName('Frameworks') def ProjectsGroup(self): return self._GroupByName('Projects') def RootGroupForPath(self, path): """Returns a PBXGroup child of this object to which path should be added. This method is intended to choose between SourceGroup and IntermediatesGroup on the basis of whether path is present in a source directory or an intermediates directory. For the purposes of this determination, any path located within a derived file directory such as PROJECT_DERIVED_FILE_DIR is treated as being in an intermediates directory. The returned value is a two-element tuple. The first element is the PBXGroup, and the second element specifies whether that group should be organized hierarchically (True) or as a single flat list (False). """ # TODO(mark): make this a class variable and bind to self on call? # Also, this list is nowhere near exhaustive. # INTERMEDIATE_DIR and SHARED_INTERMEDIATE_DIR are used by # gyp.generator.xcode. There should probably be some way for that module # to push the names in, rather than having to hard-code them here. source_tree_groups = { 'DERIVED_FILE_DIR': (self.IntermediatesGroup, True), 'INTERMEDIATE_DIR': (self.IntermediatesGroup, True), 'PROJECT_DERIVED_FILE_DIR': (self.IntermediatesGroup, True), 'SHARED_INTERMEDIATE_DIR': (self.IntermediatesGroup, True), } (source_tree, path) = SourceTreeAndPathFromPath(path) if source_tree != None and source_tree in source_tree_groups: (group_func, hierarchical) = source_tree_groups[source_tree] group = group_func() return (group, hierarchical) # TODO(mark): make additional choices based on file extension. return (self.SourceGroup(), True) def AddOrGetFileInRootGroup(self, path): """Returns a PBXFileReference corresponding to path in the correct group according to RootGroupForPath's heuristics. If an existing PBXFileReference for path exists, it will be returned. Otherwise, one will be created and returned. """ (group, hierarchical) = self.RootGroupForPath(path) return group.AddOrGetFileByPath(path, hierarchical) def RootGroupsTakeOverOnlyChildren(self, recurse=False): """Calls TakeOverOnlyChild for all groups in the main group.""" for group in self._properties['mainGroup']._properties['children']: if isinstance(group, PBXGroup): group.TakeOverOnlyChild(recurse) def SortGroups(self): # Sort the children of the mainGroup (like "Source" and "Products") # according to their defined order. self._properties['mainGroup']._properties['children'] = \ sorted(self._properties['mainGroup']._properties['children'], cmp=lambda x,y: x.CompareRootGroup(y)) # Sort everything else by putting group before files, and going # alphabetically by name within sections of groups and files. SortGroup # is recursive. for group in self._properties['mainGroup']._properties['children']: if not isinstance(group, PBXGroup): continue if group.Name() == 'Products': # The Products group is a special case. Instead of sorting # alphabetically, sort things in the order of the targets that # produce the products. To do this, just build up a new list of # products based on the targets. products = [] for target in self._properties['targets']: if not isinstance(target, PBXNativeTarget): continue product = target._properties['productReference'] # Make sure that the product is already in the products group. assert product in group._properties['children'] products.append(product) # Make sure that this process doesn't miss anything that was already # in the products group. assert len(products) == len(group._properties['children']) group._properties['children'] = products else: group.SortGroup() def AddOrGetProjectReference(self, other_pbxproject): """Add a reference to another project file (via PBXProject object) to this one. Returns [ProductGroup, ProjectRef]. ProductGroup is a PBXGroup object in this project file that contains a PBXReferenceProxy object for each product of each PBXNativeTarget in the other project file. ProjectRef is a PBXFileReference to the other project file. If this project file already references the other project file, the existing ProductGroup and ProjectRef are returned. The ProductGroup will still be updated if necessary. """ if not 'projectReferences' in self._properties: self._properties['projectReferences'] = [] product_group = None project_ref = None if not other_pbxproject in self._other_pbxprojects: # This project file isn't yet linked to the other one. Establish the # link. product_group = PBXGroup({'name': 'Products'}) # ProductGroup is strong. product_group.parent = self # There's nothing unique about this PBXGroup, and if left alone, it will # wind up with the same set of hashables as all other PBXGroup objects # owned by the projectReferences list. Add the hashables of the # remote PBXProject that it's related to. product_group._hashables.extend(other_pbxproject.Hashables()) # The other project reports its path as relative to the same directory # that this project's path is relative to. The other project's path # is not necessarily already relative to this project. Figure out the # pathname that this project needs to use to refer to the other one. this_path = posixpath.dirname(self.Path()) projectDirPath = self.GetProperty('projectDirPath') if projectDirPath: if posixpath.isabs(projectDirPath[0]): this_path = projectDirPath else: this_path = posixpath.join(this_path, projectDirPath) other_path = gyp.common.RelativePath(other_pbxproject.Path(), this_path) # ProjectRef is weak (it's owned by the mainGroup hierarchy). project_ref = PBXFileReference({ 'lastKnownFileType': 'wrapper.pb-project', 'path': other_path, 'sourceTree': 'SOURCE_ROOT', }) self.ProjectsGroup().AppendChild(project_ref) ref_dict = {'ProductGroup': product_group, 'ProjectRef': project_ref} self._other_pbxprojects[other_pbxproject] = ref_dict self.AppendProperty('projectReferences', ref_dict) # Xcode seems to sort this list case-insensitively self._properties['projectReferences'] = \ sorted(self._properties['projectReferences'], cmp=lambda x,y: cmp(x['ProjectRef'].Name().lower(), y['ProjectRef'].Name().lower())) else: # The link already exists. Pull out the relevnt data. project_ref_dict = self._other_pbxprojects[other_pbxproject] product_group = project_ref_dict['ProductGroup'] project_ref = project_ref_dict['ProjectRef'] self._SetUpProductReferences(other_pbxproject, product_group, project_ref) return [product_group, project_ref] def _SetUpProductReferences(self, other_pbxproject, product_group, project_ref): # TODO(mark): This only adds references to products in other_pbxproject # when they don't exist in this pbxproject. Perhaps it should also # remove references from this pbxproject that are no longer present in # other_pbxproject. Perhaps it should update various properties if they # change. for target in other_pbxproject._properties['targets']: if not isinstance(target, PBXNativeTarget): continue other_fileref = target._properties['productReference'] if product_group.GetChildByRemoteObject(other_fileref) is None: # Xcode sets remoteInfo to the name of the target and not the name # of its product, despite this proxy being a reference to the product. container_item = PBXContainerItemProxy({ 'containerPortal': project_ref, 'proxyType': 2, 'remoteGlobalIDString': other_fileref, 'remoteInfo': target.Name() }) # TODO(mark): Does sourceTree get copied straight over from the other # project? Can the other project ever have lastKnownFileType here # instead of explicitFileType? (Use it if so?) Can path ever be # unset? (I don't think so.) Can other_fileref have name set, and # does it impact the PBXReferenceProxy if so? These are the questions # that perhaps will be answered one day. reference_proxy = PBXReferenceProxy({ 'fileType': other_fileref._properties['explicitFileType'], 'path': other_fileref._properties['path'], 'sourceTree': other_fileref._properties['sourceTree'], 'remoteRef': container_item, }) product_group.AppendChild(reference_proxy) def SortRemoteProductReferences(self): # For each remote project file, sort the associated ProductGroup in the # same order that the targets are sorted in the remote project file. This # is the sort order used by Xcode. def CompareProducts(x, y, remote_products): # x and y are PBXReferenceProxy objects. Go through their associated # PBXContainerItem to get the remote PBXFileReference, which will be # present in the remote_products list. x_remote = x._properties['remoteRef']._properties['remoteGlobalIDString'] y_remote = y._properties['remoteRef']._properties['remoteGlobalIDString'] x_index = remote_products.index(x_remote) y_index = remote_products.index(y_remote) # Use the order of each remote PBXFileReference in remote_products to # determine the sort order. return cmp(x_index, y_index) for other_pbxproject, ref_dict in self._other_pbxprojects.iteritems(): # Build up a list of products in the remote project file, ordered the # same as the targets that produce them. remote_products = [] for target in other_pbxproject._properties['targets']: if not isinstance(target, PBXNativeTarget): continue remote_products.append(target._properties['productReference']) # Sort the PBXReferenceProxy children according to the list of remote # products. product_group = ref_dict['ProductGroup'] product_group._properties['children'] = sorted( product_group._properties['children'], cmp=lambda x, y: CompareProducts(x, y, remote_products)) class XCProjectFile(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'archiveVersion': [0, int, 0, 1, 1], 'classes': [0, dict, 0, 1, {}], 'objectVersion': [0, int, 0, 1, 45], 'rootObject': [0, PBXProject, 1, 1], }) def SetXcodeVersion(self, version): version_to_object_version = { '2.4': 45, '3.0': 45, '3.1': 45, '3.2': 46, } if not version in version_to_object_version: supported_str = ', '.join(sorted(version_to_object_version.keys())) raise Exception( 'Unsupported Xcode version %s (supported: %s)' % ( version, supported_str ) ) compatibility_version = 'Xcode %s' % version self._properties['rootObject'].SetProperty('compatibilityVersion', compatibility_version) self.SetProperty('objectVersion', version_to_object_version[version]); def ComputeIDs(self, recursive=True, overwrite=True, hash=None): # Although XCProjectFile is implemented here as an XCObject, it's not a # proper object in the Xcode sense, and it certainly doesn't have its own # ID. Pass through an attempt to update IDs to the real root object. if recursive: self._properties['rootObject'].ComputeIDs(recursive, overwrite, hash) def Print(self, file=sys.stdout): self.VerifyHasRequiredProperties() # Add the special "objects" property, which will be caught and handled # separately during printing. This structure allows a fairly standard # loop do the normal printing. self._properties['objects'] = {} self._XCPrint(file, 0, '// !$*UTF8*$!\n') if self._should_print_single_line: self._XCPrint(file, 0, '{ ') else: self._XCPrint(file, 0, '{\n') for property, value in sorted(self._properties.iteritems(), cmp=lambda x, y: cmp(x, y)): if property == 'objects': self._PrintObjects(file) else: self._XCKVPrint(file, 1, property, value) self._XCPrint(file, 0, '}\n') del self._properties['objects'] def _PrintObjects(self, file): if self._should_print_single_line: self._XCPrint(file, 0, 'objects = {') else: self._XCPrint(file, 1, 'objects = {\n') objects_by_class = {} for object in self.Descendants(): if object == self: continue class_name = object.__class__.__name__ if not class_name in objects_by_class: objects_by_class[class_name] = [] objects_by_class[class_name].append(object) for class_name in sorted(objects_by_class): self._XCPrint(file, 0, '\n') self._XCPrint(file, 0, '/* Begin ' + class_name + ' section */\n') for object in sorted(objects_by_class[class_name], cmp=lambda x, y: cmp(x.id, y.id)): object.Print(file) self._XCPrint(file, 0, '/* End ' + class_name + ' section */\n') if self._should_print_single_line: self._XCPrint(file, 0, '}; ') else: self._XCPrint(file, 1, '};\n') ������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/xml_fix.py����������000644 �000766 �000024 �00000004176 12455173731 031646� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Applies a fix to CR LF TAB handling in xml.dom. Fixes this: http://code.google.com/p/chromium/issues/detail?id=76293 Working around this: http://bugs.python.org/issue5752 TODO(bradnelson): Consider dropping this when we drop XP support. """ import xml.dom.minidom def _Replacement_write_data(writer, data, is_attrib=False): """Writes datachars to writer.""" data = data.replace("&", "&").replace("<", "<") data = data.replace("\"", """).replace(">", ">") if is_attrib: data = data.replace( "\r", " ").replace( "\n", " ").replace( "\t", " ") writer.write(data) def _Replacement_writexml(self, writer, indent="", addindent="", newl=""): # indent = current indentation # addindent = indentation to add to higher levels # newl = newline string writer.write(indent+"<" + self.tagName) attrs = self._get_attributes() a_names = attrs.keys() a_names.sort() for a_name in a_names: writer.write(" %s=\"" % a_name) _Replacement_write_data(writer, attrs[a_name].value, is_attrib=True) writer.write("\"") if self.childNodes: writer.write(">%s" % newl) for node in self.childNodes: node.writexml(writer, indent + addindent, addindent, newl) writer.write("%s</%s>%s" % (indent, self.tagName, newl)) else: writer.write("/>%s" % newl) class XmlFix(object): """Object to manage temporary patching of xml.dom.minidom.""" def __init__(self): # Preserve current xml.dom.minidom functions. self.write_data = xml.dom.minidom._write_data self.writexml = xml.dom.minidom.Element.writexml # Inject replacement versions of a function and a method. xml.dom.minidom._write_data = _Replacement_write_data xml.dom.minidom.Element.writexml = _Replacement_writexml def Cleanup(self): if self.write_data: xml.dom.minidom._write_data = self.write_data xml.dom.minidom.Element.writexml = self.writexml self.write_data = None def __del__(self): self.Cleanup() ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/__init__.py����������������������000644 �000766 �000024 �00000000000 12455173731 033624� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/android.py000644 �000766 �000024 �00000124737 12455173731 033614� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. # Notes: # # This generates makefiles suitable for inclusion into the Android build system # via an Android.mk file. It is based on make.py, the standard makefile # generator. # # The code below generates a separate .mk file for each target, but # all are sourced by the top-level GypAndroid.mk. This means that all # variables in .mk-files clobber one another, and furthermore that any # variables set potentially clash with other Android build system variables. # Try to avoid setting global variables where possible. import gyp import gyp.common import gyp.generator.make as make # Reuse global functions from make backend. import os import re import subprocess generator_default_variables = { 'OS': 'android', 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'SHARED_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_SUFFIX': '.so', 'INTERMEDIATE_DIR': '$(gyp_intermediate_dir)', 'SHARED_INTERMEDIATE_DIR': '$(gyp_shared_intermediate_dir)', 'PRODUCT_DIR': '$(gyp_shared_intermediate_dir)', 'SHARED_LIB_DIR': '$(builddir)/lib.$(TOOLSET)', 'LIB_DIR': '$(obj).$(TOOLSET)', 'RULE_INPUT_ROOT': '%(INPUT_ROOT)s', # This gets expanded by Python. 'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s', # This gets expanded by Python. 'RULE_INPUT_PATH': '$(RULE_SOURCES)', 'RULE_INPUT_EXT': '$(suffix $<)', 'RULE_INPUT_NAME': '$(notdir $<)', 'CONFIGURATION_NAME': '$(GYP_CONFIGURATION)', } # Make supports multiple toolsets generator_supports_multiple_toolsets = True # Generator-specific gyp specs. generator_additional_non_configuration_keys = [ # Boolean to declare that this target does not want its name mangled. 'android_unmangled_name', ] generator_additional_path_sections = [] generator_extra_sources_for_rules = [] SHARED_FOOTER = """\ # "gyp_all_modules" is a concatenation of the "gyp_all_modules" targets from # all the included sub-makefiles. This is just here to clarify. gyp_all_modules: """ header = """\ # This file is generated by gyp; do not edit. """ android_standard_include_paths = set([ # JNI_H_INCLUDE in build/core/binary.mk 'dalvik/libnativehelper/include/nativehelper', # from SRC_HEADERS in build/core/config.mk 'system/core/include', 'hardware/libhardware/include', 'hardware/libhardware_legacy/include', 'hardware/ril/include', 'dalvik/libnativehelper/include', 'frameworks/native/include', 'frameworks/native/opengl/include', 'frameworks/base/include', 'frameworks/base/opengl/include', 'frameworks/base/native/include', 'external/skia/include', # TARGET_C_INCLUDES in build/core/combo/TARGET_linux-arm.mk 'bionic/libc/arch-arm/include', 'bionic/libc/include', 'bionic/libstdc++/include', 'bionic/libc/kernel/common', 'bionic/libc/kernel/arch-arm', 'bionic/libm/include', 'bionic/libm/include/arm', 'bionic/libthread_db/include', ]) # Map gyp target types to Android module classes. MODULE_CLASSES = { 'static_library': 'STATIC_LIBRARIES', 'shared_library': 'SHARED_LIBRARIES', 'executable': 'EXECUTABLES', } def IsCPPExtension(ext): return make.COMPILABLE_EXTENSIONS.get(ext) == 'cxx' def Sourceify(path): """Convert a path to its source directory form. The Android backend does not support options.generator_output, so this function is a noop.""" return path # Map from qualified target to path to output. # For Android, the target of these maps is a tuple ('static', 'modulename'), # ('dynamic', 'modulename'), or ('path', 'some/path') instead of a string, # since we link by module. target_outputs = {} # Map from qualified target to any linkable output. A subset # of target_outputs. E.g. when mybinary depends on liba, we want to # include liba in the linker line; when otherbinary depends on # mybinary, we just want to build mybinary first. target_link_deps = {} class AndroidMkWriter(object): """AndroidMkWriter packages up the writing of one target-specific Android.mk. Its only real entry point is Write(), and is mostly used for namespacing. """ def __init__(self, android_top_dir): self.android_top_dir = android_top_dir def Write(self, qualified_target, relative_target, base_path, output_filename, spec, configs, part_of_all): """The main entry point: writes a .mk file for a single target. Arguments: qualified_target: target we're generating relative_target: qualified target name relative to the root base_path: path relative to source root we're building in, used to resolve target-relative paths output_filename: output .mk file name to write spec, configs: gyp info part_of_all: flag indicating this target is part of 'all' """ gyp.common.EnsureDirExists(output_filename) self.fp = open(output_filename, 'w') self.fp.write(header) self.qualified_target = qualified_target self.relative_target = relative_target self.path = base_path self.target = spec['target_name'] self.type = spec['type'] self.toolset = spec['toolset'] deps, link_deps = self.ComputeDeps(spec) # Some of the generation below can add extra output, sources, or # link dependencies. All of the out params of the functions that # follow use names like extra_foo. extra_outputs = [] extra_sources = [] self.android_class = MODULE_CLASSES.get(self.type, 'GYP') self.android_module = self.ComputeAndroidModule(spec) (self.android_stem, self.android_suffix) = self.ComputeOutputParts(spec) self.output = self.output_binary = self.ComputeOutput(spec) # Standard header. self.WriteLn('include $(CLEAR_VARS)\n') # Module class and name. self.WriteLn('LOCAL_MODULE_CLASS := ' + self.android_class) self.WriteLn('LOCAL_MODULE := ' + self.android_module) # Only emit LOCAL_MODULE_STEM if it's different to LOCAL_MODULE. # The library module classes fail if the stem is set. ComputeOutputParts # makes sure that stem == modulename in these cases. if self.android_stem != self.android_module: self.WriteLn('LOCAL_MODULE_STEM := ' + self.android_stem) self.WriteLn('LOCAL_MODULE_SUFFIX := ' + self.android_suffix) self.WriteLn('LOCAL_MODULE_TAGS := optional') if self.toolset == 'host': self.WriteLn('LOCAL_IS_HOST_MODULE := true') # Grab output directories; needed for Actions and Rules. self.WriteLn('gyp_intermediate_dir := $(call local-intermediates-dir)') self.WriteLn('gyp_shared_intermediate_dir := ' '$(call intermediates-dir-for,GYP,shared)') self.WriteLn() # List files this target depends on so that actions/rules/copies/sources # can depend on the list. # TODO: doesn't pull in things through transitive link deps; needed? target_dependencies = [x[1] for x in deps if x[0] == 'path'] self.WriteLn('# Make sure our deps are built first.') self.WriteList(target_dependencies, 'GYP_TARGET_DEPENDENCIES', local_pathify=True) # Actions must come first, since they can generate more OBJs for use below. if 'actions' in spec: self.WriteActions(spec['actions'], extra_sources, extra_outputs) # Rules must be early like actions. if 'rules' in spec: self.WriteRules(spec['rules'], extra_sources, extra_outputs) if 'copies' in spec: self.WriteCopies(spec['copies'], extra_outputs) # GYP generated outputs. self.WriteList(extra_outputs, 'GYP_GENERATED_OUTPUTS', local_pathify=True) # Set LOCAL_ADDITIONAL_DEPENDENCIES so that Android's build rules depend # on both our dependency targets and our generated files. self.WriteLn('# Make sure our deps and generated files are built first.') self.WriteLn('LOCAL_ADDITIONAL_DEPENDENCIES := $(GYP_TARGET_DEPENDENCIES) ' '$(GYP_GENERATED_OUTPUTS)') self.WriteLn() # Sources. if spec.get('sources', []) or extra_sources: self.WriteSources(spec, configs, extra_sources) self.WriteTarget(spec, configs, deps, link_deps, part_of_all) # Update global list of target outputs, used in dependency tracking. target_outputs[qualified_target] = ('path', self.output_binary) # Update global list of link dependencies. if self.type == 'static_library': target_link_deps[qualified_target] = ('static', self.android_module) elif self.type == 'shared_library': target_link_deps[qualified_target] = ('shared', self.android_module) self.fp.close() return self.android_module def WriteActions(self, actions, extra_sources, extra_outputs): """Write Makefile code for any 'actions' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these actions (used to make other pieces dependent on these actions) """ for action in actions: name = make.StringToMakefileVariable('%s_%s' % (self.relative_target, action['action_name'])) self.WriteLn('### Rules for action "%s":' % action['action_name']) inputs = action['inputs'] outputs = action['outputs'] # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set() for out in outputs: if not out.startswith('$'): print ('WARNING: Action for target "%s" writes output to local path ' '"%s".' % (self.target, out)) dir = os.path.split(out)[0] if dir: dirs.add(dir) if int(action.get('process_outputs_as_sources', False)): extra_sources += outputs # Prepare the actual command. command = gyp.common.EncodePOSIXShellList(action['action']) if 'message' in action: quiet_cmd = 'Gyp action: %s ($@)' % action['message'] else: quiet_cmd = 'Gyp action: %s ($@)' % name if len(dirs) > 0: command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command cd_action = 'cd $(gyp_local_path)/%s; ' % self.path command = cd_action + command # The makefile rules are all relative to the top dir, but the gyp actions # are defined relative to their containing dir. This replaces the gyp_* # variables for the action rule with an absolute version so that the # output goes in the right place. # Only write the gyp_* rules for the "primary" output (:1); # it's superfluous for the "extra outputs", and this avoids accidentally # writing duplicate dummy rules for those outputs. main_output = make.QuoteSpaces(self.LocalPathify(outputs[0])) self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output) self.WriteLn('%s: gyp_intermediate_dir := ' '$(abspath $(gyp_intermediate_dir))' % main_output) self.WriteLn('%s: gyp_shared_intermediate_dir := ' '$(abspath $(gyp_shared_intermediate_dir))' % main_output) # Android's envsetup.sh adds a number of directories to the path including # the built host binary directory. This causes actions/rules invoked by # gyp to sometimes use these instead of system versions, e.g. bison. # The built host binaries may not be suitable, and can cause errors. # So, we remove them from the PATH using the ANDROID_BUILD_PATHS variable # set by envsetup. self.WriteLn('%s: export PATH := $(subst $(ANDROID_BUILD_PATHS),,$(PATH))' % main_output) for input in inputs: assert ' ' not in input, ( "Spaces in action input filenames not supported (%s)" % input) for output in outputs: assert ' ' not in output, ( "Spaces in action output filenames not supported (%s)" % output) self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' % (main_output, ' '.join(map(self.LocalPathify, inputs)))) self.WriteLn('\t@echo "%s"' % quiet_cmd) self.WriteLn('\t$(hide)%s\n' % command) for output in outputs[1:]: # Make each output depend on the main output, with an empty command # to force make to notice that the mtime has changed. self.WriteLn('%s: %s ;' % (self.LocalPathify(output), main_output)) extra_outputs += outputs self.WriteLn() self.WriteLn() def WriteRules(self, rules, extra_sources, extra_outputs): """Write Makefile code for any 'rules' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these rules (used to make other pieces dependent on these rules) """ if len(rules) == 0: return rule_trigger = '%s_rule_trigger' % self.android_module did_write_rule = False for rule in rules: if len(rule.get('rule_sources', [])) == 0: continue did_write_rule = True name = make.StringToMakefileVariable('%s_%s' % (self.relative_target, rule['rule_name'])) self.WriteLn('\n### Generated for rule "%s":' % name) self.WriteLn('# "%s":' % rule) inputs = rule.get('inputs') for rule_source in rule.get('rule_sources', []): (rule_source_dirname, rule_source_basename) = os.path.split(rule_source) (rule_source_root, rule_source_ext) = \ os.path.splitext(rule_source_basename) outputs = [self.ExpandInputRoot(out, rule_source_root, rule_source_dirname) for out in rule['outputs']] dirs = set() for out in outputs: if not out.startswith('$'): print ('WARNING: Rule for target %s writes output to local path %s' % (self.target, out)) dir = os.path.dirname(out) if dir: dirs.add(dir) extra_outputs += outputs if int(rule.get('process_outputs_as_sources', False)): extra_sources.extend(outputs) components = [] for component in rule['action']: component = self.ExpandInputRoot(component, rule_source_root, rule_source_dirname) if '$(RULE_SOURCES)' in component: component = component.replace('$(RULE_SOURCES)', rule_source) components.append(component) command = gyp.common.EncodePOSIXShellList(components) cd_action = 'cd $(gyp_local_path)/%s; ' % self.path command = cd_action + command if dirs: command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command # We set up a rule to build the first output, and then set up # a rule for each additional output to depend on the first. outputs = map(self.LocalPathify, outputs) main_output = outputs[0] self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output) self.WriteLn('%s: gyp_intermediate_dir := ' '$(abspath $(gyp_intermediate_dir))' % main_output) self.WriteLn('%s: gyp_shared_intermediate_dir := ' '$(abspath $(gyp_shared_intermediate_dir))' % main_output) # See explanation in WriteActions. self.WriteLn('%s: export PATH := ' '$(subst $(ANDROID_BUILD_PATHS),,$(PATH))' % main_output) main_output_deps = self.LocalPathify(rule_source) if inputs: main_output_deps += ' ' main_output_deps += ' '.join([self.LocalPathify(f) for f in inputs]) self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' % (main_output, main_output_deps)) self.WriteLn('\t%s\n' % command) for output in outputs[1:]: # Make each output depend on the main output, with an empty command # to force make to notice that the mtime has changed. self.WriteLn('%s: %s ;' % (output, main_output)) self.WriteLn('.PHONY: %s' % (rule_trigger)) self.WriteLn('%s: %s' % (rule_trigger, main_output)) self.WriteLn('') if did_write_rule: extra_sources.append(rule_trigger) # Force all rules to run. self.WriteLn('### Finished generating for all rules') self.WriteLn('') def WriteCopies(self, copies, extra_outputs): """Write Makefile code for any 'copies' from the gyp input. extra_outputs: a list that will be filled in with any outputs of this action (used to make other pieces dependent on this action) """ self.WriteLn('### Generated for copy rule.') variable = make.StringToMakefileVariable(self.relative_target + '_copies') outputs = [] for copy in copies: for path in copy['files']: # The Android build system does not allow generation of files into the # source tree. The destination should start with a variable, which will # typically be $(gyp_intermediate_dir) or # $(gyp_shared_intermediate_dir). Note that we can't use an assertion # because some of the gyp tests depend on this. if not copy['destination'].startswith('$'): print ('WARNING: Copy rule for target %s writes output to ' 'local path %s' % (self.target, copy['destination'])) # LocalPathify() calls normpath, stripping trailing slashes. path = Sourceify(self.LocalPathify(path)) filename = os.path.split(path)[1] output = Sourceify(self.LocalPathify(os.path.join(copy['destination'], filename))) self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES) | $(ACP)' % (output, path)) self.WriteLn('\t@echo Copying: $@') self.WriteLn('\t$(hide) mkdir -p $(dir $@)') self.WriteLn('\t$(hide) $(ACP) -rpf $< $@') self.WriteLn() outputs.append(output) self.WriteLn('%s = %s' % (variable, ' '.join(map(make.QuoteSpaces, outputs)))) extra_outputs.append('$(%s)' % variable) self.WriteLn() def WriteSourceFlags(self, spec, configs): """Write out the flags and include paths used to compile source files for the current target. Args: spec, configs: input from gyp. """ for configname, config in sorted(configs.iteritems()): extracted_includes = [] self.WriteLn('\n# Flags passed to both C and C++ files.') cflags, includes_from_cflags = self.ExtractIncludesFromCFlags( config.get('cflags', []) + config.get('cflags_c', [])) extracted_includes.extend(includes_from_cflags) self.WriteList(cflags, 'MY_CFLAGS_%s' % configname) self.WriteList(config.get('defines'), 'MY_DEFS_%s' % configname, prefix='-D', quoter=make.EscapeCppDefine) self.WriteLn('\n# Include paths placed before CFLAGS/CPPFLAGS') includes = list(config.get('include_dirs', [])) includes.extend(extracted_includes) includes = map(Sourceify, map(self.LocalPathify, includes)) includes = self.NormalizeIncludePaths(includes) self.WriteList(includes, 'LOCAL_C_INCLUDES_%s' % configname) self.WriteLn('\n# Flags passed to only C++ (and not C) files.') self.WriteList(config.get('cflags_cc'), 'LOCAL_CPPFLAGS_%s' % configname) self.WriteLn('\nLOCAL_CFLAGS := $(MY_CFLAGS_$(GYP_CONFIGURATION)) ' '$(MY_DEFS_$(GYP_CONFIGURATION))') # Undefine ANDROID for host modules # TODO: the source code should not use macro ANDROID to tell if it's host # or target module. if self.toolset == 'host': self.WriteLn('# Undefine ANDROID for host modules') self.WriteLn('LOCAL_CFLAGS += -UANDROID') self.WriteLn('LOCAL_C_INCLUDES := $(GYP_COPIED_SOURCE_ORIGIN_DIRS) ' '$(LOCAL_C_INCLUDES_$(GYP_CONFIGURATION))') self.WriteLn('LOCAL_CPPFLAGS := $(LOCAL_CPPFLAGS_$(GYP_CONFIGURATION))') def WriteSources(self, spec, configs, extra_sources): """Write Makefile code for any 'sources' from the gyp input. These are source files necessary to build the current target. We need to handle shared_intermediate directory source files as a special case by copying them to the intermediate directory and treating them as a genereated sources. Otherwise the Android build rules won't pick them up. Args: spec, configs: input from gyp. extra_sources: Sources generated from Actions or Rules. """ sources = filter(make.Compilable, spec.get('sources', [])) generated_not_sources = [x for x in extra_sources if not make.Compilable(x)] extra_sources = filter(make.Compilable, extra_sources) # Determine and output the C++ extension used by these sources. # We simply find the first C++ file and use that extension. all_sources = sources + extra_sources local_cpp_extension = '.cpp' for source in all_sources: (root, ext) = os.path.splitext(source) if IsCPPExtension(ext): local_cpp_extension = ext break if local_cpp_extension != '.cpp': self.WriteLn('LOCAL_CPP_EXTENSION := %s' % local_cpp_extension) # We need to move any non-generated sources that are coming from the # shared intermediate directory out of LOCAL_SRC_FILES and put them # into LOCAL_GENERATED_SOURCES. We also need to move over any C++ files # that don't match our local_cpp_extension, since Android will only # generate Makefile rules for a single LOCAL_CPP_EXTENSION. local_files = [] for source in sources: (root, ext) = os.path.splitext(source) if '$(gyp_shared_intermediate_dir)' in source: extra_sources.append(source) elif '$(gyp_intermediate_dir)' in source: extra_sources.append(source) elif IsCPPExtension(ext) and ext != local_cpp_extension: extra_sources.append(source) else: local_files.append(os.path.normpath(os.path.join(self.path, source))) # For any generated source, if it is coming from the shared intermediate # directory then we add a Make rule to copy them to the local intermediate # directory first. This is because the Android LOCAL_GENERATED_SOURCES # must be in the local module intermediate directory for the compile rules # to work properly. If the file has the wrong C++ extension, then we add # a rule to copy that to intermediates and use the new version. final_generated_sources = [] # If a source file gets copied, we still need to add the orginal source # directory as header search path, for GCC searches headers in the # directory that contains the source file by default. origin_src_dirs = [] for source in extra_sources: local_file = source if not '$(gyp_intermediate_dir)/' in local_file: basename = os.path.basename(local_file) local_file = '$(gyp_intermediate_dir)/' + basename (root, ext) = os.path.splitext(local_file) if IsCPPExtension(ext) and ext != local_cpp_extension: local_file = root + local_cpp_extension if local_file != source: self.WriteLn('%s: %s' % (local_file, self.LocalPathify(source))) self.WriteLn('\tmkdir -p $(@D); cp $< $@') origin_src_dirs.append(os.path.dirname(source)) final_generated_sources.append(local_file) # We add back in all of the non-compilable stuff to make sure that the # make rules have dependencies on them. final_generated_sources.extend(generated_not_sources) self.WriteList(final_generated_sources, 'LOCAL_GENERATED_SOURCES') origin_src_dirs = gyp.common.uniquer(origin_src_dirs) origin_src_dirs = map(Sourceify, map(self.LocalPathify, origin_src_dirs)) self.WriteList(origin_src_dirs, 'GYP_COPIED_SOURCE_ORIGIN_DIRS') self.WriteList(local_files, 'LOCAL_SRC_FILES') # Write out the flags used to compile the source; this must be done last # so that GYP_COPIED_SOURCE_ORIGIN_DIRS can be used as an include path. self.WriteSourceFlags(spec, configs) def ComputeAndroidModule(self, spec): """Return the Android module name used for a gyp spec. We use the complete qualified target name to avoid collisions between duplicate targets in different directories. We also add a suffix to distinguish gyp-generated module names. """ if int(spec.get('android_unmangled_name', 0)): assert self.type != 'shared_library' or self.target.startswith('lib') return self.target if self.type == 'shared_library': # For reasons of convention, the Android build system requires that all # shared library modules are named 'libfoo' when generating -l flags. prefix = 'lib_' else: prefix = '' if spec['toolset'] == 'host': suffix = '_host_gyp' else: suffix = '_gyp' if self.path: name = '%s%s_%s%s' % (prefix, self.path, self.target, suffix) else: name = '%s%s%s' % (prefix, self.target, suffix) return make.StringToMakefileVariable(name) def ComputeOutputParts(self, spec): """Return the 'output basename' of a gyp spec, split into filename + ext. Android libraries must be named the same thing as their module name, otherwise the linker can't find them, so product_name and so on must be ignored if we are building a library, and the "lib" prepending is not done for Android. """ assert self.type != 'loadable_module' # TODO: not supported? target = spec['target_name'] target_prefix = '' target_ext = '' if self.type == 'static_library': target = self.ComputeAndroidModule(spec) target_ext = '.a' elif self.type == 'shared_library': target = self.ComputeAndroidModule(spec) target_ext = '.so' elif self.type == 'none': target_ext = '.stamp' elif self.type != 'executable': print ("ERROR: What output file should be generated?", "type", self.type, "target", target) if self.type != 'static_library' and self.type != 'shared_library': target_prefix = spec.get('product_prefix', target_prefix) target = spec.get('product_name', target) product_ext = spec.get('product_extension') if product_ext: target_ext = '.' + product_ext target_stem = target_prefix + target return (target_stem, target_ext) def ComputeOutputBasename(self, spec): """Return the 'output basename' of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce 'libfoobar.so' """ return ''.join(self.ComputeOutputParts(spec)) def ComputeOutput(self, spec): """Return the 'output' (full output path) of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce '$(obj)/baz/libfoobar.so' """ if self.type == 'executable' and self.toolset == 'host': # We install host executables into shared_intermediate_dir so they can be # run by gyp rules that refer to PRODUCT_DIR. path = '$(gyp_shared_intermediate_dir)' elif self.type == 'shared_library': if self.toolset == 'host': path = '$(HOST_OUT_INTERMEDIATE_LIBRARIES)' else: path = '$(TARGET_OUT_INTERMEDIATE_LIBRARIES)' else: # Other targets just get built into their intermediate dir. if self.toolset == 'host': path = '$(call intermediates-dir-for,%s,%s,true)' % (self.android_class, self.android_module) else: path = '$(call intermediates-dir-for,%s,%s)' % (self.android_class, self.android_module) assert spec.get('product_dir') is None # TODO: not supported? return os.path.join(path, self.ComputeOutputBasename(spec)) def NormalizeIncludePaths(self, include_paths): """ Normalize include_paths. Convert absolute paths to relative to the Android top directory; filter out include paths that are already brought in by the Android build system. Args: include_paths: A list of unprocessed include paths. Returns: A list of normalized include paths. """ normalized = [] for path in include_paths: if path[0] == '/': path = gyp.common.RelativePath(path, self.android_top_dir) # Filter out the Android standard search path. if path not in android_standard_include_paths: normalized.append(path) return normalized def ExtractIncludesFromCFlags(self, cflags): """Extract includes "-I..." out from cflags Args: cflags: A list of compiler flags, which may be mixed with "-I.." Returns: A tuple of lists: (clean_clfags, include_paths). "-I.." is trimmed. """ clean_cflags = [] include_paths = [] for flag in cflags: if flag.startswith('-I'): include_paths.append(flag[2:]) else: clean_cflags.append(flag) return (clean_cflags, include_paths) def ComputeAndroidLibraryModuleNames(self, libraries): """Compute the Android module names from libraries, ie spec.get('libraries') Args: libraries: the value of spec.get('libraries') Returns: A tuple (static_lib_modules, dynamic_lib_modules) """ static_lib_modules = [] dynamic_lib_modules = [] for libs in libraries: # Libs can have multiple words. for lib in libs.split(): # Filter the system libraries, which are added by default by the Android # build system. if (lib == '-lc' or lib == '-lstdc++' or lib == '-lm' or lib.endswith('libgcc.a')): continue match = re.search(r'([^/]+)\.a$', lib) if match: static_lib_modules.append(match.group(1)) continue match = re.search(r'([^/]+)\.so$', lib) if match: dynamic_lib_modules.append(match.group(1)) continue # "-lstlport" -> libstlport if lib.startswith('-l'): if lib.endswith('_static'): static_lib_modules.append('lib' + lib[2:]) else: dynamic_lib_modules.append('lib' + lib[2:]) return (static_lib_modules, dynamic_lib_modules) def ComputeDeps(self, spec): """Compute the dependencies of a gyp spec. Returns a tuple (deps, link_deps), where each is a list of filenames that will need to be put in front of make for either building (deps) or linking (link_deps). """ deps = [] link_deps = [] if 'dependencies' in spec: deps.extend([target_outputs[dep] for dep in spec['dependencies'] if target_outputs[dep]]) for dep in spec['dependencies']: if dep in target_link_deps: link_deps.append(target_link_deps[dep]) deps.extend(link_deps) return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps)) def WriteTargetFlags(self, spec, configs, link_deps): """Write Makefile code to specify the link flags and library dependencies. spec, configs: input from gyp. link_deps: link dependency list; see ComputeDeps() """ for configname, config in sorted(configs.iteritems()): ldflags = list(config.get('ldflags', [])) self.WriteLn('') self.WriteList(ldflags, 'LOCAL_LDFLAGS_%s' % configname) self.WriteLn('\nLOCAL_LDFLAGS := $(LOCAL_LDFLAGS_$(GYP_CONFIGURATION))') # Libraries (i.e. -lfoo) libraries = gyp.common.uniquer(spec.get('libraries', [])) static_libs, dynamic_libs = self.ComputeAndroidLibraryModuleNames( libraries) # Link dependencies (i.e. libfoo.a, libfoo.so) static_link_deps = [x[1] for x in link_deps if x[0] == 'static'] shared_link_deps = [x[1] for x in link_deps if x[0] == 'shared'] self.WriteLn('') self.WriteList(static_libs + static_link_deps, 'LOCAL_STATIC_LIBRARIES') self.WriteLn('# Enable grouping to fix circular references') self.WriteLn('LOCAL_GROUP_STATIC_LIBRARIES := true') self.WriteLn('') self.WriteList(dynamic_libs + shared_link_deps, 'LOCAL_SHARED_LIBRARIES') def WriteTarget(self, spec, configs, deps, link_deps, part_of_all): """Write Makefile code to produce the final target of the gyp spec. spec, configs: input from gyp. deps, link_deps: dependency lists; see ComputeDeps() part_of_all: flag indicating this target is part of 'all' """ self.WriteLn('### Rules for final target.') if self.type != 'none': self.WriteTargetFlags(spec, configs, link_deps) # Add to the set of targets which represent the gyp 'all' target. We use the # name 'gyp_all_modules' as the Android build system doesn't allow the use # of the Make target 'all' and because 'all_modules' is the equivalent of # the Make target 'all' on Android. if part_of_all: self.WriteLn('# Add target alias to "gyp_all_modules" target.') self.WriteLn('.PHONY: gyp_all_modules') self.WriteLn('gyp_all_modules: %s' % self.android_module) self.WriteLn('') # Add an alias from the gyp target name to the Android module name. This # simplifies manual builds of the target, and is required by the test # framework. if self.target != self.android_module: self.WriteLn('# Alias gyp target name.') self.WriteLn('.PHONY: %s' % self.target) self.WriteLn('%s: %s' % (self.target, self.android_module)) self.WriteLn('') # Add the command to trigger build of the target type depending # on the toolset. Ex: BUILD_STATIC_LIBRARY vs. BUILD_HOST_STATIC_LIBRARY # NOTE: This has to come last! modifier = '' if self.toolset == 'host': modifier = 'HOST_' if self.type == 'static_library': self.WriteLn('include $(BUILD_%sSTATIC_LIBRARY)' % modifier) elif self.type == 'shared_library': self.WriteLn('LOCAL_PRELINK_MODULE := false') self.WriteLn('include $(BUILD_%sSHARED_LIBRARY)' % modifier) elif self.type == 'executable': if self.toolset == 'host': self.WriteLn('LOCAL_MODULE_PATH := $(gyp_shared_intermediate_dir)') else: # Don't install target executables for now, as it results in them being # included in ROM. This can be revisited if there's a reason to install # them later. self.WriteLn('LOCAL_UNINSTALLABLE_MODULE := true') self.WriteLn('include $(BUILD_%sEXECUTABLE)' % modifier) else: self.WriteLn('LOCAL_MODULE_PATH := $(PRODUCT_OUT)/gyp_stamp') self.WriteLn('LOCAL_UNINSTALLABLE_MODULE := true') self.WriteLn() self.WriteLn('include $(BUILD_SYSTEM)/base_rules.mk') self.WriteLn() self.WriteLn('$(LOCAL_BUILT_MODULE): $(LOCAL_ADDITIONAL_DEPENDENCIES)') self.WriteLn('\t$(hide) echo "Gyp timestamp: $@"') self.WriteLn('\t$(hide) mkdir -p $(dir $@)') self.WriteLn('\t$(hide) touch $@') def WriteList(self, value_list, variable=None, prefix='', quoter=make.QuoteIfNecessary, local_pathify=False): """Write a variable definition that is a list of values. E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out foo = blaha blahb but in a pretty-printed style. """ values = '' if value_list: value_list = [quoter(prefix + l) for l in value_list] if local_pathify: value_list = [self.LocalPathify(l) for l in value_list] values = ' \\\n\t' + ' \\\n\t'.join(value_list) self.fp.write('%s :=%s\n\n' % (variable, values)) def WriteLn(self, text=''): self.fp.write(text + '\n') def LocalPathify(self, path): """Convert a subdirectory-relative path into a normalized path which starts with the make variable $(LOCAL_PATH) (i.e. the top of the project tree). Absolute paths, or paths that contain variables, are just normalized.""" if '$(' in path or os.path.isabs(path): # path is not a file in the project tree in this case, but calling # normpath is still important for trimming trailing slashes. return os.path.normpath(path) local_path = os.path.join('$(LOCAL_PATH)', self.path, path) local_path = os.path.normpath(local_path) # Check that normalizing the path didn't ../ itself out of $(LOCAL_PATH) # - i.e. that the resulting path is still inside the project tree. The # path may legitimately have ended up containing just $(LOCAL_PATH), though, # so we don't look for a slash. assert local_path.startswith('$(LOCAL_PATH)'), ( 'Path %s attempts to escape from gyp path %s !)' % (path, self.path)) return local_path def ExpandInputRoot(self, template, expansion, dirname): if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template: return template path = template % { 'INPUT_ROOT': expansion, 'INPUT_DIRNAME': dirname, } return path def PerformBuild(data, configurations, params): # The android backend only supports the default configuration. options = params['options'] makefile = os.path.abspath(os.path.join(options.toplevel_dir, 'GypAndroid.mk')) env = dict(os.environ) env['ONE_SHOT_MAKEFILE'] = makefile arguments = ['make', '-C', os.environ['ANDROID_BUILD_TOP'], 'gyp_all_modules'] print 'Building: %s' % arguments subprocess.check_call(arguments, env=env) def GenerateOutput(target_list, target_dicts, data, params): options = params['options'] generator_flags = params.get('generator_flags', {}) builddir_name = generator_flags.get('output_dir', 'out') limit_to_target_all = generator_flags.get('limit_to_target_all', False) android_top_dir = os.environ.get('ANDROID_BUILD_TOP') assert android_top_dir, '$ANDROID_BUILD_TOP not set; you need to run lunch.' def CalculateMakefilePath(build_file, base_name): """Determine where to write a Makefile for a given gyp file.""" # Paths in gyp files are relative to the .gyp file, but we want # paths relative to the source root for the master makefile. Grab # the path of the .gyp file as the base to relativize against. # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp". base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.depth) # We write the file in the base_path directory. output_file = os.path.join(options.depth, base_path, base_name) assert not options.generator_output, ( 'The Android backend does not support options.generator_output.') base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.toplevel_dir) return base_path, output_file # TODO: search for the first non-'Default' target. This can go # away when we add verification that all targets have the # necessary configurations. default_configuration = None toolsets = set([target_dicts[target]['toolset'] for target in target_list]) for target in target_list: spec = target_dicts[target] if spec['default_configuration'] != 'Default': default_configuration = spec['default_configuration'] break if not default_configuration: default_configuration = 'Default' srcdir = '.' makefile_name = 'GypAndroid' + options.suffix + '.mk' makefile_path = os.path.join(options.toplevel_dir, makefile_name) assert not options.generator_output, ( 'The Android backend does not support options.generator_output.') gyp.common.EnsureDirExists(makefile_path) root_makefile = open(makefile_path, 'w') root_makefile.write(header) # We set LOCAL_PATH just once, here, to the top of the project tree. This # allows all the other paths we use to be relative to the Android.mk file, # as the Android build system expects. root_makefile.write('\nLOCAL_PATH := $(call my-dir)\n') # Find the list of targets that derive from the gyp file(s) being built. needed_targets = set() for build_file in params['build_files']: for target in gyp.common.AllTargets(target_list, target_dicts, build_file): needed_targets.add(target) build_files = set() include_list = set() android_modules = {} for qualified_target in target_list: build_file, target, toolset = gyp.common.ParseQualifiedTarget( qualified_target) relative_build_file = gyp.common.RelativePath(build_file, options.toplevel_dir) build_files.add(relative_build_file) included_files = data[build_file]['included_files'] for included_file in included_files: # The included_files entries are relative to the dir of the build file # that included them, so we have to undo that and then make them relative # to the root dir. relative_include_file = gyp.common.RelativePath( gyp.common.UnrelativePath(included_file, build_file), options.toplevel_dir) abs_include_file = os.path.abspath(relative_include_file) # If the include file is from the ~/.gyp dir, we should use absolute path # so that relocating the src dir doesn't break the path. if (params['home_dot_gyp'] and abs_include_file.startswith(params['home_dot_gyp'])): build_files.add(abs_include_file) else: build_files.add(relative_include_file) base_path, output_file = CalculateMakefilePath(build_file, target + '.' + toolset + options.suffix + '.mk') spec = target_dicts[qualified_target] configs = spec['configurations'] part_of_all = (qualified_target in needed_targets and not int(spec.get('suppress_wildcard', False))) if limit_to_target_all and not part_of_all: continue relative_target = gyp.common.QualifiedTarget(relative_build_file, target, toolset) writer = AndroidMkWriter(android_top_dir) android_module = writer.Write(qualified_target, relative_target, base_path, output_file, spec, configs, part_of_all=part_of_all) if android_module in android_modules: print ('ERROR: Android module names must be unique. The following ' 'targets both generate Android module name %s.\n %s\n %s' % (android_module, android_modules[android_module], qualified_target)) return android_modules[android_module] = qualified_target # Our root_makefile lives at the source root. Compute the relative path # from there to the output_file for including. mkfile_rel_path = gyp.common.RelativePath(output_file, os.path.dirname(makefile_path)) include_list.add(mkfile_rel_path) root_makefile.write('GYP_CONFIGURATION ?= %s\n' % default_configuration) # Write out the sorted list of includes. root_makefile.write('\n') for include_file in sorted(include_list): root_makefile.write('include $(LOCAL_PATH)/' + include_file + '\n') root_makefile.write('\n') root_makefile.write(SHARED_FOOTER) root_makefile.close() ���������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/cmake.py��000644 �000766 �000024 �00000121321 12455173731 033236� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """cmake output module This module is under development and should be considered experimental. This module produces cmake (2.8.8+) input as its output. One CMakeLists.txt is created for each configuration. This module's original purpose was to support editing in IDEs like KDevelop which use CMake for project management. It is also possible to use CMake to generate projects for other IDEs such as eclipse cdt and code::blocks. QtCreator will convert the CMakeLists.txt to a code::blocks cbp for the editor to read, but build using CMake. As a result QtCreator editor is unaware of compiler defines. The generated CMakeLists.txt can also be used to build on Linux. There is currently no support for building on platforms other than Linux. The generated CMakeLists.txt should properly compile all projects. However, there is a mismatch between gyp and cmake with regard to linking. All attempts are made to work around this, but CMake sometimes sees -Wl,--start-group as a library and incorrectly repeats it. As a result the output of this generator should not be relied on for building. When using with kdevelop, use version 4.4+. Previous versions of kdevelop will not be able to find the header file directories described in the generated CMakeLists.txt file. """ import multiprocessing import os import signal import string import subprocess import gyp.common generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_PREFIX': 'lib', 'SHARED_LIB_SUFFIX': '.so', 'SHARED_LIB_DIR': '${builddir}/lib.${TOOLSET}', 'LIB_DIR': '${obj}.${TOOLSET}', 'INTERMEDIATE_DIR': '${obj}.${TOOLSET}/${TARGET}/geni', 'SHARED_INTERMEDIATE_DIR': '${obj}/gen', 'PRODUCT_DIR': '${builddir}', 'RULE_INPUT_PATH': '${RULE_INPUT_PATH}', 'RULE_INPUT_DIRNAME': '${RULE_INPUT_DIRNAME}', 'RULE_INPUT_NAME': '${RULE_INPUT_NAME}', 'RULE_INPUT_ROOT': '${RULE_INPUT_ROOT}', 'RULE_INPUT_EXT': '${RULE_INPUT_EXT}', 'CONFIGURATION_NAME': '${configuration}', } FULL_PATH_VARS = ('${CMAKE_SOURCE_DIR}', '${builddir}', '${obj}') generator_supports_multiple_toolsets = True generator_wants_static_library_dependencies_adjusted = True COMPILABLE_EXTENSIONS = { '.c': 'cc', '.cc': 'cxx', '.cpp': 'cxx', '.cxx': 'cxx', '.s': 's', # cc '.S': 's', # cc } def RemovePrefix(a, prefix): """Returns 'a' without 'prefix' if it starts with 'prefix'.""" return a[len(prefix):] if a.startswith(prefix) else a def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" default_variables.setdefault('OS', gyp.common.GetFlavor(params)) def Compilable(filename): """Return true if the file is compilable (should be in OBJS).""" return any(filename.endswith(e) for e in COMPILABLE_EXTENSIONS) def Linkable(filename): """Return true if the file is linkable (should be on the link line).""" return filename.endswith('.o') def NormjoinPathForceCMakeSource(base_path, rel_path): """Resolves rel_path against base_path and returns the result. If rel_path is an absolute path it is returned unchanged. Otherwise it is resolved against base_path and normalized. If the result is a relative path, it is forced to be relative to the CMakeLists.txt. """ if os.path.isabs(rel_path): return rel_path if any([rel_path.startswith(var) for var in FULL_PATH_VARS]): return rel_path # TODO: do we need to check base_path for absolute variables as well? return os.path.join('${CMAKE_SOURCE_DIR}', os.path.normpath(os.path.join(base_path, rel_path))) def NormjoinPath(base_path, rel_path): """Resolves rel_path against base_path and returns the result. TODO: what is this really used for? If rel_path begins with '$' it is returned unchanged. Otherwise it is resolved against base_path if relative, then normalized. """ if rel_path.startswith('$') and not rel_path.startswith('${configuration}'): return rel_path return os.path.normpath(os.path.join(base_path, rel_path)) def CMakeStringEscape(a): """Escapes the string 'a' for use inside a CMake string. This means escaping '\' otherwise it may be seen as modifying the next character '"' otherwise it will end the string ';' otherwise the string becomes a list The following do not need to be escaped '#' when the lexer is in string state, this does not start a comment The following are yet unknown '$' generator variables (like ${obj}) must not be escaped, but text $ should be escaped what is wanted is to know which $ come from generator variables """ return a.replace('\\', '\\\\').replace(';', '\\;').replace('"', '\\"') def SetFileProperty(output, source_name, property_name, values, sep): """Given a set of source file, sets the given property on them.""" output.write('set_source_files_properties(') output.write(source_name) output.write(' PROPERTIES ') output.write(property_name) output.write(' "') for value in values: output.write(CMakeStringEscape(value)) output.write(sep) output.write('")\n') def SetFilesProperty(output, source_names, property_name, values, sep): """Given a set of source files, sets the given property on them.""" output.write('set_source_files_properties(\n') for source_name in source_names: output.write(' ') output.write(source_name) output.write('\n') output.write(' PROPERTIES\n ') output.write(property_name) output.write(' "') for value in values: output.write(CMakeStringEscape(value)) output.write(sep) output.write('"\n)\n') def SetTargetProperty(output, target_name, property_name, values, sep=''): """Given a target, sets the given property.""" output.write('set_target_properties(') output.write(target_name) output.write(' PROPERTIES ') output.write(property_name) output.write(' "') for value in values: output.write(CMakeStringEscape(value)) output.write(sep) output.write('")\n') def SetVariable(output, variable_name, value): """Sets a CMake variable.""" output.write('set(') output.write(variable_name) output.write(' "') output.write(CMakeStringEscape(value)) output.write('")\n') def SetVariableList(output, variable_name, values): """Sets a CMake variable to a list.""" if not values: return SetVariable(output, variable_name, "") if len(values) == 1: return SetVariable(output, variable_name, values[0]) output.write('list(APPEND ') output.write(variable_name) output.write('\n "') output.write('"\n "'.join([CMakeStringEscape(value) for value in values])) output.write('")\n') def UnsetVariable(output, variable_name): """Unsets a CMake variable.""" output.write('unset(') output.write(variable_name) output.write(')\n') def WriteVariable(output, variable_name, prepend=None): if prepend: output.write(prepend) output.write('${') output.write(variable_name) output.write('}') class CMakeTargetType: def __init__(self, command, modifier, property_modifier): self.command = command self.modifier = modifier self.property_modifier = property_modifier cmake_target_type_from_gyp_target_type = { 'executable': CMakeTargetType('add_executable', None, 'RUNTIME'), 'static_library': CMakeTargetType('add_library', 'STATIC', 'ARCHIVE'), 'shared_library': CMakeTargetType('add_library', 'SHARED', 'LIBRARY'), 'loadable_module': CMakeTargetType('add_library', 'MODULE', 'LIBRARY'), 'none': CMakeTargetType('add_custom_target', 'SOURCES', None), } def StringToCMakeTargetName(a): """Converts the given string 'a' to a valid CMake target name. All invalid characters are replaced by '_'. Invalid for cmake: ' ', '/', '(', ')' Invalid for make: ':' Invalid for unknown reasons but cause failures: '.' """ return a.translate(string.maketrans(' /():.', '______')) def WriteActions(target_name, actions, extra_sources, extra_deps, path_to_gyp, output): """Write CMake for the 'actions' in the target. Args: target_name: the name of the CMake target being generated. actions: the Gyp 'actions' dict for this target. extra_sources: [(<cmake_src>, <src>)] to append with generated source files. extra_deps: [<cmake_taget>] to append with generated targets. path_to_gyp: relative path from CMakeLists.txt being generated to the Gyp file in which the target being generated is defined. """ for action in actions: action_name = StringToCMakeTargetName(action['action_name']) action_target_name = '%s__%s' % (target_name, action_name) inputs = action['inputs'] inputs_name = action_target_name + '__input' SetVariableList(output, inputs_name, [NormjoinPathForceCMakeSource(path_to_gyp, dep) for dep in inputs]) outputs = action['outputs'] cmake_outputs = [NormjoinPathForceCMakeSource(path_to_gyp, out) for out in outputs] outputs_name = action_target_name + '__output' SetVariableList(output, outputs_name, cmake_outputs) # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set(dir for dir in (os.path.dirname(o) for o in outputs) if dir) if int(action.get('process_outputs_as_sources', False)): extra_sources.extend(zip(cmake_outputs, outputs)) # add_custom_command output.write('add_custom_command(OUTPUT ') WriteVariable(output, outputs_name) output.write('\n') if len(dirs) > 0: for directory in dirs: output.write(' COMMAND ${CMAKE_COMMAND} -E make_directory ') output.write(directory) output.write('\n') output.write(' COMMAND ') output.write(gyp.common.EncodePOSIXShellList(action['action'])) output.write('\n') output.write(' DEPENDS ') WriteVariable(output, inputs_name) output.write('\n') output.write(' WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}/') output.write(path_to_gyp) output.write('\n') output.write(' COMMENT ') if 'message' in action: output.write(action['message']) else: output.write(action_target_name) output.write('\n') output.write(' VERBATIM\n') output.write(')\n') # add_custom_target output.write('add_custom_target(') output.write(action_target_name) output.write('\n DEPENDS ') WriteVariable(output, outputs_name) output.write('\n SOURCES ') WriteVariable(output, inputs_name) output.write('\n)\n') extra_deps.append(action_target_name) def NormjoinRulePathForceCMakeSource(base_path, rel_path, rule_source): if rel_path.startswith(("${RULE_INPUT_PATH}","${RULE_INPUT_DIRNAME}")): if any([rule_source.startswith(var) for var in FULL_PATH_VARS]): return rel_path return NormjoinPathForceCMakeSource(base_path, rel_path) def WriteRules(target_name, rules, extra_sources, extra_deps, path_to_gyp, output): """Write CMake for the 'rules' in the target. Args: target_name: the name of the CMake target being generated. actions: the Gyp 'actions' dict for this target. extra_sources: [(<cmake_src>, <src>)] to append with generated source files. extra_deps: [<cmake_taget>] to append with generated targets. path_to_gyp: relative path from CMakeLists.txt being generated to the Gyp file in which the target being generated is defined. """ for rule in rules: rule_name = StringToCMakeTargetName(target_name + '__' + rule['rule_name']) inputs = rule.get('inputs', []) inputs_name = rule_name + '__input' SetVariableList(output, inputs_name, [NormjoinPathForceCMakeSource(path_to_gyp, dep) for dep in inputs]) outputs = rule['outputs'] var_outputs = [] for count, rule_source in enumerate(rule.get('rule_sources', [])): action_name = rule_name + '_' + str(count) rule_source_dirname, rule_source_basename = os.path.split(rule_source) rule_source_root, rule_source_ext = os.path.splitext(rule_source_basename) SetVariable(output, 'RULE_INPUT_PATH', rule_source) SetVariable(output, 'RULE_INPUT_DIRNAME', rule_source_dirname) SetVariable(output, 'RULE_INPUT_NAME', rule_source_basename) SetVariable(output, 'RULE_INPUT_ROOT', rule_source_root) SetVariable(output, 'RULE_INPUT_EXT', rule_source_ext) # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set(dir for dir in (os.path.dirname(o) for o in outputs) if dir) # Create variables for the output, as 'local' variable will be unset. these_outputs = [] for output_index, out in enumerate(outputs): output_name = action_name + '_' + str(output_index) SetVariable(output, output_name, NormjoinRulePathForceCMakeSource(path_to_gyp, out, rule_source)) if int(rule.get('process_outputs_as_sources', False)): extra_sources.append(('${' + output_name + '}', out)) these_outputs.append('${' + output_name + '}') var_outputs.append('${' + output_name + '}') # add_custom_command output.write('add_custom_command(OUTPUT\n') for out in these_outputs: output.write(' ') output.write(out) output.write('\n') for directory in dirs: output.write(' COMMAND ${CMAKE_COMMAND} -E make_directory ') output.write(directory) output.write('\n') output.write(' COMMAND ') output.write(gyp.common.EncodePOSIXShellList(rule['action'])) output.write('\n') output.write(' DEPENDS ') WriteVariable(output, inputs_name) output.write(' ') output.write(NormjoinPath(path_to_gyp, rule_source)) output.write('\n') # CMAKE_SOURCE_DIR is where the CMakeLists.txt lives. # The cwd is the current build directory. output.write(' WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}/') output.write(path_to_gyp) output.write('\n') output.write(' COMMENT ') if 'message' in rule: output.write(rule['message']) else: output.write(action_name) output.write('\n') output.write(' VERBATIM\n') output.write(')\n') UnsetVariable(output, 'RULE_INPUT_PATH') UnsetVariable(output, 'RULE_INPUT_DIRNAME') UnsetVariable(output, 'RULE_INPUT_NAME') UnsetVariable(output, 'RULE_INPUT_ROOT') UnsetVariable(output, 'RULE_INPUT_EXT') # add_custom_target output.write('add_custom_target(') output.write(rule_name) output.write(' DEPENDS\n') for out in var_outputs: output.write(' ') output.write(out) output.write('\n') output.write('SOURCES ') WriteVariable(output, inputs_name) output.write('\n') for rule_source in rule.get('rule_sources', []): output.write(' ') output.write(NormjoinPath(path_to_gyp, rule_source)) output.write('\n') output.write(')\n') extra_deps.append(rule_name) def WriteCopies(target_name, copies, extra_deps, path_to_gyp, output): """Write CMake for the 'copies' in the target. Args: target_name: the name of the CMake target being generated. actions: the Gyp 'actions' dict for this target. extra_deps: [<cmake_taget>] to append with generated targets. path_to_gyp: relative path from CMakeLists.txt being generated to the Gyp file in which the target being generated is defined. """ copy_name = target_name + '__copies' # CMake gets upset with custom targets with OUTPUT which specify no output. have_copies = any(copy['files'] for copy in copies) if not have_copies: output.write('add_custom_target(') output.write(copy_name) output.write(')\n') extra_deps.append(copy_name) return class Copy: def __init__(self, ext, command): self.cmake_inputs = [] self.cmake_outputs = [] self.gyp_inputs = [] self.gyp_outputs = [] self.ext = ext self.inputs_name = None self.outputs_name = None self.command = command file_copy = Copy('', 'copy') dir_copy = Copy('_dirs', 'copy_directory') for copy in copies: files = copy['files'] destination = copy['destination'] for src in files: path = os.path.normpath(src) basename = os.path.split(path)[1] dst = os.path.join(destination, basename) copy = file_copy if os.path.basename(src) else dir_copy copy.cmake_inputs.append(NormjoinPath(path_to_gyp, src)) copy.cmake_outputs.append(NormjoinPathForceCMakeSource(path_to_gyp, dst)) copy.gyp_inputs.append(src) copy.gyp_outputs.append(dst) for copy in (file_copy, dir_copy): if copy.cmake_inputs: copy.inputs_name = copy_name + '__input' + copy.ext SetVariableList(output, copy.inputs_name, copy.cmake_inputs) copy.outputs_name = copy_name + '__output' + copy.ext SetVariableList(output, copy.outputs_name, copy.cmake_outputs) # add_custom_command output.write('add_custom_command(\n') output.write('OUTPUT') for copy in (file_copy, dir_copy): if copy.outputs_name: WriteVariable(output, copy.outputs_name, ' ') output.write('\n') for copy in (file_copy, dir_copy): for src, dst in zip(copy.gyp_inputs, copy.gyp_outputs): # 'cmake -E copy src dst' will create the 'dst' directory if needed. output.write('COMMAND ${CMAKE_COMMAND} -E %s ' % copy.command) output.write(src) output.write(' ') output.write(dst) output.write("\n") output.write('DEPENDS') for copy in (file_copy, dir_copy): if copy.inputs_name: WriteVariable(output, copy.inputs_name, ' ') output.write('\n') output.write('WORKING_DIRECTORY ${CMAKE_SOURCE_DIR}/') output.write(path_to_gyp) output.write('\n') output.write('COMMENT Copying for ') output.write(target_name) output.write('\n') output.write('VERBATIM\n') output.write(')\n') # add_custom_target output.write('add_custom_target(') output.write(copy_name) output.write('\n DEPENDS') for copy in (file_copy, dir_copy): if copy.outputs_name: WriteVariable(output, copy.outputs_name, ' ') output.write('\n SOURCES') if file_copy.inputs_name: WriteVariable(output, file_copy.inputs_name, ' ') output.write('\n)\n') extra_deps.append(copy_name) def CreateCMakeTargetBaseName(qualified_target): """This is the name we would like the target to have.""" _, gyp_target_name, gyp_target_toolset = ( gyp.common.ParseQualifiedTarget(qualified_target)) cmake_target_base_name = gyp_target_name if gyp_target_toolset and gyp_target_toolset != 'target': cmake_target_base_name += '_' + gyp_target_toolset return StringToCMakeTargetName(cmake_target_base_name) def CreateCMakeTargetFullName(qualified_target): """An unambiguous name for the target.""" gyp_file, gyp_target_name, gyp_target_toolset = ( gyp.common.ParseQualifiedTarget(qualified_target)) cmake_target_full_name = gyp_file + ':' + gyp_target_name if gyp_target_toolset and gyp_target_toolset != 'target': cmake_target_full_name += '_' + gyp_target_toolset return StringToCMakeTargetName(cmake_target_full_name) class CMakeNamer(object): """Converts Gyp target names into CMake target names. CMake requires that target names be globally unique. One way to ensure this is to fully qualify the names of the targets. Unfortunatly, this ends up with all targets looking like "chrome_chrome_gyp_chrome" instead of just "chrome". If this generator were only interested in building, it would be possible to fully qualify all target names, then create unqualified target names which depend on all qualified targets which should have had that name. This is more or less what the 'make' generator does with aliases. However, one goal of this generator is to create CMake files for use with IDEs, and fully qualified names are not as user friendly. Since target name collision is rare, we do the above only when required. Toolset variants are always qualified from the base, as this is required for building. However, it also makes sense for an IDE, as it is possible for defines to be different. """ def __init__(self, target_list): self.cmake_target_base_names_conficting = set() cmake_target_base_names_seen = set() for qualified_target in target_list: cmake_target_base_name = CreateCMakeTargetBaseName(qualified_target) if cmake_target_base_name not in cmake_target_base_names_seen: cmake_target_base_names_seen.add(cmake_target_base_name) else: self.cmake_target_base_names_conficting.add(cmake_target_base_name) def CreateCMakeTargetName(self, qualified_target): base_name = CreateCMakeTargetBaseName(qualified_target) if base_name in self.cmake_target_base_names_conficting: return CreateCMakeTargetFullName(qualified_target) return base_name def WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use, options, generator_flags, all_qualified_targets, output): # The make generator does this always. # TODO: It would be nice to be able to tell CMake all dependencies. circular_libs = generator_flags.get('circular', True) if not generator_flags.get('standalone', False): output.write('\n#') output.write(qualified_target) output.write('\n') gyp_file, _, _ = gyp.common.ParseQualifiedTarget(qualified_target) rel_gyp_file = gyp.common.RelativePath(gyp_file, options.toplevel_dir) rel_gyp_dir = os.path.dirname(rel_gyp_file) # Relative path from build dir to top dir. build_to_top = gyp.common.InvertRelativePath(build_dir, options.toplevel_dir) # Relative path from build dir to gyp dir. build_to_gyp = os.path.join(build_to_top, rel_gyp_dir) path_from_cmakelists_to_gyp = build_to_gyp spec = target_dicts.get(qualified_target, {}) config = spec.get('configurations', {}).get(config_to_use, {}) target_name = spec.get('target_name', '<missing target name>') target_type = spec.get('type', '<missing target type>') target_toolset = spec.get('toolset') SetVariable(output, 'TARGET', target_name) SetVariable(output, 'TOOLSET', target_toolset) cmake_target_name = namer.CreateCMakeTargetName(qualified_target) extra_sources = [] extra_deps = [] # Actions must come first, since they can generate more OBJs for use below. if 'actions' in spec: WriteActions(cmake_target_name, spec['actions'], extra_sources, extra_deps, path_from_cmakelists_to_gyp, output) # Rules must be early like actions. if 'rules' in spec: WriteRules(cmake_target_name, spec['rules'], extra_sources, extra_deps, path_from_cmakelists_to_gyp, output) # Copies if 'copies' in spec: WriteCopies(cmake_target_name, spec['copies'], extra_deps, path_from_cmakelists_to_gyp, output) # Target and sources srcs = spec.get('sources', []) # Gyp separates the sheep from the goats based on file extensions. def partition(l, p): return reduce(lambda x, e: x[not p(e)].append(e) or x, l, ([], [])) compilable_srcs, other_srcs = partition(srcs, Compilable) # CMake gets upset when executable targets provide no sources. if target_type == 'executable' and not compilable_srcs and not extra_sources: print ('Executable %s has no complilable sources, treating as "none".' % target_name ) target_type = 'none' cmake_target_type = cmake_target_type_from_gyp_target_type.get(target_type) if cmake_target_type is None: print ('Target %s has unknown target type %s, skipping.' % ( target_name, target_type ) ) return other_srcs_name = None if other_srcs: other_srcs_name = cmake_target_name + '__other_srcs' SetVariableList(output, other_srcs_name, [NormjoinPath(path_from_cmakelists_to_gyp, src) for src in other_srcs]) # CMake is opposed to setting linker directories and considers the practice # of setting linker directories dangerous. Instead, it favors the use of # find_library and passing absolute paths to target_link_libraries. # However, CMake does provide the command link_directories, which adds # link directories to targets defined after it is called. # As a result, link_directories must come before the target definition. # CMake unfortunately has no means of removing entries from LINK_DIRECTORIES. library_dirs = config.get('library_dirs') if library_dirs is not None: output.write('link_directories(') for library_dir in library_dirs: output.write(' ') output.write(NormjoinPath(path_from_cmakelists_to_gyp, library_dir)) output.write('\n') output.write(')\n') output.write(cmake_target_type.command) output.write('(') output.write(cmake_target_name) if cmake_target_type.modifier is not None: output.write(' ') output.write(cmake_target_type.modifier) if other_srcs_name: WriteVariable(output, other_srcs_name, ' ') output.write('\n') for src in compilable_srcs: output.write(' ') output.write(NormjoinPath(path_from_cmakelists_to_gyp, src)) output.write('\n') for extra_source in extra_sources: output.write(' ') src, _ = extra_source output.write(NormjoinPath(path_from_cmakelists_to_gyp, src)) output.write('\n') output.write(')\n') # Output name and location. if target_type != 'none': # Mark uncompiled sources as uncompiled. if other_srcs_name: output.write('set_source_files_properties(') WriteVariable(output, other_srcs_name, '') output.write(' PROPERTIES HEADER_FILE_ONLY "TRUE")\n') # Output directory target_output_directory = spec.get('product_dir') if target_output_directory is None: if target_type in ('executable', 'loadable_module'): target_output_directory = generator_default_variables['PRODUCT_DIR'] elif target_type in ('shared_library'): target_output_directory = '${builddir}/lib.${TOOLSET}' elif spec.get('standalone_static_library', False): target_output_directory = generator_default_variables['PRODUCT_DIR'] else: base_path = gyp.common.RelativePath(os.path.dirname(gyp_file), options.toplevel_dir) target_output_directory = '${obj}.${TOOLSET}' target_output_directory = ( os.path.join(target_output_directory, base_path)) cmake_target_output_directory = NormjoinPathForceCMakeSource( path_from_cmakelists_to_gyp, target_output_directory) SetTargetProperty(output, cmake_target_name, cmake_target_type.property_modifier + '_OUTPUT_DIRECTORY', cmake_target_output_directory) # Output name default_product_prefix = '' default_product_name = target_name default_product_ext = '' if target_type == 'static_library': static_library_prefix = generator_default_variables['STATIC_LIB_PREFIX'] default_product_name = RemovePrefix(default_product_name, static_library_prefix) default_product_prefix = static_library_prefix default_product_ext = generator_default_variables['STATIC_LIB_SUFFIX'] elif target_type in ('loadable_module', 'shared_library'): shared_library_prefix = generator_default_variables['SHARED_LIB_PREFIX'] default_product_name = RemovePrefix(default_product_name, shared_library_prefix) default_product_prefix = shared_library_prefix default_product_ext = generator_default_variables['SHARED_LIB_SUFFIX'] elif target_type != 'executable': print ('ERROR: What output file should be generated?', 'type', target_type, 'target', target_name) product_prefix = spec.get('product_prefix', default_product_prefix) product_name = spec.get('product_name', default_product_name) product_ext = spec.get('product_extension') if product_ext: product_ext = '.' + product_ext else: product_ext = default_product_ext SetTargetProperty(output, cmake_target_name, 'PREFIX', product_prefix) SetTargetProperty(output, cmake_target_name, cmake_target_type.property_modifier + '_OUTPUT_NAME', product_name) SetTargetProperty(output, cmake_target_name, 'SUFFIX', product_ext) # Make the output of this target referenceable as a source. cmake_target_output_basename = product_prefix + product_name + product_ext cmake_target_output = os.path.join(cmake_target_output_directory, cmake_target_output_basename) SetFileProperty(output, cmake_target_output, 'GENERATED', ['TRUE'], '') # Let CMake know if the 'all' target should depend on this target. exclude_from_all = ('TRUE' if qualified_target not in all_qualified_targets else 'FALSE') SetTargetProperty(output, cmake_target_name, 'EXCLUDE_FROM_ALL', exclude_from_all) for extra_target_name in extra_deps: SetTargetProperty(output, extra_target_name, 'EXCLUDE_FROM_ALL', exclude_from_all) # Includes includes = config.get('include_dirs') if includes: # This (target include directories) is what requires CMake 2.8.8 includes_name = cmake_target_name + '__include_dirs' SetVariableList(output, includes_name, [NormjoinPathForceCMakeSource(path_from_cmakelists_to_gyp, include) for include in includes]) output.write('set_property(TARGET ') output.write(cmake_target_name) output.write(' APPEND PROPERTY INCLUDE_DIRECTORIES ') WriteVariable(output, includes_name, '') output.write(')\n') # Defines defines = config.get('defines') if defines is not None: SetTargetProperty(output, cmake_target_name, 'COMPILE_DEFINITIONS', defines, ';') # Compile Flags - http://www.cmake.org/Bug/view.php?id=6493 # CMake currently does not have target C and CXX flags. # So, instead of doing... # cflags_c = config.get('cflags_c') # if cflags_c is not None: # SetTargetProperty(output, cmake_target_name, # 'C_COMPILE_FLAGS', cflags_c, ' ') # cflags_cc = config.get('cflags_cc') # if cflags_cc is not None: # SetTargetProperty(output, cmake_target_name, # 'CXX_COMPILE_FLAGS', cflags_cc, ' ') # Instead we must... s_sources = [] c_sources = [] cxx_sources = [] for src in srcs: _, ext = os.path.splitext(src) src_type = COMPILABLE_EXTENSIONS.get(ext, None) if src_type == 's': s_sources.append(NormjoinPath(path_from_cmakelists_to_gyp, src)) if src_type == 'cc': c_sources.append(NormjoinPath(path_from_cmakelists_to_gyp, src)) if src_type == 'cxx': cxx_sources.append(NormjoinPath(path_from_cmakelists_to_gyp, src)) for extra_source in extra_sources: src, real_source = extra_source _, ext = os.path.splitext(real_source) src_type = COMPILABLE_EXTENSIONS.get(ext, None) if src_type == 's': s_sources.append(NormjoinPath(path_from_cmakelists_to_gyp, src)) if src_type == 'cc': c_sources.append(NormjoinPath(path_from_cmakelists_to_gyp, src)) if src_type == 'cxx': cxx_sources.append(NormjoinPath(path_from_cmakelists_to_gyp, src)) cflags = config.get('cflags', []) cflags_c = config.get('cflags_c', []) cflags_cxx = config.get('cflags_cc', []) if c_sources and not (s_sources or cxx_sources): flags = [] flags.extend(cflags) flags.extend(cflags_c) SetTargetProperty(output, cmake_target_name, 'COMPILE_FLAGS', flags, ' ') elif cxx_sources and not (s_sources or c_sources): flags = [] flags.extend(cflags) flags.extend(cflags_cxx) SetTargetProperty(output, cmake_target_name, 'COMPILE_FLAGS', flags, ' ') else: if s_sources and cflags: SetFilesProperty(output, s_sources, 'COMPILE_FLAGS', cflags, ' ') if c_sources and (cflags or cflags_c): flags = [] flags.extend(cflags) flags.extend(cflags_c) SetFilesProperty(output, c_sources, 'COMPILE_FLAGS', flags, ' ') if cxx_sources and (cflags or cflags_cxx): flags = [] flags.extend(cflags) flags.extend(cflags_cxx) SetFilesProperty(output, cxx_sources, 'COMPILE_FLAGS', flags, ' ') # Have assembly link as c if there are no other files if not c_sources and not cxx_sources and s_sources: SetTargetProperty(output, cmake_target_name, 'LINKER_LANGUAGE', ['C']) # Linker flags ldflags = config.get('ldflags') if ldflags is not None: SetTargetProperty(output, cmake_target_name, 'LINK_FLAGS', ldflags, ' ') # Note on Dependencies and Libraries: # CMake wants to handle link order, resolving the link line up front. # Gyp does not retain or enforce specifying enough information to do so. # So do as other gyp generators and use --start-group and --end-group. # Give CMake as little information as possible so that it doesn't mess it up. # Dependencies rawDeps = spec.get('dependencies', []) static_deps = [] shared_deps = [] other_deps = [] for rawDep in rawDeps: dep_cmake_name = namer.CreateCMakeTargetName(rawDep) dep_spec = target_dicts.get(rawDep, {}) dep_target_type = dep_spec.get('type', None) if dep_target_type == 'static_library': static_deps.append(dep_cmake_name) elif dep_target_type == 'shared_library': shared_deps.append(dep_cmake_name) else: other_deps.append(dep_cmake_name) # ensure all external dependencies are complete before internal dependencies # extra_deps currently only depend on their own deps, so otherwise run early if static_deps or shared_deps or other_deps: for extra_dep in extra_deps: output.write('add_dependencies(') output.write(extra_dep) output.write('\n') for deps in (static_deps, shared_deps, other_deps): for dep in gyp.common.uniquer(deps): output.write(' ') output.write(dep) output.write('\n') output.write(')\n') linkable = target_type in ('executable', 'loadable_module', 'shared_library') other_deps.extend(extra_deps) if other_deps or (not linkable and (static_deps or shared_deps)): output.write('add_dependencies(') output.write(cmake_target_name) output.write('\n') for dep in gyp.common.uniquer(other_deps): output.write(' ') output.write(dep) output.write('\n') if not linkable: for deps in (static_deps, shared_deps): for lib_dep in gyp.common.uniquer(deps): output.write(' ') output.write(lib_dep) output.write('\n') output.write(')\n') # Libraries if linkable: external_libs = [lib for lib in spec.get('libraries', []) if len(lib) > 0] if external_libs or static_deps or shared_deps: output.write('target_link_libraries(') output.write(cmake_target_name) output.write('\n') if static_deps: write_group = circular_libs and len(static_deps) > 1 if write_group: output.write('-Wl,--start-group\n') for dep in gyp.common.uniquer(static_deps): output.write(' ') output.write(dep) output.write('\n') if write_group: output.write('-Wl,--end-group\n') if shared_deps: for dep in gyp.common.uniquer(shared_deps): output.write(' ') output.write(dep) output.write('\n') if external_libs: for lib in gyp.common.uniquer(external_libs): output.write(' ') output.write(lib) output.write('\n') output.write(')\n') UnsetVariable(output, 'TOOLSET') UnsetVariable(output, 'TARGET') def GenerateOutputForConfig(target_list, target_dicts, data, params, config_to_use): options = params['options'] generator_flags = params['generator_flags'] # generator_dir: relative path from pwd to where make puts build files. # Makes migrating from make to cmake easier, cmake doesn't put anything here. # Each Gyp configuration creates a different CMakeLists.txt file # to avoid incompatibilities between Gyp and CMake configurations. generator_dir = os.path.relpath(options.generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = generator_flags.get('output_dir', 'out') # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.normpath(os.path.join(generator_dir, output_dir, config_to_use)) toplevel_build = os.path.join(options.toplevel_dir, build_dir) output_file = os.path.join(toplevel_build, 'CMakeLists.txt') gyp.common.EnsureDirExists(output_file) output = open(output_file, 'w') output.write('cmake_minimum_required(VERSION 2.8.8 FATAL_ERROR)\n') output.write('cmake_policy(VERSION 2.8.8)\n') _, project_target, _ = gyp.common.ParseQualifiedTarget(target_list[-1]) output.write('project(') output.write(project_target) output.write(')\n') SetVariable(output, 'configuration', config_to_use) # The following appears to be as-yet undocumented. # http://public.kitware.com/Bug/view.php?id=8392 output.write('enable_language(ASM)\n') # ASM-ATT does not support .S files. # output.write('enable_language(ASM-ATT)\n') SetVariable(output, 'builddir', '${CMAKE_BINARY_DIR}') SetVariable(output, 'obj', '${builddir}/obj') output.write('\n') # TODO: Undocumented/unsupported (the CMake Java generator depends on it). # CMake by default names the object resulting from foo.c to be foo.c.o. # Gyp traditionally names the object resulting from foo.c foo.o. # This should be irrelevant, but some targets extract .o files from .a # and depend on the name of the extracted .o files. output.write('set(CMAKE_C_OUTPUT_EXTENSION_REPLACE 1)\n') output.write('set(CMAKE_CXX_OUTPUT_EXTENSION_REPLACE 1)\n') output.write('\n') namer = CMakeNamer(target_list) # The list of targets upon which the 'all' target should depend. # CMake has it's own implicit 'all' target, one is not created explicitly. all_qualified_targets = set() for build_file in params['build_files']: for qualified_target in gyp.common.AllTargets(target_list, target_dicts, os.path.normpath(build_file)): all_qualified_targets.add(qualified_target) for qualified_target in target_list: WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use, options, generator_flags, all_qualified_targets, output) output.close() def PerformBuild(data, configurations, params): options = params['options'] generator_flags = params['generator_flags'] # generator_dir: relative path from pwd to where make puts build files. # Makes migrating from make to cmake easier, cmake doesn't put anything here. generator_dir = os.path.relpath(options.generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = generator_flags.get('output_dir', 'out') for config_name in configurations: # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.normpath(os.path.join(generator_dir, output_dir, config_name)) arguments = ['cmake', '-G', 'Ninja'] print 'Generating [%s]: %s' % (config_name, arguments) subprocess.check_call(arguments, cwd=build_dir) arguments = ['ninja', '-C', build_dir] print 'Building [%s]: %s' % (config_name, arguments) subprocess.check_call(arguments) def CallGenerateOutputForConfig(arglist): # Ignore the interrupt signal so that the parent process catches it and # kills all multiprocessing children. signal.signal(signal.SIGINT, signal.SIG_IGN) target_list, target_dicts, data, params, config_name = arglist GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) def GenerateOutput(target_list, target_dicts, data, params): user_config = params.get('generator_flags', {}).get('config', None) if user_config: GenerateOutputForConfig(target_list, target_dicts, data, params, user_config) else: config_names = target_dicts[target_list[0]]['configurations'].keys() if params['parallel']: try: pool = multiprocessing.Pool(len(config_names)) arglists = [] for config_name in config_names: arglists.append((target_list, target_dicts, data, params, config_name)) pool.map(CallGenerateOutputForConfig, arglists) except KeyboardInterrupt, e: pool.terminate() raise e else: for config_name in config_names: GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/dump_dependency_json.py����������000644 �000766 �000024 �00000005320 12455173731 036273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import collections import os import gyp import gyp.common import gyp.msvs_emulation import json import sys generator_supports_multiple_toolsets = True generator_wants_static_library_dependencies_adjusted = False generator_default_variables = { } for dirname in ['INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']: # Some gyp steps fail if these are empty(!). generator_default_variables[dirname] = 'dir' for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX', 'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX', 'CONFIGURATION_NAME']: generator_default_variables[unused] = '' def CalculateVariables(default_variables, params): generator_flags = params.get('generator_flags', {}) for key, val in generator_flags.items(): default_variables.setdefault(key, val) default_variables.setdefault('OS', gyp.common.GetFlavor(params)) flavor = gyp.common.GetFlavor(params) if flavor =='win': # Copy additional generator configuration data from VS, which is shared # by the Windows Ninja generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) def CalculateGeneratorInputInfo(params): """Calculate the generator specific info that gets fed to input (called by gyp).""" generator_flags = params.get('generator_flags', {}) if generator_flags.get('adjust_static_libraries', False): global generator_wants_static_library_dependencies_adjusted generator_wants_static_library_dependencies_adjusted = True def GenerateOutput(target_list, target_dicts, data, params): # Map of target -> list of targets it depends on. edges = {} # Queue of targets to visit. targets_to_visit = target_list[:] while len(targets_to_visit) > 0: target = targets_to_visit.pop() if target in edges: continue edges[target] = [] for dep in target_dicts[target].get('dependencies', []): edges[target].append(dep) targets_to_visit.append(dep) filename = 'dump.json' f = open(filename, 'w') json.dump(edges, f) f.close() print 'Wrote json to %s.' % filename ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/eclipse.py000644 �000766 �000024 �00000027166 12455173731 033616� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """GYP backend that generates Eclipse CDT settings files. This backend DOES NOT generate Eclipse CDT projects. Instead, it generates XML files that can be imported into an Eclipse CDT project. The XML file contains a list of include paths and symbols (i.e. defines). Because a full .cproject definition is not created by this generator, it's not possible to properly define the include dirs and symbols for each file individually. Instead, one set of includes/symbols is generated for the entire project. This works fairly well (and is a vast improvement in general), but may still result in a few indexer issues here and there. This generator has no automated tests, so expect it to be broken. """ from xml.sax.saxutils import escape import os.path import subprocess import gyp import gyp.common import gyp.msvs_emulation import shlex generator_wants_static_library_dependencies_adjusted = False generator_default_variables = { } for dirname in ['INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']: # Some gyp steps fail if these are empty(!). generator_default_variables[dirname] = 'dir' for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX', 'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX', 'CONFIGURATION_NAME']: generator_default_variables[unused] = '' # Include dirs will occasionally use the SHARED_INTERMEDIATE_DIR variable as # part of the path when dealing with generated headers. This value will be # replaced dynamically for each configuration. generator_default_variables['SHARED_INTERMEDIATE_DIR'] = \ '$SHARED_INTERMEDIATE_DIR' def CalculateVariables(default_variables, params): generator_flags = params.get('generator_flags', {}) for key, val in generator_flags.items(): default_variables.setdefault(key, val) flavor = gyp.common.GetFlavor(params) default_variables.setdefault('OS', flavor) if flavor == 'win': # Copy additional generator configuration data from VS, which is shared # by the Eclipse generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) def CalculateGeneratorInputInfo(params): """Calculate the generator specific info that gets fed to input (called by gyp).""" generator_flags = params.get('generator_flags', {}) if generator_flags.get('adjust_static_libraries', False): global generator_wants_static_library_dependencies_adjusted generator_wants_static_library_dependencies_adjusted = True def GetAllIncludeDirectories(target_list, target_dicts, shared_intermediate_dirs, config_name, params): """Calculate the set of include directories to be used. Returns: A list including all the include_dir's specified for every target followed by any include directories that were added as cflag compiler options. """ gyp_includes_set = set() compiler_includes_list = [] flavor = gyp.common.GetFlavor(params) if flavor == 'win': generator_flags = params.get('generator_flags', {}) for target_name in target_list: target = target_dicts[target_name] if config_name in target['configurations']: config = target['configurations'][config_name] # Look for any include dirs that were explicitly added via cflags. This # may be done in gyp files to force certain includes to come at the end. # TODO(jgreenwald): Change the gyp files to not abuse cflags for this, and # remove this. if flavor == 'win': msvs_settings = gyp.msvs_emulation.MsvsSettings(target, generator_flags) cflags = msvs_settings.GetCflags(config_name) else: cflags = config['cflags'] for cflag in cflags: include_dir = '' if cflag.startswith('-I'): include_dir = cflag[2:] if include_dir and not include_dir in compiler_includes_list: compiler_includes_list.append(include_dir) # Find standard gyp include dirs. if config.has_key('include_dirs'): include_dirs = config['include_dirs'] for shared_intermediate_dir in shared_intermediate_dirs: for include_dir in include_dirs: include_dir = include_dir.replace('$SHARED_INTERMEDIATE_DIR', shared_intermediate_dir) if not os.path.isabs(include_dir): base_dir = os.path.dirname(target_name) include_dir = base_dir + '/' + include_dir include_dir = os.path.abspath(include_dir) if not include_dir in gyp_includes_set: gyp_includes_set.add(include_dir) # Generate a list that has all the include dirs. all_includes_list = list(gyp_includes_set) all_includes_list.sort() for compiler_include in compiler_includes_list: if not compiler_include in gyp_includes_set: all_includes_list.append(compiler_include) # All done. return all_includes_list def GetCompilerPath(target_list, target_dicts, data): """Determine a command that can be used to invoke the compiler. Returns: If this is a gyp project that has explicit make settings, try to determine the compiler from that. Otherwise, see if a compiler was specified via the CC_target environment variable. """ # First, see if the compiler is configured in make's settings. build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings_dict = data[build_file].get('make_global_settings', {}) for key, value in make_global_settings_dict: if key in ['CC', 'CXX']: return value # Check to see if the compiler was specified as an environment variable. for key in ['CC_target', 'CC', 'CXX']: compiler = os.environ.get(key) if compiler: return compiler return 'gcc' def GetAllDefines(target_list, target_dicts, data, config_name, params): """Calculate the defines for a project. Returns: A dict that includes explict defines declared in gyp files along with all of the default defines that the compiler uses. """ # Get defines declared in the gyp files. all_defines = {} flavor = gyp.common.GetFlavor(params) if flavor == 'win': generator_flags = params.get('generator_flags', {}) for target_name in target_list: target = target_dicts[target_name] if flavor == 'win': msvs_settings = gyp.msvs_emulation.MsvsSettings(target, generator_flags) extra_defines = msvs_settings.GetComputedDefines(config_name) else: extra_defines = [] if config_name in target['configurations']: config = target['configurations'][config_name] target_defines = config['defines'] else: target_defines = [] for define in target_defines + extra_defines: split_define = define.split('=', 1) if len(split_define) == 1: split_define.append('1') if split_define[0].strip() in all_defines: # Already defined continue all_defines[split_define[0].strip()] = split_define[1].strip() # Get default compiler defines (if possible). if flavor == 'win': return all_defines # Default defines already processed in the loop above. cc_target = GetCompilerPath(target_list, target_dicts, data) if cc_target: command = shlex.split(cc_target) command.extend(['-E', '-dM', '-']) cpp_proc = subprocess.Popen(args=command, cwd='.', stdin=subprocess.PIPE, stdout=subprocess.PIPE) cpp_output = cpp_proc.communicate()[0] cpp_lines = cpp_output.split('\n') for cpp_line in cpp_lines: if not cpp_line.strip(): continue cpp_line_parts = cpp_line.split(' ', 2) key = cpp_line_parts[1] if len(cpp_line_parts) >= 3: val = cpp_line_parts[2] else: val = '1' all_defines[key] = val return all_defines def WriteIncludePaths(out, eclipse_langs, include_dirs): """Write the includes section of a CDT settings export file.""" out.write(' <section name="org.eclipse.cdt.internal.ui.wizards.' \ 'settingswizards.IncludePaths">\n') out.write(' <language name="holder for library settings"></language>\n') for lang in eclipse_langs: out.write(' <language name="%s">\n' % lang) for include_dir in include_dirs: out.write(' <includepath workspace_path="false">%s</includepath>\n' % include_dir) out.write(' </language>\n') out.write(' </section>\n') def WriteMacros(out, eclipse_langs, defines): """Write the macros section of a CDT settings export file.""" out.write(' <section name="org.eclipse.cdt.internal.ui.wizards.' \ 'settingswizards.Macros">\n') out.write(' <language name="holder for library settings"></language>\n') for lang in eclipse_langs: out.write(' <language name="%s">\n' % lang) for key in sorted(defines.iterkeys()): out.write(' <macro><name>%s</name><value>%s</value></macro>\n' % (escape(key), escape(defines[key]))) out.write(' </language>\n') out.write(' </section>\n') def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name): options = params['options'] generator_flags = params.get('generator_flags', {}) # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.join(generator_flags.get('output_dir', 'out'), config_name) toplevel_build = os.path.join(options.toplevel_dir, build_dir) # Ninja uses out/Debug/gen while make uses out/Debug/obj/gen as the # SHARED_INTERMEDIATE_DIR. Include both possible locations. shared_intermediate_dirs = [os.path.join(toplevel_build, 'obj', 'gen'), os.path.join(toplevel_build, 'gen')] out_name = os.path.join(toplevel_build, 'eclipse-cdt-settings.xml') gyp.common.EnsureDirExists(out_name) out = open(out_name, 'w') out.write('<?xml version="1.0" encoding="UTF-8"?>\n') out.write('<cdtprojectproperties>\n') eclipse_langs = ['C++ Source File', 'C Source File', 'Assembly Source File', 'GNU C++', 'GNU C', 'Assembly'] include_dirs = GetAllIncludeDirectories(target_list, target_dicts, shared_intermediate_dirs, config_name, params) WriteIncludePaths(out, eclipse_langs, include_dirs) defines = GetAllDefines(target_list, target_dicts, data, config_name, params) WriteMacros(out, eclipse_langs, defines) out.write('</cdtprojectproperties>\n') out.close() def GenerateOutput(target_list, target_dicts, data, params): """Generate an XML settings file that can be imported into a CDT project.""" if params['options'].generator_output: raise NotImplementedError, "--generator_output not implemented for eclipse" user_config = params.get('generator_flags', {}).get('config', None) if user_config: GenerateOutputForConfig(target_list, target_dicts, data, params, user_config) else: config_names = target_dicts[target_list[0]]['configurations'].keys() for config_name in config_names: GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/gypd.py���000644 �000766 �000024 �00000006375 12455173731 033134� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """gypd output module This module produces gyp input as its output. Output files are given the .gypd extension to avoid overwriting the .gyp files that they are generated from. Internal references to .gyp files (such as those found in "dependencies" sections) are not adjusted to point to .gypd files instead; unlike other paths, which are relative to the .gyp or .gypd file, such paths are relative to the directory from which gyp was run to create the .gypd file. This generator module is intended to be a sample and a debugging aid, hence the "d" for "debug" in .gypd. It is useful to inspect the results of the various merges, expansions, and conditional evaluations performed by gyp and to see a representation of what would be fed to a generator module. It's not advisable to rename .gypd files produced by this module to .gyp, because they will have all merges, expansions, and evaluations already performed and the relevant constructs not present in the output; paths to dependencies may be wrong; and various sections that do not belong in .gyp files such as such as "included_files" and "*_excluded" will be present. Output will also be stripped of comments. This is not intended to be a general-purpose gyp pretty-printer; for that, you probably just want to run "pprint.pprint(eval(open('source.gyp').read()))", which will still strip comments but won't do all of the other things done to this module's output. The specific formatting of the output generated by this module is subject to change. """ import gyp.common import errno import os import pprint # These variables should just be spit back out as variable references. _generator_identity_variables = [ 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'INTERMEDIATE_DIR', 'PRODUCT_DIR', 'RULE_INPUT_ROOT', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'RULE_INPUT_NAME', 'RULE_INPUT_PATH', 'SHARED_INTERMEDIATE_DIR', ] # gypd doesn't define a default value for OS like many other generator # modules. Specify "-D OS=whatever" on the command line to provide a value. generator_default_variables = { } # gypd supports multiple toolsets generator_supports_multiple_toolsets = True # TODO(mark): This always uses <, which isn't right. The input module should # notify the generator to tell it which phase it is operating in, and this # module should use < for the early phase and then switch to > for the late # phase. Bonus points for carrying @ back into the output too. for v in _generator_identity_variables: generator_default_variables[v] = '<(%s)' % v def GenerateOutput(target_list, target_dicts, data, params): output_files = {} for qualified_target in target_list: [input_file, target] = \ gyp.common.ParseQualifiedTarget(qualified_target)[0:2] if input_file[-4:] != '.gyp': continue input_file_stem = input_file[:-4] output_file = input_file_stem + params['options'].suffix + '.gypd' if not output_file in output_files: output_files[output_file] = input_file for output_file, input_file in output_files.iteritems(): output = open(output_file, 'w') pprint.pprint(data[input_file], output) output.close() �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/gypsh.py��000644 �000766 �000024 �00000003201 12455173731 033304� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """gypsh output module gypsh is a GYP shell. It's not really a generator per se. All it does is fire up an interactive Python session with a few local variables set to the variables passed to the generator. Like gypd, it's intended as a debugging aid, to facilitate the exploration of .gyp structures after being processed by the input module. The expected usage is "gyp -f gypsh -D OS=desired_os". """ import code import sys # All of this stuff about generator variables was lovingly ripped from gypd.py. # That module has a much better description of what's going on and why. _generator_identity_variables = [ 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'INTERMEDIATE_DIR', 'PRODUCT_DIR', 'RULE_INPUT_ROOT', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'RULE_INPUT_NAME', 'RULE_INPUT_PATH', 'SHARED_INTERMEDIATE_DIR', ] generator_default_variables = { } for v in _generator_identity_variables: generator_default_variables[v] = '<(%s)' % v def GenerateOutput(target_list, target_dicts, data, params): locals = { 'target_list': target_list, 'target_dicts': target_dicts, 'data': data, } # Use a banner that looks like the stock Python one and like what # code.interact uses by default, but tack on something to indicate what # locals are available, and identify gypsh. banner='Python %s on %s\nlocals.keys() = %s\ngypsh' % \ (sys.version, sys.platform, repr(sorted(locals.keys()))) code.interact(banner, local=locals) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/make.py���000644 �000766 �000024 �00000257154 12455173731 033111� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. # Notes: # # This is all roughly based on the Makefile system used by the Linux # kernel, but is a non-recursive make -- we put the entire dependency # graph in front of make and let it figure it out. # # The code below generates a separate .mk file for each target, but # all are sourced by the top-level Makefile. This means that all # variables in .mk-files clobber one another. Be careful to use := # where appropriate for immediate evaluation, and similarly to watch # that you're not relying on a variable value to last beween different # .mk files. # # TODOs: # # Global settings and utility functions are currently stuffed in the # toplevel Makefile. It may make sense to generate some .mk files on # the side to keep the the files readable. import os import re import sys import subprocess import gyp import gyp.common import gyp.xcode_emulation from gyp.common import GetEnvironFallback generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'SHARED_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'INTERMEDIATE_DIR': '$(obj).$(TOOLSET)/$(TARGET)/geni', 'SHARED_INTERMEDIATE_DIR': '$(obj)/gen', 'PRODUCT_DIR': '$(builddir)', 'RULE_INPUT_ROOT': '%(INPUT_ROOT)s', # This gets expanded by Python. 'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s', # This gets expanded by Python. 'RULE_INPUT_PATH': '$(abspath $<)', 'RULE_INPUT_EXT': '$(suffix $<)', 'RULE_INPUT_NAME': '$(notdir $<)', 'CONFIGURATION_NAME': '$(BUILDTYPE)', } # Make supports multiple toolsets generator_supports_multiple_toolsets = True # Request sorted dependencies in the order from dependents to dependencies. generator_wants_sorted_dependencies = False # Placates pylint. generator_additional_non_configuration_keys = [] generator_additional_path_sections = [] generator_extra_sources_for_rules = [] generator_filelist_paths = None def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" flavor = gyp.common.GetFlavor(params) if flavor == 'mac': default_variables.setdefault('OS', 'mac') default_variables.setdefault('SHARED_LIB_SUFFIX', '.dylib') default_variables.setdefault('SHARED_LIB_DIR', generator_default_variables['PRODUCT_DIR']) default_variables.setdefault('LIB_DIR', generator_default_variables['PRODUCT_DIR']) # Copy additional generator configuration data from Xcode, which is shared # by the Mac Make generator. import gyp.generator.xcode as xcode_generator global generator_additional_non_configuration_keys generator_additional_non_configuration_keys = getattr(xcode_generator, 'generator_additional_non_configuration_keys', []) global generator_additional_path_sections generator_additional_path_sections = getattr(xcode_generator, 'generator_additional_path_sections', []) global generator_extra_sources_for_rules generator_extra_sources_for_rules = getattr(xcode_generator, 'generator_extra_sources_for_rules', []) COMPILABLE_EXTENSIONS.update({'.m': 'objc', '.mm' : 'objcxx'}) else: operating_system = flavor if flavor == 'android': operating_system = 'linux' # Keep this legacy behavior for now. default_variables.setdefault('OS', operating_system) default_variables.setdefault('SHARED_LIB_SUFFIX', '.so') default_variables.setdefault('SHARED_LIB_DIR','$(builddir)/lib.$(TOOLSET)') default_variables.setdefault('LIB_DIR', '$(obj).$(TOOLSET)') def CalculateGeneratorInputInfo(params): """Calculate the generator specific info that gets fed to input (called by gyp).""" generator_flags = params.get('generator_flags', {}) android_ndk_version = generator_flags.get('android_ndk_version', None) # Android NDK requires a strict link order. if android_ndk_version: global generator_wants_sorted_dependencies generator_wants_sorted_dependencies = True output_dir = params['options'].generator_output or \ params['options'].toplevel_dir builddir_name = generator_flags.get('output_dir', 'out') qualified_out_dir = os.path.normpath(os.path.join( output_dir, builddir_name, 'gypfiles')) global generator_filelist_paths generator_filelist_paths = { 'toplevel': params['options'].toplevel_dir, 'qualified_out_dir': qualified_out_dir, } # The .d checking code below uses these functions: # wildcard, sort, foreach, shell, wordlist # wildcard can handle spaces, the rest can't. # Since I could find no way to make foreach work with spaces in filenames # correctly, the .d files have spaces replaced with another character. The .d # file for # Chromium\ Framework.framework/foo # is for example # out/Release/.deps/out/Release/Chromium?Framework.framework/foo # This is the replacement character. SPACE_REPLACEMENT = '?' LINK_COMMANDS_LINUX = """\ quiet_cmd_alink = AR($(TOOLSET)) $@ cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) quiet_cmd_alink_thin = AR($(TOOLSET)) $@ cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^) # Due to circular dependencies between libraries :(, we wrap the # special "figure out circular dependencies" flags around the entire # input list during linking. quiet_cmd_link = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ -Wl,--start-group $(LD_INPUTS) -Wl,--end-group $(LIBS) # We support two kinds of shared objects (.so): # 1) shared_library, which is just bundling together many dependent libraries # into a link line. # 2) loadable_module, which is generating a module intended for dlopen(). # # They differ only slightly: # In the former case, we want to package all dependent code into the .so. # In the latter case, we want to package just the API exposed by the # outermost module. # This means shared_library uses --whole-archive, while loadable_module doesn't. # (Note that --whole-archive is incompatible with the --start-group used in # normal linking.) # Other shared-object link notes: # - Set SONAME to the library filename so our binaries don't reference # the local, absolute paths used on the link command-line. quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--whole-archive $(LD_INPUTS) -Wl,--no-whole-archive $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--start-group $(filter-out FORCE_DO_CMD, $^) -Wl,--end-group $(LIBS) """ LINK_COMMANDS_MAC = """\ quiet_cmd_alink = LIBTOOL-STATIC $@ cmd_alink = rm -f $@ && ./gyp-mac-tool filter-libtool libtool $(GYP_LIBTOOLFLAGS) -static -o $@ $(filter %.o,$^) quiet_cmd_link = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS) quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -bundle $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) """ LINK_COMMANDS_ANDROID = """\ quiet_cmd_alink = AR($(TOOLSET)) $@ cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) quiet_cmd_alink_thin = AR($(TOOLSET)) $@ cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^) # Due to circular dependencies between libraries :(, we wrap the # special "figure out circular dependencies" flags around the entire # input list during linking. quiet_cmd_link = LINK($(TOOLSET)) $@ quiet_cmd_link_host = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ -Wl,--start-group $(LD_INPUTS) -Wl,--end-group $(LIBS) cmd_link_host = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) # Other shared-object link notes: # - Set SONAME to the library filename so our binaries don't reference # the local, absolute paths used on the link command-line. quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--whole-archive $(LD_INPUTS) -Wl,--no-whole-archive $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--start-group $(filter-out FORCE_DO_CMD, $^) -Wl,--end-group $(LIBS) quiet_cmd_solink_module_host = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module_host = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) """ LINK_COMMANDS_AIX = """\ quiet_cmd_alink = AR($(TOOLSET)) $@ cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) quiet_cmd_alink_thin = AR($(TOOLSET)) $@ cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) quiet_cmd_link = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) """ # Header of toplevel Makefile. # This should go into the build tree, but it's easier to keep it here for now. SHARED_HEADER = ("""\ # We borrow heavily from the kernel build setup, though we are simpler since # we don't have Kconfig tweaking settings on us. # The implicit make rules have it looking for RCS files, among other things. # We instead explicitly write all the rules we care about. # It's even quicker (saves ~200ms) to pass -r on the command line. MAKEFLAGS=-r # The source directory tree. srcdir := %(srcdir)s abs_srcdir := $(abspath $(srcdir)) # The name of the builddir. builddir_name ?= %(builddir)s # The V=1 flag on command line makes us verbosely print command lines. ifdef V quiet= else quiet=quiet_ endif # Specify BUILDTYPE=Release on the command line for a release build. BUILDTYPE ?= %(default_configuration)s # Directory all our build output goes into. # Note that this must be two directories beneath src/ for unit tests to pass, # as they reach into the src/ directory for data with relative paths. builddir ?= $(builddir_name)/$(BUILDTYPE) abs_builddir := $(abspath $(builddir)) depsdir := $(builddir)/.deps # Object output directory. obj := $(builddir)/obj abs_obj := $(abspath $(obj)) # We build up a list of every single one of the targets so we can slurp in the # generated dependency rule Makefiles in one pass. all_deps := %(make_global_settings)s CC.target ?= %(CC.target)s CFLAGS.target ?= $(CFLAGS) CXX.target ?= %(CXX.target)s CXXFLAGS.target ?= $(CXXFLAGS) LINK.target ?= %(LINK.target)s LDFLAGS.target ?= $(LDFLAGS) AR.target ?= $(AR) # C++ apps need to be linked with g++. # # Note: flock is used to seralize linking. Linking is a memory-intensive # process so running parallel links can often lead to thrashing. To disable # the serialization, override LINK via an envrionment variable as follows: # # export LINK=g++ # # This will allow make to invoke N linker processes as specified in -jN. LINK ?= %(flock)s $(builddir)/linker.lock $(CXX.target) # TODO(evan): move all cross-compilation logic to gyp-time so we don't need # to replicate this environment fallback in make as well. CC.host ?= %(CC.host)s CFLAGS.host ?= CXX.host ?= %(CXX.host)s CXXFLAGS.host ?= LINK.host ?= %(LINK.host)s LDFLAGS.host ?= AR.host ?= %(AR.host)s # Define a dir function that can handle spaces. # http://www.gnu.org/software/make/manual/make.html#Syntax-of-Functions # "leading spaces cannot appear in the text of the first argument as written. # These characters can be put into the argument value by variable substitution." empty := space := $(empty) $(empty) # http://stackoverflow.com/questions/1189781/using-make-dir-or-notdir-on-a-path-with-spaces replace_spaces = $(subst $(space),""" + SPACE_REPLACEMENT + """,$1) unreplace_spaces = $(subst """ + SPACE_REPLACEMENT + """,$(space),$1) dirx = $(call unreplace_spaces,$(dir $(call replace_spaces,$1))) # Flags to make gcc output dependency info. Note that you need to be # careful here to use the flags that ccache and distcc can understand. # We write to a dep file on the side first and then rename at the end # so we can't end up with a broken dep file. depfile = $(depsdir)/$(call replace_spaces,$@).d DEPFLAGS = -MMD -MF $(depfile).raw # We have to fixup the deps output in a few ways. # (1) the file output should mention the proper .o file. # ccache or distcc lose the path to the target, so we convert a rule of # the form: # foobar.o: DEP1 DEP2 # into # path/to/foobar.o: DEP1 DEP2 # (2) we want missing files not to cause us to fail to build. # We want to rewrite # foobar.o: DEP1 DEP2 \\ # DEP3 # to # DEP1: # DEP2: # DEP3: # so if the files are missing, they're just considered phony rules. # We have to do some pretty insane escaping to get those backslashes # and dollar signs past make, the shell, and sed at the same time. # Doesn't work with spaces, but that's fine: .d files have spaces in # their names replaced with other characters.""" r""" define fixup_dep # The depfile may not exist if the input file didn't have any #includes. touch $(depfile).raw # Fixup path as in (1). sed -e "s|^$(notdir $@)|$@|" $(depfile).raw >> $(depfile) # Add extra rules as in (2). # We remove slashes and replace spaces with new lines; # remove blank lines; # delete the first line and append a colon to the remaining lines. sed -e 's|\\||' -e 'y| |\n|' $(depfile).raw |\ grep -v '^$$' |\ sed -e 1d -e 's|$$|:|' \ >> $(depfile) rm $(depfile).raw endef """ """ # Command definitions: # - cmd_foo is the actual command to run; # - quiet_cmd_foo is the brief-output summary of the command. quiet_cmd_cc = CC($(TOOLSET)) $@ cmd_cc = $(CC.$(TOOLSET)) $(GYP_CFLAGS) $(DEPFLAGS) $(CFLAGS.$(TOOLSET)) -c -o $@ $< quiet_cmd_cxx = CXX($(TOOLSET)) $@ cmd_cxx = $(CXX.$(TOOLSET)) $(GYP_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $< %(extra_commands)s quiet_cmd_touch = TOUCH $@ cmd_touch = touch $@ quiet_cmd_copy = COPY $@ # send stderr to /dev/null to ignore messages when linking directories. cmd_copy = rm -rf "$@" && cp -af "$<" "$@" %(link_commands)s """ r""" # Define an escape_quotes function to escape single quotes. # This allows us to handle quotes properly as long as we always use # use single quotes and escape_quotes. escape_quotes = $(subst ','\'',$(1)) # This comment is here just to include a ' to unconfuse syntax highlighting. # Define an escape_vars function to escape '$' variable syntax. # This allows us to read/write command lines with shell variables (e.g. # $LD_LIBRARY_PATH), without triggering make substitution. escape_vars = $(subst $$,$$$$,$(1)) # Helper that expands to a shell command to echo a string exactly as it is in # make. This uses printf instead of echo because printf's behaviour with respect # to escape sequences is more portable than echo's across different shells # (e.g., dash, bash). exact_echo = printf '%%s\n' '$(call escape_quotes,$(1))' """ """ # Helper to compare the command we're about to run against the command # we logged the last time we ran the command. Produces an empty # string (false) when the commands match. # Tricky point: Make has no string-equality test function. # The kernel uses the following, but it seems like it would have false # positives, where one string reordered its arguments. # arg_check = $(strip $(filter-out $(cmd_$(1)), $(cmd_$@)) \\ # $(filter-out $(cmd_$@), $(cmd_$(1)))) # We instead substitute each for the empty string into the other, and # say they're equal if both substitutions produce the empty string. # .d files contain """ + SPACE_REPLACEMENT + \ """ instead of spaces, take that into account. command_changed = $(or $(subst $(cmd_$(1)),,$(cmd_$(call replace_spaces,$@))),\\ $(subst $(cmd_$(call replace_spaces,$@)),,$(cmd_$(1)))) # Helper that is non-empty when a prerequisite changes. # Normally make does this implicitly, but we force rules to always run # so we can check their command lines. # $? -- new prerequisites # $| -- order-only dependencies prereq_changed = $(filter-out FORCE_DO_CMD,$(filter-out $|,$?)) # Helper that executes all postbuilds until one fails. define do_postbuilds @E=0;\\ for p in $(POSTBUILDS); do\\ eval $$p;\\ E=$$?;\\ if [ $$E -ne 0 ]; then\\ break;\\ fi;\\ done;\\ if [ $$E -ne 0 ]; then\\ rm -rf "$@";\\ exit $$E;\\ fi endef # do_cmd: run a command via the above cmd_foo names, if necessary. # Should always run for a given target to handle command-line changes. # Second argument, if non-zero, makes it do asm/C/C++ dependency munging. # Third argument, if non-zero, makes it do POSTBUILDS processing. # Note: We intentionally do NOT call dirx for depfile, since it contains """ + \ SPACE_REPLACEMENT + """ for # spaces already and dirx strips the """ + SPACE_REPLACEMENT + \ """ characters. define do_cmd $(if $(or $(command_changed),$(prereq_changed)), @$(call exact_echo, $($(quiet)cmd_$(1))) @mkdir -p "$(call dirx,$@)" "$(dir $(depfile))" $(if $(findstring flock,$(word %(flock_index)d,$(cmd_$1))), @$(cmd_$(1)) @echo " $(quiet_cmd_$(1)): Finished", @$(cmd_$(1)) ) @$(call exact_echo,$(call escape_vars,cmd_$(call replace_spaces,$@) := $(cmd_$(1)))) > $(depfile) @$(if $(2),$(fixup_dep)) $(if $(and $(3), $(POSTBUILDS)), $(call do_postbuilds) ) ) endef # Declare the "%(default_target)s" target first so it is the default, # even though we don't have the deps yet. .PHONY: %(default_target)s %(default_target)s: # make looks for ways to re-generate included makefiles, but in our case, we # don't have a direct way. Explicitly telling make that it has nothing to do # for them makes it go faster. %%.d: ; # Use FORCE_DO_CMD to force a target to run. Should be coupled with # do_cmd. .PHONY: FORCE_DO_CMD FORCE_DO_CMD: """) SHARED_HEADER_MAC_COMMANDS = """ quiet_cmd_objc = CXX($(TOOLSET)) $@ cmd_objc = $(CC.$(TOOLSET)) $(GYP_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $< quiet_cmd_objcxx = CXX($(TOOLSET)) $@ cmd_objcxx = $(CXX.$(TOOLSET)) $(GYP_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $< # Commands for precompiled header files. quiet_cmd_pch_c = CXX($(TOOLSET)) $@ cmd_pch_c = $(CC.$(TOOLSET)) $(GYP_PCH_CFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $< quiet_cmd_pch_cc = CXX($(TOOLSET)) $@ cmd_pch_cc = $(CC.$(TOOLSET)) $(GYP_PCH_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $< quiet_cmd_pch_m = CXX($(TOOLSET)) $@ cmd_pch_m = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $< quiet_cmd_pch_mm = CXX($(TOOLSET)) $@ cmd_pch_mm = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $< # gyp-mac-tool is written next to the root Makefile by gyp. # Use $(4) for the command, since $(2) and $(3) are used as flag by do_cmd # already. quiet_cmd_mac_tool = MACTOOL $(4) $< cmd_mac_tool = ./gyp-mac-tool $(4) $< "$@" quiet_cmd_mac_package_framework = PACKAGE FRAMEWORK $@ cmd_mac_package_framework = ./gyp-mac-tool package-framework "$@" $(4) quiet_cmd_infoplist = INFOPLIST $@ cmd_infoplist = $(CC.$(TOOLSET)) -E -P -Wno-trigraphs -x c $(INFOPLIST_DEFINES) "$<" -o "$@" """ def WriteRootHeaderSuffixRules(writer): extensions = sorted(COMPILABLE_EXTENSIONS.keys(), key=str.lower) writer.write('# Suffix rules, putting all outputs into $(obj).\n') for ext in extensions: writer.write('$(obj).$(TOOLSET)/%%.o: $(srcdir)/%%%s FORCE_DO_CMD\n' % ext) writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext]) writer.write('\n# Try building from generated source, too.\n') for ext in extensions: writer.write( '$(obj).$(TOOLSET)/%%.o: $(obj).$(TOOLSET)/%%%s FORCE_DO_CMD\n' % ext) writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext]) writer.write('\n') for ext in extensions: writer.write('$(obj).$(TOOLSET)/%%.o: $(obj)/%%%s FORCE_DO_CMD\n' % ext) writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext]) writer.write('\n') SHARED_HEADER_SUFFIX_RULES_COMMENT1 = ("""\ # Suffix rules, putting all outputs into $(obj). """) SHARED_HEADER_SUFFIX_RULES_COMMENT2 = ("""\ # Try building from generated source, too. """) SHARED_FOOTER = """\ # "all" is a concatenation of the "all" targets from all the included # sub-makefiles. This is just here to clarify. all: # Add in dependency-tracking rules. $(all_deps) is the list of every single # target in our tree. Only consider the ones with .d (dependency) info: d_files := $(wildcard $(foreach f,$(all_deps),$(depsdir)/$(f).d)) ifneq ($(d_files),) include $(d_files) endif """ header = """\ # This file is generated by gyp; do not edit. """ # Maps every compilable file extension to the do_cmd that compiles it. COMPILABLE_EXTENSIONS = { '.c': 'cc', '.cc': 'cxx', '.cpp': 'cxx', '.cxx': 'cxx', '.s': 'cc', '.S': 'cc', } def Compilable(filename): """Return true if the file is compilable (should be in OBJS).""" for res in (filename.endswith(e) for e in COMPILABLE_EXTENSIONS): if res: return True return False def Linkable(filename): """Return true if the file is linkable (should be on the link line).""" return filename.endswith('.o') def Target(filename): """Translate a compilable filename to its .o target.""" return os.path.splitext(filename)[0] + '.o' def EscapeShellArgument(s): """Quotes an argument so that it will be interpreted literally by a POSIX shell. Taken from http://stackoverflow.com/questions/35817/whats-the-best-way-to-escape-ossystem-calls-in-python """ return "'" + s.replace("'", "'\\''") + "'" def EscapeMakeVariableExpansion(s): """Make has its own variable expansion syntax using $. We must escape it for string to be interpreted literally.""" return s.replace('$', '$$') def EscapeCppDefine(s): """Escapes a CPP define so that it will reach the compiler unaltered.""" s = EscapeShellArgument(s) s = EscapeMakeVariableExpansion(s) # '#' characters must be escaped even embedded in a string, else Make will # treat it as the start of a comment. return s.replace('#', r'\#') def QuoteIfNecessary(string): """TODO: Should this ideally be replaced with one or more of the above functions?""" if '"' in string: string = '"' + string.replace('"', '\\"') + '"' return string def StringToMakefileVariable(string): """Convert a string to a value that is acceptable as a make variable name.""" return re.sub('[^a-zA-Z0-9_]', '_', string) srcdir_prefix = '' def Sourceify(path): """Convert a path to its source directory form.""" if '$(' in path: return path if os.path.isabs(path): return path return srcdir_prefix + path def QuoteSpaces(s, quote=r'\ '): return s.replace(' ', quote) # Map from qualified target to path to output. target_outputs = {} # Map from qualified target to any linkable output. A subset # of target_outputs. E.g. when mybinary depends on liba, we want to # include liba in the linker line; when otherbinary depends on # mybinary, we just want to build mybinary first. target_link_deps = {} class MakefileWriter: """MakefileWriter packages up the writing of one target-specific foobar.mk. Its only real entry point is Write(), and is mostly used for namespacing. """ def __init__(self, generator_flags, flavor): self.generator_flags = generator_flags self.flavor = flavor self.suffix_rules_srcdir = {} self.suffix_rules_objdir1 = {} self.suffix_rules_objdir2 = {} # Generate suffix rules for all compilable extensions. for ext in COMPILABLE_EXTENSIONS.keys(): # Suffix rules for source folder. self.suffix_rules_srcdir.update({ext: ("""\ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(srcdir)/%%%s FORCE_DO_CMD @$(call do_cmd,%s,1) """ % (ext, COMPILABLE_EXTENSIONS[ext]))}) # Suffix rules for generated source files. self.suffix_rules_objdir1.update({ext: ("""\ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj).$(TOOLSET)/%%%s FORCE_DO_CMD @$(call do_cmd,%s,1) """ % (ext, COMPILABLE_EXTENSIONS[ext]))}) self.suffix_rules_objdir2.update({ext: ("""\ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj)/%%%s FORCE_DO_CMD @$(call do_cmd,%s,1) """ % (ext, COMPILABLE_EXTENSIONS[ext]))}) def Write(self, qualified_target, base_path, output_filename, spec, configs, part_of_all): """The main entry point: writes a .mk file for a single target. Arguments: qualified_target: target we're generating base_path: path relative to source root we're building in, used to resolve target-relative paths output_filename: output .mk file name to write spec, configs: gyp info part_of_all: flag indicating this target is part of 'all' """ gyp.common.EnsureDirExists(output_filename) self.fp = open(output_filename, 'w') self.fp.write(header) self.qualified_target = qualified_target self.path = base_path self.target = spec['target_name'] self.type = spec['type'] self.toolset = spec['toolset'] self.is_mac_bundle = gyp.xcode_emulation.IsMacBundle(self.flavor, spec) if self.flavor == 'mac': self.xcode_settings = gyp.xcode_emulation.XcodeSettings(spec) else: self.xcode_settings = None deps, link_deps = self.ComputeDeps(spec) # Some of the generation below can add extra output, sources, or # link dependencies. All of the out params of the functions that # follow use names like extra_foo. extra_outputs = [] extra_sources = [] extra_link_deps = [] extra_mac_bundle_resources = [] mac_bundle_deps = [] if self.is_mac_bundle: self.output = self.ComputeMacBundleOutput(spec) self.output_binary = self.ComputeMacBundleBinaryOutput(spec) else: self.output = self.output_binary = self.ComputeOutput(spec) self.is_standalone_static_library = bool( spec.get('standalone_static_library', 0)) self._INSTALLABLE_TARGETS = ('executable', 'loadable_module', 'shared_library') if (self.is_standalone_static_library or self.type in self._INSTALLABLE_TARGETS): self.alias = os.path.basename(self.output) install_path = self._InstallableTargetInstallPath() else: self.alias = self.output install_path = self.output self.WriteLn("TOOLSET := " + self.toolset) self.WriteLn("TARGET := " + self.target) # Actions must come first, since they can generate more OBJs for use below. if 'actions' in spec: self.WriteActions(spec['actions'], extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all) # Rules must be early like actions. if 'rules' in spec: self.WriteRules(spec['rules'], extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all) if 'copies' in spec: self.WriteCopies(spec['copies'], extra_outputs, part_of_all) # Bundle resources. if self.is_mac_bundle: all_mac_bundle_resources = ( spec.get('mac_bundle_resources', []) + extra_mac_bundle_resources) self.WriteMacBundleResources(all_mac_bundle_resources, mac_bundle_deps) self.WriteMacInfoPlist(mac_bundle_deps) # Sources. all_sources = spec.get('sources', []) + extra_sources if all_sources: self.WriteSources( configs, deps, all_sources, extra_outputs, extra_link_deps, part_of_all, gyp.xcode_emulation.MacPrefixHeader( self.xcode_settings, lambda p: Sourceify(self.Absolutify(p)), self.Pchify)) sources = filter(Compilable, all_sources) if sources: self.WriteLn(SHARED_HEADER_SUFFIX_RULES_COMMENT1) extensions = set([os.path.splitext(s)[1] for s in sources]) for ext in extensions: if ext in self.suffix_rules_srcdir: self.WriteLn(self.suffix_rules_srcdir[ext]) self.WriteLn(SHARED_HEADER_SUFFIX_RULES_COMMENT2) for ext in extensions: if ext in self.suffix_rules_objdir1: self.WriteLn(self.suffix_rules_objdir1[ext]) for ext in extensions: if ext in self.suffix_rules_objdir2: self.WriteLn(self.suffix_rules_objdir2[ext]) self.WriteLn('# End of this set of suffix rules') # Add dependency from bundle to bundle binary. if self.is_mac_bundle: mac_bundle_deps.append(self.output_binary) self.WriteTarget(spec, configs, deps, extra_link_deps + link_deps, mac_bundle_deps, extra_outputs, part_of_all) # Update global list of target outputs, used in dependency tracking. target_outputs[qualified_target] = install_path # Update global list of link dependencies. if self.type in ('static_library', 'shared_library'): target_link_deps[qualified_target] = self.output_binary # Currently any versions have the same effect, but in future the behavior # could be different. if self.generator_flags.get('android_ndk_version', None): self.WriteAndroidNdkModuleRule(self.target, all_sources, link_deps) self.fp.close() def WriteSubMake(self, output_filename, makefile_path, targets, build_dir): """Write a "sub-project" Makefile. This is a small, wrapper Makefile that calls the top-level Makefile to build the targets from a single gyp file (i.e. a sub-project). Arguments: output_filename: sub-project Makefile name to write makefile_path: path to the top-level Makefile targets: list of "all" targets for this sub-project build_dir: build output directory, relative to the sub-project """ gyp.common.EnsureDirExists(output_filename) self.fp = open(output_filename, 'w') self.fp.write(header) # For consistency with other builders, put sub-project build output in the # sub-project dir (see test/subdirectory/gyptest-subdir-all.py). self.WriteLn('export builddir_name ?= %s' % os.path.join(os.path.dirname(output_filename), build_dir)) self.WriteLn('.PHONY: all') self.WriteLn('all:') if makefile_path: makefile_path = ' -C ' + makefile_path self.WriteLn('\t$(MAKE)%s %s' % (makefile_path, ' '.join(targets))) self.fp.close() def WriteActions(self, actions, extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all): """Write Makefile code for any 'actions' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these actions (used to make other pieces dependent on these actions) part_of_all: flag indicating this target is part of 'all' """ env = self.GetSortedXcodeEnv() for action in actions: name = StringToMakefileVariable('%s_%s' % (self.qualified_target, action['action_name'])) self.WriteLn('### Rules for action "%s":' % action['action_name']) inputs = action['inputs'] outputs = action['outputs'] # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set() for out in outputs: dir = os.path.split(out)[0] if dir: dirs.add(dir) if int(action.get('process_outputs_as_sources', False)): extra_sources += outputs if int(action.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += outputs # Write the actual command. action_commands = action['action'] if self.flavor == 'mac': action_commands = [gyp.xcode_emulation.ExpandEnvVars(command, env) for command in action_commands] command = gyp.common.EncodePOSIXShellList(action_commands) if 'message' in action: self.WriteLn('quiet_cmd_%s = ACTION %s $@' % (name, action['message'])) else: self.WriteLn('quiet_cmd_%s = ACTION %s $@' % (name, name)) if len(dirs) > 0: command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command cd_action = 'cd %s; ' % Sourceify(self.path or '.') # command and cd_action get written to a toplevel variable called # cmd_foo. Toplevel variables can't handle things that change per # makefile like $(TARGET), so hardcode the target. command = command.replace('$(TARGET)', self.target) cd_action = cd_action.replace('$(TARGET)', self.target) # Set LD_LIBRARY_PATH in case the action runs an executable from this # build which links to shared libs from this build. # actions run on the host, so they should in theory only use host # libraries, but until everything is made cross-compile safe, also use # target libraries. # TODO(piman): when everything is cross-compile safe, remove lib.target self.WriteLn('cmd_%s = LD_LIBRARY_PATH=$(builddir)/lib.host:' '$(builddir)/lib.target:$$LD_LIBRARY_PATH; ' 'export LD_LIBRARY_PATH; ' '%s%s' % (name, cd_action, command)) self.WriteLn() outputs = map(self.Absolutify, outputs) # The makefile rules are all relative to the top dir, but the gyp actions # are defined relative to their containing dir. This replaces the obj # variable for the action rule with an absolute version so that the output # goes in the right place. # Only write the 'obj' and 'builddir' rules for the "primary" output (:1); # it's superfluous for the "extra outputs", and this avoids accidentally # writing duplicate dummy rules for those outputs. # Same for environment. self.WriteLn("%s: obj := $(abs_obj)" % QuoteSpaces(outputs[0])) self.WriteLn("%s: builddir := $(abs_builddir)" % QuoteSpaces(outputs[0])) self.WriteSortedXcodeEnv(outputs[0], self.GetSortedXcodeEnv()) for input in inputs: assert ' ' not in input, ( "Spaces in action input filenames not supported (%s)" % input) for output in outputs: assert ' ' not in output, ( "Spaces in action output filenames not supported (%s)" % output) # See the comment in WriteCopies about expanding env vars. outputs = [gyp.xcode_emulation.ExpandEnvVars(o, env) for o in outputs] inputs = [gyp.xcode_emulation.ExpandEnvVars(i, env) for i in inputs] self.WriteDoCmd(outputs, map(Sourceify, map(self.Absolutify, inputs)), part_of_all=part_of_all, command=name) # Stuff the outputs in a variable so we can refer to them later. outputs_variable = 'action_%s_outputs' % name self.WriteLn('%s := %s' % (outputs_variable, ' '.join(outputs))) extra_outputs.append('$(%s)' % outputs_variable) self.WriteLn() self.WriteLn() def WriteRules(self, rules, extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all): """Write Makefile code for any 'rules' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these rules (used to make other pieces dependent on these rules) part_of_all: flag indicating this target is part of 'all' """ env = self.GetSortedXcodeEnv() for rule in rules: name = StringToMakefileVariable('%s_%s' % (self.qualified_target, rule['rule_name'])) count = 0 self.WriteLn('### Generated for rule %s:' % name) all_outputs = [] for rule_source in rule.get('rule_sources', []): dirs = set() (rule_source_dirname, rule_source_basename) = os.path.split(rule_source) (rule_source_root, rule_source_ext) = \ os.path.splitext(rule_source_basename) outputs = [self.ExpandInputRoot(out, rule_source_root, rule_source_dirname) for out in rule['outputs']] for out in outputs: dir = os.path.dirname(out) if dir: dirs.add(dir) if int(rule.get('process_outputs_as_sources', False)): extra_sources += outputs if int(rule.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += outputs inputs = map(Sourceify, map(self.Absolutify, [rule_source] + rule.get('inputs', []))) actions = ['$(call do_cmd,%s_%d)' % (name, count)] if name == 'resources_grit': # HACK: This is ugly. Grit intentionally doesn't touch the # timestamp of its output file when the file doesn't change, # which is fine in hash-based dependency systems like scons # and forge, but not kosher in the make world. After some # discussion, hacking around it here seems like the least # amount of pain. actions += ['@touch --no-create $@'] # See the comment in WriteCopies about expanding env vars. outputs = [gyp.xcode_emulation.ExpandEnvVars(o, env) for o in outputs] inputs = [gyp.xcode_emulation.ExpandEnvVars(i, env) for i in inputs] outputs = map(self.Absolutify, outputs) all_outputs += outputs # Only write the 'obj' and 'builddir' rules for the "primary" output # (:1); it's superfluous for the "extra outputs", and this avoids # accidentally writing duplicate dummy rules for those outputs. self.WriteLn('%s: obj := $(abs_obj)' % outputs[0]) self.WriteLn('%s: builddir := $(abs_builddir)' % outputs[0]) self.WriteMakeRule(outputs, inputs + ['FORCE_DO_CMD'], actions) # Spaces in rule filenames are not supported, but rule variables have # spaces in them (e.g. RULE_INPUT_PATH expands to '$(abspath $<)'). # The spaces within the variables are valid, so remove the variables # before checking. variables_with_spaces = re.compile(r'\$\([^ ]* \$<\)') for output in outputs: output = re.sub(variables_with_spaces, '', output) assert ' ' not in output, ( "Spaces in rule filenames not yet supported (%s)" % output) self.WriteLn('all_deps += %s' % ' '.join(outputs)) action = [self.ExpandInputRoot(ac, rule_source_root, rule_source_dirname) for ac in rule['action']] mkdirs = '' if len(dirs) > 0: mkdirs = 'mkdir -p %s; ' % ' '.join(dirs) cd_action = 'cd %s; ' % Sourceify(self.path or '.') # action, cd_action, and mkdirs get written to a toplevel variable # called cmd_foo. Toplevel variables can't handle things that change # per makefile like $(TARGET), so hardcode the target. if self.flavor == 'mac': action = [gyp.xcode_emulation.ExpandEnvVars(command, env) for command in action] action = gyp.common.EncodePOSIXShellList(action) action = action.replace('$(TARGET)', self.target) cd_action = cd_action.replace('$(TARGET)', self.target) mkdirs = mkdirs.replace('$(TARGET)', self.target) # Set LD_LIBRARY_PATH in case the rule runs an executable from this # build which links to shared libs from this build. # rules run on the host, so they should in theory only use host # libraries, but until everything is made cross-compile safe, also use # target libraries. # TODO(piman): when everything is cross-compile safe, remove lib.target self.WriteLn( "cmd_%(name)s_%(count)d = LD_LIBRARY_PATH=" "$(builddir)/lib.host:$(builddir)/lib.target:$$LD_LIBRARY_PATH; " "export LD_LIBRARY_PATH; " "%(cd_action)s%(mkdirs)s%(action)s" % { 'action': action, 'cd_action': cd_action, 'count': count, 'mkdirs': mkdirs, 'name': name, }) self.WriteLn( 'quiet_cmd_%(name)s_%(count)d = RULE %(name)s_%(count)d $@' % { 'count': count, 'name': name, }) self.WriteLn() count += 1 outputs_variable = 'rule_%s_outputs' % name self.WriteList(all_outputs, outputs_variable) extra_outputs.append('$(%s)' % outputs_variable) self.WriteLn('### Finished generating for rule: %s' % name) self.WriteLn() self.WriteLn('### Finished generating for all rules') self.WriteLn('') def WriteCopies(self, copies, extra_outputs, part_of_all): """Write Makefile code for any 'copies' from the gyp input. extra_outputs: a list that will be filled in with any outputs of this action (used to make other pieces dependent on this action) part_of_all: flag indicating this target is part of 'all' """ self.WriteLn('### Generated for copy rule.') variable = StringToMakefileVariable(self.qualified_target + '_copies') outputs = [] for copy in copies: for path in copy['files']: # Absolutify() may call normpath, and will strip trailing slashes. path = Sourceify(self.Absolutify(path)) filename = os.path.split(path)[1] output = Sourceify(self.Absolutify(os.path.join(copy['destination'], filename))) # If the output path has variables in it, which happens in practice for # 'copies', writing the environment as target-local doesn't work, # because the variables are already needed for the target name. # Copying the environment variables into global make variables doesn't # work either, because then the .d files will potentially contain spaces # after variable expansion, and .d file handling cannot handle spaces. # As a workaround, manually expand variables at gyp time. Since 'copies' # can't run scripts, there's no need to write the env then. # WriteDoCmd() will escape spaces for .d files. env = self.GetSortedXcodeEnv() output = gyp.xcode_emulation.ExpandEnvVars(output, env) path = gyp.xcode_emulation.ExpandEnvVars(path, env) self.WriteDoCmd([output], [path], 'copy', part_of_all) outputs.append(output) self.WriteLn('%s = %s' % (variable, ' '.join(map(QuoteSpaces, outputs)))) extra_outputs.append('$(%s)' % variable) self.WriteLn() def WriteMacBundleResources(self, resources, bundle_deps): """Writes Makefile code for 'mac_bundle_resources'.""" self.WriteLn('### Generated for mac_bundle_resources') for output, res in gyp.xcode_emulation.GetMacBundleResources( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, map(Sourceify, map(self.Absolutify, resources))): self.WriteDoCmd([output], [res], 'mac_tool,,,copy-bundle-resource', part_of_all=True) bundle_deps.append(output) def WriteMacInfoPlist(self, bundle_deps): """Write Makefile code for bundle Info.plist files.""" info_plist, out, defines, extra_env = gyp.xcode_emulation.GetMacInfoPlist( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, lambda p: Sourceify(self.Absolutify(p))) if not info_plist: return if defines: # Create an intermediate file to store preprocessed results. intermediate_plist = ('$(obj).$(TOOLSET)/$(TARGET)/' + os.path.basename(info_plist)) self.WriteList(defines, intermediate_plist + ': INFOPLIST_DEFINES', '-D', quoter=EscapeCppDefine) self.WriteMakeRule([intermediate_plist], [info_plist], ['$(call do_cmd,infoplist)', # "Convert" the plist so that any weird whitespace changes from the # preprocessor do not affect the XML parser in mac_tool. '@plutil -convert xml1 $@ $@']) info_plist = intermediate_plist # plists can contain envvars and substitute them into the file. self.WriteSortedXcodeEnv( out, self.GetSortedXcodeEnv(additional_settings=extra_env)) self.WriteDoCmd([out], [info_plist], 'mac_tool,,,copy-info-plist', part_of_all=True) bundle_deps.append(out) def WriteSources(self, configs, deps, sources, extra_outputs, extra_link_deps, part_of_all, precompiled_header): """Write Makefile code for any 'sources' from the gyp input. These are source files necessary to build the current target. configs, deps, sources: input from gyp. extra_outputs: a list of extra outputs this action should be dependent on; used to serialize action/rules before compilation extra_link_deps: a list that will be filled in with any outputs of compilation (to be used in link lines) part_of_all: flag indicating this target is part of 'all' """ # Write configuration-specific variables for CFLAGS, etc. for configname in sorted(configs.keys()): config = configs[configname] self.WriteList(config.get('defines'), 'DEFS_%s' % configname, prefix='-D', quoter=EscapeCppDefine) if self.flavor == 'mac': cflags = self.xcode_settings.GetCflags(configname) cflags_c = self.xcode_settings.GetCflagsC(configname) cflags_cc = self.xcode_settings.GetCflagsCC(configname) cflags_objc = self.xcode_settings.GetCflagsObjC(configname) cflags_objcc = self.xcode_settings.GetCflagsObjCC(configname) else: cflags = config.get('cflags') cflags_c = config.get('cflags_c') cflags_cc = config.get('cflags_cc') self.WriteLn("# Flags passed to all source files."); self.WriteList(cflags, 'CFLAGS_%s' % configname) self.WriteLn("# Flags passed to only C files."); self.WriteList(cflags_c, 'CFLAGS_C_%s' % configname) self.WriteLn("# Flags passed to only C++ files."); self.WriteList(cflags_cc, 'CFLAGS_CC_%s' % configname) if self.flavor == 'mac': self.WriteLn("# Flags passed to only ObjC files."); self.WriteList(cflags_objc, 'CFLAGS_OBJC_%s' % configname) self.WriteLn("# Flags passed to only ObjC++ files."); self.WriteList(cflags_objcc, 'CFLAGS_OBJCC_%s' % configname) includes = config.get('include_dirs') if includes: includes = map(Sourceify, map(self.Absolutify, includes)) self.WriteList(includes, 'INCS_%s' % configname, prefix='-I') compilable = filter(Compilable, sources) objs = map(self.Objectify, map(self.Absolutify, map(Target, compilable))) self.WriteList(objs, 'OBJS') for obj in objs: assert ' ' not in obj, ( "Spaces in object filenames not supported (%s)" % obj) self.WriteLn('# Add to the list of files we specially track ' 'dependencies for.') self.WriteLn('all_deps += $(OBJS)') self.WriteLn() # Make sure our dependencies are built first. if deps: self.WriteMakeRule(['$(OBJS)'], deps, comment = 'Make sure our dependencies are built ' 'before any of us.', order_only = True) # Make sure the actions and rules run first. # If they generate any extra headers etc., the per-.o file dep tracking # will catch the proper rebuilds, so order only is still ok here. if extra_outputs: self.WriteMakeRule(['$(OBJS)'], extra_outputs, comment = 'Make sure our actions/rules run ' 'before any of us.', order_only = True) pchdeps = precompiled_header.GetObjDependencies(compilable, objs ) if pchdeps: self.WriteLn('# Dependencies from obj files to their precompiled headers') for source, obj, gch in pchdeps: self.WriteLn('%s: %s' % (obj, gch)) self.WriteLn('# End precompiled header dependencies') if objs: extra_link_deps.append('$(OBJS)') self.WriteLn("""\ # CFLAGS et al overrides must be target-local. # See "Target-specific Variable Values" in the GNU Make manual.""") self.WriteLn("$(OBJS): TOOLSET := $(TOOLSET)") self.WriteLn("$(OBJS): GYP_CFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('c') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_C_$(BUILDTYPE))") self.WriteLn("$(OBJS): GYP_CXXFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('cc') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_CC_$(BUILDTYPE))") if self.flavor == 'mac': self.WriteLn("$(OBJS): GYP_OBJCFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('m') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_C_$(BUILDTYPE)) " "$(CFLAGS_OBJC_$(BUILDTYPE))") self.WriteLn("$(OBJS): GYP_OBJCXXFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('mm') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_CC_$(BUILDTYPE)) " "$(CFLAGS_OBJCC_$(BUILDTYPE))") self.WritePchTargets(precompiled_header.GetPchBuildCommands()) # If there are any object files in our input file list, link them into our # output. extra_link_deps += filter(Linkable, sources) self.WriteLn() def WritePchTargets(self, pch_commands): """Writes make rules to compile prefix headers.""" if not pch_commands: return for gch, lang_flag, lang, input in pch_commands: extra_flags = { 'c': '$(CFLAGS_C_$(BUILDTYPE))', 'cc': '$(CFLAGS_CC_$(BUILDTYPE))', 'm': '$(CFLAGS_C_$(BUILDTYPE)) $(CFLAGS_OBJC_$(BUILDTYPE))', 'mm': '$(CFLAGS_CC_$(BUILDTYPE)) $(CFLAGS_OBJCC_$(BUILDTYPE))', }[lang] var_name = { 'c': 'GYP_PCH_CFLAGS', 'cc': 'GYP_PCH_CXXFLAGS', 'm': 'GYP_PCH_OBJCFLAGS', 'mm': 'GYP_PCH_OBJCXXFLAGS', }[lang] self.WriteLn("%s: %s := %s " % (gch, var_name, lang_flag) + "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "$(CFLAGS_$(BUILDTYPE)) " + extra_flags) self.WriteLn('%s: %s FORCE_DO_CMD' % (gch, input)) self.WriteLn('\t@$(call do_cmd,pch_%s,1)' % lang) self.WriteLn('') assert ' ' not in gch, ( "Spaces in gch filenames not supported (%s)" % gch) self.WriteLn('all_deps += %s' % gch) self.WriteLn('') def ComputeOutputBasename(self, spec): """Return the 'output basename' of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce 'libfoobar.so' """ assert not self.is_mac_bundle if self.flavor == 'mac' and self.type in ( 'static_library', 'executable', 'shared_library', 'loadable_module'): return self.xcode_settings.GetExecutablePath() target = spec['target_name'] target_prefix = '' target_ext = '' if self.type == 'static_library': if target[:3] == 'lib': target = target[3:] target_prefix = 'lib' target_ext = '.a' elif self.type in ('loadable_module', 'shared_library'): if target[:3] == 'lib': target = target[3:] target_prefix = 'lib' target_ext = '.so' elif self.type == 'none': target = '%s.stamp' % target elif self.type != 'executable': print ("ERROR: What output file should be generated?", "type", self.type, "target", target) target_prefix = spec.get('product_prefix', target_prefix) target = spec.get('product_name', target) product_ext = spec.get('product_extension') if product_ext: target_ext = '.' + product_ext return target_prefix + target + target_ext def _InstallImmediately(self): return self.toolset == 'target' and self.flavor == 'mac' and self.type in ( 'static_library', 'executable', 'shared_library', 'loadable_module') def ComputeOutput(self, spec): """Return the 'output' (full output path) of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce '$(obj)/baz/libfoobar.so' """ assert not self.is_mac_bundle path = os.path.join('$(obj).' + self.toolset, self.path) if self.type == 'executable' or self._InstallImmediately(): path = '$(builddir)' path = spec.get('product_dir', path) return os.path.join(path, self.ComputeOutputBasename(spec)) def ComputeMacBundleOutput(self, spec): """Return the 'output' (full output path) to a bundle output directory.""" assert self.is_mac_bundle path = generator_default_variables['PRODUCT_DIR'] return os.path.join(path, self.xcode_settings.GetWrapperName()) def ComputeMacBundleBinaryOutput(self, spec): """Return the 'output' (full output path) to the binary in a bundle.""" path = generator_default_variables['PRODUCT_DIR'] return os.path.join(path, self.xcode_settings.GetExecutablePath()) def ComputeDeps(self, spec): """Compute the dependencies of a gyp spec. Returns a tuple (deps, link_deps), where each is a list of filenames that will need to be put in front of make for either building (deps) or linking (link_deps). """ deps = [] link_deps = [] if 'dependencies' in spec: deps.extend([target_outputs[dep] for dep in spec['dependencies'] if target_outputs[dep]]) for dep in spec['dependencies']: if dep in target_link_deps: link_deps.append(target_link_deps[dep]) deps.extend(link_deps) # TODO: It seems we need to transitively link in libraries (e.g. -lfoo)? # This hack makes it work: # link_deps.extend(spec.get('libraries', [])) return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps)) def WriteDependencyOnExtraOutputs(self, target, extra_outputs): self.WriteMakeRule([self.output_binary], extra_outputs, comment = 'Build our special outputs first.', order_only = True) def WriteTarget(self, spec, configs, deps, link_deps, bundle_deps, extra_outputs, part_of_all): """Write Makefile code to produce the final target of the gyp spec. spec, configs: input from gyp. deps, link_deps: dependency lists; see ComputeDeps() extra_outputs: any extra outputs that our target should depend on part_of_all: flag indicating this target is part of 'all' """ self.WriteLn('### Rules for final target.') if extra_outputs: self.WriteDependencyOnExtraOutputs(self.output_binary, extra_outputs) self.WriteMakeRule(extra_outputs, deps, comment=('Preserve order dependency of ' 'special output on deps.'), order_only = True) target_postbuilds = {} if self.type != 'none': for configname in sorted(configs.keys()): config = configs[configname] if self.flavor == 'mac': ldflags = self.xcode_settings.GetLdflags(configname, generator_default_variables['PRODUCT_DIR'], lambda p: Sourceify(self.Absolutify(p))) # TARGET_POSTBUILDS_$(BUILDTYPE) is added to postbuilds later on. gyp_to_build = gyp.common.InvertRelativePath(self.path) target_postbuild = self.xcode_settings.AddImplicitPostbuilds( configname, QuoteSpaces(os.path.normpath(os.path.join(gyp_to_build, self.output))), QuoteSpaces(os.path.normpath(os.path.join(gyp_to_build, self.output_binary)))) if target_postbuild: target_postbuilds[configname] = target_postbuild else: ldflags = config.get('ldflags', []) # Compute an rpath for this output if needed. if any(dep.endswith('.so') or '.so.' in dep for dep in deps): # We want to get the literal string "$ORIGIN" into the link command, # so we need lots of escaping. ldflags.append(r'-Wl,-rpath=\$$ORIGIN/lib.%s/' % self.toolset) ldflags.append(r'-Wl,-rpath-link=\$(builddir)/lib.%s/' % self.toolset) library_dirs = config.get('library_dirs', []) ldflags += [('-L%s' % library_dir) for library_dir in library_dirs] self.WriteList(ldflags, 'LDFLAGS_%s' % configname) if self.flavor == 'mac': self.WriteList(self.xcode_settings.GetLibtoolflags(configname), 'LIBTOOLFLAGS_%s' % configname) libraries = spec.get('libraries') if libraries: # Remove duplicate entries libraries = gyp.common.uniquer(libraries) if self.flavor == 'mac': libraries = self.xcode_settings.AdjustLibraries(libraries) self.WriteList(libraries, 'LIBS') self.WriteLn('%s: GYP_LDFLAGS := $(LDFLAGS_$(BUILDTYPE))' % QuoteSpaces(self.output_binary)) self.WriteLn('%s: LIBS := $(LIBS)' % QuoteSpaces(self.output_binary)) if self.flavor == 'mac': self.WriteLn('%s: GYP_LIBTOOLFLAGS := $(LIBTOOLFLAGS_$(BUILDTYPE))' % QuoteSpaces(self.output_binary)) # Postbuild actions. Like actions, but implicitly depend on the target's # output. postbuilds = [] if self.flavor == 'mac': if target_postbuilds: postbuilds.append('$(TARGET_POSTBUILDS_$(BUILDTYPE))') postbuilds.extend( gyp.xcode_emulation.GetSpecPostbuildCommands(spec)) if postbuilds: # Envvars may be referenced by TARGET_POSTBUILDS_$(BUILDTYPE), # so we must output its definition first, since we declare variables # using ":=". self.WriteSortedXcodeEnv(self.output, self.GetSortedXcodePostbuildEnv()) for configname in target_postbuilds: self.WriteLn('%s: TARGET_POSTBUILDS_%s := %s' % (QuoteSpaces(self.output), configname, gyp.common.EncodePOSIXShellList(target_postbuilds[configname]))) # Postbuilds expect to be run in the gyp file's directory, so insert an # implicit postbuild to cd to there. postbuilds.insert(0, gyp.common.EncodePOSIXShellList(['cd', self.path])) for i in xrange(len(postbuilds)): if not postbuilds[i].startswith('$'): postbuilds[i] = EscapeShellArgument(postbuilds[i]) self.WriteLn('%s: builddir := $(abs_builddir)' % QuoteSpaces(self.output)) self.WriteLn('%s: POSTBUILDS := %s' % ( QuoteSpaces(self.output), ' '.join(postbuilds))) # A bundle directory depends on its dependencies such as bundle resources # and bundle binary. When all dependencies have been built, the bundle # needs to be packaged. if self.is_mac_bundle: # If the framework doesn't contain a binary, then nothing depends # on the actions -- make the framework depend on them directly too. self.WriteDependencyOnExtraOutputs(self.output, extra_outputs) # Bundle dependencies. Note that the code below adds actions to this # target, so if you move these two lines, move the lines below as well. self.WriteList(map(QuoteSpaces, bundle_deps), 'BUNDLE_DEPS') self.WriteLn('%s: $(BUNDLE_DEPS)' % QuoteSpaces(self.output)) # After the framework is built, package it. Needs to happen before # postbuilds, since postbuilds depend on this. if self.type in ('shared_library', 'loadable_module'): self.WriteLn('\t@$(call do_cmd,mac_package_framework,,,%s)' % self.xcode_settings.GetFrameworkVersion()) # Bundle postbuilds can depend on the whole bundle, so run them after # the bundle is packaged, not already after the bundle binary is done. if postbuilds: self.WriteLn('\t@$(call do_postbuilds)') postbuilds = [] # Don't write postbuilds for target's output. # Needed by test/mac/gyptest-rebuild.py. self.WriteLn('\t@true # No-op, used by tests') # Since this target depends on binary and resources which are in # nested subfolders, the framework directory will be older than # its dependencies usually. To prevent this rule from executing # on every build (expensive, especially with postbuilds), expliclity # update the time on the framework directory. self.WriteLn('\t@touch -c %s' % QuoteSpaces(self.output)) if postbuilds: assert not self.is_mac_bundle, ('Postbuilds for bundles should be done ' 'on the bundle, not the binary (target \'%s\')' % self.target) assert 'product_dir' not in spec, ('Postbuilds do not work with ' 'custom product_dir') if self.type == 'executable': self.WriteLn('%s: LD_INPUTS := %s' % ( QuoteSpaces(self.output_binary), ' '.join(map(QuoteSpaces, link_deps)))) if self.toolset == 'host' and self.flavor == 'android': self.WriteDoCmd([self.output_binary], link_deps, 'link_host', part_of_all, postbuilds=postbuilds) else: self.WriteDoCmd([self.output_binary], link_deps, 'link', part_of_all, postbuilds=postbuilds) elif self.type == 'static_library': for link_dep in link_deps: assert ' ' not in link_dep, ( "Spaces in alink input filenames not supported (%s)" % link_dep) if (self.flavor not in ('mac', 'openbsd', 'win') and not self.is_standalone_static_library): self.WriteDoCmd([self.output_binary], link_deps, 'alink_thin', part_of_all, postbuilds=postbuilds) else: self.WriteDoCmd([self.output_binary], link_deps, 'alink', part_of_all, postbuilds=postbuilds) elif self.type == 'shared_library': self.WriteLn('%s: LD_INPUTS := %s' % ( QuoteSpaces(self.output_binary), ' '.join(map(QuoteSpaces, link_deps)))) self.WriteDoCmd([self.output_binary], link_deps, 'solink', part_of_all, postbuilds=postbuilds) elif self.type == 'loadable_module': for link_dep in link_deps: assert ' ' not in link_dep, ( "Spaces in module input filenames not supported (%s)" % link_dep) if self.toolset == 'host' and self.flavor == 'android': self.WriteDoCmd([self.output_binary], link_deps, 'solink_module_host', part_of_all, postbuilds=postbuilds) else: self.WriteDoCmd( [self.output_binary], link_deps, 'solink_module', part_of_all, postbuilds=postbuilds) elif self.type == 'none': # Write a stamp line. self.WriteDoCmd([self.output_binary], deps, 'touch', part_of_all, postbuilds=postbuilds) else: print "WARNING: no output for", self.type, target # Add an alias for each target (if there are any outputs). # Installable target aliases are created below. if ((self.output and self.output != self.target) and (self.type not in self._INSTALLABLE_TARGETS)): self.WriteMakeRule([self.target], [self.output], comment='Add target alias', phony = True) if part_of_all: self.WriteMakeRule(['all'], [self.target], comment = 'Add target alias to "all" target.', phony = True) # Add special-case rules for our installable targets. # 1) They need to install to the build dir or "product" dir. # 2) They get shortcuts for building (e.g. "make chrome"). # 3) They are part of "make all". if (self.type in self._INSTALLABLE_TARGETS or self.is_standalone_static_library): if self.type == 'shared_library': file_desc = 'shared library' elif self.type == 'static_library': file_desc = 'static library' else: file_desc = 'executable' install_path = self._InstallableTargetInstallPath() installable_deps = [self.output] if (self.flavor == 'mac' and not 'product_dir' in spec and self.toolset == 'target'): # On mac, products are created in install_path immediately. assert install_path == self.output, '%s != %s' % ( install_path, self.output) # Point the target alias to the final binary output. self.WriteMakeRule([self.target], [install_path], comment='Add target alias', phony = True) if install_path != self.output: assert not self.is_mac_bundle # See comment a few lines above. self.WriteDoCmd([install_path], [self.output], 'copy', comment = 'Copy this to the %s output path.' % file_desc, part_of_all=part_of_all) installable_deps.append(install_path) if self.output != self.alias and self.alias != self.target: self.WriteMakeRule([self.alias], installable_deps, comment = 'Short alias for building this %s.' % file_desc, phony = True) if part_of_all: self.WriteMakeRule(['all'], [install_path], comment = 'Add %s to "all" target.' % file_desc, phony = True) def WriteList(self, value_list, variable=None, prefix='', quoter=QuoteIfNecessary): """Write a variable definition that is a list of values. E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out foo = blaha blahb but in a pretty-printed style. """ values = '' if value_list: value_list = [quoter(prefix + l) for l in value_list] values = ' \\\n\t' + ' \\\n\t'.join(value_list) self.fp.write('%s :=%s\n\n' % (variable, values)) def WriteDoCmd(self, outputs, inputs, command, part_of_all, comment=None, postbuilds=False): """Write a Makefile rule that uses do_cmd. This makes the outputs dependent on the command line that was run, as well as support the V= make command line flag. """ suffix = '' if postbuilds: assert ',' not in command suffix = ',,1' # Tell do_cmd to honor $POSTBUILDS self.WriteMakeRule(outputs, inputs, actions = ['$(call do_cmd,%s%s)' % (command, suffix)], comment = comment, force = True) # Add our outputs to the list of targets we read depfiles from. # all_deps is only used for deps file reading, and for deps files we replace # spaces with ? because escaping doesn't work with make's $(sort) and # other functions. outputs = [QuoteSpaces(o, SPACE_REPLACEMENT) for o in outputs] self.WriteLn('all_deps += %s' % ' '.join(outputs)) def WriteMakeRule(self, outputs, inputs, actions=None, comment=None, order_only=False, force=False, phony=False): """Write a Makefile rule, with some extra tricks. outputs: a list of outputs for the rule (note: this is not directly supported by make; see comments below) inputs: a list of inputs for the rule actions: a list of shell commands to run for the rule comment: a comment to put in the Makefile above the rule (also useful for making this Python script's code self-documenting) order_only: if true, makes the dependency order-only force: if true, include FORCE_DO_CMD as an order-only dep phony: if true, the rule does not actually generate the named output, the output is just a name to run the rule """ outputs = map(QuoteSpaces, outputs) inputs = map(QuoteSpaces, inputs) if comment: self.WriteLn('# ' + comment) if phony: self.WriteLn('.PHONY: ' + ' '.join(outputs)) # TODO(evanm): just make order_only a list of deps instead of these hacks. if order_only: order_insert = '| ' pick_output = ' '.join(outputs) else: order_insert = '' pick_output = outputs[0] if force: force_append = ' FORCE_DO_CMD' else: force_append = '' if actions: self.WriteLn("%s: TOOLSET := $(TOOLSET)" % outputs[0]) self.WriteLn('%s: %s%s%s' % (pick_output, order_insert, ' '.join(inputs), force_append)) if actions: for action in actions: self.WriteLn('\t%s' % action) if not order_only and len(outputs) > 1: # If we have more than one output, a rule like # foo bar: baz # that for *each* output we must run the action, potentially # in parallel. That is not what we're trying to write -- what # we want is that we run the action once and it generates all # the files. # http://www.gnu.org/software/hello/manual/automake/Multiple-Outputs.html # discusses this problem and has this solution: # 1) Write the naive rule that would produce parallel runs of # the action. # 2) Make the outputs seralized on each other, so we won't start # a parallel run until the first run finishes, at which point # we'll have generated all the outputs and we're done. self.WriteLn('%s: %s' % (' '.join(outputs[1:]), outputs[0])) # Add a dummy command to the "extra outputs" rule, otherwise make seems to # think these outputs haven't (couldn't have?) changed, and thus doesn't # flag them as changed (i.e. include in '$?') when evaluating dependent # rules, which in turn causes do_cmd() to skip running dependent commands. self.WriteLn('%s: ;' % (' '.join(outputs[1:]))) self.WriteLn() def WriteAndroidNdkModuleRule(self, module_name, all_sources, link_deps): """Write a set of LOCAL_XXX definitions for Android NDK. These variable definitions will be used by Android NDK but do nothing for non-Android applications. Arguments: module_name: Android NDK module name, which must be unique among all module names. all_sources: A list of source files (will be filtered by Compilable). link_deps: A list of link dependencies, which must be sorted in the order from dependencies to dependents. """ if self.type not in ('executable', 'shared_library', 'static_library'): return self.WriteLn('# Variable definitions for Android applications') self.WriteLn('include $(CLEAR_VARS)') self.WriteLn('LOCAL_MODULE := ' + module_name) self.WriteLn('LOCAL_CFLAGS := $(CFLAGS_$(BUILDTYPE)) ' '$(DEFS_$(BUILDTYPE)) ' # LOCAL_CFLAGS is applied to both of C and C++. There is # no way to specify $(CFLAGS_C_$(BUILDTYPE)) only for C # sources. '$(CFLAGS_C_$(BUILDTYPE)) ' # $(INCS_$(BUILDTYPE)) includes the prefix '-I' while # LOCAL_C_INCLUDES does not expect it. So put it in # LOCAL_CFLAGS. '$(INCS_$(BUILDTYPE))') # LOCAL_CXXFLAGS is obsolete and LOCAL_CPPFLAGS is preferred. self.WriteLn('LOCAL_CPPFLAGS := $(CFLAGS_CC_$(BUILDTYPE))') self.WriteLn('LOCAL_C_INCLUDES :=') self.WriteLn('LOCAL_LDLIBS := $(LDFLAGS_$(BUILDTYPE)) $(LIBS)') # Detect the C++ extension. cpp_ext = {'.cc': 0, '.cpp': 0, '.cxx': 0} default_cpp_ext = '.cpp' for filename in all_sources: ext = os.path.splitext(filename)[1] if ext in cpp_ext: cpp_ext[ext] += 1 if cpp_ext[ext] > cpp_ext[default_cpp_ext]: default_cpp_ext = ext self.WriteLn('LOCAL_CPP_EXTENSION := ' + default_cpp_ext) self.WriteList(map(self.Absolutify, filter(Compilable, all_sources)), 'LOCAL_SRC_FILES') # Filter out those which do not match prefix and suffix and produce # the resulting list without prefix and suffix. def DepsToModules(deps, prefix, suffix): modules = [] for filepath in deps: filename = os.path.basename(filepath) if filename.startswith(prefix) and filename.endswith(suffix): modules.append(filename[len(prefix):-len(suffix)]) return modules # Retrieve the default value of 'SHARED_LIB_SUFFIX' params = {'flavor': 'linux'} default_variables = {} CalculateVariables(default_variables, params) self.WriteList( DepsToModules(link_deps, generator_default_variables['SHARED_LIB_PREFIX'], default_variables['SHARED_LIB_SUFFIX']), 'LOCAL_SHARED_LIBRARIES') self.WriteList( DepsToModules(link_deps, generator_default_variables['STATIC_LIB_PREFIX'], generator_default_variables['STATIC_LIB_SUFFIX']), 'LOCAL_STATIC_LIBRARIES') if self.type == 'executable': self.WriteLn('include $(BUILD_EXECUTABLE)') elif self.type == 'shared_library': self.WriteLn('include $(BUILD_SHARED_LIBRARY)') elif self.type == 'static_library': self.WriteLn('include $(BUILD_STATIC_LIBRARY)') self.WriteLn() def WriteLn(self, text=''): self.fp.write(text + '\n') def GetSortedXcodeEnv(self, additional_settings=None): return gyp.xcode_emulation.GetSortedXcodeEnv( self.xcode_settings, "$(abs_builddir)", os.path.join("$(abs_srcdir)", self.path), "$(BUILDTYPE)", additional_settings) def GetSortedXcodePostbuildEnv(self): # CHROMIUM_STRIP_SAVE_FILE is a chromium-specific hack. # TODO(thakis): It would be nice to have some general mechanism instead. strip_save_file = self.xcode_settings.GetPerTargetSetting( 'CHROMIUM_STRIP_SAVE_FILE', '') # Even if strip_save_file is empty, explicitly write it. Else a postbuild # might pick up an export from an earlier target. return self.GetSortedXcodeEnv( additional_settings={'CHROMIUM_STRIP_SAVE_FILE': strip_save_file}) def WriteSortedXcodeEnv(self, target, env): for k, v in env: # For # foo := a\ b # the escaped space does the right thing. For # export foo := a\ b # it does not -- the backslash is written to the env as literal character. # So don't escape spaces in |env[k]|. self.WriteLn('%s: export %s := %s' % (QuoteSpaces(target), k, v)) def Objectify(self, path): """Convert a path to its output directory form.""" if '$(' in path: path = path.replace('$(obj)/', '$(obj).%s/$(TARGET)/' % self.toolset) if not '$(obj)' in path: path = '$(obj).%s/$(TARGET)/%s' % (self.toolset, path) return path def Pchify(self, path, lang): """Convert a prefix header path to its output directory form.""" path = self.Absolutify(path) if '$(' in path: path = path.replace('$(obj)/', '$(obj).%s/$(TARGET)/pch-%s' % (self.toolset, lang)) return path return '$(obj).%s/$(TARGET)/pch-%s/%s' % (self.toolset, lang, path) def Absolutify(self, path): """Convert a subdirectory-relative path into a base-relative path. Skips over paths that contain variables.""" if '$(' in path: # Don't call normpath in this case, as it might collapse the # path too aggressively if it features '..'. However it's still # important to strip trailing slashes. return path.rstrip('/') return os.path.normpath(os.path.join(self.path, path)) def ExpandInputRoot(self, template, expansion, dirname): if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template: return template path = template % { 'INPUT_ROOT': expansion, 'INPUT_DIRNAME': dirname, } return path def _InstallableTargetInstallPath(self): """Returns the location of the final output for an installable target.""" # Xcode puts shared_library results into PRODUCT_DIR, and some gyp files # rely on this. Emulate this behavior for mac. # XXX(TooTallNate): disabling this code since we don't want this behavior... #if (self.type == 'shared_library' and # (self.flavor != 'mac' or self.toolset != 'target')): # # Install all shared libs into a common directory (per toolset) for # # convenient access with LD_LIBRARY_PATH. # return '$(builddir)/lib.%s/%s' % (self.toolset, self.alias) return '$(builddir)/' + self.alias def WriteAutoRegenerationRule(params, root_makefile, makefile_name, build_files): """Write the target to regenerate the Makefile.""" options = params['options'] build_files_args = [gyp.common.RelativePath(filename, options.toplevel_dir) for filename in params['build_files_arg']] gyp_binary = gyp.common.FixIfRelativePath(params['gyp_binary'], options.toplevel_dir) if not gyp_binary.startswith(os.sep): gyp_binary = os.path.join('.', gyp_binary) root_makefile.write( "quiet_cmd_regen_makefile = ACTION Regenerating $@\n" "cmd_regen_makefile = cd $(srcdir); %(cmd)s\n" "%(makefile_name)s: %(deps)s\n" "\t$(call do_cmd,regen_makefile)\n\n" % { 'makefile_name': makefile_name, 'deps': ' '.join(map(Sourceify, build_files)), 'cmd': gyp.common.EncodePOSIXShellList( [gyp_binary, '-fmake'] + gyp.RegenerateFlags(options) + build_files_args)}) def PerformBuild(data, configurations, params): options = params['options'] for config in configurations: arguments = ['make'] if options.toplevel_dir and options.toplevel_dir != '.': arguments += '-C', options.toplevel_dir arguments.append('BUILDTYPE=' + config) print 'Building [%s]: %s' % (config, arguments) subprocess.check_call(arguments) def GenerateOutput(target_list, target_dicts, data, params): options = params['options'] flavor = gyp.common.GetFlavor(params) generator_flags = params.get('generator_flags', {}) builddir_name = generator_flags.get('output_dir', 'out') android_ndk_version = generator_flags.get('android_ndk_version', None) default_target = generator_flags.get('default_target', 'all') def CalculateMakefilePath(build_file, base_name): """Determine where to write a Makefile for a given gyp file.""" # Paths in gyp files are relative to the .gyp file, but we want # paths relative to the source root for the master makefile. Grab # the path of the .gyp file as the base to relativize against. # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp". base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.depth) # We write the file in the base_path directory. output_file = os.path.join(options.depth, base_path, base_name) if options.generator_output: output_file = os.path.join( options.depth, options.generator_output, base_path, base_name) base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.toplevel_dir) return base_path, output_file # TODO: search for the first non-'Default' target. This can go # away when we add verification that all targets have the # necessary configurations. default_configuration = None toolsets = set([target_dicts[target]['toolset'] for target in target_list]) for target in target_list: spec = target_dicts[target] if spec['default_configuration'] != 'Default': default_configuration = spec['default_configuration'] break if not default_configuration: default_configuration = 'Default' srcdir = '.' makefile_name = 'Makefile' + options.suffix makefile_path = os.path.join(options.toplevel_dir, makefile_name) if options.generator_output: global srcdir_prefix makefile_path = os.path.join( options.toplevel_dir, options.generator_output, makefile_name) srcdir = gyp.common.RelativePath(srcdir, options.generator_output) srcdir_prefix = '$(srcdir)/' flock_command= 'flock' header_params = { 'default_target': default_target, 'builddir': builddir_name, 'default_configuration': default_configuration, 'flock': flock_command, 'flock_index': 1, 'link_commands': LINK_COMMANDS_LINUX, 'extra_commands': '', 'srcdir': srcdir, } if flavor == 'mac': flock_command = './gyp-mac-tool flock' header_params.update({ 'flock': flock_command, 'flock_index': 2, 'link_commands': LINK_COMMANDS_MAC, 'extra_commands': SHARED_HEADER_MAC_COMMANDS, }) elif flavor == 'android': header_params.update({ 'link_commands': LINK_COMMANDS_ANDROID, }) elif flavor == 'solaris': header_params.update({ 'flock': './gyp-flock-tool flock', 'flock_index': 2, }) elif flavor == 'freebsd': # Note: OpenBSD has sysutils/flock. lockf seems to be FreeBSD specific. header_params.update({ 'flock': 'lockf', }) elif flavor == 'aix': header_params.update({ 'link_commands': LINK_COMMANDS_AIX, 'flock': './gyp-flock-tool flock', 'flock_index': 2, }) header_params.update({ 'CC.target': GetEnvironFallback(('CC_target', 'CC'), '$(CC)'), 'AR.target': GetEnvironFallback(('AR_target', 'AR'), '$(AR)'), 'CXX.target': GetEnvironFallback(('CXX_target', 'CXX'), '$(CXX)'), 'LINK.target': GetEnvironFallback(('LINK_target', 'LINK'), '$(LINK)'), 'CC.host': GetEnvironFallback(('CC_host',), 'gcc'), 'AR.host': GetEnvironFallback(('AR_host',), 'ar'), 'CXX.host': GetEnvironFallback(('CXX_host',), 'g++'), 'LINK.host': GetEnvironFallback(('LINK_host',), '$(CXX.host)'), }) build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings_array = data[build_file].get('make_global_settings', []) wrappers = {} wrappers['LINK'] = '%s $(builddir)/linker.lock' % flock_command for key, value in make_global_settings_array: if key.endswith('_wrapper'): wrappers[key[:-len('_wrapper')]] = '$(abspath %s)' % value make_global_settings = '' for key, value in make_global_settings_array: if re.match('.*_wrapper', key): continue if value[0] != '$': value = '$(abspath %s)' % value wrapper = wrappers.get(key) if wrapper: value = '%s %s' % (wrapper, value) del wrappers[key] if key in ('CC', 'CC.host', 'CXX', 'CXX.host'): make_global_settings += ( 'ifneq (,$(filter $(origin %s), undefined default))\n' % key) # Let gyp-time envvars win over global settings. env_key = key.replace('.', '_') # CC.host -> CC_host if env_key in os.environ: value = os.environ[env_key] make_global_settings += ' %s = %s\n' % (key, value) make_global_settings += 'endif\n' else: make_global_settings += '%s ?= %s\n' % (key, value) # TODO(ukai): define cmd when only wrapper is specified in # make_global_settings. header_params['make_global_settings'] = make_global_settings gyp.common.EnsureDirExists(makefile_path) root_makefile = open(makefile_path, 'w') root_makefile.write(SHARED_HEADER % header_params) # Currently any versions have the same effect, but in future the behavior # could be different. if android_ndk_version: root_makefile.write( '# Define LOCAL_PATH for build of Android applications.\n' 'LOCAL_PATH := $(call my-dir)\n' '\n') for toolset in toolsets: root_makefile.write('TOOLSET := %s\n' % toolset) WriteRootHeaderSuffixRules(root_makefile) # Put build-time support tools next to the root Makefile. dest_path = os.path.dirname(makefile_path) gyp.common.CopyTool(flavor, dest_path) # Find the list of targets that derive from the gyp file(s) being built. needed_targets = set() for build_file in params['build_files']: for target in gyp.common.AllTargets(target_list, target_dicts, build_file): needed_targets.add(target) build_files = set() include_list = set() for qualified_target in target_list: build_file, target, toolset = gyp.common.ParseQualifiedTarget( qualified_target) this_make_global_settings = data[build_file].get('make_global_settings', []) assert make_global_settings_array == this_make_global_settings, ( "make_global_settings needs to be the same for all targets. %s vs. %s" % (this_make_global_settings, make_global_settings)) build_files.add(gyp.common.RelativePath(build_file, options.toplevel_dir)) included_files = data[build_file]['included_files'] for included_file in included_files: # The included_files entries are relative to the dir of the build file # that included them, so we have to undo that and then make them relative # to the root dir. relative_include_file = gyp.common.RelativePath( gyp.common.UnrelativePath(included_file, build_file), options.toplevel_dir) abs_include_file = os.path.abspath(relative_include_file) # If the include file is from the ~/.gyp dir, we should use absolute path # so that relocating the src dir doesn't break the path. if (params['home_dot_gyp'] and abs_include_file.startswith(params['home_dot_gyp'])): build_files.add(abs_include_file) else: build_files.add(relative_include_file) base_path, output_file = CalculateMakefilePath(build_file, target + '.' + toolset + options.suffix + '.mk') spec = target_dicts[qualified_target] configs = spec['configurations'] if flavor == 'mac': gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(data[build_file], spec) writer = MakefileWriter(generator_flags, flavor) writer.Write(qualified_target, base_path, output_file, spec, configs, part_of_all=qualified_target in needed_targets) # Our root_makefile lives at the source root. Compute the relative path # from there to the output_file for including. mkfile_rel_path = gyp.common.RelativePath(output_file, os.path.dirname(makefile_path)) include_list.add(mkfile_rel_path) # Write out per-gyp (sub-project) Makefiles. depth_rel_path = gyp.common.RelativePath(options.depth, os.getcwd()) for build_file in build_files: # The paths in build_files were relativized above, so undo that before # testing against the non-relativized items in target_list and before # calculating the Makefile path. build_file = os.path.join(depth_rel_path, build_file) gyp_targets = [target_dicts[target]['target_name'] for target in target_list if target.startswith(build_file) and target in needed_targets] # Only generate Makefiles for gyp files with targets. if not gyp_targets: continue base_path, output_file = CalculateMakefilePath(build_file, os.path.splitext(os.path.basename(build_file))[0] + '.Makefile') makefile_rel_path = gyp.common.RelativePath(os.path.dirname(makefile_path), os.path.dirname(output_file)) writer.WriteSubMake(output_file, makefile_rel_path, gyp_targets, builddir_name) # Write out the sorted list of includes. root_makefile.write('\n') for include_file in sorted(include_list): # We wrap each .mk include in an if statement so users can tell make to # not load a file by setting NO_LOAD. The below make code says, only # load the .mk file if the .mk filename doesn't start with a token in # NO_LOAD. root_makefile.write( "ifeq ($(strip $(foreach prefix,$(NO_LOAD),\\\n" " $(findstring $(join ^,$(prefix)),\\\n" " $(join ^," + include_file + ")))),)\n") root_makefile.write(" include " + include_file + "\n") root_makefile.write("endif\n") root_makefile.write('\n') if (not generator_flags.get('standalone') and generator_flags.get('auto_regeneration', True)): WriteAutoRegenerationRule(params, root_makefile, makefile_name, build_files) root_makefile.write(SHARED_FOOTER) root_makefile.close() ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs.py���000644 �000766 �000024 �00000363654 12455173731 033167� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import collections import copy import ntpath import os import posixpath import re import subprocess import sys import gyp.common import gyp.easy_xml as easy_xml import gyp.MSVSNew as MSVSNew import gyp.MSVSProject as MSVSProject import gyp.MSVSSettings as MSVSSettings import gyp.MSVSToolFile as MSVSToolFile import gyp.MSVSUserFile as MSVSUserFile import gyp.MSVSUtil as MSVSUtil import gyp.MSVSVersion as MSVSVersion from gyp.common import GypError # TODO: Remove once bots are on 2.7, http://crbug.com/241769 def _import_OrderedDict(): import collections try: return collections.OrderedDict except AttributeError: import gyp.ordered_dict return gyp.ordered_dict.OrderedDict OrderedDict = _import_OrderedDict() # Regular expression for validating Visual Studio GUIDs. If the GUID # contains lowercase hex letters, MSVS will be fine. However, # IncrediBuild BuildConsole will parse the solution file, but then # silently skip building the target causing hard to track down errors. # Note that this only happens with the BuildConsole, and does not occur # if IncrediBuild is executed from inside Visual Studio. This regex # validates that the string looks like a GUID with all uppercase hex # letters. VALID_MSVS_GUID_CHARS = re.compile('^[A-F0-9\-]+$') generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '.exe', 'STATIC_LIB_PREFIX': '', 'SHARED_LIB_PREFIX': '', 'STATIC_LIB_SUFFIX': '.lib', 'SHARED_LIB_SUFFIX': '.dll', 'INTERMEDIATE_DIR': '$(IntDir)', 'SHARED_INTERMEDIATE_DIR': '$(OutDir)obj/global_intermediate', 'OS': 'win', 'PRODUCT_DIR': '$(OutDir)', 'LIB_DIR': '$(OutDir)lib', 'RULE_INPUT_ROOT': '$(InputName)', 'RULE_INPUT_DIRNAME': '$(InputDir)', 'RULE_INPUT_EXT': '$(InputExt)', 'RULE_INPUT_NAME': '$(InputFileName)', 'RULE_INPUT_PATH': '$(InputPath)', 'CONFIGURATION_NAME': '$(ConfigurationName)', } # The msvs specific sections that hold paths generator_additional_path_sections = [ 'msvs_cygwin_dirs', 'msvs_props', ] generator_additional_non_configuration_keys = [ 'msvs_cygwin_dirs', 'msvs_cygwin_shell', 'msvs_large_pdb', 'msvs_shard', 'msvs_external_builder', 'msvs_external_builder_out_dir', 'msvs_external_builder_build_cmd', 'msvs_external_builder_clean_cmd', ] # List of precompiled header related keys. precomp_keys = [ 'msvs_precompiled_header', 'msvs_precompiled_source', ] cached_username = None cached_domain = None # Based on http://code.activestate.com/recipes/576694/. class OrderedSet(collections.MutableSet): def __init__(self, iterable=None): self.end = end = [] end += [None, end, end] # sentinel node for doubly linked list self.map = {} # key --> [key, prev, next] if iterable is not None: self |= iterable def __len__(self): return len(self.map) def discard(self, key): if key in self.map: key, prev, next = self.map.pop(key) prev[2] = next next[1] = prev def __contains__(self, key): return key in self.map def add(self, key): if key not in self.map: end = self.end curr = end[1] curr[2] = end[1] = self.map[key] = [key, curr, end] def update(self, iterable): for i in iterable: if i not in self: self.add(i) def __iter__(self): end = self.end curr = end[2] while curr is not end: yield curr[0] curr = curr[2] # TODO(gspencer): Switch the os.environ calls to be # win32api.GetDomainName() and win32api.GetUserName() once the # python version in depot_tools has been updated to work on Vista # 64-bit. def _GetDomainAndUserName(): if sys.platform not in ('win32', 'cygwin'): return ('DOMAIN', 'USERNAME') global cached_username global cached_domain if not cached_domain or not cached_username: domain = os.environ.get('USERDOMAIN') username = os.environ.get('USERNAME') if not domain or not username: call = subprocess.Popen(['net', 'config', 'Workstation'], stdout=subprocess.PIPE) config = call.communicate()[0] username_re = re.compile('^User name\s+(\S+)', re.MULTILINE) username_match = username_re.search(config) if username_match: username = username_match.group(1) domain_re = re.compile('^Logon domain\s+(\S+)', re.MULTILINE) domain_match = domain_re.search(config) if domain_match: domain = domain_match.group(1) cached_domain = domain cached_username = username return (cached_domain, cached_username) fixpath_prefix = None def _NormalizedSource(source): """Normalize the path. But not if that gets rid of a variable, as this may expand to something larger than one directory. Arguments: source: The path to be normalize.d Returns: The normalized path. """ normalized = os.path.normpath(source) if source.count('$') == normalized.count('$'): source = normalized return source def _FixPath(path): """Convert paths to a form that will make sense in a vcproj file. Arguments: path: The path to convert, may contain / etc. Returns: The path with all slashes made into backslashes. """ if fixpath_prefix and path and not os.path.isabs(path) and not path[0] == '$': path = os.path.join(fixpath_prefix, path) path = path.replace('/', '\\') path = _NormalizedSource(path) if path and path[-1] == '\\': path = path[:-1] return path def _FixPaths(paths): """Fix each of the paths of the list.""" return [_FixPath(i) for i in paths] def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None, list_excluded=True, msvs_version=None): """Converts a list split source file paths into a vcproj folder hierarchy. Arguments: sources: A list of source file paths split. prefix: A list of source file path layers meant to apply to each of sources. excluded: A set of excluded files. msvs_version: A MSVSVersion object. Returns: A hierarchy of filenames and MSVSProject.Filter objects that matches the layout of the source tree. For example: _ConvertSourcesToFilterHierarchy([['a', 'bob1.c'], ['b', 'bob2.c']], prefix=['joe']) --> [MSVSProject.Filter('a', contents=['joe\\a\\bob1.c']), MSVSProject.Filter('b', contents=['joe\\b\\bob2.c'])] """ if not prefix: prefix = [] result = [] excluded_result = [] folders = OrderedDict() # Gather files into the final result, excluded, or folders. for s in sources: if len(s) == 1: filename = _NormalizedSource('\\'.join(prefix + s)) if filename in excluded: excluded_result.append(filename) else: result.append(filename) elif msvs_version and not msvs_version.UsesVcxproj(): # For MSVS 2008 and earlier, we need to process all files before walking # the sub folders. if not folders.get(s[0]): folders[s[0]] = [] folders[s[0]].append(s[1:]) else: contents = _ConvertSourcesToFilterHierarchy([s[1:]], prefix + [s[0]], excluded=excluded, list_excluded=list_excluded, msvs_version=msvs_version) contents = MSVSProject.Filter(s[0], contents=contents) result.append(contents) # Add a folder for excluded files. if excluded_result and list_excluded: excluded_folder = MSVSProject.Filter('_excluded_files', contents=excluded_result) result.append(excluded_folder) if msvs_version and msvs_version.UsesVcxproj(): return result # Populate all the folders. for f in folders: contents = _ConvertSourcesToFilterHierarchy(folders[f], prefix=prefix + [f], excluded=excluded, list_excluded=list_excluded) contents = MSVSProject.Filter(f, contents=contents) result.append(contents) return result def _ToolAppend(tools, tool_name, setting, value, only_if_unset=False): if not value: return _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset) def _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset=False): # TODO(bradnelson): ugly hack, fix this more generally!!! if 'Directories' in setting or 'Dependencies' in setting: if type(value) == str: value = value.replace('/', '\\') else: value = [i.replace('/', '\\') for i in value] if not tools.get(tool_name): tools[tool_name] = dict() tool = tools[tool_name] if tool.get(setting): if only_if_unset: return if type(tool[setting]) == list and type(value) == list: tool[setting] += value else: raise TypeError( 'Appending "%s" to a non-list setting "%s" for tool "%s" is ' 'not allowed, previous value: %s' % ( value, setting, tool_name, str(tool[setting]))) else: tool[setting] = value def _ConfigPlatform(config_data): return config_data.get('msvs_configuration_platform', 'Win32') def _ConfigBaseName(config_name, platform_name): if config_name.endswith('_' + platform_name): return config_name[0:-len(platform_name) - 1] else: return config_name def _ConfigFullName(config_name, config_data): platform_name = _ConfigPlatform(config_data) return '%s|%s' % (_ConfigBaseName(config_name, platform_name), platform_name) def _BuildCommandLineForRuleRaw(spec, cmd, cygwin_shell, has_input_path, quote_cmd, do_setup_env): if [x for x in cmd if '$(InputDir)' in x]: input_dir_preamble = ( 'set INPUTDIR=$(InputDir)\n' 'set INPUTDIR=%INPUTDIR:$(ProjectDir)=%\n' 'set INPUTDIR=%INPUTDIR:~0,-1%\n' ) else: input_dir_preamble = '' if cygwin_shell: # Find path to cygwin. cygwin_dir = _FixPath(spec.get('msvs_cygwin_dirs', ['.'])[0]) # Prepare command. direct_cmd = cmd direct_cmd = [i.replace('$(IntDir)', '`cygpath -m "${INTDIR}"`') for i in direct_cmd] direct_cmd = [i.replace('$(OutDir)', '`cygpath -m "${OUTDIR}"`') for i in direct_cmd] direct_cmd = [i.replace('$(InputDir)', '`cygpath -m "${INPUTDIR}"`') for i in direct_cmd] if has_input_path: direct_cmd = [i.replace('$(InputPath)', '`cygpath -m "${INPUTPATH}"`') for i in direct_cmd] direct_cmd = ['\\"%s\\"' % i.replace('"', '\\\\\\"') for i in direct_cmd] # direct_cmd = gyp.common.EncodePOSIXShellList(direct_cmd) direct_cmd = ' '.join(direct_cmd) # TODO(quote): regularize quoting path names throughout the module cmd = '' if do_setup_env: cmd += 'call "$(ProjectDir)%(cygwin_dir)s\\setup_env.bat" && ' cmd += 'set CYGWIN=nontsec&& ' if direct_cmd.find('NUMBER_OF_PROCESSORS') >= 0: cmd += 'set /a NUMBER_OF_PROCESSORS_PLUS_1=%%NUMBER_OF_PROCESSORS%%+1&& ' if direct_cmd.find('INTDIR') >= 0: cmd += 'set INTDIR=$(IntDir)&& ' if direct_cmd.find('OUTDIR') >= 0: cmd += 'set OUTDIR=$(OutDir)&& ' if has_input_path and direct_cmd.find('INPUTPATH') >= 0: cmd += 'set INPUTPATH=$(InputPath) && ' cmd += 'bash -c "%(cmd)s"' cmd = cmd % {'cygwin_dir': cygwin_dir, 'cmd': direct_cmd} return input_dir_preamble + cmd else: # Convert cat --> type to mimic unix. if cmd[0] == 'cat': command = ['type'] else: command = [cmd[0].replace('/', '\\')] # Add call before command to ensure that commands can be tied together one # after the other without aborting in Incredibuild, since IB makes a bat # file out of the raw command string, and some commands (like python) are # actually batch files themselves. command.insert(0, 'call') # Fix the paths # TODO(quote): This is a really ugly heuristic, and will miss path fixing # for arguments like "--arg=path" or "/opt:path". # If the argument starts with a slash or dash, it's probably a command line # switch arguments = [i if (i[:1] in "/-") else _FixPath(i) for i in cmd[1:]] arguments = [i.replace('$(InputDir)', '%INPUTDIR%') for i in arguments] arguments = [MSVSSettings.FixVCMacroSlashes(i) for i in arguments] if quote_cmd: # Support a mode for using cmd directly. # Convert any paths to native form (first element is used directly). # TODO(quote): regularize quoting path names throughout the module arguments = ['"%s"' % i for i in arguments] # Collapse into a single command. return input_dir_preamble + ' '.join(command + arguments) def _BuildCommandLineForRule(spec, rule, has_input_path, do_setup_env): # Currently this weird argument munging is used to duplicate the way a # python script would need to be run as part of the chrome tree. # Eventually we should add some sort of rule_default option to set this # per project. For now the behavior chrome needs is the default. mcs = rule.get('msvs_cygwin_shell') if mcs is None: mcs = int(spec.get('msvs_cygwin_shell', 1)) elif isinstance(mcs, str): mcs = int(mcs) quote_cmd = int(rule.get('msvs_quote_cmd', 1)) return _BuildCommandLineForRuleRaw(spec, rule['action'], mcs, has_input_path, quote_cmd, do_setup_env=do_setup_env) def _AddActionStep(actions_dict, inputs, outputs, description, command): """Merge action into an existing list of actions. Care must be taken so that actions which have overlapping inputs either don't get assigned to the same input, or get collapsed into one. Arguments: actions_dict: dictionary keyed on input name, which maps to a list of dicts describing the actions attached to that input file. inputs: list of inputs outputs: list of outputs description: description of the action command: command line to execute """ # Require there to be at least one input (call sites will ensure this). assert inputs action = { 'inputs': inputs, 'outputs': outputs, 'description': description, 'command': command, } # Pick where to stick this action. # While less than optimal in terms of build time, attach them to the first # input for now. chosen_input = inputs[0] # Add it there. if chosen_input not in actions_dict: actions_dict[chosen_input] = [] actions_dict[chosen_input].append(action) def _AddCustomBuildToolForMSVS(p, spec, primary_input, inputs, outputs, description, cmd): """Add a custom build tool to execute something. Arguments: p: the target project spec: the target project dict primary_input: input file to attach the build tool to inputs: list of inputs outputs: list of outputs description: description of the action cmd: command line to execute """ inputs = _FixPaths(inputs) outputs = _FixPaths(outputs) tool = MSVSProject.Tool( 'VCCustomBuildTool', {'Description': description, 'AdditionalDependencies': ';'.join(inputs), 'Outputs': ';'.join(outputs), 'CommandLine': cmd, }) # Add to the properties of primary input for each config. for config_name, c_data in spec['configurations'].iteritems(): p.AddFileConfig(_FixPath(primary_input), _ConfigFullName(config_name, c_data), tools=[tool]) def _AddAccumulatedActionsToMSVS(p, spec, actions_dict): """Add actions accumulated into an actions_dict, merging as needed. Arguments: p: the target project spec: the target project dict actions_dict: dictionary keyed on input name, which maps to a list of dicts describing the actions attached to that input file. """ for primary_input in actions_dict: inputs = OrderedSet() outputs = OrderedSet() descriptions = [] commands = [] for action in actions_dict[primary_input]: inputs.update(OrderedSet(action['inputs'])) outputs.update(OrderedSet(action['outputs'])) descriptions.append(action['description']) commands.append(action['command']) # Add the custom build step for one input file. description = ', and also '.join(descriptions) command = '\r\n'.join(commands) _AddCustomBuildToolForMSVS(p, spec, primary_input=primary_input, inputs=inputs, outputs=outputs, description=description, cmd=command) def _RuleExpandPath(path, input_file): """Given the input file to which a rule applied, string substitute a path. Arguments: path: a path to string expand input_file: the file to which the rule applied. Returns: The string substituted path. """ path = path.replace('$(InputName)', os.path.splitext(os.path.split(input_file)[1])[0]) path = path.replace('$(InputDir)', os.path.dirname(input_file)) path = path.replace('$(InputExt)', os.path.splitext(os.path.split(input_file)[1])[1]) path = path.replace('$(InputFileName)', os.path.split(input_file)[1]) path = path.replace('$(InputPath)', input_file) return path def _FindRuleTriggerFiles(rule, sources): """Find the list of files which a particular rule applies to. Arguments: rule: the rule in question sources: the set of all known source files for this project Returns: The list of sources that trigger a particular rule. """ return rule.get('rule_sources', []) def _RuleInputsAndOutputs(rule, trigger_file): """Find the inputs and outputs generated by a rule. Arguments: rule: the rule in question. trigger_file: the main trigger for this rule. Returns: The pair of (inputs, outputs) involved in this rule. """ raw_inputs = _FixPaths(rule.get('inputs', [])) raw_outputs = _FixPaths(rule.get('outputs', [])) inputs = OrderedSet() outputs = OrderedSet() inputs.add(trigger_file) for i in raw_inputs: inputs.add(_RuleExpandPath(i, trigger_file)) for o in raw_outputs: outputs.add(_RuleExpandPath(o, trigger_file)) return (inputs, outputs) def _GenerateNativeRulesForMSVS(p, rules, output_dir, spec, options): """Generate a native rules file. Arguments: p: the target project rules: the set of rules to include output_dir: the directory in which the project/gyp resides spec: the project dict options: global generator options """ rules_filename = '%s%s.rules' % (spec['target_name'], options.suffix) rules_file = MSVSToolFile.Writer(os.path.join(output_dir, rules_filename), spec['target_name']) # Add each rule. for r in rules: rule_name = r['rule_name'] rule_ext = r['extension'] inputs = _FixPaths(r.get('inputs', [])) outputs = _FixPaths(r.get('outputs', [])) # Skip a rule with no action and no inputs. if 'action' not in r and not r.get('rule_sources', []): continue cmd = _BuildCommandLineForRule(spec, r, has_input_path=True, do_setup_env=True) rules_file.AddCustomBuildRule(name=rule_name, description=r.get('message', rule_name), extensions=[rule_ext], additional_dependencies=inputs, outputs=outputs, cmd=cmd) # Write out rules file. rules_file.WriteIfChanged() # Add rules file to project. p.AddToolFile(rules_filename) def _Cygwinify(path): path = path.replace('$(OutDir)', '$(OutDirCygwin)') path = path.replace('$(IntDir)', '$(IntDirCygwin)') return path def _GenerateExternalRules(rules, output_dir, spec, sources, options, actions_to_add): """Generate an external makefile to do a set of rules. Arguments: rules: the list of rules to include output_dir: path containing project and gyp files spec: project specification data sources: set of sources known options: global generator options actions_to_add: The list of actions we will add to. """ filename = '%s_rules%s.mk' % (spec['target_name'], options.suffix) mk_file = gyp.common.WriteOnDiff(os.path.join(output_dir, filename)) # Find cygwin style versions of some paths. mk_file.write('OutDirCygwin:=$(shell cygpath -u "$(OutDir)")\n') mk_file.write('IntDirCygwin:=$(shell cygpath -u "$(IntDir)")\n') # Gather stuff needed to emit all: target. all_inputs = OrderedSet() all_outputs = OrderedSet() all_output_dirs = OrderedSet() first_outputs = [] for rule in rules: trigger_files = _FindRuleTriggerFiles(rule, sources) for tf in trigger_files: inputs, outputs = _RuleInputsAndOutputs(rule, tf) all_inputs.update(OrderedSet(inputs)) all_outputs.update(OrderedSet(outputs)) # Only use one target from each rule as the dependency for # 'all' so we don't try to build each rule multiple times. first_outputs.append(list(outputs)[0]) # Get the unique output directories for this rule. output_dirs = [os.path.split(i)[0] for i in outputs] for od in output_dirs: all_output_dirs.add(od) first_outputs_cyg = [_Cygwinify(i) for i in first_outputs] # Write out all: target, including mkdir for each output directory. mk_file.write('all: %s\n' % ' '.join(first_outputs_cyg)) for od in all_output_dirs: if od: mk_file.write('\tmkdir -p `cygpath -u "%s"`\n' % od) mk_file.write('\n') # Define how each output is generated. for rule in rules: trigger_files = _FindRuleTriggerFiles(rule, sources) for tf in trigger_files: # Get all the inputs and outputs for this rule for this trigger file. inputs, outputs = _RuleInputsAndOutputs(rule, tf) inputs = [_Cygwinify(i) for i in inputs] outputs = [_Cygwinify(i) for i in outputs] # Prepare the command line for this rule. cmd = [_RuleExpandPath(c, tf) for c in rule['action']] cmd = ['"%s"' % i for i in cmd] cmd = ' '.join(cmd) # Add it to the makefile. mk_file.write('%s: %s\n' % (' '.join(outputs), ' '.join(inputs))) mk_file.write('\t%s\n\n' % cmd) # Close up the file. mk_file.close() # Add makefile to list of sources. sources.add(filename) # Add a build action to call makefile. cmd = ['make', 'OutDir=$(OutDir)', 'IntDir=$(IntDir)', '-j', '${NUMBER_OF_PROCESSORS_PLUS_1}', '-f', filename] cmd = _BuildCommandLineForRuleRaw(spec, cmd, True, False, True, True) # Insert makefile as 0'th input, so it gets the action attached there, # as this is easier to understand from in the IDE. all_inputs = list(all_inputs) all_inputs.insert(0, filename) _AddActionStep(actions_to_add, inputs=_FixPaths(all_inputs), outputs=_FixPaths(all_outputs), description='Running external rules for %s' % spec['target_name'], command=cmd) def _EscapeEnvironmentVariableExpansion(s): """Escapes % characters. Escapes any % characters so that Windows-style environment variable expansions will leave them alone. See http://connect.microsoft.com/VisualStudio/feedback/details/106127/cl-d-name-text-containing-percentage-characters-doesnt-compile to understand why we have to do this. Args: s: The string to be escaped. Returns: The escaped string. """ s = s.replace('%', '%%') return s quote_replacer_regex = re.compile(r'(\\*)"') def _EscapeCommandLineArgumentForMSVS(s): """Escapes a Windows command-line argument. So that the Win32 CommandLineToArgv function will turn the escaped result back into the original string. See http://msdn.microsoft.com/en-us/library/17w5ykft.aspx ("Parsing C++ Command-Line Arguments") to understand why we have to do this. Args: s: the string to be escaped. Returns: the escaped string. """ def _Replace(match): # For a literal quote, CommandLineToArgv requires an odd number of # backslashes preceding it, and it produces half as many literal backslashes # (rounded down). So we need to produce 2n+1 backslashes. return 2 * match.group(1) + '\\"' # Escape all quotes so that they are interpreted literally. s = quote_replacer_regex.sub(_Replace, s) # Now add unescaped quotes so that any whitespace is interpreted literally. s = '"' + s + '"' return s delimiters_replacer_regex = re.compile(r'(\\*)([,;]+)') def _EscapeVCProjCommandLineArgListItem(s): """Escapes command line arguments for MSVS. The VCProj format stores string lists in a single string using commas and semi-colons as separators, which must be quoted if they are to be interpreted literally. However, command-line arguments may already have quotes, and the VCProj parser is ignorant of the backslash escaping convention used by CommandLineToArgv, so the command-line quotes and the VCProj quotes may not be the same quotes. So to store a general command-line argument in a VCProj list, we need to parse the existing quoting according to VCProj's convention and quote any delimiters that are not already quoted by that convention. The quotes that we add will also be seen by CommandLineToArgv, so if backslashes precede them then we also have to escape those backslashes according to the CommandLineToArgv convention. Args: s: the string to be escaped. Returns: the escaped string. """ def _Replace(match): # For a non-literal quote, CommandLineToArgv requires an even number of # backslashes preceding it, and it produces half as many literal # backslashes. So we need to produce 2n backslashes. return 2 * match.group(1) + '"' + match.group(2) + '"' segments = s.split('"') # The unquoted segments are at the even-numbered indices. for i in range(0, len(segments), 2): segments[i] = delimiters_replacer_regex.sub(_Replace, segments[i]) # Concatenate back into a single string s = '"'.join(segments) if len(segments) % 2 == 0: # String ends while still quoted according to VCProj's convention. This # means the delimiter and the next list item that follow this one in the # .vcproj file will be misinterpreted as part of this item. There is nothing # we can do about this. Adding an extra quote would correct the problem in # the VCProj but cause the same problem on the final command-line. Moving # the item to the end of the list does works, but that's only possible if # there's only one such item. Let's just warn the user. print >> sys.stderr, ('Warning: MSVS may misinterpret the odd number of ' + 'quotes in ' + s) return s def _EscapeCppDefineForMSVS(s): """Escapes a CPP define so that it will reach the compiler unaltered.""" s = _EscapeEnvironmentVariableExpansion(s) s = _EscapeCommandLineArgumentForMSVS(s) s = _EscapeVCProjCommandLineArgListItem(s) # cl.exe replaces literal # characters with = in preprocesor definitions for # some reason. Octal-encode to work around that. s = s.replace('#', '\\%03o' % ord('#')) return s quote_replacer_regex2 = re.compile(r'(\\+)"') def _EscapeCommandLineArgumentForMSBuild(s): """Escapes a Windows command-line argument for use by MSBuild.""" def _Replace(match): return (len(match.group(1)) / 2 * 4) * '\\' + '\\"' # Escape all quotes so that they are interpreted literally. s = quote_replacer_regex2.sub(_Replace, s) return s def _EscapeMSBuildSpecialCharacters(s): escape_dictionary = { '%': '%25', '$': '%24', '@': '%40', "'": '%27', ';': '%3B', '?': '%3F', '*': '%2A' } result = ''.join([escape_dictionary.get(c, c) for c in s]) return result def _EscapeCppDefineForMSBuild(s): """Escapes a CPP define so that it will reach the compiler unaltered.""" s = _EscapeEnvironmentVariableExpansion(s) s = _EscapeCommandLineArgumentForMSBuild(s) s = _EscapeMSBuildSpecialCharacters(s) # cl.exe replaces literal # characters with = in preprocesor definitions for # some reason. Octal-encode to work around that. s = s.replace('#', '\\%03o' % ord('#')) return s def _GenerateRulesForMSVS(p, output_dir, options, spec, sources, excluded_sources, actions_to_add): """Generate all the rules for a particular project. Arguments: p: the project output_dir: directory to emit rules to options: global options passed to the generator spec: the specification for this project sources: the set of all known source files in this project excluded_sources: the set of sources excluded from normal processing actions_to_add: deferred list of actions to add in """ rules = spec.get('rules', []) rules_native = [r for r in rules if not int(r.get('msvs_external_rule', 0))] rules_external = [r for r in rules if int(r.get('msvs_external_rule', 0))] # Handle rules that use a native rules file. if rules_native: _GenerateNativeRulesForMSVS(p, rules_native, output_dir, spec, options) # Handle external rules (non-native rules). if rules_external: _GenerateExternalRules(rules_external, output_dir, spec, sources, options, actions_to_add) _AdjustSourcesForRules(spec, rules, sources, excluded_sources) def _AdjustSourcesForRules(spec, rules, sources, excluded_sources): # Add outputs generated by each rule (if applicable). for rule in rules: # Done if not processing outputs as sources. if int(rule.get('process_outputs_as_sources', False)): # Add in the outputs from this rule. trigger_files = _FindRuleTriggerFiles(rule, sources) for trigger_file in trigger_files: inputs, outputs = _RuleInputsAndOutputs(rule, trigger_file) inputs = OrderedSet(_FixPaths(inputs)) outputs = OrderedSet(_FixPaths(outputs)) inputs.remove(_FixPath(trigger_file)) sources.update(inputs) if not spec.get('msvs_external_builder'): excluded_sources.update(inputs) sources.update(outputs) def _FilterActionsFromExcluded(excluded_sources, actions_to_add): """Take inputs with actions attached out of the list of exclusions. Arguments: excluded_sources: list of source files not to be built. actions_to_add: dict of actions keyed on source file they're attached to. Returns: excluded_sources with files that have actions attached removed. """ must_keep = OrderedSet(_FixPaths(actions_to_add.keys())) return [s for s in excluded_sources if s not in must_keep] def _GetDefaultConfiguration(spec): return spec['configurations'][spec['default_configuration']] def _GetGuidOfProject(proj_path, spec): """Get the guid for the project. Arguments: proj_path: Path of the vcproj or vcxproj file to generate. spec: The target dictionary containing the properties of the target. Returns: the guid. Raises: ValueError: if the specified GUID is invalid. """ # Pluck out the default configuration. default_config = _GetDefaultConfiguration(spec) # Decide the guid of the project. guid = default_config.get('msvs_guid') if guid: if VALID_MSVS_GUID_CHARS.match(guid) is None: raise ValueError('Invalid MSVS guid: "%s". Must match regex: "%s".' % (guid, VALID_MSVS_GUID_CHARS.pattern)) guid = '{%s}' % guid guid = guid or MSVSNew.MakeGuid(proj_path) return guid def _GetMsbuildToolsetOfProject(proj_path, spec, version): """Get the platform toolset for the project. Arguments: proj_path: Path of the vcproj or vcxproj file to generate. spec: The target dictionary containing the properties of the target. version: The MSVSVersion object. Returns: the platform toolset string or None. """ # Pluck out the default configuration. default_config = _GetDefaultConfiguration(spec) toolset = default_config.get('msbuild_toolset') if not toolset and version.DefaultToolset(): toolset = version.DefaultToolset() return toolset def _GenerateProject(project, options, version, generator_flags): """Generates a vcproj file. Arguments: project: the MSVSProject object. options: global generator options. version: the MSVSVersion object. generator_flags: dict of generator-specific flags. Returns: A list of source files that cannot be found on disk. """ default_config = _GetDefaultConfiguration(project.spec) # Skip emitting anything if told to with msvs_existing_vcproj option. if default_config.get('msvs_existing_vcproj'): return [] if version.UsesVcxproj(): return _GenerateMSBuildProject(project, options, version, generator_flags) else: return _GenerateMSVSProject(project, options, version, generator_flags) def _GenerateMSVSProject(project, options, version, generator_flags): """Generates a .vcproj file. It may create .rules and .user files too. Arguments: project: The project object we will generate the file for. options: Global options passed to the generator. version: The VisualStudioVersion object. generator_flags: dict of generator-specific flags. """ spec = project.spec gyp.common.EnsureDirExists(project.path) platforms = _GetUniquePlatforms(spec) p = MSVSProject.Writer(project.path, version, spec['target_name'], project.guid, platforms) # Get directory project file is in. project_dir = os.path.split(project.path)[0] gyp_path = _NormalizedSource(project.build_file) relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir) config_type = _GetMSVSConfigurationType(spec, project.build_file) for config_name, config in spec['configurations'].iteritems(): _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config) # Prepare list of sources and excluded sources. gyp_file = os.path.split(project.build_file)[1] sources, excluded_sources = _PrepareListOfSources(spec, generator_flags, gyp_file) # Add rules. actions_to_add = {} _GenerateRulesForMSVS(p, project_dir, options, spec, sources, excluded_sources, actions_to_add) list_excluded = generator_flags.get('msvs_list_excluded_files', True) sources, excluded_sources, excluded_idl = ( _AdjustSourcesAndConvertToFilterHierarchy(spec, options, project_dir, sources, excluded_sources, list_excluded, version)) # Add in files. missing_sources = _VerifySourcesExist(sources, project_dir) p.AddFiles(sources) _AddToolFilesToMSVS(p, spec) _HandlePreCompiledHeaders(p, sources, spec) _AddActions(actions_to_add, spec, relative_path_of_gyp_file) _AddCopies(actions_to_add, spec) _WriteMSVSUserFile(project.path, version, spec) # NOTE: this stanza must appear after all actions have been decided. # Don't excluded sources with actions attached, or they won't run. excluded_sources = _FilterActionsFromExcluded( excluded_sources, actions_to_add) _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl, list_excluded) _AddAccumulatedActionsToMSVS(p, spec, actions_to_add) # Write it out. p.WriteIfChanged() return missing_sources def _GetUniquePlatforms(spec): """Returns the list of unique platforms for this spec, e.g ['win32', ...]. Arguments: spec: The target dictionary containing the properties of the target. Returns: The MSVSUserFile object created. """ # Gather list of unique platforms. platforms = OrderedSet() for configuration in spec['configurations']: platforms.add(_ConfigPlatform(spec['configurations'][configuration])) platforms = list(platforms) return platforms def _CreateMSVSUserFile(proj_path, version, spec): """Generates a .user file for the user running this Gyp program. Arguments: proj_path: The path of the project file being created. The .user file shares the same path (with an appropriate suffix). version: The VisualStudioVersion object. spec: The target dictionary containing the properties of the target. Returns: The MSVSUserFile object created. """ (domain, username) = _GetDomainAndUserName() vcuser_filename = '.'.join([proj_path, domain, username, 'user']) user_file = MSVSUserFile.Writer(vcuser_filename, version, spec['target_name']) return user_file def _GetMSVSConfigurationType(spec, build_file): """Returns the configuration type for this project. It's a number defined by Microsoft. May raise an exception. Args: spec: The target dictionary containing the properties of the target. build_file: The path of the gyp file. Returns: An integer, the configuration type. """ try: config_type = { 'executable': '1', # .exe 'shared_library': '2', # .dll 'loadable_module': '2', # .dll 'static_library': '4', # .lib 'none': '10', # Utility type }[spec['type']] except KeyError: if spec.get('type'): raise GypError('Target type %s is not a valid target type for ' 'target %s in %s.' % (spec['type'], spec['target_name'], build_file)) else: raise GypError('Missing type field for target %s in %s.' % (spec['target_name'], build_file)) return config_type def _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config): """Adds a configuration to the MSVS project. Many settings in a vcproj file are specific to a configuration. This function the main part of the vcproj file that's configuration specific. Arguments: p: The target project being generated. spec: The target dictionary containing the properties of the target. config_type: The configuration type, a number as defined by Microsoft. config_name: The name of the configuration. config: The dictionary that defines the special processing to be done for this configuration. """ # Get the information for this configuration include_dirs, resource_include_dirs = _GetIncludeDirs(config) libraries = _GetLibraries(spec) library_dirs = _GetLibraryDirs(config) out_file, vc_tool, _ = _GetOutputFilePathAndTool(spec, msbuild=False) defines = _GetDefines(config) defines = [_EscapeCppDefineForMSVS(d) for d in defines] disabled_warnings = _GetDisabledWarnings(config) prebuild = config.get('msvs_prebuild') postbuild = config.get('msvs_postbuild') def_file = _GetModuleDefinition(spec) precompiled_header = config.get('msvs_precompiled_header') # Prepare the list of tools as a dictionary. tools = dict() # Add in user specified msvs_settings. msvs_settings = config.get('msvs_settings', {}) MSVSSettings.ValidateMSVSSettings(msvs_settings) # Prevent default library inheritance from the environment. _ToolAppend(tools, 'VCLinkerTool', 'AdditionalDependencies', ['$(NOINHERIT)']) for tool in msvs_settings: settings = config['msvs_settings'][tool] for setting in settings: _ToolAppend(tools, tool, setting, settings[setting]) # Add the information to the appropriate tool _ToolAppend(tools, 'VCCLCompilerTool', 'AdditionalIncludeDirectories', include_dirs) _ToolAppend(tools, 'VCResourceCompilerTool', 'AdditionalIncludeDirectories', resource_include_dirs) # Add in libraries. _ToolAppend(tools, 'VCLinkerTool', 'AdditionalDependencies', libraries) _ToolAppend(tools, 'VCLinkerTool', 'AdditionalLibraryDirectories', library_dirs) if out_file: _ToolAppend(tools, vc_tool, 'OutputFile', out_file, only_if_unset=True) # Add defines. _ToolAppend(tools, 'VCCLCompilerTool', 'PreprocessorDefinitions', defines) _ToolAppend(tools, 'VCResourceCompilerTool', 'PreprocessorDefinitions', defines) # Change program database directory to prevent collisions. _ToolAppend(tools, 'VCCLCompilerTool', 'ProgramDataBaseFileName', '$(IntDir)$(ProjectName)\\vc80.pdb', only_if_unset=True) # Add disabled warnings. _ToolAppend(tools, 'VCCLCompilerTool', 'DisableSpecificWarnings', disabled_warnings) # Add Pre-build. _ToolAppend(tools, 'VCPreBuildEventTool', 'CommandLine', prebuild) # Add Post-build. _ToolAppend(tools, 'VCPostBuildEventTool', 'CommandLine', postbuild) # Turn on precompiled headers if appropriate. if precompiled_header: precompiled_header = os.path.split(precompiled_header)[1] _ToolAppend(tools, 'VCCLCompilerTool', 'UsePrecompiledHeader', '2') _ToolAppend(tools, 'VCCLCompilerTool', 'PrecompiledHeaderThrough', precompiled_header) _ToolAppend(tools, 'VCCLCompilerTool', 'ForcedIncludeFiles', precompiled_header) # Loadable modules don't generate import libraries; # tell dependent projects to not expect one. if spec['type'] == 'loadable_module': _ToolAppend(tools, 'VCLinkerTool', 'IgnoreImportLibrary', 'true') # Set the module definition file if any. if def_file: _ToolAppend(tools, 'VCLinkerTool', 'ModuleDefinitionFile', def_file) _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name) def _GetIncludeDirs(config): """Returns the list of directories to be used for #include directives. Arguments: config: The dictionary that defines the special processing to be done for this configuration. Returns: The list of directory paths. """ # TODO(bradnelson): include_dirs should really be flexible enough not to # require this sort of thing. include_dirs = ( config.get('include_dirs', []) + config.get('msvs_system_include_dirs', [])) resource_include_dirs = config.get('resource_include_dirs', include_dirs) include_dirs = _FixPaths(include_dirs) resource_include_dirs = _FixPaths(resource_include_dirs) return include_dirs, resource_include_dirs def _GetLibraryDirs(config): """Returns the list of directories to be used for library search paths. Arguments: config: The dictionary that defines the special processing to be done for this configuration. Returns: The list of directory paths. """ library_dirs = config.get('library_dirs', []) library_dirs = _FixPaths(library_dirs) return library_dirs def _GetLibraries(spec): """Returns the list of libraries for this configuration. Arguments: spec: The target dictionary containing the properties of the target. Returns: The list of directory paths. """ libraries = spec.get('libraries', []) # Strip out -l, as it is not used on windows (but is needed so we can pass # in libraries that are assumed to be in the default library path). # Also remove duplicate entries, leaving only the last duplicate, while # preserving order. found = OrderedSet() unique_libraries_list = [] for entry in reversed(libraries): library = re.sub('^\-l', '', entry) if not os.path.splitext(library)[1]: library += '.lib' if library not in found: found.add(library) unique_libraries_list.append(library) unique_libraries_list.reverse() return unique_libraries_list def _GetOutputFilePathAndTool(spec, msbuild): """Returns the path and tool to use for this target. Figures out the path of the file this spec will create and the name of the VC tool that will create it. Arguments: spec: The target dictionary containing the properties of the target. Returns: A triple of (file path, name of the vc tool, name of the msbuild tool) """ # Select a name for the output file. out_file = '' vc_tool = '' msbuild_tool = '' output_file_map = { 'executable': ('VCLinkerTool', 'Link', '$(OutDir)', '.exe'), 'shared_library': ('VCLinkerTool', 'Link', '$(OutDir)', '.dll'), 'loadable_module': ('VCLinkerTool', 'Link', '$(OutDir)', '.dll'), 'static_library': ('VCLibrarianTool', 'Lib', '$(OutDir)lib\\', '.lib'), } output_file_props = output_file_map.get(spec['type']) if output_file_props and int(spec.get('msvs_auto_output_file', 1)): vc_tool, msbuild_tool, out_dir, suffix = output_file_props if spec.get('standalone_static_library', 0): out_dir = '$(OutDir)' out_dir = spec.get('product_dir', out_dir) product_extension = spec.get('product_extension') if product_extension: suffix = '.' + product_extension elif msbuild: suffix = '$(TargetExt)' prefix = spec.get('product_prefix', '') product_name = spec.get('product_name', '$(ProjectName)') out_file = ntpath.join(out_dir, prefix + product_name + suffix) return out_file, vc_tool, msbuild_tool def _GetOutputTargetExt(spec): """Returns the extension for this target, including the dot If product_extension is specified, set target_extension to this to avoid MSB8012, returns None otherwise. Ignores any target_extension settings in the input files. Arguments: spec: The target dictionary containing the properties of the target. Returns: A string with the extension, or None """ target_extension = spec.get('product_extension') if target_extension: return '.' + target_extension return None def _GetDefines(config): """Returns the list of preprocessor definitions for this configuation. Arguments: config: The dictionary that defines the special processing to be done for this configuration. Returns: The list of preprocessor definitions. """ defines = [] for d in config.get('defines', []): if type(d) == list: fd = '='.join([str(dpart) for dpart in d]) else: fd = str(d) defines.append(fd) return defines def _GetDisabledWarnings(config): return [str(i) for i in config.get('msvs_disabled_warnings', [])] def _GetModuleDefinition(spec): def_file = '' if spec['type'] in ['shared_library', 'loadable_module', 'executable']: def_files = [s for s in spec.get('sources', []) if s.endswith('.def')] if len(def_files) == 1: def_file = _FixPath(def_files[0]) elif def_files: raise ValueError( 'Multiple module definition files in one target, target %s lists ' 'multiple .def files: %s' % ( spec['target_name'], ' '.join(def_files))) return def_file def _ConvertToolsToExpectedForm(tools): """Convert tools to a form expected by Visual Studio. Arguments: tools: A dictionary of settings; the tool name is the key. Returns: A list of Tool objects. """ tool_list = [] for tool, settings in tools.iteritems(): # Collapse settings with lists. settings_fixed = {} for setting, value in settings.iteritems(): if type(value) == list: if ((tool == 'VCLinkerTool' and setting == 'AdditionalDependencies') or setting == 'AdditionalOptions'): settings_fixed[setting] = ' '.join(value) else: settings_fixed[setting] = ';'.join(value) else: settings_fixed[setting] = value # Add in this tool. tool_list.append(MSVSProject.Tool(tool, settings_fixed)) return tool_list def _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name): """Add to the project file the configuration specified by config. Arguments: p: The target project being generated. spec: the target project dict. tools: A dictionary of settings; the tool name is the key. config: The dictionary that defines the special processing to be done for this configuration. config_type: The configuration type, a number as defined by Microsoft. config_name: The name of the configuration. """ attributes = _GetMSVSAttributes(spec, config, config_type) # Add in this configuration. tool_list = _ConvertToolsToExpectedForm(tools) p.AddConfig(_ConfigFullName(config_name, config), attrs=attributes, tools=tool_list) def _GetMSVSAttributes(spec, config, config_type): # Prepare configuration attributes. prepared_attrs = {} source_attrs = config.get('msvs_configuration_attributes', {}) for a in source_attrs: prepared_attrs[a] = source_attrs[a] # Add props files. vsprops_dirs = config.get('msvs_props', []) vsprops_dirs = _FixPaths(vsprops_dirs) if vsprops_dirs: prepared_attrs['InheritedPropertySheets'] = ';'.join(vsprops_dirs) # Set configuration type. prepared_attrs['ConfigurationType'] = config_type output_dir = prepared_attrs.get('OutputDirectory', '$(SolutionDir)$(ConfigurationName)') prepared_attrs['OutputDirectory'] = _FixPath(output_dir) + '\\' if 'IntermediateDirectory' not in prepared_attrs: intermediate = '$(ConfigurationName)\\obj\\$(ProjectName)' prepared_attrs['IntermediateDirectory'] = _FixPath(intermediate) + '\\' else: intermediate = _FixPath(prepared_attrs['IntermediateDirectory']) + '\\' intermediate = MSVSSettings.FixVCMacroSlashes(intermediate) prepared_attrs['IntermediateDirectory'] = intermediate return prepared_attrs def _AddNormalizedSources(sources_set, sources_array): sources_set.update(_NormalizedSource(s) for s in sources_array) def _PrepareListOfSources(spec, generator_flags, gyp_file): """Prepare list of sources and excluded sources. Besides the sources specified directly in the spec, adds the gyp file so that a change to it will cause a re-compile. Also adds appropriate sources for actions and copies. Assumes later stage will un-exclude files which have custom build steps attached. Arguments: spec: The target dictionary containing the properties of the target. gyp_file: The name of the gyp file. Returns: A pair of (list of sources, list of excluded sources). The sources will be relative to the gyp file. """ sources = OrderedSet() _AddNormalizedSources(sources, spec.get('sources', [])) excluded_sources = OrderedSet() # Add in the gyp file. if not generator_flags.get('standalone'): sources.add(gyp_file) # Add in 'action' inputs and outputs. for a in spec.get('actions', []): inputs = a['inputs'] inputs = [_NormalizedSource(i) for i in inputs] # Add all inputs to sources and excluded sources. inputs = OrderedSet(inputs) sources.update(inputs) if not spec.get('msvs_external_builder'): excluded_sources.update(inputs) if int(a.get('process_outputs_as_sources', False)): _AddNormalizedSources(sources, a.get('outputs', [])) # Add in 'copies' inputs and outputs. for cpy in spec.get('copies', []): _AddNormalizedSources(sources, cpy.get('files', [])) return (sources, excluded_sources) def _AdjustSourcesAndConvertToFilterHierarchy( spec, options, gyp_dir, sources, excluded_sources, list_excluded, version): """Adjusts the list of sources and excluded sources. Also converts the sets to lists. Arguments: spec: The target dictionary containing the properties of the target. options: Global generator options. gyp_dir: The path to the gyp file being processed. sources: A set of sources to be included for this project. excluded_sources: A set of sources to be excluded for this project. version: A MSVSVersion object. Returns: A trio of (list of sources, list of excluded sources, path of excluded IDL file) """ # Exclude excluded sources coming into the generator. excluded_sources.update(OrderedSet(spec.get('sources_excluded', []))) # Add excluded sources into sources for good measure. sources.update(excluded_sources) # Convert to proper windows form. # NOTE: sources goes from being a set to a list here. # NOTE: excluded_sources goes from being a set to a list here. sources = _FixPaths(sources) # Convert to proper windows form. excluded_sources = _FixPaths(excluded_sources) excluded_idl = _IdlFilesHandledNonNatively(spec, sources) precompiled_related = _GetPrecompileRelatedFiles(spec) # Find the excluded ones, minus the precompiled header related ones. fully_excluded = [i for i in excluded_sources if i not in precompiled_related] # Convert to folders and the right slashes. sources = [i.split('\\') for i in sources] sources = _ConvertSourcesToFilterHierarchy(sources, excluded=fully_excluded, list_excluded=list_excluded, msvs_version=version) # Prune filters with a single child to flatten ugly directory structures # such as ../../src/modules/module1 etc. while len(sources) == 1 and isinstance(sources[0], MSVSProject.Filter): sources = sources[0].contents return sources, excluded_sources, excluded_idl def _IdlFilesHandledNonNatively(spec, sources): # If any non-native rules use 'idl' as an extension exclude idl files. # Gather a list here to use later. using_idl = False for rule in spec.get('rules', []): if rule['extension'] == 'idl' and int(rule.get('msvs_external_rule', 0)): using_idl = True break if using_idl: excluded_idl = [i for i in sources if i.endswith('.idl')] else: excluded_idl = [] return excluded_idl def _GetPrecompileRelatedFiles(spec): # Gather a list of precompiled header related sources. precompiled_related = [] for _, config in spec['configurations'].iteritems(): for k in precomp_keys: f = config.get(k) if f: precompiled_related.append(_FixPath(f)) return precompiled_related def _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl, list_excluded): exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl) for file_name, excluded_configs in exclusions.iteritems(): if (not list_excluded and len(excluded_configs) == len(spec['configurations'])): # If we're not listing excluded files, then they won't appear in the # project, so don't try to configure them to be excluded. pass else: for config_name, config in excluded_configs: p.AddFileConfig(file_name, _ConfigFullName(config_name, config), {'ExcludedFromBuild': 'true'}) def _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl): exclusions = {} # Exclude excluded sources from being built. for f in excluded_sources: excluded_configs = [] for config_name, config in spec['configurations'].iteritems(): precomped = [_FixPath(config.get(i, '')) for i in precomp_keys] # Don't do this for ones that are precompiled header related. if f not in precomped: excluded_configs.append((config_name, config)) exclusions[f] = excluded_configs # If any non-native rules use 'idl' as an extension exclude idl files. # Exclude them now. for f in excluded_idl: excluded_configs = [] for config_name, config in spec['configurations'].iteritems(): excluded_configs.append((config_name, config)) exclusions[f] = excluded_configs return exclusions def _AddToolFilesToMSVS(p, spec): # Add in tool files (rules). tool_files = OrderedSet() for _, config in spec['configurations'].iteritems(): for f in config.get('msvs_tool_files', []): tool_files.add(f) for f in tool_files: p.AddToolFile(f) def _HandlePreCompiledHeaders(p, sources, spec): # Pre-compiled header source stubs need a different compiler flag # (generate precompiled header) and any source file not of the same # kind (i.e. C vs. C++) as the precompiled header source stub needs # to have use of precompiled headers disabled. extensions_excluded_from_precompile = [] for config_name, config in spec['configurations'].iteritems(): source = config.get('msvs_precompiled_source') if source: source = _FixPath(source) # UsePrecompiledHeader=1 for if using precompiled headers. tool = MSVSProject.Tool('VCCLCompilerTool', {'UsePrecompiledHeader': '1'}) p.AddFileConfig(source, _ConfigFullName(config_name, config), {}, tools=[tool]) basename, extension = os.path.splitext(source) if extension == '.c': extensions_excluded_from_precompile = ['.cc', '.cpp', '.cxx'] else: extensions_excluded_from_precompile = ['.c'] def DisableForSourceTree(source_tree): for source in source_tree: if isinstance(source, MSVSProject.Filter): DisableForSourceTree(source.contents) else: basename, extension = os.path.splitext(source) if extension in extensions_excluded_from_precompile: for config_name, config in spec['configurations'].iteritems(): tool = MSVSProject.Tool('VCCLCompilerTool', {'UsePrecompiledHeader': '0', 'ForcedIncludeFiles': '$(NOINHERIT)'}) p.AddFileConfig(_FixPath(source), _ConfigFullName(config_name, config), {}, tools=[tool]) # Do nothing if there was no precompiled source. if extensions_excluded_from_precompile: DisableForSourceTree(sources) def _AddActions(actions_to_add, spec, relative_path_of_gyp_file): # Add actions. actions = spec.get('actions', []) # Don't setup_env every time. When all the actions are run together in one # batch file in VS, the PATH will grow too long. # Membership in this set means that the cygwin environment has been set up, # and does not need to be set up again. have_setup_env = set() for a in actions: # Attach actions to the gyp file if nothing else is there. inputs = a.get('inputs') or [relative_path_of_gyp_file] attached_to = inputs[0] need_setup_env = attached_to not in have_setup_env cmd = _BuildCommandLineForRule(spec, a, has_input_path=False, do_setup_env=need_setup_env) have_setup_env.add(attached_to) # Add the action. _AddActionStep(actions_to_add, inputs=inputs, outputs=a.get('outputs', []), description=a.get('message', a['action_name']), command=cmd) def _WriteMSVSUserFile(project_path, version, spec): # Add run_as and test targets. if 'run_as' in spec: run_as = spec['run_as'] action = run_as.get('action', []) environment = run_as.get('environment', []) working_directory = run_as.get('working_directory', '.') elif int(spec.get('test', 0)): action = ['$(TargetPath)', '--gtest_print_time'] environment = [] working_directory = '.' else: return # Nothing to add # Write out the user file. user_file = _CreateMSVSUserFile(project_path, version, spec) for config_name, c_data in spec['configurations'].iteritems(): user_file.AddDebugSettings(_ConfigFullName(config_name, c_data), action, environment, working_directory) user_file.WriteIfChanged() def _AddCopies(actions_to_add, spec): copies = _GetCopies(spec) for inputs, outputs, cmd, description in copies: _AddActionStep(actions_to_add, inputs=inputs, outputs=outputs, description=description, command=cmd) def _GetCopies(spec): copies = [] # Add copies. for cpy in spec.get('copies', []): for src in cpy.get('files', []): dst = os.path.join(cpy['destination'], os.path.basename(src)) # _AddCustomBuildToolForMSVS() will call _FixPath() on the inputs and # outputs, so do the same for our generated command line. if src.endswith('/'): src_bare = src[:-1] base_dir = posixpath.split(src_bare)[0] outer_dir = posixpath.split(src_bare)[1] cmd = 'cd "%s" && xcopy /e /f /y "%s" "%s\\%s\\"' % ( _FixPath(base_dir), outer_dir, _FixPath(dst), outer_dir) copies.append(([src], ['dummy_copies', dst], cmd, 'Copying %s to %s' % (src, dst))) else: cmd = 'mkdir "%s" 2>nul & set ERRORLEVEL=0 & copy /Y "%s" "%s"' % ( _FixPath(cpy['destination']), _FixPath(src), _FixPath(dst)) copies.append(([src], [dst], cmd, 'Copying %s to %s' % (src, dst))) return copies def _GetPathDict(root, path): # |path| will eventually be empty (in the recursive calls) if it was initially # relative; otherwise it will eventually end up as '\', 'D:\', etc. if not path or path.endswith(os.sep): return root parent, folder = os.path.split(path) parent_dict = _GetPathDict(root, parent) if folder not in parent_dict: parent_dict[folder] = dict() return parent_dict[folder] def _DictsToFolders(base_path, bucket, flat): # Convert to folders recursively. children = [] for folder, contents in bucket.iteritems(): if type(contents) == dict: folder_children = _DictsToFolders(os.path.join(base_path, folder), contents, flat) if flat: children += folder_children else: folder_children = MSVSNew.MSVSFolder(os.path.join(base_path, folder), name='(' + folder + ')', entries=folder_children) children.append(folder_children) else: children.append(contents) return children def _CollapseSingles(parent, node): # Recursively explorer the tree of dicts looking for projects which are # the sole item in a folder which has the same name as the project. Bring # such projects up one level. if (type(node) == dict and len(node) == 1 and node.keys()[0] == parent + '.vcproj'): return node[node.keys()[0]] if type(node) != dict: return node for child in node: node[child] = _CollapseSingles(child, node[child]) return node def _GatherSolutionFolders(sln_projects, project_objects, flat): root = {} # Convert into a tree of dicts on path. for p in sln_projects: gyp_file, target = gyp.common.ParseQualifiedTarget(p)[0:2] gyp_dir = os.path.dirname(gyp_file) path_dict = _GetPathDict(root, gyp_dir) path_dict[target + '.vcproj'] = project_objects[p] # Walk down from the top until we hit a folder that has more than one entry. # In practice, this strips the top-level "src/" dir from the hierarchy in # the solution. while len(root) == 1 and type(root[root.keys()[0]]) == dict: root = root[root.keys()[0]] # Collapse singles. root = _CollapseSingles('', root) # Merge buckets until everything is a root entry. return _DictsToFolders('', root, flat) def _GetPathOfProject(qualified_target, spec, options, msvs_version): default_config = _GetDefaultConfiguration(spec) proj_filename = default_config.get('msvs_existing_vcproj') if not proj_filename: proj_filename = (spec['target_name'] + options.suffix + msvs_version.ProjectExtension()) build_file = gyp.common.BuildFile(qualified_target) proj_path = os.path.join(os.path.dirname(build_file), proj_filename) fix_prefix = None if options.generator_output: project_dir_path = os.path.dirname(os.path.abspath(proj_path)) proj_path = os.path.join(options.generator_output, proj_path) fix_prefix = gyp.common.RelativePath(project_dir_path, os.path.dirname(proj_path)) return proj_path, fix_prefix def _GetPlatformOverridesOfProject(spec): # Prepare a dict indicating which project configurations are used for which # solution configurations for this target. config_platform_overrides = {} for config_name, c in spec['configurations'].iteritems(): config_fullname = _ConfigFullName(config_name, c) platform = c.get('msvs_target_platform', _ConfigPlatform(c)) fixed_config_fullname = '%s|%s' % ( _ConfigBaseName(config_name, _ConfigPlatform(c)), platform) config_platform_overrides[config_fullname] = fixed_config_fullname return config_platform_overrides def _CreateProjectObjects(target_list, target_dicts, options, msvs_version): """Create a MSVSProject object for the targets found in target list. Arguments: target_list: the list of targets to generate project objects for. target_dicts: the dictionary of specifications. options: global generator options. msvs_version: the MSVSVersion object. Returns: A set of created projects, keyed by target. """ global fixpath_prefix # Generate each project. projects = {} for qualified_target in target_list: spec = target_dicts[qualified_target] if spec['toolset'] != 'target': raise GypError( 'Multiple toolsets not supported in msvs build (target %s)' % qualified_target) proj_path, fixpath_prefix = _GetPathOfProject(qualified_target, spec, options, msvs_version) guid = _GetGuidOfProject(proj_path, spec) overrides = _GetPlatformOverridesOfProject(spec) build_file = gyp.common.BuildFile(qualified_target) # Create object for this project. obj = MSVSNew.MSVSProject( proj_path, name=spec['target_name'], guid=guid, spec=spec, build_file=build_file, config_platform_overrides=overrides, fixpath_prefix=fixpath_prefix) # Set project toolset if any (MS build only) if msvs_version.UsesVcxproj(): obj.set_msbuild_toolset( _GetMsbuildToolsetOfProject(proj_path, spec, msvs_version)) projects[qualified_target] = obj # Set all the dependencies, but not if we are using an external builder like # ninja for project in projects.values(): if not project.spec.get('msvs_external_builder'): deps = project.spec.get('dependencies', []) deps = [projects[d] for d in deps] project.set_dependencies(deps) return projects def _InitNinjaFlavor(options, target_list, target_dicts): """Initialize targets for the ninja flavor. This sets up the necessary variables in the targets to generate msvs projects that use ninja as an external builder. The variables in the spec are only set if they have not been set. This allows individual specs to override the default values initialized here. Arguments: options: Options provided to the generator. target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. """ for qualified_target in target_list: spec = target_dicts[qualified_target] if spec.get('msvs_external_builder'): # The spec explicitly defined an external builder, so don't change it. continue path_to_ninja = spec.get('msvs_path_to_ninja', 'ninja.exe') spec['msvs_external_builder'] = 'ninja' if not spec.get('msvs_external_builder_out_dir'): spec['msvs_external_builder_out_dir'] = \ options.depth + '/out/$(Configuration)' if not spec.get('msvs_external_builder_build_cmd'): spec['msvs_external_builder_build_cmd'] = [ path_to_ninja, '-C', '$(OutDir)', '$(ProjectName)', ] if not spec.get('msvs_external_builder_clean_cmd'): spec['msvs_external_builder_clean_cmd'] = [ path_to_ninja, '-C', '$(OutDir)', '-t', 'clean', '$(ProjectName)', ] def CalculateVariables(default_variables, params): """Generated variables that require params to be known.""" generator_flags = params.get('generator_flags', {}) # Select project file format version (if unset, default to auto detecting). msvs_version = MSVSVersion.SelectVisualStudioVersion( generator_flags.get('msvs_version', 'auto')) # Stash msvs_version for later (so we don't have to probe the system twice). params['msvs_version'] = msvs_version # Set a variable so conditions can be based on msvs_version. default_variables['MSVS_VERSION'] = msvs_version.ShortName() # To determine processor word size on Windows, in addition to checking # PROCESSOR_ARCHITECTURE (which reflects the word size of the current # process), it is also necessary to check PROCESSOR_ARCITEW6432 (which # contains the actual word size of the system when running thru WOW64). if (os.environ.get('PROCESSOR_ARCHITECTURE', '').find('64') >= 0 or os.environ.get('PROCESSOR_ARCHITEW6432', '').find('64') >= 0): default_variables['MSVS_OS_BITS'] = 64 else: default_variables['MSVS_OS_BITS'] = 32 if gyp.common.GetFlavor(params) == 'ninja': default_variables['SHARED_INTERMEDIATE_DIR'] = '$(OutDir)gen' def PerformBuild(data, configurations, params): options = params['options'] msvs_version = params['msvs_version'] devenv = os.path.join(msvs_version.path, 'Common7', 'IDE', 'devenv.com') for build_file, build_file_dict in data.iteritems(): (build_file_root, build_file_ext) = os.path.splitext(build_file) if build_file_ext != '.gyp': continue sln_path = build_file_root + options.suffix + '.sln' if options.generator_output: sln_path = os.path.join(options.generator_output, sln_path) for config in configurations: arguments = [devenv, sln_path, '/Build', config] print 'Building [%s]: %s' % (config, arguments) rtn = subprocess.check_call(arguments) def GenerateOutput(target_list, target_dicts, data, params): """Generate .sln and .vcproj files. This is the entry point for this generator. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. data: Dictionary containing per .gyp data. """ global fixpath_prefix options = params['options'] # Get the project file format version back out of where we stashed it in # GeneratorCalculatedVariables. msvs_version = params['msvs_version'] generator_flags = params.get('generator_flags', {}) # Optionally shard targets marked with 'msvs_shard': SHARD_COUNT. (target_list, target_dicts) = MSVSUtil.ShardTargets(target_list, target_dicts) # Optionally use the large PDB workaround for targets marked with # 'msvs_large_pdb': 1. (target_list, target_dicts) = MSVSUtil.InsertLargePdbShims( target_list, target_dicts, generator_default_variables) # Optionally configure each spec to use ninja as the external builder. if params.get('flavor') == 'ninja': _InitNinjaFlavor(options, target_list, target_dicts) # Prepare the set of configurations. configs = set() for qualified_target in target_list: spec = target_dicts[qualified_target] for config_name, config in spec['configurations'].iteritems(): configs.add(_ConfigFullName(config_name, config)) configs = list(configs) # Figure out all the projects that will be generated and their guids project_objects = _CreateProjectObjects(target_list, target_dicts, options, msvs_version) # Generate each project. missing_sources = [] for project in project_objects.values(): fixpath_prefix = project.fixpath_prefix missing_sources.extend(_GenerateProject(project, options, msvs_version, generator_flags)) fixpath_prefix = None for build_file in data: # Validate build_file extension if not build_file.endswith('.gyp'): continue sln_path = os.path.splitext(build_file)[0] + options.suffix + '.sln' if options.generator_output: sln_path = os.path.join(options.generator_output, sln_path) # Get projects in the solution, and their dependents. sln_projects = gyp.common.BuildFileTargets(target_list, build_file) sln_projects += gyp.common.DeepDependencyTargets(target_dicts, sln_projects) # Create folder hierarchy. root_entries = _GatherSolutionFolders( sln_projects, project_objects, flat=msvs_version.FlatSolution()) # Create solution. sln = MSVSNew.MSVSSolution(sln_path, entries=root_entries, variants=configs, websiteProperties=False, version=msvs_version) sln.Write() if missing_sources: error_message = "Missing input files:\n" + \ '\n'.join(set(missing_sources)) if generator_flags.get('msvs_error_on_missing_sources', False): raise GypError(error_message) else: print >> sys.stdout, "Warning: " + error_message def _GenerateMSBuildFiltersFile(filters_path, source_files, extension_to_rule_name): """Generate the filters file. This file is used by Visual Studio to organize the presentation of source files into folders. Arguments: filters_path: The path of the file to be created. source_files: The hierarchical structure of all the sources. extension_to_rule_name: A dictionary mapping file extensions to rules. """ filter_group = [] source_group = [] _AppendFiltersForMSBuild('', source_files, extension_to_rule_name, filter_group, source_group) if filter_group: content = ['Project', {'ToolsVersion': '4.0', 'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003' }, ['ItemGroup'] + filter_group, ['ItemGroup'] + source_group ] easy_xml.WriteXmlIfChanged(content, filters_path, pretty=True, win32=True) elif os.path.exists(filters_path): # We don't need this filter anymore. Delete the old filter file. os.unlink(filters_path) def _AppendFiltersForMSBuild(parent_filter_name, sources, extension_to_rule_name, filter_group, source_group): """Creates the list of filters and sources to be added in the filter file. Args: parent_filter_name: The name of the filter under which the sources are found. sources: The hierarchy of filters and sources to process. extension_to_rule_name: A dictionary mapping file extensions to rules. filter_group: The list to which filter entries will be appended. source_group: The list to which source entries will be appeneded. """ for source in sources: if isinstance(source, MSVSProject.Filter): # We have a sub-filter. Create the name of that sub-filter. if not parent_filter_name: filter_name = source.name else: filter_name = '%s\\%s' % (parent_filter_name, source.name) # Add the filter to the group. filter_group.append( ['Filter', {'Include': filter_name}, ['UniqueIdentifier', MSVSNew.MakeGuid(source.name)]]) # Recurse and add its dependents. _AppendFiltersForMSBuild(filter_name, source.contents, extension_to_rule_name, filter_group, source_group) else: # It's a source. Create a source entry. _, element = _MapFileToMsBuildSourceType(source, extension_to_rule_name) source_entry = [element, {'Include': source}] # Specify the filter it is part of, if any. if parent_filter_name: source_entry.append(['Filter', parent_filter_name]) source_group.append(source_entry) def _MapFileToMsBuildSourceType(source, extension_to_rule_name): """Returns the group and element type of the source file. Arguments: source: The source file name. extension_to_rule_name: A dictionary mapping file extensions to rules. Returns: A pair of (group this file should be part of, the label of element) """ _, ext = os.path.splitext(source) if ext in extension_to_rule_name: group = 'rule' element = extension_to_rule_name[ext] elif ext in ['.cc', '.cpp', '.c', '.cxx']: group = 'compile' element = 'ClCompile' elif ext in ['.h', '.hxx']: group = 'include' element = 'ClInclude' elif ext == '.rc': group = 'resource' element = 'ResourceCompile' elif ext == '.idl': group = 'midl' element = 'Midl' else: group = 'none' element = 'None' return (group, element) def _GenerateRulesForMSBuild(output_dir, options, spec, sources, excluded_sources, props_files_of_rules, targets_files_of_rules, actions_to_add, extension_to_rule_name): # MSBuild rules are implemented using three files: an XML file, a .targets # file and a .props file. # See http://blogs.msdn.com/b/vcblog/archive/2010/04/21/quick-help-on-vs2010-custom-build-rule.aspx # for more details. rules = spec.get('rules', []) rules_native = [r for r in rules if not int(r.get('msvs_external_rule', 0))] rules_external = [r for r in rules if int(r.get('msvs_external_rule', 0))] msbuild_rules = [] for rule in rules_native: # Skip a rule with no action and no inputs. if 'action' not in rule and not rule.get('rule_sources', []): continue msbuild_rule = MSBuildRule(rule, spec) msbuild_rules.append(msbuild_rule) extension_to_rule_name[msbuild_rule.extension] = msbuild_rule.rule_name if msbuild_rules: base = spec['target_name'] + options.suffix props_name = base + '.props' targets_name = base + '.targets' xml_name = base + '.xml' props_files_of_rules.add(props_name) targets_files_of_rules.add(targets_name) props_path = os.path.join(output_dir, props_name) targets_path = os.path.join(output_dir, targets_name) xml_path = os.path.join(output_dir, xml_name) _GenerateMSBuildRulePropsFile(props_path, msbuild_rules) _GenerateMSBuildRuleTargetsFile(targets_path, msbuild_rules) _GenerateMSBuildRuleXmlFile(xml_path, msbuild_rules) if rules_external: _GenerateExternalRules(rules_external, output_dir, spec, sources, options, actions_to_add) _AdjustSourcesForRules(spec, rules, sources, excluded_sources) class MSBuildRule(object): """Used to store information used to generate an MSBuild rule. Attributes: rule_name: The rule name, sanitized to use in XML. target_name: The name of the target. after_targets: The name of the AfterTargets element. before_targets: The name of the BeforeTargets element. depends_on: The name of the DependsOn element. compute_output: The name of the ComputeOutput element. dirs_to_make: The name of the DirsToMake element. inputs: The name of the _inputs element. tlog: The name of the _tlog element. extension: The extension this rule applies to. description: The message displayed when this rule is invoked. additional_dependencies: A string listing additional dependencies. outputs: The outputs of this rule. command: The command used to run the rule. """ def __init__(self, rule, spec): self.display_name = rule['rule_name'] # Assure that the rule name is only characters and numbers self.rule_name = re.sub(r'\W', '_', self.display_name) # Create the various element names, following the example set by the # Visual Studio 2008 to 2010 conversion. I don't know if VS2010 # is sensitive to the exact names. self.target_name = '_' + self.rule_name self.after_targets = self.rule_name + 'AfterTargets' self.before_targets = self.rule_name + 'BeforeTargets' self.depends_on = self.rule_name + 'DependsOn' self.compute_output = 'Compute%sOutput' % self.rule_name self.dirs_to_make = self.rule_name + 'DirsToMake' self.inputs = self.rule_name + '_inputs' self.tlog = self.rule_name + '_tlog' self.extension = rule['extension'] if not self.extension.startswith('.'): self.extension = '.' + self.extension self.description = MSVSSettings.ConvertVCMacrosToMSBuild( rule.get('message', self.rule_name)) old_additional_dependencies = _FixPaths(rule.get('inputs', [])) self.additional_dependencies = ( ';'.join([MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in old_additional_dependencies])) old_outputs = _FixPaths(rule.get('outputs', [])) self.outputs = ';'.join([MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in old_outputs]) old_command = _BuildCommandLineForRule(spec, rule, has_input_path=True, do_setup_env=True) self.command = MSVSSettings.ConvertVCMacrosToMSBuild(old_command) def _GenerateMSBuildRulePropsFile(props_path, msbuild_rules): """Generate the .props file.""" content = ['Project', {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003'}] for rule in msbuild_rules: content.extend([ ['PropertyGroup', {'Condition': "'$(%s)' == '' and '$(%s)' == '' and " "'$(ConfigurationType)' != 'Makefile'" % (rule.before_targets, rule.after_targets) }, [rule.before_targets, 'Midl'], [rule.after_targets, 'CustomBuild'], ], ['PropertyGroup', [rule.depends_on, {'Condition': "'$(ConfigurationType)' != 'Makefile'"}, '_SelectedFiles;$(%s)' % rule.depends_on ], ], ['ItemDefinitionGroup', [rule.rule_name, ['CommandLineTemplate', rule.command], ['Outputs', rule.outputs], ['ExecutionDescription', rule.description], ['AdditionalDependencies', rule.additional_dependencies], ], ] ]) easy_xml.WriteXmlIfChanged(content, props_path, pretty=True, win32=True) def _GenerateMSBuildRuleTargetsFile(targets_path, msbuild_rules): """Generate the .targets file.""" content = ['Project', {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003' } ] item_group = [ 'ItemGroup', ['PropertyPageSchema', {'Include': '$(MSBuildThisFileDirectory)$(MSBuildThisFileName).xml'} ] ] for rule in msbuild_rules: item_group.append( ['AvailableItemName', {'Include': rule.rule_name}, ['Targets', rule.target_name], ]) content.append(item_group) for rule in msbuild_rules: content.append( ['UsingTask', {'TaskName': rule.rule_name, 'TaskFactory': 'XamlTaskFactory', 'AssemblyName': 'Microsoft.Build.Tasks.v4.0' }, ['Task', '$(MSBuildThisFileDirectory)$(MSBuildThisFileName).xml'], ]) for rule in msbuild_rules: rule_name = rule.rule_name target_outputs = '%%(%s.Outputs)' % rule_name target_inputs = ('%%(%s.Identity);%%(%s.AdditionalDependencies);' '$(MSBuildProjectFile)') % (rule_name, rule_name) rule_inputs = '%%(%s.Identity)' % rule_name extension_condition = ("'%(Extension)'=='.obj' or " "'%(Extension)'=='.res' or " "'%(Extension)'=='.rsc' or " "'%(Extension)'=='.lib'") remove_section = [ 'ItemGroup', {'Condition': "'@(SelectedFiles)' != ''"}, [rule_name, {'Remove': '@(%s)' % rule_name, 'Condition': "'%(Identity)' != '@(SelectedFiles)'" } ] ] inputs_section = [ 'ItemGroup', [rule.inputs, {'Include': '%%(%s.AdditionalDependencies)' % rule_name}] ] logging_section = [ 'ItemGroup', [rule.tlog, {'Include': '%%(%s.Outputs)' % rule_name, 'Condition': ("'%%(%s.Outputs)' != '' and " "'%%(%s.ExcludedFromBuild)' != 'true'" % (rule_name, rule_name)) }, ['Source', "@(%s, '|')" % rule_name], ['Inputs', "@(%s -> '%%(Fullpath)', ';')" % rule.inputs], ], ] message_section = [ 'Message', {'Importance': 'High', 'Text': '%%(%s.ExecutionDescription)' % rule_name } ] write_tlog_section = [ 'WriteLinesToFile', {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != " "'true'" % (rule.tlog, rule.tlog), 'File': '$(IntDir)$(ProjectName).write.1.tlog', 'Lines': "^%%(%s.Source);@(%s->'%%(Fullpath)')" % (rule.tlog, rule.tlog) } ] read_tlog_section = [ 'WriteLinesToFile', {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != " "'true'" % (rule.tlog, rule.tlog), 'File': '$(IntDir)$(ProjectName).read.1.tlog', 'Lines': "^%%(%s.Source);%%(%s.Inputs)" % (rule.tlog, rule.tlog) } ] command_and_input_section = [ rule_name, {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != " "'true'" % (rule_name, rule_name), 'CommandLineTemplate': '%%(%s.CommandLineTemplate)' % rule_name, 'AdditionalOptions': '%%(%s.AdditionalOptions)' % rule_name, 'Inputs': rule_inputs } ] content.extend([ ['Target', {'Name': rule.target_name, 'BeforeTargets': '$(%s)' % rule.before_targets, 'AfterTargets': '$(%s)' % rule.after_targets, 'Condition': "'@(%s)' != ''" % rule_name, 'DependsOnTargets': '$(%s);%s' % (rule.depends_on, rule.compute_output), 'Outputs': target_outputs, 'Inputs': target_inputs }, remove_section, inputs_section, logging_section, message_section, write_tlog_section, read_tlog_section, command_and_input_section, ], ['PropertyGroup', ['ComputeLinkInputsTargets', '$(ComputeLinkInputsTargets);', '%s;' % rule.compute_output ], ['ComputeLibInputsTargets', '$(ComputeLibInputsTargets);', '%s;' % rule.compute_output ], ], ['Target', {'Name': rule.compute_output, 'Condition': "'@(%s)' != ''" % rule_name }, ['ItemGroup', [rule.dirs_to_make, {'Condition': "'@(%s)' != '' and " "'%%(%s.ExcludedFromBuild)' != 'true'" % (rule_name, rule_name), 'Include': '%%(%s.Outputs)' % rule_name } ], ['Link', {'Include': '%%(%s.Identity)' % rule.dirs_to_make, 'Condition': extension_condition } ], ['Lib', {'Include': '%%(%s.Identity)' % rule.dirs_to_make, 'Condition': extension_condition } ], ['ImpLib', {'Include': '%%(%s.Identity)' % rule.dirs_to_make, 'Condition': extension_condition } ], ], ['MakeDir', {'Directories': ("@(%s->'%%(RootDir)%%(Directory)')" % rule.dirs_to_make) } ] ], ]) easy_xml.WriteXmlIfChanged(content, targets_path, pretty=True, win32=True) def _GenerateMSBuildRuleXmlFile(xml_path, msbuild_rules): # Generate the .xml file content = [ 'ProjectSchemaDefinitions', {'xmlns': ('clr-namespace:Microsoft.Build.Framework.XamlTypes;' 'assembly=Microsoft.Build.Framework'), 'xmlns:x': 'http://schemas.microsoft.com/winfx/2006/xaml', 'xmlns:sys': 'clr-namespace:System;assembly=mscorlib', 'xmlns:transformCallback': 'Microsoft.Cpp.Dev10.ConvertPropertyCallback' } ] for rule in msbuild_rules: content.extend([ ['Rule', {'Name': rule.rule_name, 'PageTemplate': 'tool', 'DisplayName': rule.display_name, 'Order': '200' }, ['Rule.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'ItemType': rule.rule_name } ] ], ['Rule.Categories', ['Category', {'Name': 'General'}, ['Category.DisplayName', ['sys:String', 'General'], ], ], ['Category', {'Name': 'Command Line', 'Subtype': 'CommandLine' }, ['Category.DisplayName', ['sys:String', 'Command Line'], ], ], ], ['StringListProperty', {'Name': 'Inputs', 'Category': 'Command Line', 'IsRequired': 'true', 'Switch': ' ' }, ['StringListProperty.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'ItemType': rule.rule_name, 'SourceType': 'Item' } ] ], ], ['StringProperty', {'Name': 'CommandLineTemplate', 'DisplayName': 'Command Line', 'Visible': 'False', 'IncludeInCommandLine': 'False' } ], ['DynamicEnumProperty', {'Name': rule.before_targets, 'Category': 'General', 'EnumProvider': 'Targets', 'IncludeInCommandLine': 'False' }, ['DynamicEnumProperty.DisplayName', ['sys:String', 'Execute Before'], ], ['DynamicEnumProperty.Description', ['sys:String', 'Specifies the targets for the build customization' ' to run before.' ], ], ['DynamicEnumProperty.ProviderSettings', ['NameValuePair', {'Name': 'Exclude', 'Value': '^%s|^Compute' % rule.before_targets } ] ], ['DynamicEnumProperty.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'HasConfigurationCondition': 'true' } ] ], ], ['DynamicEnumProperty', {'Name': rule.after_targets, 'Category': 'General', 'EnumProvider': 'Targets', 'IncludeInCommandLine': 'False' }, ['DynamicEnumProperty.DisplayName', ['sys:String', 'Execute After'], ], ['DynamicEnumProperty.Description', ['sys:String', ('Specifies the targets for the build customization' ' to run after.') ], ], ['DynamicEnumProperty.ProviderSettings', ['NameValuePair', {'Name': 'Exclude', 'Value': '^%s|^Compute' % rule.after_targets } ] ], ['DynamicEnumProperty.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'ItemType': '', 'HasConfigurationCondition': 'true' } ] ], ], ['StringListProperty', {'Name': 'Outputs', 'DisplayName': 'Outputs', 'Visible': 'False', 'IncludeInCommandLine': 'False' } ], ['StringProperty', {'Name': 'ExecutionDescription', 'DisplayName': 'Execution Description', 'Visible': 'False', 'IncludeInCommandLine': 'False' } ], ['StringListProperty', {'Name': 'AdditionalDependencies', 'DisplayName': 'Additional Dependencies', 'IncludeInCommandLine': 'False', 'Visible': 'false' } ], ['StringProperty', {'Subtype': 'AdditionalOptions', 'Name': 'AdditionalOptions', 'Category': 'Command Line' }, ['StringProperty.DisplayName', ['sys:String', 'Additional Options'], ], ['StringProperty.Description', ['sys:String', 'Additional Options'], ], ], ], ['ItemType', {'Name': rule.rule_name, 'DisplayName': rule.display_name } ], ['FileExtension', {'Name': '*' + rule.extension, 'ContentType': rule.rule_name } ], ['ContentType', {'Name': rule.rule_name, 'DisplayName': '', 'ItemType': rule.rule_name } ] ]) easy_xml.WriteXmlIfChanged(content, xml_path, pretty=True, win32=True) def _GetConfigurationAndPlatform(name, settings): configuration = name.rsplit('_', 1)[0] platform = settings.get('msvs_configuration_platform', 'Win32') return (configuration, platform) def _GetConfigurationCondition(name, settings): return (r"'$(Configuration)|$(Platform)'=='%s|%s'" % _GetConfigurationAndPlatform(name, settings)) def _GetMSBuildProjectConfigurations(configurations): group = ['ItemGroup', {'Label': 'ProjectConfigurations'}] for (name, settings) in sorted(configurations.iteritems()): configuration, platform = _GetConfigurationAndPlatform(name, settings) designation = '%s|%s' % (configuration, platform) group.append( ['ProjectConfiguration', {'Include': designation}, ['Configuration', configuration], ['Platform', platform]]) return [group] def _GetMSBuildGlobalProperties(spec, guid, gyp_file_name): namespace = os.path.splitext(gyp_file_name)[0] return [ ['PropertyGroup', {'Label': 'Globals'}, ['ProjectGuid', guid], ['Keyword', 'Win32Proj'], ['RootNamespace', namespace], ] ] def _GetMSBuildConfigurationDetails(spec, build_file): properties = {} for name, settings in spec['configurations'].iteritems(): msbuild_attributes = _GetMSBuildAttributes(spec, settings, build_file) condition = _GetConfigurationCondition(name, settings) character_set = msbuild_attributes.get('CharacterSet') _AddConditionalProperty(properties, condition, 'ConfigurationType', msbuild_attributes['ConfigurationType']) if character_set: _AddConditionalProperty(properties, condition, 'CharacterSet', character_set) return _GetMSBuildPropertyGroup(spec, 'Configuration', properties) def _GetMSBuildLocalProperties(msbuild_toolset): # Currently the only local property we support is PlatformToolset properties = {} if msbuild_toolset: properties = [ ['PropertyGroup', {'Label': 'Locals'}, ['PlatformToolset', msbuild_toolset], ] ] return properties def _GetMSBuildPropertySheets(configurations): user_props = r'$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props' additional_props = {} props_specified = False for name, settings in sorted(configurations.iteritems()): configuration = _GetConfigurationCondition(name, settings) if settings.has_key('msbuild_props'): additional_props[configuration] = _FixPaths(settings['msbuild_props']) props_specified = True else: additional_props[configuration] = '' if not props_specified: return [ ['ImportGroup', {'Label': 'PropertySheets'}, ['Import', {'Project': user_props, 'Condition': "exists('%s')" % user_props, 'Label': 'LocalAppDataPlatform' } ] ] ] else: sheets = [] for condition, props in additional_props.iteritems(): import_group = [ 'ImportGroup', {'Label': 'PropertySheets', 'Condition': condition }, ['Import', {'Project': user_props, 'Condition': "exists('%s')" % user_props, 'Label': 'LocalAppDataPlatform' } ] ] for props_file in props: import_group.append(['Import', {'Project':props_file}]) sheets.append(import_group) return sheets def _ConvertMSVSBuildAttributes(spec, config, build_file): config_type = _GetMSVSConfigurationType(spec, build_file) msvs_attributes = _GetMSVSAttributes(spec, config, config_type) msbuild_attributes = {} for a in msvs_attributes: if a in ['IntermediateDirectory', 'OutputDirectory']: directory = MSVSSettings.ConvertVCMacrosToMSBuild(msvs_attributes[a]) if not directory.endswith('\\'): directory += '\\' msbuild_attributes[a] = directory elif a == 'CharacterSet': msbuild_attributes[a] = _ConvertMSVSCharacterSet(msvs_attributes[a]) elif a == 'ConfigurationType': msbuild_attributes[a] = _ConvertMSVSConfigurationType(msvs_attributes[a]) else: print 'Warning: Do not know how to convert MSVS attribute ' + a return msbuild_attributes def _ConvertMSVSCharacterSet(char_set): if char_set.isdigit(): char_set = { '0': 'MultiByte', '1': 'Unicode', '2': 'MultiByte', }[char_set] return char_set def _ConvertMSVSConfigurationType(config_type): if config_type.isdigit(): config_type = { '1': 'Application', '2': 'DynamicLibrary', '4': 'StaticLibrary', '10': 'Utility' }[config_type] return config_type def _GetMSBuildAttributes(spec, config, build_file): if 'msbuild_configuration_attributes' not in config: msbuild_attributes = _ConvertMSVSBuildAttributes(spec, config, build_file) else: config_type = _GetMSVSConfigurationType(spec, build_file) config_type = _ConvertMSVSConfigurationType(config_type) msbuild_attributes = config.get('msbuild_configuration_attributes', {}) msbuild_attributes.setdefault('ConfigurationType', config_type) output_dir = msbuild_attributes.get('OutputDirectory', '$(SolutionDir)$(Configuration)') msbuild_attributes['OutputDirectory'] = _FixPath(output_dir) + '\\' if 'IntermediateDirectory' not in msbuild_attributes: intermediate = _FixPath('$(Configuration)') + '\\' msbuild_attributes['IntermediateDirectory'] = intermediate if 'CharacterSet' in msbuild_attributes: msbuild_attributes['CharacterSet'] = _ConvertMSVSCharacterSet( msbuild_attributes['CharacterSet']) if 'TargetName' not in msbuild_attributes: prefix = spec.get('product_prefix', '') product_name = spec.get('product_name', '$(ProjectName)') target_name = prefix + product_name msbuild_attributes['TargetName'] = target_name if 'TargetExt' not in msbuild_attributes and 'product_extension' in spec: ext = spec.get('product_extension') msbuild_attributes['TargetExt'] = '.' + ext if spec.get('msvs_external_builder'): external_out_dir = spec.get('msvs_external_builder_out_dir', '.') msbuild_attributes['OutputDirectory'] = _FixPath(external_out_dir) + '\\' # Make sure that 'TargetPath' matches 'Lib.OutputFile' or 'Link.OutputFile' # (depending on the tool used) to avoid MSB8012 warning. msbuild_tool_map = { 'executable': 'Link', 'shared_library': 'Link', 'loadable_module': 'Link', 'static_library': 'Lib', } msbuild_tool = msbuild_tool_map.get(spec['type']) if msbuild_tool: msbuild_settings = config['finalized_msbuild_settings'] out_file = msbuild_settings[msbuild_tool].get('OutputFile') if out_file: msbuild_attributes['TargetPath'] = _FixPath(out_file) target_ext = msbuild_settings[msbuild_tool].get('TargetExt') if target_ext: msbuild_attributes['TargetExt'] = target_ext return msbuild_attributes def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file): # TODO(jeanluc) We could optimize out the following and do it only if # there are actions. # TODO(jeanluc) Handle the equivalent of setting 'CYGWIN=nontsec'. new_paths = [] cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.'])[0] if cygwin_dirs: cyg_path = '$(MSBuildProjectDirectory)\\%s\\bin\\' % _FixPath(cygwin_dirs) new_paths.append(cyg_path) # TODO(jeanluc) Change the convention to have both a cygwin_dir and a # python_dir. python_path = cyg_path.replace('cygwin\\bin', 'python_26') new_paths.append(python_path) if new_paths: new_paths = '$(ExecutablePath);' + ';'.join(new_paths) properties = {} for (name, configuration) in sorted(configurations.iteritems()): condition = _GetConfigurationCondition(name, configuration) attributes = _GetMSBuildAttributes(spec, configuration, build_file) msbuild_settings = configuration['finalized_msbuild_settings'] _AddConditionalProperty(properties, condition, 'IntDir', attributes['IntermediateDirectory']) _AddConditionalProperty(properties, condition, 'OutDir', attributes['OutputDirectory']) _AddConditionalProperty(properties, condition, 'TargetName', attributes['TargetName']) if 'TargetExt' in attributes: _AddConditionalProperty(properties, condition, 'TargetExt', attributes['TargetExt']) if attributes.get('TargetPath'): _AddConditionalProperty(properties, condition, 'TargetPath', attributes['TargetPath']) if attributes.get('TargetExt'): _AddConditionalProperty(properties, condition, 'TargetExt', attributes['TargetExt']) if new_paths: _AddConditionalProperty(properties, condition, 'ExecutablePath', new_paths) tool_settings = msbuild_settings.get('', {}) for name, value in sorted(tool_settings.iteritems()): formatted_value = _GetValueFormattedForMSBuild('', name, value) _AddConditionalProperty(properties, condition, name, formatted_value) return _GetMSBuildPropertyGroup(spec, None, properties) def _AddConditionalProperty(properties, condition, name, value): """Adds a property / conditional value pair to a dictionary. Arguments: properties: The dictionary to be modified. The key is the name of the property. The value is itself a dictionary; its key is the value and the value a list of condition for which this value is true. condition: The condition under which the named property has the value. name: The name of the property. value: The value of the property. """ if name not in properties: properties[name] = {} values = properties[name] if value not in values: values[value] = [] conditions = values[value] conditions.append(condition) # Regex for msvs variable references ( i.e. $(FOO) ). MSVS_VARIABLE_REFERENCE = re.compile('\$\(([a-zA-Z_][a-zA-Z0-9_]*)\)') def _GetMSBuildPropertyGroup(spec, label, properties): """Returns a PropertyGroup definition for the specified properties. Arguments: spec: The target project dict. label: An optional label for the PropertyGroup. properties: The dictionary to be converted. The key is the name of the property. The value is itself a dictionary; its key is the value and the value a list of condition for which this value is true. """ group = ['PropertyGroup'] if label: group.append({'Label': label}) num_configurations = len(spec['configurations']) def GetEdges(node): # Use a definition of edges such that user_of_variable -> used_varible. # This happens to be easier in this case, since a variable's # definition contains all variables it references in a single string. edges = set() for value in sorted(properties[node].keys()): # Add to edges all $(...) references to variables. # # Variable references that refer to names not in properties are excluded # These can exist for instance to refer built in definitions like # $(SolutionDir). # # Self references are ignored. Self reference is used in a few places to # append to the default value. I.e. PATH=$(PATH);other_path edges.update(set([v for v in MSVS_VARIABLE_REFERENCE.findall(value) if v in properties and v != node])) return edges properties_ordered = gyp.common.TopologicallySorted( properties.keys(), GetEdges) # Walk properties in the reverse of a topological sort on # user_of_variable -> used_variable as this ensures variables are # defined before they are used. # NOTE: reverse(topsort(DAG)) = topsort(reverse_edges(DAG)) for name in reversed(properties_ordered): values = properties[name] for value, conditions in sorted(values.iteritems()): if len(conditions) == num_configurations: # If the value is the same all configurations, # just add one unconditional entry. group.append([name, value]) else: for condition in conditions: group.append([name, {'Condition': condition}, value]) return [group] def _GetMSBuildToolSettingsSections(spec, configurations): groups = [] for (name, configuration) in sorted(configurations.iteritems()): msbuild_settings = configuration['finalized_msbuild_settings'] group = ['ItemDefinitionGroup', {'Condition': _GetConfigurationCondition(name, configuration)} ] for tool_name, tool_settings in sorted(msbuild_settings.iteritems()): # Skip the tool named '' which is a holder of global settings handled # by _GetMSBuildConfigurationGlobalProperties. if tool_name: if tool_settings: tool = [tool_name] for name, value in sorted(tool_settings.iteritems()): formatted_value = _GetValueFormattedForMSBuild(tool_name, name, value) tool.append([name, formatted_value]) group.append(tool) groups.append(group) return groups def _FinalizeMSBuildSettings(spec, configuration): if 'msbuild_settings' in configuration: converted = False msbuild_settings = configuration['msbuild_settings'] MSVSSettings.ValidateMSBuildSettings(msbuild_settings) else: converted = True msvs_settings = configuration.get('msvs_settings', {}) msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(msvs_settings) include_dirs, resource_include_dirs = _GetIncludeDirs(configuration) libraries = _GetLibraries(spec) library_dirs = _GetLibraryDirs(configuration) out_file, _, msbuild_tool = _GetOutputFilePathAndTool(spec, msbuild=True) target_ext = _GetOutputTargetExt(spec) defines = _GetDefines(configuration) if converted: # Visual Studio 2010 has TR1 defines = [d for d in defines if d != '_HAS_TR1=0'] # Warn of ignored settings ignored_settings = ['msvs_prebuild', 'msvs_postbuild', 'msvs_tool_files'] for ignored_setting in ignored_settings: value = configuration.get(ignored_setting) if value: print ('Warning: The automatic conversion to MSBuild does not handle ' '%s. Ignoring setting of %s' % (ignored_setting, str(value))) defines = [_EscapeCppDefineForMSBuild(d) for d in defines] disabled_warnings = _GetDisabledWarnings(configuration) # TODO(jeanluc) Validate & warn that we don't translate # prebuild = configuration.get('msvs_prebuild') # postbuild = configuration.get('msvs_postbuild') def_file = _GetModuleDefinition(spec) precompiled_header = configuration.get('msvs_precompiled_header') # Add the information to the appropriate tool # TODO(jeanluc) We could optimize and generate these settings only if # the corresponding files are found, e.g. don't generate ResourceCompile # if you don't have any resources. _ToolAppend(msbuild_settings, 'ClCompile', 'AdditionalIncludeDirectories', include_dirs) _ToolAppend(msbuild_settings, 'ResourceCompile', 'AdditionalIncludeDirectories', resource_include_dirs) # Add in libraries, note that even for empty libraries, we want this # set, to prevent inheriting default libraries from the enviroment. _ToolSetOrAppend(msbuild_settings, 'Link', 'AdditionalDependencies', libraries) _ToolAppend(msbuild_settings, 'Link', 'AdditionalLibraryDirectories', library_dirs) if out_file: _ToolAppend(msbuild_settings, msbuild_tool, 'OutputFile', out_file, only_if_unset=True) if target_ext: _ToolAppend(msbuild_settings, msbuild_tool, 'TargetExt', target_ext, only_if_unset=True) # Add defines. _ToolAppend(msbuild_settings, 'ClCompile', 'PreprocessorDefinitions', defines) _ToolAppend(msbuild_settings, 'ResourceCompile', 'PreprocessorDefinitions', defines) # Add disabled warnings. _ToolAppend(msbuild_settings, 'ClCompile', 'DisableSpecificWarnings', disabled_warnings) # Turn on precompiled headers if appropriate. if precompiled_header: precompiled_header = os.path.split(precompiled_header)[1] _ToolAppend(msbuild_settings, 'ClCompile', 'PrecompiledHeader', 'Use') _ToolAppend(msbuild_settings, 'ClCompile', 'PrecompiledHeaderFile', precompiled_header) _ToolAppend(msbuild_settings, 'ClCompile', 'ForcedIncludeFiles', [precompiled_header]) # Loadable modules don't generate import libraries; # tell dependent projects to not expect one. if spec['type'] == 'loadable_module': _ToolAppend(msbuild_settings, '', 'IgnoreImportLibrary', 'true') # Set the module definition file if any. if def_file: _ToolAppend(msbuild_settings, 'Link', 'ModuleDefinitionFile', def_file) configuration['finalized_msbuild_settings'] = msbuild_settings def _GetValueFormattedForMSBuild(tool_name, name, value): if type(value) == list: # For some settings, VS2010 does not automatically extends the settings # TODO(jeanluc) Is this what we want? if name in ['AdditionalIncludeDirectories', 'AdditionalLibraryDirectories', 'AdditionalOptions', 'DelayLoadDLLs', 'DisableSpecificWarnings', 'PreprocessorDefinitions']: value.append('%%(%s)' % name) # For most tools, entries in a list should be separated with ';' but some # settings use a space. Check for those first. exceptions = { 'ClCompile': ['AdditionalOptions'], 'Link': ['AdditionalOptions'], 'Lib': ['AdditionalOptions']} if tool_name in exceptions and name in exceptions[tool_name]: char = ' ' else: char = ';' formatted_value = char.join( [MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in value]) else: formatted_value = MSVSSettings.ConvertVCMacrosToMSBuild(value) return formatted_value def _VerifySourcesExist(sources, root_dir): """Verifies that all source files exist on disk. Checks that all regular source files, i.e. not created at run time, exist on disk. Missing files cause needless recompilation but no otherwise visible errors. Arguments: sources: A recursive list of Filter/file names. root_dir: The root directory for the relative path names. Returns: A list of source files that cannot be found on disk. """ missing_sources = [] for source in sources: if isinstance(source, MSVSProject.Filter): missing_sources.extend(_VerifySourcesExist(source.contents, root_dir)) else: if '$' not in source: full_path = os.path.join(root_dir, source) if not os.path.exists(full_path): missing_sources.append(full_path) return missing_sources def _GetMSBuildSources(spec, sources, exclusions, extension_to_rule_name, actions_spec, sources_handled_by_action, list_excluded): groups = ['none', 'midl', 'include', 'compile', 'resource', 'rule'] grouped_sources = {} for g in groups: grouped_sources[g] = [] _AddSources2(spec, sources, exclusions, grouped_sources, extension_to_rule_name, sources_handled_by_action, list_excluded) sources = [] for g in groups: if grouped_sources[g]: sources.append(['ItemGroup'] + grouped_sources[g]) if actions_spec: sources.append(['ItemGroup'] + actions_spec) return sources def _AddSources2(spec, sources, exclusions, grouped_sources, extension_to_rule_name, sources_handled_by_action, list_excluded): extensions_excluded_from_precompile = [] for source in sources: if isinstance(source, MSVSProject.Filter): _AddSources2(spec, source.contents, exclusions, grouped_sources, extension_to_rule_name, sources_handled_by_action, list_excluded) else: if not source in sources_handled_by_action: detail = [] excluded_configurations = exclusions.get(source, []) if len(excluded_configurations) == len(spec['configurations']): detail.append(['ExcludedFromBuild', 'true']) else: for config_name, configuration in sorted(excluded_configurations): condition = _GetConfigurationCondition(config_name, configuration) detail.append(['ExcludedFromBuild', {'Condition': condition}, 'true']) # Add precompile if needed for config_name, configuration in spec['configurations'].iteritems(): precompiled_source = configuration.get('msvs_precompiled_source', '') if precompiled_source != '': precompiled_source = _FixPath(precompiled_source) if not extensions_excluded_from_precompile: # If the precompiled header is generated by a C source, we must # not try to use it for C++ sources, and vice versa. basename, extension = os.path.splitext(precompiled_source) if extension == '.c': extensions_excluded_from_precompile = ['.cc', '.cpp', '.cxx'] else: extensions_excluded_from_precompile = ['.c'] if precompiled_source == source: condition = _GetConfigurationCondition(config_name, configuration) detail.append(['PrecompiledHeader', {'Condition': condition}, 'Create' ]) else: # Turn off precompiled header usage for source files of a # different type than the file that generated the # precompiled header. for extension in extensions_excluded_from_precompile: if source.endswith(extension): detail.append(['PrecompiledHeader', '']) detail.append(['ForcedIncludeFiles', '']) group, element = _MapFileToMsBuildSourceType(source, extension_to_rule_name) grouped_sources[group].append([element, {'Include': source}] + detail) def _GetMSBuildProjectReferences(project): references = [] if project.dependencies: group = ['ItemGroup'] for dependency in project.dependencies: guid = dependency.guid project_dir = os.path.split(project.path)[0] relative_path = gyp.common.RelativePath(dependency.path, project_dir) project_ref = ['ProjectReference', {'Include': relative_path}, ['Project', guid], ['ReferenceOutputAssembly', 'false'] ] for config in dependency.spec.get('configurations', {}).itervalues(): # If it's disabled in any config, turn it off in the reference. if config.get('msvs_2010_disable_uldi_when_referenced', 0): project_ref.append(['UseLibraryDependencyInputs', 'false']) break group.append(project_ref) references.append(group) return references def _GenerateMSBuildProject(project, options, version, generator_flags): spec = project.spec configurations = spec['configurations'] project_dir, project_file_name = os.path.split(project.path) gyp.common.EnsureDirExists(project.path) # Prepare list of sources and excluded sources. gyp_path = _NormalizedSource(project.build_file) relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir) gyp_file = os.path.split(project.build_file)[1] sources, excluded_sources = _PrepareListOfSources(spec, generator_flags, gyp_file) # Add rules. actions_to_add = {} props_files_of_rules = set() targets_files_of_rules = set() extension_to_rule_name = {} list_excluded = generator_flags.get('msvs_list_excluded_files', True) # Don't generate rules if we are using an external builder like ninja. if not spec.get('msvs_external_builder'): _GenerateRulesForMSBuild(project_dir, options, spec, sources, excluded_sources, props_files_of_rules, targets_files_of_rules, actions_to_add, extension_to_rule_name) else: rules = spec.get('rules', []) _AdjustSourcesForRules(spec, rules, sources, excluded_sources) sources, excluded_sources, excluded_idl = ( _AdjustSourcesAndConvertToFilterHierarchy(spec, options, project_dir, sources, excluded_sources, list_excluded, version)) # Don't add actions if we are using an external builder like ninja. if not spec.get('msvs_external_builder'): _AddActions(actions_to_add, spec, project.build_file) _AddCopies(actions_to_add, spec) # NOTE: this stanza must appear after all actions have been decided. # Don't excluded sources with actions attached, or they won't run. excluded_sources = _FilterActionsFromExcluded( excluded_sources, actions_to_add) exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl) actions_spec, sources_handled_by_action = _GenerateActionsForMSBuild( spec, actions_to_add) _GenerateMSBuildFiltersFile(project.path + '.filters', sources, extension_to_rule_name) missing_sources = _VerifySourcesExist(sources, project_dir) for configuration in configurations.itervalues(): _FinalizeMSBuildSettings(spec, configuration) # Add attributes to root element import_default_section = [ ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.Default.props'}]] import_cpp_props_section = [ ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.props'}]] import_cpp_targets_section = [ ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.targets'}]] macro_section = [['PropertyGroup', {'Label': 'UserMacros'}]] content = [ 'Project', {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003', 'ToolsVersion': version.ProjectVersion(), 'DefaultTargets': 'Build' }] content += _GetMSBuildProjectConfigurations(configurations) content += _GetMSBuildGlobalProperties(spec, project.guid, project_file_name) content += import_default_section content += _GetMSBuildConfigurationDetails(spec, project.build_file) content += _GetMSBuildLocalProperties(project.msbuild_toolset) content += import_cpp_props_section content += _GetMSBuildExtensions(props_files_of_rules) content += _GetMSBuildPropertySheets(configurations) content += macro_section content += _GetMSBuildConfigurationGlobalProperties(spec, configurations, project.build_file) content += _GetMSBuildToolSettingsSections(spec, configurations) content += _GetMSBuildSources( spec, sources, exclusions, extension_to_rule_name, actions_spec, sources_handled_by_action, list_excluded) content += _GetMSBuildProjectReferences(project) content += import_cpp_targets_section content += _GetMSBuildExtensionTargets(targets_files_of_rules) if spec.get('msvs_external_builder'): content += _GetMSBuildExternalBuilderTargets(spec) # TODO(jeanluc) File a bug to get rid of runas. We had in MSVS: # has_run_as = _WriteMSVSUserFile(project.path, version, spec) easy_xml.WriteXmlIfChanged(content, project.path, pretty=True, win32=True) return missing_sources def _GetMSBuildExternalBuilderTargets(spec): """Return a list of MSBuild targets for external builders. Right now, only "Build" and "Clean" targets are generated. Arguments: spec: The gyp target spec. Returns: List of MSBuild 'Target' specs. """ build_cmd = _BuildCommandLineForRuleRaw( spec, spec['msvs_external_builder_build_cmd'], False, False, False, False) build_target = ['Target', {'Name': 'Build'}] build_target.append(['Exec', {'Command': build_cmd}]) clean_cmd = _BuildCommandLineForRuleRaw( spec, spec['msvs_external_builder_clean_cmd'], False, False, False, False) clean_target = ['Target', {'Name': 'Clean'}] clean_target.append(['Exec', {'Command': clean_cmd}]) return [build_target, clean_target] def _GetMSBuildExtensions(props_files_of_rules): extensions = ['ImportGroup', {'Label': 'ExtensionSettings'}] for props_file in props_files_of_rules: extensions.append(['Import', {'Project': props_file}]) return [extensions] def _GetMSBuildExtensionTargets(targets_files_of_rules): targets_node = ['ImportGroup', {'Label': 'ExtensionTargets'}] for targets_file in sorted(targets_files_of_rules): targets_node.append(['Import', {'Project': targets_file}]) return [targets_node] def _GenerateActionsForMSBuild(spec, actions_to_add): """Add actions accumulated into an actions_to_add, merging as needed. Arguments: spec: the target project dict actions_to_add: dictionary keyed on input name, which maps to a list of dicts describing the actions attached to that input file. Returns: A pair of (action specification, the sources handled by this action). """ sources_handled_by_action = OrderedSet() actions_spec = [] for primary_input, actions in actions_to_add.iteritems(): inputs = OrderedSet() outputs = OrderedSet() descriptions = [] commands = [] for action in actions: inputs.update(OrderedSet(action['inputs'])) outputs.update(OrderedSet(action['outputs'])) descriptions.append(action['description']) cmd = action['command'] # For most actions, add 'call' so that actions that invoke batch files # return and continue executing. msbuild_use_call provides a way to # disable this but I have not seen any adverse effect from doing that # for everything. if action.get('msbuild_use_call', True): cmd = 'call ' + cmd commands.append(cmd) # Add the custom build action for one input file. description = ', and also '.join(descriptions) # We can't join the commands simply with && because the command line will # get too long. See also _AddActions: cygwin's setup_env mustn't be called # for every invocation or the command that sets the PATH will grow too # long. command = ( '\r\nif %errorlevel% neq 0 exit /b %errorlevel%\r\n'.join(commands)) _AddMSBuildAction(spec, primary_input, inputs, outputs, command, description, sources_handled_by_action, actions_spec) return actions_spec, sources_handled_by_action def _AddMSBuildAction(spec, primary_input, inputs, outputs, cmd, description, sources_handled_by_action, actions_spec): command = MSVSSettings.ConvertVCMacrosToMSBuild(cmd) primary_input = _FixPath(primary_input) inputs_array = _FixPaths(inputs) outputs_array = _FixPaths(outputs) additional_inputs = ';'.join([i for i in inputs_array if i != primary_input]) outputs = ';'.join(outputs_array) sources_handled_by_action.add(primary_input) action_spec = ['CustomBuild', {'Include': primary_input}] action_spec.extend( # TODO(jeanluc) 'Document' for all or just if as_sources? [['FileType', 'Document'], ['Command', command], ['Message', description], ['Outputs', outputs] ]) if additional_inputs: action_spec.append(['AdditionalInputs', additional_inputs]) actions_spec.append(action_spec) ������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs_test.py���������������������000755 �000766 �000024 �00000001772 12455173731 034140� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the msvs.py file. """ import gyp.generator.msvs as msvs import unittest import StringIO class TestSequenceFunctions(unittest.TestCase): def setUp(self): self.stderr = StringIO.StringIO() def test_GetLibraries(self): self.assertEqual( msvs._GetLibraries({}), []) self.assertEqual( msvs._GetLibraries({'libraries': []}), []) self.assertEqual( msvs._GetLibraries({'other':'foo', 'libraries': ['a.lib']}), ['a.lib']) self.assertEqual( msvs._GetLibraries({'libraries': ['-la']}), ['a.lib']) self.assertEqual( msvs._GetLibraries({'libraries': ['a.lib', 'b.lib', 'c.lib', '-lb.lib', '-lb.lib', 'd.lib', 'a.lib']}), ['c.lib', 'b.lib', 'd.lib', 'a.lib']) if __name__ == '__main__': unittest.main() ������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja.py��000644 �000766 �000024 �00000256075 12455173731 033274� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import copy import hashlib import json import multiprocessing import os.path import re import signal import subprocess import sys import gyp import gyp.common import gyp.msvs_emulation import gyp.MSVSUtil as MSVSUtil import gyp.xcode_emulation from cStringIO import StringIO from gyp.common import GetEnvironFallback import gyp.ninja_syntax as ninja_syntax generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_PREFIX': 'lib', # Gyp expects the following variables to be expandable by the build # system to the appropriate locations. Ninja prefers paths to be # known at gyp time. To resolve this, introduce special # variables starting with $! and $| (which begin with a $ so gyp knows it # should be treated specially, but is otherwise an invalid # ninja/shell variable) that are passed to gyp here but expanded # before writing out into the target .ninja files; see # ExpandSpecial. # $! is used for variables that represent a path and that can only appear at # the start of a string, while $| is used for variables that can appear # anywhere in a string. 'INTERMEDIATE_DIR': '$!INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR': '$!PRODUCT_DIR/gen', 'PRODUCT_DIR': '$!PRODUCT_DIR', 'CONFIGURATION_NAME': '$|CONFIGURATION_NAME', # Special variables that may be used by gyp 'rule' targets. # We generate definitions for these variables on the fly when processing a # rule. 'RULE_INPUT_ROOT': '${root}', 'RULE_INPUT_DIRNAME': '${dirname}', 'RULE_INPUT_PATH': '${source}', 'RULE_INPUT_EXT': '${ext}', 'RULE_INPUT_NAME': '${name}', } # Placates pylint. generator_additional_non_configuration_keys = [] generator_additional_path_sections = [] generator_extra_sources_for_rules = [] generator_filelist_paths = None # TODO: figure out how to not build extra host objects in the non-cross-compile # case when this is enabled, and enable unconditionally. generator_supports_multiple_toolsets = ( os.environ.get('GYP_CROSSCOMPILE') or os.environ.get('AR_host') or os.environ.get('CC_host') or os.environ.get('CXX_host') or os.environ.get('AR_target') or os.environ.get('CC_target') or os.environ.get('CXX_target')) def StripPrefix(arg, prefix): if arg.startswith(prefix): return arg[len(prefix):] return arg def QuoteShellArgument(arg, flavor): """Quote a string such that it will be interpreted as a single argument by the shell.""" # Rather than attempting to enumerate the bad shell characters, just # whitelist common OK ones and quote anything else. if re.match(r'^[a-zA-Z0-9_=.\\/-]+$', arg): return arg # No quoting necessary. if flavor == 'win': return gyp.msvs_emulation.QuoteForRspFile(arg) return "'" + arg.replace("'", "'" + '"\'"' + "'") + "'" def Define(d, flavor): """Takes a preprocessor define and returns a -D parameter that's ninja- and shell-escaped.""" if flavor == 'win': # cl.exe replaces literal # characters with = in preprocesor definitions for # some reason. Octal-encode to work around that. d = d.replace('#', '\\%03o' % ord('#')) return QuoteShellArgument(ninja_syntax.escape('-D' + d), flavor) def AddArch(output, arch): """Adds an arch string to an output path.""" output, extension = os.path.splitext(output) return '%s.%s%s' % (output, arch, extension) class Target: """Target represents the paths used within a single gyp target. Conceptually, building a single target A is a series of steps: 1) actions/rules/copies generates source/resources/etc. 2) compiles generates .o files 3) link generates a binary (library/executable) 4) bundle merges the above in a mac bundle (Any of these steps can be optional.) From a build ordering perspective, a dependent target B could just depend on the last output of this series of steps. But some dependent commands sometimes need to reach inside the box. For example, when linking B it needs to get the path to the static library generated by A. This object stores those paths. To keep things simple, member variables only store concrete paths to single files, while methods compute derived values like "the last output of the target". """ def __init__(self, type): # Gyp type ("static_library", etc.) of this target. self.type = type # File representing whether any input dependencies necessary for # dependent actions have completed. self.preaction_stamp = None # File representing whether any input dependencies necessary for # dependent compiles have completed. self.precompile_stamp = None # File representing the completion of actions/rules/copies, if any. self.actions_stamp = None # Path to the output of the link step, if any. self.binary = None # Path to the file representing the completion of building the bundle, # if any. self.bundle = None # On Windows, incremental linking requires linking against all the .objs # that compose a .lib (rather than the .lib itself). That list is stored # here. self.component_objs = None # Windows only. The import .lib is the output of a build step, but # because dependents only link against the lib (not both the lib and the # dll) we keep track of the import library here. self.import_lib = None def Linkable(self): """Return true if this is a target that can be linked against.""" return self.type in ('static_library', 'shared_library') def UsesToc(self, flavor): """Return true if the target should produce a restat rule based on a TOC file.""" # For bundles, the .TOC should be produced for the binary, not for # FinalOutput(). But the naive approach would put the TOC file into the # bundle, so don't do this for bundles for now. if flavor == 'win' or self.bundle: return False return self.type in ('shared_library', 'loadable_module') def PreActionInput(self, flavor): """Return the path, if any, that should be used as a dependency of any dependent action step.""" if self.UsesToc(flavor): return self.FinalOutput() + '.TOC' return self.FinalOutput() or self.preaction_stamp def PreCompileInput(self): """Return the path, if any, that should be used as a dependency of any dependent compile step.""" return self.actions_stamp or self.precompile_stamp def FinalOutput(self): """Return the last output of the target, which depends on all prior steps.""" return self.bundle or self.binary or self.actions_stamp # A small discourse on paths as used within the Ninja build: # All files we produce (both at gyp and at build time) appear in the # build directory (e.g. out/Debug). # # Paths within a given .gyp file are always relative to the directory # containing the .gyp file. Call these "gyp paths". This includes # sources as well as the starting directory a given gyp rule/action # expects to be run from. We call the path from the source root to # the gyp file the "base directory" within the per-.gyp-file # NinjaWriter code. # # All paths as written into the .ninja files are relative to the build # directory. Call these paths "ninja paths". # # We translate between these two notions of paths with two helper # functions: # # - GypPathToNinja translates a gyp path (i.e. relative to the .gyp file) # into the equivalent ninja path. # # - GypPathToUniqueOutput translates a gyp path into a ninja path to write # an output file; the result can be namespaced such that it is unique # to the input file name as well as the output target name. class NinjaWriter: def __init__(self, qualified_target, target_outputs, base_dir, build_dir, output_file, toplevel_build, output_file_name, flavor, toplevel_dir=None): """ base_dir: path from source root to directory containing this gyp file, by gyp semantics, all input paths are relative to this build_dir: path from source root to build output toplevel_dir: path to the toplevel directory """ self.qualified_target = qualified_target self.target_outputs = target_outputs self.base_dir = base_dir self.build_dir = build_dir self.ninja = ninja_syntax.Writer(output_file) self.toplevel_build = toplevel_build self.output_file_name = output_file_name self.flavor = flavor self.abs_build_dir = None if toplevel_dir is not None: self.abs_build_dir = os.path.abspath(os.path.join(toplevel_dir, build_dir)) self.obj_ext = '.obj' if flavor == 'win' else '.o' if flavor == 'win': # See docstring of msvs_emulation.GenerateEnvironmentFiles(). self.win_env = {} for arch in ('x86', 'x64'): self.win_env[arch] = 'environment.' + arch # Relative path from build output dir to base dir. build_to_top = gyp.common.InvertRelativePath(build_dir, toplevel_dir) self.build_to_base = os.path.join(build_to_top, base_dir) # Relative path from base dir to build dir. base_to_top = gyp.common.InvertRelativePath(base_dir, toplevel_dir) self.base_to_build = os.path.join(base_to_top, build_dir) def ExpandSpecial(self, path, product_dir=None): """Expand specials like $!PRODUCT_DIR in |path|. If |product_dir| is None, assumes the cwd is already the product dir. Otherwise, |product_dir| is the relative path to the product dir. """ PRODUCT_DIR = '$!PRODUCT_DIR' if PRODUCT_DIR in path: if product_dir: path = path.replace(PRODUCT_DIR, product_dir) else: path = path.replace(PRODUCT_DIR + '/', '') path = path.replace(PRODUCT_DIR + '\\', '') path = path.replace(PRODUCT_DIR, '.') INTERMEDIATE_DIR = '$!INTERMEDIATE_DIR' if INTERMEDIATE_DIR in path: int_dir = self.GypPathToUniqueOutput('gen') # GypPathToUniqueOutput generates a path relative to the product dir, # so insert product_dir in front if it is provided. path = path.replace(INTERMEDIATE_DIR, os.path.join(product_dir or '', int_dir)) CONFIGURATION_NAME = '$|CONFIGURATION_NAME' path = path.replace(CONFIGURATION_NAME, self.config_name) return path def ExpandRuleVariables(self, path, root, dirname, source, ext, name): if self.flavor == 'win': path = self.msvs_settings.ConvertVSMacros( path, config=self.config_name) path = path.replace(generator_default_variables['RULE_INPUT_ROOT'], root) path = path.replace(generator_default_variables['RULE_INPUT_DIRNAME'], dirname) path = path.replace(generator_default_variables['RULE_INPUT_PATH'], source) path = path.replace(generator_default_variables['RULE_INPUT_EXT'], ext) path = path.replace(generator_default_variables['RULE_INPUT_NAME'], name) return path def GypPathToNinja(self, path, env=None): """Translate a gyp path to a ninja path, optionally expanding environment variable references in |path| with |env|. See the above discourse on path conversions.""" if env: if self.flavor == 'mac': path = gyp.xcode_emulation.ExpandEnvVars(path, env) elif self.flavor == 'win': path = gyp.msvs_emulation.ExpandMacros(path, env) if path.startswith('$!'): expanded = self.ExpandSpecial(path) if self.flavor == 'win': expanded = os.path.normpath(expanded) return expanded if '$|' in path: path = self.ExpandSpecial(path) assert '$' not in path, path return os.path.normpath(os.path.join(self.build_to_base, path)) def GypPathToUniqueOutput(self, path, qualified=True): """Translate a gyp path to a ninja path for writing output. If qualified is True, qualify the resulting filename with the name of the target. This is necessary when e.g. compiling the same path twice for two separate output targets. See the above discourse on path conversions.""" path = self.ExpandSpecial(path) assert not path.startswith('$'), path # Translate the path following this scheme: # Input: foo/bar.gyp, target targ, references baz/out.o # Output: obj/foo/baz/targ.out.o (if qualified) # obj/foo/baz/out.o (otherwise) # (and obj.host instead of obj for cross-compiles) # # Why this scheme and not some other one? # 1) for a given input, you can compute all derived outputs by matching # its path, even if the input is brought via a gyp file with '..'. # 2) simple files like libraries and stamps have a simple filename. obj = 'obj' if self.toolset != 'target': obj += '.' + self.toolset path_dir, path_basename = os.path.split(path) if qualified: path_basename = self.name + '.' + path_basename return os.path.normpath(os.path.join(obj, self.base_dir, path_dir, path_basename)) def WriteCollapsedDependencies(self, name, targets): """Given a list of targets, return a path for a single file representing the result of building all the targets or None. Uses a stamp file if necessary.""" assert targets == filter(None, targets), targets if len(targets) == 0: return None if len(targets) > 1: stamp = self.GypPathToUniqueOutput(name + '.stamp') targets = self.ninja.build(stamp, 'stamp', targets) self.ninja.newline() return targets[0] def _SubninjaNameForArch(self, arch): output_file_base = os.path.splitext(self.output_file_name)[0] return '%s.%s.ninja' % (output_file_base, arch) def WriteSpec(self, spec, config_name, generator_flags): """The main entry point for NinjaWriter: write the build rules for a spec. Returns a Target object, which represents the output paths for this spec. Returns None if there are no outputs (e.g. a settings-only 'none' type target).""" self.config_name = config_name self.name = spec['target_name'] self.toolset = spec['toolset'] config = spec['configurations'][config_name] self.target = Target(spec['type']) self.is_standalone_static_library = bool( spec.get('standalone_static_library', 0)) # Track if this target contains any C++ files, to decide if gcc or g++ # should be used for linking. self.uses_cpp = False self.is_mac_bundle = gyp.xcode_emulation.IsMacBundle(self.flavor, spec) self.xcode_settings = self.msvs_settings = None if self.flavor == 'mac': self.xcode_settings = gyp.xcode_emulation.XcodeSettings(spec) if self.flavor == 'win': self.msvs_settings = gyp.msvs_emulation.MsvsSettings(spec, generator_flags) arch = self.msvs_settings.GetArch(config_name) self.ninja.variable('arch', self.win_env[arch]) self.ninja.variable('cc', '$cl_' + arch) self.ninja.variable('cxx', '$cl_' + arch) if self.flavor == 'mac': self.archs = self.xcode_settings.GetActiveArchs(config_name) if len(self.archs) > 1: self.arch_subninjas = dict( (arch, ninja_syntax.Writer( OpenOutput(os.path.join(self.toplevel_build, self._SubninjaNameForArch(arch)), 'w'))) for arch in self.archs) # Compute predepends for all rules. # actions_depends is the dependencies this target depends on before running # any of its action/rule/copy steps. # compile_depends is the dependencies this target depends on before running # any of its compile steps. actions_depends = [] compile_depends = [] # TODO(evan): it is rather confusing which things are lists and which # are strings. Fix these. if 'dependencies' in spec: for dep in spec['dependencies']: if dep in self.target_outputs: target = self.target_outputs[dep] actions_depends.append(target.PreActionInput(self.flavor)) compile_depends.append(target.PreCompileInput()) actions_depends = filter(None, actions_depends) compile_depends = filter(None, compile_depends) actions_depends = self.WriteCollapsedDependencies('actions_depends', actions_depends) compile_depends = self.WriteCollapsedDependencies('compile_depends', compile_depends) self.target.preaction_stamp = actions_depends self.target.precompile_stamp = compile_depends # Write out actions, rules, and copies. These must happen before we # compile any sources, so compute a list of predependencies for sources # while we do it. extra_sources = [] mac_bundle_depends = [] self.target.actions_stamp = self.WriteActionsRulesCopies( spec, extra_sources, actions_depends, mac_bundle_depends) # If we have actions/rules/copies, we depend directly on those, but # otherwise we depend on dependent target's actions/rules/copies etc. # We never need to explicitly depend on previous target's link steps, # because no compile ever depends on them. compile_depends_stamp = (self.target.actions_stamp or compile_depends) # Write out the compilation steps, if any. link_deps = [] sources = extra_sources + spec.get('sources', []) if sources: if self.flavor == 'mac' and len(self.archs) > 1: # Write subninja file containing compile and link commands scoped to # a single arch if a fat binary is being built. for arch in self.archs: self.ninja.subninja(self._SubninjaNameForArch(arch)) pch = None if self.flavor == 'win': gyp.msvs_emulation.VerifyMissingSources( sources, self.abs_build_dir, generator_flags, self.GypPathToNinja) pch = gyp.msvs_emulation.PrecompiledHeader( self.msvs_settings, config_name, self.GypPathToNinja, self.GypPathToUniqueOutput, self.obj_ext) else: pch = gyp.xcode_emulation.MacPrefixHeader( self.xcode_settings, self.GypPathToNinja, lambda path, lang: self.GypPathToUniqueOutput(path + '-' + lang)) link_deps = self.WriteSources( self.ninja, config_name, config, sources, compile_depends_stamp, pch, spec) # Some actions/rules output 'sources' that are already object files. obj_outputs = [f for f in sources if f.endswith(self.obj_ext)] if obj_outputs: if self.flavor != 'mac' or len(self.archs) == 1: link_deps += [self.GypPathToNinja(o) for o in obj_outputs] else: print "Warning: Actions/rules writing object files don't work with " \ "multiarch targets, dropping. (target %s)" % spec['target_name'] if self.flavor == 'win' and self.target.type == 'static_library': self.target.component_objs = link_deps # Write out a link step, if needed. output = None is_empty_bundle = not link_deps and not mac_bundle_depends if link_deps or self.target.actions_stamp or actions_depends: output = self.WriteTarget(spec, config_name, config, link_deps, self.target.actions_stamp or actions_depends) if self.is_mac_bundle: mac_bundle_depends.append(output) # Bundle all of the above together, if needed. if self.is_mac_bundle: output = self.WriteMacBundle(spec, mac_bundle_depends, is_empty_bundle) if not output: return None assert self.target.FinalOutput(), output return self.target def _WinIdlRule(self, source, prebuild, outputs): """Handle the implicit VS .idl rule for one source file. Fills |outputs| with files that are generated.""" outdir, output, vars, flags = self.msvs_settings.GetIdlBuildData( source, self.config_name) outdir = self.GypPathToNinja(outdir) def fix_path(path, rel=None): path = os.path.join(outdir, path) dirname, basename = os.path.split(source) root, ext = os.path.splitext(basename) path = self.ExpandRuleVariables( path, root, dirname, source, ext, basename) if rel: path = os.path.relpath(path, rel) return path vars = [(name, fix_path(value, outdir)) for name, value in vars] output = [fix_path(p) for p in output] vars.append(('outdir', outdir)) vars.append(('idlflags', flags)) input = self.GypPathToNinja(source) self.ninja.build(output, 'idl', input, variables=vars, order_only=prebuild) outputs.extend(output) def WriteWinIdlFiles(self, spec, prebuild): """Writes rules to match MSVS's implicit idl handling.""" assert self.flavor == 'win' if self.msvs_settings.HasExplicitIdlRules(spec): return [] outputs = [] for source in filter(lambda x: x.endswith('.idl'), spec['sources']): self._WinIdlRule(source, prebuild, outputs) return outputs def WriteActionsRulesCopies(self, spec, extra_sources, prebuild, mac_bundle_depends): """Write out the Actions, Rules, and Copies steps. Return a path representing the outputs of these steps.""" outputs = [] if self.is_mac_bundle: mac_bundle_resources = spec.get('mac_bundle_resources', [])[:] else: mac_bundle_resources = [] extra_mac_bundle_resources = [] if 'actions' in spec: outputs += self.WriteActions(spec['actions'], extra_sources, prebuild, extra_mac_bundle_resources) if 'rules' in spec: outputs += self.WriteRules(spec['rules'], extra_sources, prebuild, mac_bundle_resources, extra_mac_bundle_resources) if 'copies' in spec: outputs += self.WriteCopies(spec['copies'], prebuild, mac_bundle_depends) if 'sources' in spec and self.flavor == 'win': outputs += self.WriteWinIdlFiles(spec, prebuild) stamp = self.WriteCollapsedDependencies('actions_rules_copies', outputs) if self.is_mac_bundle: self.WriteMacBundleResources( extra_mac_bundle_resources + mac_bundle_resources, mac_bundle_depends) self.WriteMacInfoPlist(mac_bundle_depends) return stamp def GenerateDescription(self, verb, message, fallback): """Generate and return a description of a build step. |verb| is the short summary, e.g. ACTION or RULE. |message| is a hand-written description, or None if not available. |fallback| is the gyp-level name of the step, usable as a fallback. """ if self.toolset != 'target': verb += '(%s)' % self.toolset if message: return '%s %s' % (verb, self.ExpandSpecial(message)) else: return '%s %s: %s' % (verb, self.name, fallback) def WriteActions(self, actions, extra_sources, prebuild, extra_mac_bundle_resources): # Actions cd into the base directory. env = self.GetSortedXcodeEnv() if self.flavor == 'win': env = self.msvs_settings.GetVSMacroEnv( '$!PRODUCT_DIR', config=self.config_name) all_outputs = [] for action in actions: # First write out a rule for the action. name = '%s_%s' % (action['action_name'], hashlib.md5(self.qualified_target).hexdigest()) description = self.GenerateDescription('ACTION', action.get('message', None), name) is_cygwin = (self.msvs_settings.IsRuleRunUnderCygwin(action) if self.flavor == 'win' else False) args = action['action'] rule_name, _ = self.WriteNewNinjaRule(name, args, description, is_cygwin, env=env) inputs = [self.GypPathToNinja(i, env) for i in action['inputs']] if int(action.get('process_outputs_as_sources', False)): extra_sources += action['outputs'] if int(action.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += action['outputs'] outputs = [self.GypPathToNinja(o, env) for o in action['outputs']] # Then write out an edge using the rule. self.ninja.build(outputs, rule_name, inputs, order_only=prebuild) all_outputs += outputs self.ninja.newline() return all_outputs def WriteRules(self, rules, extra_sources, prebuild, mac_bundle_resources, extra_mac_bundle_resources): env = self.GetSortedXcodeEnv() all_outputs = [] for rule in rules: # First write out a rule for the rule action. name = '%s_%s' % (rule['rule_name'], hashlib.md5(self.qualified_target).hexdigest()) # Skip a rule with no action and no inputs. if 'action' not in rule and not rule.get('rule_sources', []): continue args = rule['action'] description = self.GenerateDescription( 'RULE', rule.get('message', None), ('%s ' + generator_default_variables['RULE_INPUT_PATH']) % name) is_cygwin = (self.msvs_settings.IsRuleRunUnderCygwin(rule) if self.flavor == 'win' else False) rule_name, args = self.WriteNewNinjaRule( name, args, description, is_cygwin, env=env) # TODO: if the command references the outputs directly, we should # simplify it to just use $out. # Rules can potentially make use of some special variables which # must vary per source file. # Compute the list of variables we'll need to provide. special_locals = ('source', 'root', 'dirname', 'ext', 'name') needed_variables = set(['source']) for argument in args: for var in special_locals: if ('${%s}' % var) in argument: needed_variables.add(var) def cygwin_munge(path): if is_cygwin: return path.replace('\\', '/') return path # For each source file, write an edge that generates all the outputs. for source in rule.get('rule_sources', []): source = os.path.normpath(source) dirname, basename = os.path.split(source) root, ext = os.path.splitext(basename) # Gather the list of inputs and outputs, expanding $vars if possible. outputs = [self.ExpandRuleVariables(o, root, dirname, source, ext, basename) for o in rule['outputs']] inputs = [self.ExpandRuleVariables(i, root, dirname, source, ext, basename) for i in rule.get('inputs', [])] if int(rule.get('process_outputs_as_sources', False)): extra_sources += outputs was_mac_bundle_resource = source in mac_bundle_resources if was_mac_bundle_resource or \ int(rule.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += outputs # Note: This is n_resources * n_outputs_in_rule. Put to-be-removed # items in a set and remove them all in a single pass if this becomes # a performance issue. if was_mac_bundle_resource: mac_bundle_resources.remove(source) extra_bindings = [] for var in needed_variables: if var == 'root': extra_bindings.append(('root', cygwin_munge(root))) elif var == 'dirname': # '$dirname' is a parameter to the rule action, which means # it shouldn't be converted to a Ninja path. But we don't # want $!PRODUCT_DIR in there either. dirname_expanded = self.ExpandSpecial(dirname, self.base_to_build) extra_bindings.append(('dirname', cygwin_munge(dirname_expanded))) elif var == 'source': # '$source' is a parameter to the rule action, which means # it shouldn't be converted to a Ninja path. But we don't # want $!PRODUCT_DIR in there either. source_expanded = self.ExpandSpecial(source, self.base_to_build) extra_bindings.append(('source', cygwin_munge(source_expanded))) elif var == 'ext': extra_bindings.append(('ext', ext)) elif var == 'name': extra_bindings.append(('name', cygwin_munge(basename))) else: assert var == None, repr(var) inputs = [self.GypPathToNinja(i, env) for i in inputs] outputs = [self.GypPathToNinja(o, env) for o in outputs] extra_bindings.append(('unique_name', hashlib.md5(outputs[0]).hexdigest())) self.ninja.build(outputs, rule_name, self.GypPathToNinja(source), implicit=inputs, order_only=prebuild, variables=extra_bindings) all_outputs.extend(outputs) return all_outputs def WriteCopies(self, copies, prebuild, mac_bundle_depends): outputs = [] env = self.GetSortedXcodeEnv() for copy in copies: for path in copy['files']: # Normalize the path so trailing slashes don't confuse us. path = os.path.normpath(path) basename = os.path.split(path)[1] src = self.GypPathToNinja(path, env) dst = self.GypPathToNinja(os.path.join(copy['destination'], basename), env) outputs += self.ninja.build(dst, 'copy', src, order_only=prebuild) if self.is_mac_bundle: # gyp has mac_bundle_resources to copy things into a bundle's # Resources folder, but there's no built-in way to copy files to other # places in the bundle. Hence, some targets use copies for this. Check # if this file is copied into the current bundle, and if so add it to # the bundle depends so that dependent targets get rebuilt if the copy # input changes. if dst.startswith(self.xcode_settings.GetBundleContentsFolderPath()): mac_bundle_depends.append(dst) return outputs def WriteMacBundleResources(self, resources, bundle_depends): """Writes ninja edges for 'mac_bundle_resources'.""" for output, res in gyp.xcode_emulation.GetMacBundleResources( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, map(self.GypPathToNinja, resources)): output = self.ExpandSpecial(output) self.ninja.build(output, 'mac_tool', res, variables=[('mactool_cmd', 'copy-bundle-resource')]) bundle_depends.append(output) def WriteMacInfoPlist(self, bundle_depends): """Write build rules for bundle Info.plist files.""" info_plist, out, defines, extra_env = gyp.xcode_emulation.GetMacInfoPlist( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, self.GypPathToNinja) if not info_plist: return out = self.ExpandSpecial(out) if defines: # Create an intermediate file to store preprocessed results. intermediate_plist = self.GypPathToUniqueOutput( os.path.basename(info_plist)) defines = ' '.join([Define(d, self.flavor) for d in defines]) info_plist = self.ninja.build( intermediate_plist, 'preprocess_infoplist', info_plist, variables=[('defines',defines)]) env = self.GetSortedXcodeEnv(additional_settings=extra_env) env = self.ComputeExportEnvString(env) keys = self.xcode_settings.GetExtraPlistItems(self.config_name) keys = QuoteShellArgument(json.dumps(keys), self.flavor) self.ninja.build(out, 'copy_infoplist', info_plist, variables=[('env', env), ('keys', keys)]) bundle_depends.append(out) def WriteSources(self, ninja_file, config_name, config, sources, predepends, precompiled_header, spec): """Write build rules to compile all of |sources|.""" if self.toolset == 'host': self.ninja.variable('ar', '$ar_host') self.ninja.variable('cc', '$cc_host') self.ninja.variable('cxx', '$cxx_host') self.ninja.variable('ld', '$ld_host') self.ninja.variable('ldxx', '$ldxx_host') if self.flavor != 'mac' or len(self.archs) == 1: return self.WriteSourcesForArch( self.ninja, config_name, config, sources, predepends, precompiled_header, spec) else: return dict((arch, self.WriteSourcesForArch( self.arch_subninjas[arch], config_name, config, sources, predepends, precompiled_header, spec, arch=arch)) for arch in self.archs) def WriteSourcesForArch(self, ninja_file, config_name, config, sources, predepends, precompiled_header, spec, arch=None): """Write build rules to compile all of |sources|.""" extra_defines = [] if self.flavor == 'mac': cflags = self.xcode_settings.GetCflags(config_name, arch=arch) cflags_c = self.xcode_settings.GetCflagsC(config_name) cflags_cc = self.xcode_settings.GetCflagsCC(config_name) cflags_objc = ['$cflags_c'] + \ self.xcode_settings.GetCflagsObjC(config_name) cflags_objcc = ['$cflags_cc'] + \ self.xcode_settings.GetCflagsObjCC(config_name) elif self.flavor == 'win': cflags = self.msvs_settings.GetCflags(config_name) cflags_c = self.msvs_settings.GetCflagsC(config_name) cflags_cc = self.msvs_settings.GetCflagsCC(config_name) extra_defines = self.msvs_settings.GetComputedDefines(config_name) # See comment at cc_command for why there's two .pdb files. pdbpath_c = pdbpath_cc = self.msvs_settings.GetCompilerPdbName( config_name, self.ExpandSpecial) if not pdbpath_c: obj = 'obj' if self.toolset != 'target': obj += '.' + self.toolset pdbpath = os.path.normpath(os.path.join(obj, self.base_dir, self.name)) pdbpath_c = pdbpath + '.c.pdb' pdbpath_cc = pdbpath + '.cc.pdb' self.WriteVariableList(ninja_file, 'pdbname_c', [pdbpath_c]) self.WriteVariableList(ninja_file, 'pdbname_cc', [pdbpath_cc]) self.WriteVariableList(ninja_file, 'pchprefix', [self.name]) else: cflags = config.get('cflags', []) cflags_c = config.get('cflags_c', []) cflags_cc = config.get('cflags_cc', []) # Respect environment variables related to build, but target-specific # flags can still override them. if self.toolset == 'target': cflags_c = (os.environ.get('CPPFLAGS', '').split() + os.environ.get('CFLAGS', '').split() + cflags_c) cflags_cc = (os.environ.get('CPPFLAGS', '').split() + os.environ.get('CXXFLAGS', '').split() + cflags_cc) defines = config.get('defines', []) + extra_defines self.WriteVariableList(ninja_file, 'defines', [Define(d, self.flavor) for d in defines]) if self.flavor == 'win': self.WriteVariableList(ninja_file, 'rcflags', [QuoteShellArgument(self.ExpandSpecial(f), self.flavor) for f in self.msvs_settings.GetRcflags(config_name, self.GypPathToNinja)]) include_dirs = config.get('include_dirs', []) env = self.GetSortedXcodeEnv() if self.flavor == 'win': env = self.msvs_settings.GetVSMacroEnv('$!PRODUCT_DIR', config=config_name) include_dirs = self.msvs_settings.AdjustIncludeDirs(include_dirs, config_name) self.WriteVariableList(ninja_file, 'includes', [QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor) for i in include_dirs]) pch_commands = precompiled_header.GetPchBuildCommands(arch) if self.flavor == 'mac': # Most targets use no precompiled headers, so only write these if needed. for ext, var in [('c', 'cflags_pch_c'), ('cc', 'cflags_pch_cc'), ('m', 'cflags_pch_objc'), ('mm', 'cflags_pch_objcc')]: include = precompiled_header.GetInclude(ext, arch) if include: ninja_file.variable(var, include) self.WriteVariableList(ninja_file, 'cflags', map(self.ExpandSpecial, cflags)) self.WriteVariableList(ninja_file, 'cflags_c', map(self.ExpandSpecial, cflags_c)) self.WriteVariableList(ninja_file, 'cflags_cc', map(self.ExpandSpecial, cflags_cc)) if self.flavor == 'mac': self.WriteVariableList(ninja_file, 'cflags_objc', map(self.ExpandSpecial, cflags_objc)) self.WriteVariableList(ninja_file, 'cflags_objcc', map(self.ExpandSpecial, cflags_objcc)) ninja_file.newline() outputs = [] has_rc_source = False for source in sources: filename, ext = os.path.splitext(source) ext = ext[1:] obj_ext = self.obj_ext if ext in ('cc', 'cpp', 'cxx'): command = 'cxx' self.uses_cpp = True elif ext == 'c' or (ext == 'S' and self.flavor != 'win'): command = 'cc' elif ext == 's' and self.flavor != 'win': # Doesn't generate .o.d files. command = 'cc_s' elif (self.flavor == 'win' and ext == 'asm' and self.msvs_settings.GetArch(config_name) == 'x86' and not self.msvs_settings.HasExplicitAsmRules(spec)): # Asm files only get auto assembled for x86 (not x64). command = 'asm' # Add the _asm suffix as msvs is capable of handling .cc and # .asm files of the same name without collision. obj_ext = '_asm.obj' elif self.flavor == 'mac' and ext == 'm': command = 'objc' elif self.flavor == 'mac' and ext == 'mm': command = 'objcxx' self.uses_cpp = True elif self.flavor == 'win' and ext == 'rc': command = 'rc' obj_ext = '.res' has_rc_source = True else: # Ignore unhandled extensions. continue input = self.GypPathToNinja(source) output = self.GypPathToUniqueOutput(filename + obj_ext) if arch is not None: output = AddArch(output, arch) implicit = precompiled_header.GetObjDependencies([input], [output], arch) variables = [] if self.flavor == 'win': variables, output, implicit = precompiled_header.GetFlagsModifications( input, output, implicit, command, cflags_c, cflags_cc, self.ExpandSpecial) ninja_file.build(output, command, input, implicit=[gch for _, _, gch in implicit], order_only=predepends, variables=variables) outputs.append(output) if has_rc_source: resource_include_dirs = config.get('resource_include_dirs', include_dirs) self.WriteVariableList(ninja_file, 'resource_includes', [QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor) for i in resource_include_dirs]) self.WritePchTargets(ninja_file, pch_commands) ninja_file.newline() return outputs def WritePchTargets(self, ninja_file, pch_commands): """Writes ninja rules to compile prefix headers.""" if not pch_commands: return for gch, lang_flag, lang, input in pch_commands: var_name = { 'c': 'cflags_pch_c', 'cc': 'cflags_pch_cc', 'm': 'cflags_pch_objc', 'mm': 'cflags_pch_objcc', }[lang] map = { 'c': 'cc', 'cc': 'cxx', 'm': 'objc', 'mm': 'objcxx', } cmd = map.get(lang) ninja_file.build(gch, cmd, input, variables=[(var_name, lang_flag)]) def WriteLink(self, spec, config_name, config, link_deps): """Write out a link step. Fills out target.binary. """ if self.flavor != 'mac' or len(self.archs) == 1: return self.WriteLinkForArch( self.ninja, spec, config_name, config, link_deps) else: output = self.ComputeOutput(spec) inputs = [self.WriteLinkForArch(self.arch_subninjas[arch], spec, config_name, config, link_deps[arch], arch=arch) for arch in self.archs] extra_bindings = [] if not self.is_mac_bundle: self.AppendPostbuildVariable(extra_bindings, spec, output, output) self.ninja.build(output, 'lipo', inputs, variables=extra_bindings) return output def WriteLinkForArch(self, ninja_file, spec, config_name, config, link_deps, arch=None): """Write out a link step. Fills out target.binary. """ command = { 'executable': 'link', 'loadable_module': 'solink_module', 'shared_library': 'solink', }[spec['type']] command_suffix = '' implicit_deps = set() solibs = set() if 'dependencies' in spec: # Two kinds of dependencies: # - Linkable dependencies (like a .a or a .so): add them to the link line. # - Non-linkable dependencies (like a rule that generates a file # and writes a stamp file): add them to implicit_deps extra_link_deps = set() for dep in spec['dependencies']: target = self.target_outputs.get(dep) if not target: continue linkable = target.Linkable() if linkable: new_deps = [] if (self.flavor == 'win' and target.component_objs and self.msvs_settings.IsUseLibraryDependencyInputs(config_name)): new_deps = target.component_objs elif self.flavor == 'win' and target.import_lib: new_deps = [target.import_lib] elif target.UsesToc(self.flavor): solibs.add(target.binary) implicit_deps.add(target.binary + '.TOC') else: new_deps = [target.binary] for new_dep in new_deps: if new_dep not in extra_link_deps: extra_link_deps.add(new_dep) link_deps.append(new_dep) final_output = target.FinalOutput() if not linkable or final_output != target.binary: implicit_deps.add(final_output) extra_bindings = [] if self.uses_cpp and self.flavor != 'win': extra_bindings.append(('ld', '$ldxx')) output = self.ComputeOutput(spec, arch) if arch is None and not self.is_mac_bundle: self.AppendPostbuildVariable(extra_bindings, spec, output, output) is_executable = spec['type'] == 'executable' # The ldflags config key is not used on mac or win. On those platforms # linker flags are set via xcode_settings and msvs_settings, respectively. env_ldflags = os.environ.get('LDFLAGS', '').split() if self.flavor == 'mac': ldflags = self.xcode_settings.GetLdflags(config_name, self.ExpandSpecial(generator_default_variables['PRODUCT_DIR']), self.GypPathToNinja, arch) ldflags = env_ldflags + ldflags elif self.flavor == 'win': manifest_base_name = self.GypPathToUniqueOutput( self.ComputeOutputFileName(spec)) ldflags, intermediate_manifest, manifest_files = \ self.msvs_settings.GetLdflags(config_name, self.GypPathToNinja, self.ExpandSpecial, manifest_base_name, output, is_executable, self.toplevel_build) ldflags = env_ldflags + ldflags self.WriteVariableList(ninja_file, 'manifests', manifest_files) implicit_deps = implicit_deps.union(manifest_files) if intermediate_manifest: self.WriteVariableList( ninja_file, 'intermediatemanifest', [intermediate_manifest]) command_suffix = _GetWinLinkRuleNameSuffix( self.msvs_settings.IsEmbedManifest(config_name)) def_file = self.msvs_settings.GetDefFile(self.GypPathToNinja) if def_file: implicit_deps.add(def_file) else: # Respect environment variables related to build, but target-specific # flags can still override them. ldflags = env_ldflags + config.get('ldflags', []) if is_executable and len(solibs): rpath = 'lib/' if self.toolset != 'target': rpath += self.toolset ldflags.append('-Wl,-rpath=\$$ORIGIN/%s' % rpath) ldflags.append('-Wl,-rpath-link=%s' % rpath) self.WriteVariableList(ninja_file, 'ldflags', gyp.common.uniquer(map(self.ExpandSpecial, ldflags))) library_dirs = config.get('library_dirs', []) if self.flavor == 'win': library_dirs = [self.msvs_settings.ConvertVSMacros(l, config_name) for l in library_dirs] library_dirs = ['/LIBPATH:' + QuoteShellArgument(self.GypPathToNinja(l), self.flavor) for l in library_dirs] else: library_dirs = [QuoteShellArgument('-L' + self.GypPathToNinja(l), self.flavor) for l in library_dirs] libraries = gyp.common.uniquer(map(self.ExpandSpecial, spec.get('libraries', []))) if self.flavor == 'mac': libraries = self.xcode_settings.AdjustLibraries(libraries, config_name) elif self.flavor == 'win': libraries = self.msvs_settings.AdjustLibraries(libraries) self.WriteVariableList(ninja_file, 'libs', library_dirs + libraries) linked_binary = output if command in ('solink', 'solink_module'): extra_bindings.append(('soname', os.path.split(output)[1])) extra_bindings.append(('lib', gyp.common.EncodePOSIXShellArgument(output))) if self.flavor == 'win': extra_bindings.append(('binary', output)) if '/NOENTRY' not in ldflags: self.target.import_lib = output + '.lib' extra_bindings.append(('implibflag', '/IMPLIB:%s' % self.target.import_lib)) pdbname = self.msvs_settings.GetPDBName( config_name, self.ExpandSpecial, output + '.pdb') output = [output, self.target.import_lib] if pdbname: output.append(pdbname) elif not self.is_mac_bundle: output = [output, output + '.TOC'] else: command = command + '_notoc' elif self.flavor == 'win': extra_bindings.append(('binary', output)) pdbname = self.msvs_settings.GetPDBName( config_name, self.ExpandSpecial, output + '.pdb') if pdbname: output = [output, pdbname] if len(solibs): extra_bindings.append(('solibs', gyp.common.EncodePOSIXShellList(solibs))) ninja_file.build(output, command + command_suffix, link_deps, implicit=list(implicit_deps), variables=extra_bindings) return linked_binary def WriteTarget(self, spec, config_name, config, link_deps, compile_deps): extra_link_deps = any(self.target_outputs.get(dep).Linkable() for dep in spec.get('dependencies', []) if dep in self.target_outputs) if spec['type'] == 'none' or (not link_deps and not extra_link_deps): # TODO(evan): don't call this function for 'none' target types, as # it doesn't do anything, and we fake out a 'binary' with a stamp file. self.target.binary = compile_deps self.target.type = 'none' elif spec['type'] == 'static_library': self.target.binary = self.ComputeOutput(spec) if (self.flavor not in ('mac', 'openbsd', 'win') and not self.is_standalone_static_library): self.ninja.build(self.target.binary, 'alink_thin', link_deps, order_only=compile_deps) else: variables = [] if self.xcode_settings: libtool_flags = self.xcode_settings.GetLibtoolflags(config_name) if libtool_flags: variables.append(('libtool_flags', libtool_flags)) if self.msvs_settings: libflags = self.msvs_settings.GetLibFlags(config_name, self.GypPathToNinja) variables.append(('libflags', libflags)) if self.flavor != 'mac' or len(self.archs) == 1: self.AppendPostbuildVariable(variables, spec, self.target.binary, self.target.binary) self.ninja.build(self.target.binary, 'alink', link_deps, order_only=compile_deps, variables=variables) else: inputs = [] for arch in self.archs: output = self.ComputeOutput(spec, arch) self.arch_subninjas[arch].build(output, 'alink', link_deps[arch], order_only=compile_deps, variables=variables) inputs.append(output) # TODO: It's not clear if libtool_flags should be passed to the alink # call that combines single-arch .a files into a fat .a file. self.AppendPostbuildVariable(variables, spec, self.target.binary, self.target.binary) self.ninja.build(self.target.binary, 'alink', inputs, # FIXME: test proving order_only=compile_deps isn't # needed. variables=variables) else: self.target.binary = self.WriteLink(spec, config_name, config, link_deps) return self.target.binary def WriteMacBundle(self, spec, mac_bundle_depends, is_empty): assert self.is_mac_bundle package_framework = spec['type'] in ('shared_library', 'loadable_module') output = self.ComputeMacBundleOutput() if is_empty: output += '.stamp' variables = [] self.AppendPostbuildVariable(variables, spec, output, self.target.binary, is_command_start=not package_framework) if package_framework and not is_empty: variables.append(('version', self.xcode_settings.GetFrameworkVersion())) self.ninja.build(output, 'package_framework', mac_bundle_depends, variables=variables) else: self.ninja.build(output, 'stamp', mac_bundle_depends, variables=variables) self.target.bundle = output return output def GetSortedXcodeEnv(self, additional_settings=None): """Returns the variables Xcode would set for build steps.""" assert self.abs_build_dir abs_build_dir = self.abs_build_dir return gyp.xcode_emulation.GetSortedXcodeEnv( self.xcode_settings, abs_build_dir, os.path.join(abs_build_dir, self.build_to_base), self.config_name, additional_settings) def GetSortedXcodePostbuildEnv(self): """Returns the variables Xcode would set for postbuild steps.""" postbuild_settings = {} # CHROMIUM_STRIP_SAVE_FILE is a chromium-specific hack. # TODO(thakis): It would be nice to have some general mechanism instead. strip_save_file = self.xcode_settings.GetPerTargetSetting( 'CHROMIUM_STRIP_SAVE_FILE') if strip_save_file: postbuild_settings['CHROMIUM_STRIP_SAVE_FILE'] = strip_save_file return self.GetSortedXcodeEnv(additional_settings=postbuild_settings) def AppendPostbuildVariable(self, variables, spec, output, binary, is_command_start=False): """Adds a 'postbuild' variable if there is a postbuild for |output|.""" postbuild = self.GetPostbuildCommand(spec, output, binary, is_command_start) if postbuild: variables.append(('postbuilds', postbuild)) def GetPostbuildCommand(self, spec, output, output_binary, is_command_start): """Returns a shell command that runs all the postbuilds, and removes |output| if any of them fails. If |is_command_start| is False, then the returned string will start with ' && '.""" if not self.xcode_settings or spec['type'] == 'none' or not output: return '' output = QuoteShellArgument(output, self.flavor) postbuilds = gyp.xcode_emulation.GetSpecPostbuildCommands(spec, quiet=True) if output_binary is not None: postbuilds = self.xcode_settings.AddImplicitPostbuilds( self.config_name, os.path.normpath(os.path.join(self.base_to_build, output)), QuoteShellArgument( os.path.normpath(os.path.join(self.base_to_build, output_binary)), self.flavor), postbuilds, quiet=True) if not postbuilds: return '' # Postbuilds expect to be run in the gyp file's directory, so insert an # implicit postbuild to cd to there. postbuilds.insert(0, gyp.common.EncodePOSIXShellList( ['cd', self.build_to_base])) env = self.ComputeExportEnvString(self.GetSortedXcodePostbuildEnv()) # G will be non-null if any postbuild fails. Run all postbuilds in a # subshell. commands = env + ' (' + \ ' && '.join([ninja_syntax.escape(command) for command in postbuilds]) command_string = (commands + '); G=$$?; ' # Remove the final output if any postbuild failed. '((exit $$G) || rm -rf %s) ' % output + '&& exit $$G)') if is_command_start: return '(' + command_string + ' && ' else: return '$ && (' + command_string def ComputeExportEnvString(self, env): """Given an environment, returns a string looking like 'export FOO=foo; export BAR="${FOO} bar;' that exports |env| to the shell.""" export_str = [] for k, v in env: export_str.append('export %s=%s;' % (k, ninja_syntax.escape(gyp.common.EncodePOSIXShellArgument(v)))) return ' '.join(export_str) def ComputeMacBundleOutput(self): """Return the 'output' (full output path) to a bundle output directory.""" assert self.is_mac_bundle path = generator_default_variables['PRODUCT_DIR'] return self.ExpandSpecial( os.path.join(path, self.xcode_settings.GetWrapperName())) def ComputeOutputFileName(self, spec, type=None): """Compute the filename of the final output for the current target.""" if not type: type = spec['type'] default_variables = copy.copy(generator_default_variables) CalculateVariables(default_variables, {'flavor': self.flavor}) # Compute filename prefix: the product prefix, or a default for # the product type. DEFAULT_PREFIX = { 'loadable_module': default_variables['SHARED_LIB_PREFIX'], 'shared_library': default_variables['SHARED_LIB_PREFIX'], 'static_library': default_variables['STATIC_LIB_PREFIX'], 'executable': default_variables['EXECUTABLE_PREFIX'], } prefix = spec.get('product_prefix', DEFAULT_PREFIX.get(type, '')) # Compute filename extension: the product extension, or a default # for the product type. DEFAULT_EXTENSION = { 'loadable_module': default_variables['SHARED_LIB_SUFFIX'], 'shared_library': default_variables['SHARED_LIB_SUFFIX'], 'static_library': default_variables['STATIC_LIB_SUFFIX'], 'executable': default_variables['EXECUTABLE_SUFFIX'], } extension = spec.get('product_extension') if extension: extension = '.' + extension else: extension = DEFAULT_EXTENSION.get(type, '') if 'product_name' in spec: # If we were given an explicit name, use that. target = spec['product_name'] else: # Otherwise, derive a name from the target name. target = spec['target_name'] if prefix == 'lib': # Snip out an extra 'lib' from libs if appropriate. target = StripPrefix(target, 'lib') if type in ('static_library', 'loadable_module', 'shared_library', 'executable'): return '%s%s%s' % (prefix, target, extension) elif type == 'none': return '%s.stamp' % target else: raise Exception('Unhandled output type %s' % type) def ComputeOutput(self, spec, arch=None): """Compute the path for the final output of the spec.""" type = spec['type'] if self.flavor == 'win': override = self.msvs_settings.GetOutputName(self.config_name, self.ExpandSpecial) if override: return override if arch is None and self.flavor == 'mac' and type in ( 'static_library', 'executable', 'shared_library', 'loadable_module'): filename = self.xcode_settings.GetExecutablePath() else: filename = self.ComputeOutputFileName(spec, type) if arch is None and 'product_dir' in spec: path = os.path.join(spec['product_dir'], filename) return self.ExpandSpecial(path) # Some products go into the output root, libraries go into shared library # dir, and everything else goes into the normal place. type_in_output_root = ['executable', 'loadable_module'] if self.flavor == 'mac' and self.toolset == 'target': type_in_output_root += ['shared_library', 'static_library'] elif self.flavor == 'win' and self.toolset == 'target': type_in_output_root += ['shared_library'] if arch is not None: # Make sure partial executables don't end up in a bundle or the regular # output directory. archdir = 'arch' if self.toolset != 'target': archdir = os.path.join('arch', '%s' % self.toolset) return os.path.join(archdir, AddArch(filename, arch)) elif type in type_in_output_root or self.is_standalone_static_library: return filename elif type == 'shared_library': libdir = 'lib' if self.toolset != 'target': libdir = os.path.join('lib', '%s' % self.toolset) return os.path.join(libdir, filename) else: return self.GypPathToUniqueOutput(filename, qualified=False) def WriteVariableList(self, ninja_file, var, values): assert not isinstance(values, str) if values is None: values = [] ninja_file.variable(var, ' '.join(values)) def WriteNewNinjaRule(self, name, args, description, is_cygwin, env): """Write out a new ninja "rule" statement for a given command. Returns the name of the new rule, and a copy of |args| with variables expanded.""" if self.flavor == 'win': args = [self.msvs_settings.ConvertVSMacros( arg, self.base_to_build, config=self.config_name) for arg in args] description = self.msvs_settings.ConvertVSMacros( description, config=self.config_name) elif self.flavor == 'mac': # |env| is an empty list on non-mac. args = [gyp.xcode_emulation.ExpandEnvVars(arg, env) for arg in args] description = gyp.xcode_emulation.ExpandEnvVars(description, env) # TODO: we shouldn't need to qualify names; we do it because # currently the ninja rule namespace is global, but it really # should be scoped to the subninja. rule_name = self.name if self.toolset == 'target': rule_name += '.' + self.toolset rule_name += '.' + name rule_name = re.sub('[^a-zA-Z0-9_]', '_', rule_name) # Remove variable references, but not if they refer to the magic rule # variables. This is not quite right, as it also protects these for # actions, not just for rules where they are valid. Good enough. protect = [ '${root}', '${dirname}', '${source}', '${ext}', '${name}' ] protect = '(?!' + '|'.join(map(re.escape, protect)) + ')' description = re.sub(protect + r'\$', '_', description) # gyp dictates that commands are run from the base directory. # cd into the directory before running, and adjust paths in # the arguments to point to the proper locations. rspfile = None rspfile_content = None args = [self.ExpandSpecial(arg, self.base_to_build) for arg in args] if self.flavor == 'win': rspfile = rule_name + '.$unique_name.rsp' # The cygwin case handles this inside the bash sub-shell. run_in = '' if is_cygwin else ' ' + self.build_to_base if is_cygwin: rspfile_content = self.msvs_settings.BuildCygwinBashCommandLine( args, self.build_to_base) else: rspfile_content = gyp.msvs_emulation.EncodeRspFileList(args) command = ('%s gyp-win-tool action-wrapper $arch ' % sys.executable + rspfile + run_in) else: env = self.ComputeExportEnvString(env) command = gyp.common.EncodePOSIXShellList(args) command = 'cd %s; ' % self.build_to_base + env + command # GYP rules/actions express being no-ops by not touching their outputs. # Avoid executing downstream dependencies in this case by specifying # restat=1 to ninja. self.ninja.rule(rule_name, command, description, restat=True, rspfile=rspfile, rspfile_content=rspfile_content) self.ninja.newline() return rule_name, args def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" global generator_additional_non_configuration_keys global generator_additional_path_sections flavor = gyp.common.GetFlavor(params) if flavor == 'mac': default_variables.setdefault('OS', 'mac') default_variables.setdefault('SHARED_LIB_SUFFIX', '.dylib') default_variables.setdefault('SHARED_LIB_DIR', generator_default_variables['PRODUCT_DIR']) default_variables.setdefault('LIB_DIR', generator_default_variables['PRODUCT_DIR']) # Copy additional generator configuration data from Xcode, which is shared # by the Mac Ninja generator. import gyp.generator.xcode as xcode_generator generator_additional_non_configuration_keys = getattr(xcode_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(xcode_generator, 'generator_additional_path_sections', []) global generator_extra_sources_for_rules generator_extra_sources_for_rules = getattr(xcode_generator, 'generator_extra_sources_for_rules', []) elif flavor == 'win': default_variables.setdefault('OS', 'win') default_variables['EXECUTABLE_SUFFIX'] = '.exe' default_variables['STATIC_LIB_PREFIX'] = '' default_variables['STATIC_LIB_SUFFIX'] = '.lib' default_variables['SHARED_LIB_PREFIX'] = '' default_variables['SHARED_LIB_SUFFIX'] = '.dll' # Copy additional generator configuration data from VS, which is shared # by the Windows Ninja generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) else: operating_system = flavor if flavor == 'android': operating_system = 'linux' # Keep this legacy behavior for now. default_variables.setdefault('OS', operating_system) default_variables.setdefault('SHARED_LIB_SUFFIX', '.so') default_variables.setdefault('SHARED_LIB_DIR', os.path.join('$!PRODUCT_DIR', 'lib')) default_variables.setdefault('LIB_DIR', os.path.join('$!PRODUCT_DIR', 'obj')) def ComputeOutputDir(params): """Returns the path from the toplevel_dir to the build output directory.""" # generator_dir: relative path from pwd to where make puts build files. # Makes migrating from make to ninja easier, ninja doesn't put anything here. generator_dir = os.path.relpath(params['options'].generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = params.get('generator_flags', {}).get('output_dir', 'out') # Relative path from source root to our output files. e.g. "out" return os.path.normpath(os.path.join(generator_dir, output_dir)) def CalculateGeneratorInputInfo(params): """Called by __init__ to initialize generator values based on params.""" # E.g. "out/gypfiles" toplevel = params['options'].toplevel_dir qualified_out_dir = os.path.normpath(os.path.join( toplevel, ComputeOutputDir(params), 'gypfiles')) global generator_filelist_paths generator_filelist_paths = { 'toplevel': toplevel, 'qualified_out_dir': qualified_out_dir, } def OpenOutput(path, mode='w'): """Open |path| for writing, creating directories if necessary.""" gyp.common.EnsureDirExists(path) return open(path, mode) def CommandWithWrapper(cmd, wrappers, prog): wrapper = wrappers.get(cmd, '') if wrapper: return wrapper + ' ' + prog return prog def GetDefaultConcurrentLinks(): """Returns a best-guess for a number of concurrent links.""" if sys.platform in ('win32', 'cygwin'): import ctypes class MEMORYSTATUSEX(ctypes.Structure): _fields_ = [ ("dwLength", ctypes.c_ulong), ("dwMemoryLoad", ctypes.c_ulong), ("ullTotalPhys", ctypes.c_ulonglong), ("ullAvailPhys", ctypes.c_ulonglong), ("ullTotalPageFile", ctypes.c_ulonglong), ("ullAvailPageFile", ctypes.c_ulonglong), ("ullTotalVirtual", ctypes.c_ulonglong), ("ullAvailVirtual", ctypes.c_ulonglong), ("sullAvailExtendedVirtual", ctypes.c_ulonglong), ] stat = MEMORYSTATUSEX() stat.dwLength = ctypes.sizeof(stat) ctypes.windll.kernel32.GlobalMemoryStatusEx(ctypes.byref(stat)) mem_limit = max(1, stat.ullTotalPhys / (4 * (2 ** 30))) # total / 4GB hard_cap = max(1, int(os.getenv('GYP_LINK_CONCURRENCY_MAX', 2**32))) # return min(mem_limit, hard_cap) # TODO(scottmg): Temporary speculative fix for OOM on builders # See http://crbug.com/333000. return 2 elif sys.platform.startswith('linux'): with open("/proc/meminfo") as meminfo: memtotal_re = re.compile(r'^MemTotal:\s*(\d*)\s*kB') for line in meminfo: match = memtotal_re.match(line) if not match: continue # Allow 8Gb per link on Linux because Gold is quite memory hungry return max(1, int(match.group(1)) / (8 * (2 ** 20))) return 1 elif sys.platform == 'darwin': try: avail_bytes = int(subprocess.check_output(['sysctl', '-n', 'hw.memsize'])) # A static library debug build of Chromium's unit_tests takes ~2.7GB, so # 4GB per ld process allows for some more bloat. return max(1, avail_bytes / (4 * (2 ** 30))) # total / 4GB except: return 1 else: # TODO(scottmg): Implement this for other platforms. return 1 def _GetWinLinkRuleNameSuffix(embed_manifest): """Returns the suffix used to select an appropriate linking rule depending on whether the manifest embedding is enabled.""" return '_embed' if embed_manifest else '' def _AddWinLinkRules(master_ninja, embed_manifest): """Adds link rules for Windows platform to |master_ninja|.""" def FullLinkCommand(ldcmd, out, binary_type): resource_name = { 'exe': '1', 'dll': '2', }[binary_type] return '%(python)s gyp-win-tool link-with-manifests $arch %(embed)s ' \ '%(out)s "%(ldcmd)s" %(resname)s $mt $rc "$intermediatemanifest" ' \ '$manifests' % { 'python': sys.executable, 'out': out, 'ldcmd': ldcmd, 'resname': resource_name, 'embed': embed_manifest } rule_name_suffix = _GetWinLinkRuleNameSuffix(embed_manifest) use_separate_mspdbsrv = ( int(os.environ.get('GYP_USE_SEPARATE_MSPDBSRV', '0')) != 0) dlldesc = 'LINK%s(DLL) $binary' % rule_name_suffix.upper() dllcmd = ('%s gyp-win-tool link-wrapper $arch %s ' '$ld /nologo $implibflag /DLL /OUT:$binary ' '@$binary.rsp' % (sys.executable, use_separate_mspdbsrv)) dllcmd = FullLinkCommand(dllcmd, '$binary', 'dll') master_ninja.rule('solink' + rule_name_suffix, description=dlldesc, command=dllcmd, rspfile='$binary.rsp', rspfile_content='$libs $in_newline $ldflags', restat=True, pool='link_pool') master_ninja.rule('solink_module' + rule_name_suffix, description=dlldesc, command=dllcmd, rspfile='$binary.rsp', rspfile_content='$libs $in_newline $ldflags', restat=True, pool='link_pool') # Note that ldflags goes at the end so that it has the option of # overriding default settings earlier in the command line. exe_cmd = ('%s gyp-win-tool link-wrapper $arch %s ' '$ld /nologo /OUT:$binary @$binary.rsp' % (sys.executable, use_separate_mspdbsrv)) exe_cmd = FullLinkCommand(exe_cmd, '$binary', 'exe') master_ninja.rule('link' + rule_name_suffix, description='LINK%s $binary' % rule_name_suffix.upper(), command=exe_cmd, rspfile='$binary.rsp', rspfile_content='$in_newline $libs $ldflags', pool='link_pool') def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name): options = params['options'] flavor = gyp.common.GetFlavor(params) generator_flags = params.get('generator_flags', {}) # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.normpath( os.path.join(ComputeOutputDir(params), config_name)) toplevel_build = os.path.join(options.toplevel_dir, build_dir) master_ninja_file = OpenOutput(os.path.join(toplevel_build, 'build.ninja')) master_ninja = ninja_syntax.Writer(master_ninja_file, width=120) # Put build-time support tools in out/{config_name}. gyp.common.CopyTool(flavor, toplevel_build) # Grab make settings for CC/CXX. # The rules are # - The priority from low to high is gcc/g++, the 'make_global_settings' in # gyp, the environment variable. # - If there is no 'make_global_settings' for CC.host/CXX.host or # 'CC_host'/'CXX_host' enviroment variable, cc_host/cxx_host should be set # to cc/cxx. if flavor == 'win': # Overridden by local arch choice in the use_deps case. # Chromium's ffmpeg c99conv.py currently looks for a 'cc =' line in # build.ninja so needs something valid here. http://crbug.com/233985 cc = 'cl.exe' cxx = 'cl.exe' ld = 'link.exe' ld_host = '$ld' else: cc = 'cc' cxx = 'c++' ld = '$cc' ldxx = '$cxx' ld_host = '$cc_host' ldxx_host = '$cxx_host' cc_host = None cxx_host = None cc_host_global_setting = None cxx_host_global_setting = None build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings = data[build_file].get('make_global_settings', []) build_to_root = gyp.common.InvertRelativePath(build_dir, options.toplevel_dir) wrappers = {} for key, value in make_global_settings: if key == 'CC': cc = os.path.join(build_to_root, value) if key == 'CXX': cxx = os.path.join(build_to_root, value) if key == 'CC.host': cc_host = os.path.join(build_to_root, value) cc_host_global_setting = value if key == 'CXX.host': cxx_host = os.path.join(build_to_root, value) cxx_host_global_setting = value if key.endswith('_wrapper'): wrappers[key[:-len('_wrapper')]] = os.path.join(build_to_root, value) # Support wrappers from environment variables too. for key, value in os.environ.iteritems(): if key.lower().endswith('_wrapper'): key_prefix = key[:-len('_wrapper')] key_prefix = re.sub(r'\.HOST$', '.host', key_prefix) wrappers[key_prefix] = os.path.join(build_to_root, value) if flavor == 'win': cl_paths = gyp.msvs_emulation.GenerateEnvironmentFiles( toplevel_build, generator_flags, OpenOutput) for arch, path in cl_paths.iteritems(): master_ninja.variable( 'cl_' + arch, CommandWithWrapper('CC', wrappers, QuoteShellArgument(path, flavor))) cc = GetEnvironFallback(['CC_target', 'CC'], cc) master_ninja.variable('cc', CommandWithWrapper('CC', wrappers, cc)) cxx = GetEnvironFallback(['CXX_target', 'CXX'], cxx) master_ninja.variable('cxx', CommandWithWrapper('CXX', wrappers, cxx)) if flavor == 'win': master_ninja.variable('ld', ld) master_ninja.variable('idl', 'midl.exe') master_ninja.variable('ar', 'lib.exe') master_ninja.variable('rc', 'rc.exe') master_ninja.variable('asm', 'ml.exe') master_ninja.variable('mt', 'mt.exe') else: master_ninja.variable('ld', CommandWithWrapper('LINK', wrappers, ld)) master_ninja.variable('ldxx', CommandWithWrapper('LINK', wrappers, ldxx)) master_ninja.variable('ar', GetEnvironFallback(['AR_target', 'AR'], 'ar')) if generator_supports_multiple_toolsets: if not cc_host: cc_host = cc if not cxx_host: cxx_host = cxx master_ninja.variable('ar_host', GetEnvironFallback(['AR_host'], 'ar')) cc_host = GetEnvironFallback(['CC_host'], cc_host) cxx_host = GetEnvironFallback(['CXX_host'], cxx_host) # The environment variable could be used in 'make_global_settings', like # ['CC.host', '$(CC)'] or ['CXX.host', '$(CXX)'], transform them here. if '$(CC)' in cc_host and cc_host_global_setting: cc_host = cc_host_global_setting.replace('$(CC)', cc) if '$(CXX)' in cxx_host and cxx_host_global_setting: cxx_host = cxx_host_global_setting.replace('$(CXX)', cxx) master_ninja.variable('cc_host', CommandWithWrapper('CC.host', wrappers, cc_host)) master_ninja.variable('cxx_host', CommandWithWrapper('CXX.host', wrappers, cxx_host)) if flavor == 'win': master_ninja.variable('ld_host', ld_host) else: master_ninja.variable('ld_host', CommandWithWrapper( 'LINK', wrappers, ld_host)) master_ninja.variable('ldxx_host', CommandWithWrapper( 'LINK', wrappers, ldxx_host)) master_ninja.newline() master_ninja.pool('link_pool', depth=GetDefaultConcurrentLinks()) master_ninja.newline() deps = 'msvc' if flavor == 'win' else 'gcc' if flavor != 'win': master_ninja.rule( 'cc', description='CC $out', command=('$cc -MMD -MF $out.d $defines $includes $cflags $cflags_c ' '$cflags_pch_c -c $in -o $out'), depfile='$out.d', deps=deps) master_ninja.rule( 'cc_s', description='CC $out', command=('$cc $defines $includes $cflags $cflags_c ' '$cflags_pch_c -c $in -o $out')) master_ninja.rule( 'cxx', description='CXX $out', command=('$cxx -MMD -MF $out.d $defines $includes $cflags $cflags_cc ' '$cflags_pch_cc -c $in -o $out'), depfile='$out.d', deps=deps) else: # TODO(scottmg) Separate pdb names is a test to see if it works around # http://crbug.com/142362. It seems there's a race between the creation of # the .pdb by the precompiled header step for .cc and the compilation of # .c files. This should be handled by mspdbsrv, but rarely errors out with # c1xx : fatal error C1033: cannot open program database # By making the rules target separate pdb files this might be avoided. cc_command = ('ninja -t msvc -e $arch ' + '-- ' '$cc /nologo /showIncludes /FC ' '@$out.rsp /c $in /Fo$out /Fd$pdbname_c ') cxx_command = ('ninja -t msvc -e $arch ' + '-- ' '$cxx /nologo /showIncludes /FC ' '@$out.rsp /c $in /Fo$out /Fd$pdbname_cc ') master_ninja.rule( 'cc', description='CC $out', command=cc_command, rspfile='$out.rsp', rspfile_content='$defines $includes $cflags $cflags_c', deps=deps) master_ninja.rule( 'cxx', description='CXX $out', command=cxx_command, rspfile='$out.rsp', rspfile_content='$defines $includes $cflags $cflags_cc', deps=deps) master_ninja.rule( 'idl', description='IDL $in', command=('%s gyp-win-tool midl-wrapper $arch $outdir ' '$tlb $h $dlldata $iid $proxy $in ' '$idlflags' % sys.executable)) master_ninja.rule( 'rc', description='RC $in', # Note: $in must be last otherwise rc.exe complains. command=('%s gyp-win-tool rc-wrapper ' '$arch $rc $defines $resource_includes $rcflags /fo$out $in' % sys.executable)) master_ninja.rule( 'asm', description='ASM $in', command=('%s gyp-win-tool asm-wrapper ' '$arch $asm $defines $includes /c /Fo $out $in' % sys.executable)) if flavor != 'mac' and flavor != 'win': master_ninja.rule( 'alink', description='AR $out', command='rm -f $out && $ar rcs $out $in') master_ninja.rule( 'alink_thin', description='AR $out', command='rm -f $out && $ar rcsT $out $in') # This allows targets that only need to depend on $lib's API to declare an # order-only dependency on $lib.TOC and avoid relinking such downstream # dependencies when $lib changes only in non-public ways. # The resulting string leaves an uninterpolated %{suffix} which # is used in the final substitution below. mtime_preserving_solink_base = ( 'if [ ! -e $lib -o ! -e ${lib}.TOC ]; then ' '%(solink)s && %(extract_toc)s > ${lib}.TOC; else ' '%(solink)s && %(extract_toc)s > ${lib}.tmp && ' 'if ! cmp -s ${lib}.tmp ${lib}.TOC; then mv ${lib}.tmp ${lib}.TOC ; ' 'fi; fi' % { 'solink': '$ld -shared $ldflags -o $lib -Wl,-soname=$soname %(suffix)s', 'extract_toc': ('{ readelf -d ${lib} | grep SONAME ; ' 'nm -gD -f p ${lib} | cut -f1-2 -d\' \'; }')}) master_ninja.rule( 'solink', description='SOLINK $lib', restat=True, command=(mtime_preserving_solink_base % { 'suffix': '-Wl,--whole-archive $in $solibs -Wl,--no-whole-archive ' '$libs'}), pool='link_pool') master_ninja.rule( 'solink_module', description='SOLINK(module) $lib', restat=True, command=(mtime_preserving_solink_base % { 'suffix': '-Wl,--start-group $in $solibs -Wl,--end-group ' '$libs'}), pool='link_pool') master_ninja.rule( 'link', description='LINK $out', command=('$ld $ldflags -o $out ' '-Wl,--start-group $in $solibs -Wl,--end-group $libs'), pool='link_pool') elif flavor == 'win': master_ninja.rule( 'alink', description='LIB $out', command=('%s gyp-win-tool link-wrapper $arch False ' '$ar /nologo /ignore:4221 /OUT:$out @$out.rsp' % sys.executable), rspfile='$out.rsp', rspfile_content='$in_newline $libflags') _AddWinLinkRules(master_ninja, embed_manifest=True) _AddWinLinkRules(master_ninja, embed_manifest=False) else: master_ninja.rule( 'objc', description='OBJC $out', command=('$cc -MMD -MF $out.d $defines $includes $cflags $cflags_objc ' '$cflags_pch_objc -c $in -o $out'), depfile='$out.d', deps=deps) master_ninja.rule( 'objcxx', description='OBJCXX $out', command=('$cxx -MMD -MF $out.d $defines $includes $cflags $cflags_objcc ' '$cflags_pch_objcc -c $in -o $out'), depfile='$out.d', deps=deps) master_ninja.rule( 'alink', description='LIBTOOL-STATIC $out, POSTBUILDS', command='rm -f $out && ' './gyp-mac-tool filter-libtool libtool $libtool_flags ' '-static -o $out $in' '$postbuilds') master_ninja.rule( 'lipo', description='LIPO $out, POSTBUILDS', command='rm -f $out && lipo -create $in -output $out$postbuilds') # Record the public interface of $lib in $lib.TOC. See the corresponding # comment in the posix section above for details. solink_base = '$ld %(type)s $ldflags -o $lib %(suffix)s' mtime_preserving_solink_base = ( 'if [ ! -e $lib -o ! -e ${lib}.TOC ] || ' # Always force dependent targets to relink if this library # reexports something. Handling this correctly would require # recursive TOC dumping but this is rare in practice, so punt. 'otool -l $lib | grep -q LC_REEXPORT_DYLIB ; then ' '%(solink)s && %(extract_toc)s > ${lib}.TOC; ' 'else ' '%(solink)s && %(extract_toc)s > ${lib}.tmp && ' 'if ! cmp -s ${lib}.tmp ${lib}.TOC; then ' 'mv ${lib}.tmp ${lib}.TOC ; ' 'fi; ' 'fi' % { 'solink': solink_base, 'extract_toc': '{ otool -l $lib | grep LC_ID_DYLIB -A 5; ' 'nm -gP $lib | cut -f1-2 -d\' \' | grep -v U$$; true; }'}) solink_suffix = '$in $solibs $libs$postbuilds' master_ninja.rule( 'solink', description='SOLINK $lib, POSTBUILDS', restat=True, command=mtime_preserving_solink_base % {'suffix': solink_suffix, 'type': '-shared'}, pool='link_pool') master_ninja.rule( 'solink_notoc', description='SOLINK $lib, POSTBUILDS', restat=True, command=solink_base % {'suffix':solink_suffix, 'type': '-shared'}, pool='link_pool') solink_module_suffix = '$in $solibs $libs$postbuilds' master_ninja.rule( 'solink_module', description='SOLINK(module) $lib, POSTBUILDS', restat=True, command=mtime_preserving_solink_base % {'suffix': solink_module_suffix, 'type': '-bundle'}, pool='link_pool') master_ninja.rule( 'solink_module_notoc', description='SOLINK(module) $lib, POSTBUILDS', restat=True, command=solink_base % {'suffix': solink_module_suffix, 'type': '-bundle'}, pool='link_pool') master_ninja.rule( 'link', description='LINK $out, POSTBUILDS', command=('$ld $ldflags -o $out ' '$in $solibs $libs$postbuilds'), pool='link_pool') master_ninja.rule( 'preprocess_infoplist', description='PREPROCESS INFOPLIST $out', command=('$cc -E -P -Wno-trigraphs -x c $defines $in -o $out && ' 'plutil -convert xml1 $out $out')) master_ninja.rule( 'copy_infoplist', description='COPY INFOPLIST $in', command='$env ./gyp-mac-tool copy-info-plist $in $out $keys') master_ninja.rule( 'mac_tool', description='MACTOOL $mactool_cmd $in', command='$env ./gyp-mac-tool $mactool_cmd $in $out') master_ninja.rule( 'package_framework', description='PACKAGE FRAMEWORK $out, POSTBUILDS', command='./gyp-mac-tool package-framework $out $version$postbuilds ' '&& touch $out') if flavor == 'win': master_ninja.rule( 'stamp', description='STAMP $out', command='%s gyp-win-tool stamp $out' % sys.executable) master_ninja.rule( 'copy', description='COPY $in $out', command='%s gyp-win-tool recursive-mirror $in $out' % sys.executable) else: master_ninja.rule( 'stamp', description='STAMP $out', command='${postbuilds}touch $out') master_ninja.rule( 'copy', description='COPY $in $out', command='rm -rf $out && cp -af $in $out') master_ninja.newline() all_targets = set() for build_file in params['build_files']: for target in gyp.common.AllTargets(target_list, target_dicts, os.path.normpath(build_file)): all_targets.add(target) all_outputs = set() # target_outputs is a map from qualified target name to a Target object. target_outputs = {} # target_short_names is a map from target short name to a list of Target # objects. target_short_names = {} for qualified_target in target_list: # qualified_target is like: third_party/icu/icu.gyp:icui18n#target build_file, name, toolset = \ gyp.common.ParseQualifiedTarget(qualified_target) this_make_global_settings = data[build_file].get('make_global_settings', []) assert make_global_settings == this_make_global_settings, ( "make_global_settings needs to be the same for all targets. %s vs. %s" % (this_make_global_settings, make_global_settings)) spec = target_dicts[qualified_target] if flavor == 'mac': gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(data[build_file], spec) build_file = gyp.common.RelativePath(build_file, options.toplevel_dir) base_path = os.path.dirname(build_file) obj = 'obj' if toolset != 'target': obj += '.' + toolset output_file = os.path.join(obj, base_path, name + '.ninja') ninja_output = StringIO() writer = NinjaWriter(qualified_target, target_outputs, base_path, build_dir, ninja_output, toplevel_build, output_file, flavor, toplevel_dir=options.toplevel_dir) target = writer.WriteSpec(spec, config_name, generator_flags) if ninja_output.tell() > 0: # Only create files for ninja files that actually have contents. with OpenOutput(os.path.join(toplevel_build, output_file)) as ninja_file: ninja_file.write(ninja_output.getvalue()) ninja_output.close() master_ninja.subninja(output_file) if target: if name != target.FinalOutput() and spec['toolset'] == 'target': target_short_names.setdefault(name, []).append(target) target_outputs[qualified_target] = target if qualified_target in all_targets: all_outputs.add(target.FinalOutput()) if target_short_names: # Write a short name to build this target. This benefits both the # "build chrome" case as well as the gyp tests, which expect to be # able to run actions and build libraries by their short name. master_ninja.newline() master_ninja.comment('Short names for targets.') for short_name in target_short_names: master_ninja.build(short_name, 'phony', [x.FinalOutput() for x in target_short_names[short_name]]) if all_outputs: master_ninja.newline() master_ninja.build('all', 'phony', list(all_outputs)) master_ninja.default(generator_flags.get('default_target', 'all')) master_ninja_file.close() def PerformBuild(data, configurations, params): options = params['options'] for config in configurations: builddir = os.path.join(options.toplevel_dir, 'out', config) arguments = ['ninja', '-C', builddir] print 'Building [%s]: %s' % (config, arguments) subprocess.check_call(arguments) def CallGenerateOutputForConfig(arglist): # Ignore the interrupt signal so that the parent process catches it and # kills all multiprocessing children. signal.signal(signal.SIGINT, signal.SIG_IGN) (target_list, target_dicts, data, params, config_name) = arglist GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) def GenerateOutput(target_list, target_dicts, data, params): # Update target_dicts for iOS device builds. target_dicts = gyp.xcode_emulation.CloneConfigurationForDeviceAndEmulator( target_dicts) user_config = params.get('generator_flags', {}).get('config', None) if gyp.common.GetFlavor(params) == 'win': target_list, target_dicts = MSVSUtil.ShardTargets(target_list, target_dicts) target_list, target_dicts = MSVSUtil.InsertLargePdbShims( target_list, target_dicts, generator_default_variables) if user_config: GenerateOutputForConfig(target_list, target_dicts, data, params, user_config) else: config_names = target_dicts[target_list[0]]['configurations'].keys() if params['parallel']: try: pool = multiprocessing.Pool(len(config_names)) arglists = [] for config_name in config_names: arglists.append( (target_list, target_dicts, data, params, config_name)) pool.map(CallGenerateOutputForConfig, arglists) except KeyboardInterrupt, e: pool.terminate() raise e else: for config_name in config_names: GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja_test.py��������������������000644 �000766 �000024 �00000003113 12455173731 034233� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the ninja.py file. """ import gyp.generator.ninja as ninja import unittest import StringIO import sys import TestCommon class TestPrefixesAndSuffixes(unittest.TestCase): def test_BinaryNamesWindows(self): writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'build.ninja', '.', 'build.ninja', 'win') spec = { 'target_name': 'wee' } self.assertTrue(writer.ComputeOutputFileName(spec, 'executable'). endswith('.exe')) self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library'). endswith('.dll')) self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library'). endswith('.lib')) def test_BinaryNamesLinux(self): writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'build.ninja', '.', 'build.ninja', 'linux') spec = { 'target_name': 'wee' } self.assertTrue('.' not in writer.ComputeOutputFileName(spec, 'executable')) self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library'). startswith('lib')) self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library'). startswith('lib')) self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library'). endswith('.so')) self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library'). endswith('.a')) if __name__ == '__main__': unittest.main() �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode.py��000644 �000766 �000024 �00000153034 12455173731 033266� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import filecmp import gyp.common import gyp.xcodeproj_file import errno import os import sys import posixpath import re import shutil import subprocess import tempfile # Project files generated by this module will use _intermediate_var as a # custom Xcode setting whose value is a DerivedSources-like directory that's # project-specific and configuration-specific. The normal choice, # DERIVED_FILE_DIR, is target-specific, which is thought to be too restrictive # as it is likely that multiple targets within a single project file will want # to access the same set of generated files. The other option, # PROJECT_DERIVED_FILE_DIR, is unsuitable because while it is project-specific, # it is not configuration-specific. INTERMEDIATE_DIR is defined as # $(PROJECT_DERIVED_FILE_DIR)/$(CONFIGURATION). _intermediate_var = 'INTERMEDIATE_DIR' # SHARED_INTERMEDIATE_DIR is the same, except that it is shared among all # targets that share the same BUILT_PRODUCTS_DIR. _shared_intermediate_var = 'SHARED_INTERMEDIATE_DIR' _library_search_paths_var = 'LIBRARY_SEARCH_PATHS' generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'SHARED_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_SUFFIX': '.dylib', # INTERMEDIATE_DIR is a place for targets to build up intermediate products. # It is specific to each build environment. It is only guaranteed to exist # and be constant within the context of a project, corresponding to a single # input file. Some build environments may allow their intermediate directory # to be shared on a wider scale, but this is not guaranteed. 'INTERMEDIATE_DIR': '$(%s)' % _intermediate_var, 'OS': 'mac', 'PRODUCT_DIR': '$(BUILT_PRODUCTS_DIR)', 'LIB_DIR': '$(BUILT_PRODUCTS_DIR)', 'RULE_INPUT_ROOT': '$(INPUT_FILE_BASE)', 'RULE_INPUT_EXT': '$(INPUT_FILE_SUFFIX)', 'RULE_INPUT_NAME': '$(INPUT_FILE_NAME)', 'RULE_INPUT_PATH': '$(INPUT_FILE_PATH)', 'RULE_INPUT_DIRNAME': '$(INPUT_FILE_DIRNAME)', 'SHARED_INTERMEDIATE_DIR': '$(%s)' % _shared_intermediate_var, 'CONFIGURATION_NAME': '$(CONFIGURATION)', } # The Xcode-specific sections that hold paths. generator_additional_path_sections = [ 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers', # 'mac_framework_dirs', input already handles _dirs endings. ] # The Xcode-specific keys that exist on targets and aren't moved down to # configurations. generator_additional_non_configuration_keys = [ 'mac_bundle', 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers', 'mac_xctest_bundle', 'xcode_create_dependents_test_runner', ] # We want to let any rules apply to files that are resources also. generator_extra_sources_for_rules = [ 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers', ] # Xcode's standard set of library directories, which don't need to be duplicated # in LIBRARY_SEARCH_PATHS. This list is not exhaustive, but that's okay. xcode_standard_library_dirs = frozenset([ '$(SDKROOT)/usr/lib', '$(SDKROOT)/usr/local/lib', ]) def CreateXCConfigurationList(configuration_names): xccl = gyp.xcodeproj_file.XCConfigurationList({'buildConfigurations': []}) if len(configuration_names) == 0: configuration_names = ['Default'] for configuration_name in configuration_names: xcbc = gyp.xcodeproj_file.XCBuildConfiguration({ 'name': configuration_name}) xccl.AppendProperty('buildConfigurations', xcbc) xccl.SetProperty('defaultConfigurationName', configuration_names[0]) return xccl class XcodeProject(object): def __init__(self, gyp_path, path, build_file_dict): self.gyp_path = gyp_path self.path = path self.project = gyp.xcodeproj_file.PBXProject(path=path) projectDirPath = gyp.common.RelativePath( os.path.dirname(os.path.abspath(self.gyp_path)), os.path.dirname(path) or '.') self.project.SetProperty('projectDirPath', projectDirPath) self.project_file = \ gyp.xcodeproj_file.XCProjectFile({'rootObject': self.project}) self.build_file_dict = build_file_dict # TODO(mark): add destructor that cleans up self.path if created_dir is # True and things didn't complete successfully. Or do something even # better with "try"? self.created_dir = False try: os.makedirs(self.path) self.created_dir = True except OSError, e: if e.errno != errno.EEXIST: raise def Finalize1(self, xcode_targets, serialize_all_tests): # Collect a list of all of the build configuration names used by the # various targets in the file. It is very heavily advised to keep each # target in an entire project (even across multiple project files) using # the same set of configuration names. configurations = [] for xct in self.project.GetProperty('targets'): xccl = xct.GetProperty('buildConfigurationList') xcbcs = xccl.GetProperty('buildConfigurations') for xcbc in xcbcs: name = xcbc.GetProperty('name') if name not in configurations: configurations.append(name) # Replace the XCConfigurationList attached to the PBXProject object with # a new one specifying all of the configuration names used by the various # targets. try: xccl = CreateXCConfigurationList(configurations) self.project.SetProperty('buildConfigurationList', xccl) except: sys.stderr.write("Problem with gyp file %s\n" % self.gyp_path) raise # The need for this setting is explained above where _intermediate_var is # defined. The comments below about wanting to avoid project-wide build # settings apply here too, but this needs to be set on a project-wide basis # so that files relative to the _intermediate_var setting can be displayed # properly in the Xcode UI. # # Note that for configuration-relative files such as anything relative to # _intermediate_var, for the purposes of UI tree view display, Xcode will # only resolve the configuration name once, when the project file is # opened. If the active build configuration is changed, the project file # must be closed and reopened if it is desired for the tree view to update. # This is filed as Apple radar 6588391. xccl.SetBuildSetting(_intermediate_var, '$(PROJECT_DERIVED_FILE_DIR)/$(CONFIGURATION)') xccl.SetBuildSetting(_shared_intermediate_var, '$(SYMROOT)/DerivedSources/$(CONFIGURATION)') # Set user-specified project-wide build settings and config files. This # is intended to be used very sparingly. Really, almost everything should # go into target-specific build settings sections. The project-wide # settings are only intended to be used in cases where Xcode attempts to # resolve variable references in a project context as opposed to a target # context, such as when resolving sourceTree references while building up # the tree tree view for UI display. # Any values set globally are applied to all configurations, then any # per-configuration values are applied. for xck, xcv in self.build_file_dict.get('xcode_settings', {}).iteritems(): xccl.SetBuildSetting(xck, xcv) if 'xcode_config_file' in self.build_file_dict: config_ref = self.project.AddOrGetFileInRootGroup( self.build_file_dict['xcode_config_file']) xccl.SetBaseConfiguration(config_ref) build_file_configurations = self.build_file_dict.get('configurations', {}) if build_file_configurations: for config_name in configurations: build_file_configuration_named = \ build_file_configurations.get(config_name, {}) if build_file_configuration_named: xcc = xccl.ConfigurationNamed(config_name) for xck, xcv in build_file_configuration_named.get('xcode_settings', {}).iteritems(): xcc.SetBuildSetting(xck, xcv) if 'xcode_config_file' in build_file_configuration_named: config_ref = self.project.AddOrGetFileInRootGroup( build_file_configurations[config_name]['xcode_config_file']) xcc.SetBaseConfiguration(config_ref) # Sort the targets based on how they appeared in the input. # TODO(mark): Like a lot of other things here, this assumes internal # knowledge of PBXProject - in this case, of its "targets" property. # ordinary_targets are ordinary targets that are already in the project # file. run_test_targets are the targets that run unittests and should be # used for the Run All Tests target. support_targets are the action/rule # targets used by GYP file targets, just kept for the assert check. ordinary_targets = [] run_test_targets = [] support_targets = [] # targets is full list of targets in the project. targets = [] # does the it define it's own "all"? has_custom_all = False # targets_for_all is the list of ordinary_targets that should be listed # in this project's "All" target. It includes each non_runtest_target # that does not have suppress_wildcard set. targets_for_all = [] for target in self.build_file_dict['targets']: target_name = target['target_name'] toolset = target['toolset'] qualified_target = gyp.common.QualifiedTarget(self.gyp_path, target_name, toolset) xcode_target = xcode_targets[qualified_target] # Make sure that the target being added to the sorted list is already in # the unsorted list. assert xcode_target in self.project._properties['targets'] targets.append(xcode_target) ordinary_targets.append(xcode_target) if xcode_target.support_target: support_targets.append(xcode_target.support_target) targets.append(xcode_target.support_target) if not int(target.get('suppress_wildcard', False)): targets_for_all.append(xcode_target) if target_name.lower() == 'all': has_custom_all = True; # If this target has a 'run_as' attribute, add its target to the # targets, and add it to the test targets. if target.get('run_as'): # Make a target to run something. It should have one # dependency, the parent xcode target. xccl = CreateXCConfigurationList(configurations) run_target = gyp.xcodeproj_file.PBXAggregateTarget({ 'name': 'Run ' + target_name, 'productName': xcode_target.GetProperty('productName'), 'buildConfigurationList': xccl, }, parent=self.project) run_target.AddDependency(xcode_target) command = target['run_as'] script = '' if command.get('working_directory'): script = script + 'cd "%s"\n' % \ gyp.xcodeproj_file.ConvertVariablesToShellSyntax( command.get('working_directory')) if command.get('environment'): script = script + "\n".join( ['export %s="%s"' % (key, gyp.xcodeproj_file.ConvertVariablesToShellSyntax(val)) for (key, val) in command.get('environment').iteritems()]) + "\n" # Some test end up using sockets, files on disk, etc. and can get # confused if more then one test runs at a time. The generator # flag 'xcode_serialize_all_test_runs' controls the forcing of all # tests serially. It defaults to True. To get serial runs this # little bit of python does the same as the linux flock utility to # make sure only one runs at a time. command_prefix = '' if serialize_all_tests: command_prefix = \ """python -c "import fcntl, subprocess, sys file = open('$TMPDIR/GYP_serialize_test_runs', 'a') fcntl.flock(file.fileno(), fcntl.LOCK_EX) sys.exit(subprocess.call(sys.argv[1:]))" """ # If we were unable to exec for some reason, we want to exit # with an error, and fixup variable references to be shell # syntax instead of xcode syntax. script = script + 'exec ' + command_prefix + '%s\nexit 1\n' % \ gyp.xcodeproj_file.ConvertVariablesToShellSyntax( gyp.common.EncodePOSIXShellList(command.get('action'))) ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'shellScript': script, 'showEnvVarsInLog': 0, }) run_target.AppendProperty('buildPhases', ssbp) # Add the run target to the project file. targets.append(run_target) run_test_targets.append(run_target) xcode_target.test_runner = run_target # Make sure that the list of targets being replaced is the same length as # the one replacing it, but allow for the added test runner targets. assert len(self.project._properties['targets']) == \ len(ordinary_targets) + len(support_targets) self.project._properties['targets'] = targets # Get rid of unnecessary levels of depth in groups like the Source group. self.project.RootGroupsTakeOverOnlyChildren(True) # Sort the groups nicely. Do this after sorting the targets, because the # Products group is sorted based on the order of the targets. self.project.SortGroups() # Create an "All" target if there's more than one target in this project # file and the project didn't define its own "All" target. Put a generated # "All" target first so that people opening up the project for the first # time will build everything by default. if len(targets_for_all) > 1 and not has_custom_all: xccl = CreateXCConfigurationList(configurations) all_target = gyp.xcodeproj_file.PBXAggregateTarget( { 'buildConfigurationList': xccl, 'name': 'All', }, parent=self.project) for target in targets_for_all: all_target.AddDependency(target) # TODO(mark): This is evil because it relies on internal knowledge of # PBXProject._properties. It's important to get the "All" target first, # though. self.project._properties['targets'].insert(0, all_target) # The same, but for run_test_targets. if len(run_test_targets) > 1: xccl = CreateXCConfigurationList(configurations) run_all_tests_target = gyp.xcodeproj_file.PBXAggregateTarget( { 'buildConfigurationList': xccl, 'name': 'Run All Tests', }, parent=self.project) for run_test_target in run_test_targets: run_all_tests_target.AddDependency(run_test_target) # Insert after the "All" target, which must exist if there is more than # one run_test_target. self.project._properties['targets'].insert(1, run_all_tests_target) def Finalize2(self, xcode_targets, xcode_target_to_target_dict): # Finalize2 needs to happen in a separate step because the process of # updating references to other projects depends on the ordering of targets # within remote project files. Finalize1 is responsible for sorting duty, # and once all project files are sorted, Finalize2 can come in and update # these references. # To support making a "test runner" target that will run all the tests # that are direct dependents of any given target, we look for # xcode_create_dependents_test_runner being set on an Aggregate target, # and generate a second target that will run the tests runners found under # the marked target. for bf_tgt in self.build_file_dict['targets']: if int(bf_tgt.get('xcode_create_dependents_test_runner', 0)): tgt_name = bf_tgt['target_name'] toolset = bf_tgt['toolset'] qualified_target = gyp.common.QualifiedTarget(self.gyp_path, tgt_name, toolset) xcode_target = xcode_targets[qualified_target] if isinstance(xcode_target, gyp.xcodeproj_file.PBXAggregateTarget): # Collect all the run test targets. all_run_tests = [] pbxtds = xcode_target.GetProperty('dependencies') for pbxtd in pbxtds: pbxcip = pbxtd.GetProperty('targetProxy') dependency_xct = pbxcip.GetProperty('remoteGlobalIDString') if hasattr(dependency_xct, 'test_runner'): all_run_tests.append(dependency_xct.test_runner) # Directly depend on all the runners as they depend on the target # that builds them. if len(all_run_tests) > 0: run_all_target = gyp.xcodeproj_file.PBXAggregateTarget({ 'name': 'Run %s Tests' % tgt_name, 'productName': tgt_name, }, parent=self.project) for run_test_target in all_run_tests: run_all_target.AddDependency(run_test_target) # Insert the test runner after the related target. idx = self.project._properties['targets'].index(xcode_target) self.project._properties['targets'].insert(idx + 1, run_all_target) # Update all references to other projects, to make sure that the lists of # remote products are complete. Otherwise, Xcode will fill them in when # it opens the project file, which will result in unnecessary diffs. # TODO(mark): This is evil because it relies on internal knowledge of # PBXProject._other_pbxprojects. for other_pbxproject in self.project._other_pbxprojects.keys(): self.project.AddOrGetProjectReference(other_pbxproject) self.project.SortRemoteProductReferences() # Give everything an ID. self.project_file.ComputeIDs() # Make sure that no two objects in the project file have the same ID. If # multiple objects wind up with the same ID, upon loading the file, Xcode # will only recognize one object (the last one in the file?) and the # results are unpredictable. self.project_file.EnsureNoIDCollisions() def Write(self): # Write the project file to a temporary location first. Xcode watches for # changes to the project file and presents a UI sheet offering to reload # the project when it does change. However, in some cases, especially when # multiple projects are open or when Xcode is busy, things don't work so # seamlessly. Sometimes, Xcode is able to detect that a project file has # changed but can't unload it because something else is referencing it. # To mitigate this problem, and to avoid even having Xcode present the UI # sheet when an open project is rewritten for inconsequential changes, the # project file is written to a temporary file in the xcodeproj directory # first. The new temporary file is then compared to the existing project # file, if any. If they differ, the new file replaces the old; otherwise, # the new project file is simply deleted. Xcode properly detects a file # being renamed over an open project file as a change and so it remains # able to present the "project file changed" sheet under this system. # Writing to a temporary file first also avoids the possible problem of # Xcode rereading an incomplete project file. (output_fd, new_pbxproj_path) = \ tempfile.mkstemp(suffix='.tmp', prefix='project.pbxproj.gyp.', dir=self.path) try: output_file = os.fdopen(output_fd, 'wb') self.project_file.Print(output_file) output_file.close() pbxproj_path = os.path.join(self.path, 'project.pbxproj') same = False try: same = filecmp.cmp(pbxproj_path, new_pbxproj_path, False) except OSError, e: if e.errno != errno.ENOENT: raise if same: # The new file is identical to the old one, just get rid of the new # one. os.unlink(new_pbxproj_path) else: # The new file is different from the old one, or there is no old one. # Rename the new file to the permanent name. # # tempfile.mkstemp uses an overly restrictive mode, resulting in a # file that can only be read by the owner, regardless of the umask. # There's no reason to not respect the umask here, which means that # an extra hoop is required to fetch it and reset the new file's mode. # # No way to get the umask without setting a new one? Set a safe one # and then set it back to the old value. umask = os.umask(077) os.umask(umask) os.chmod(new_pbxproj_path, 0666 & ~umask) os.rename(new_pbxproj_path, pbxproj_path) except Exception: # Don't leave turds behind. In fact, if this code was responsible for # creating the xcodeproj directory, get rid of that too. os.unlink(new_pbxproj_path) if self.created_dir: shutil.rmtree(self.path, True) raise def AddSourceToTarget(source, type, pbxp, xct): # TODO(mark): Perhaps source_extensions and library_extensions can be made a # little bit fancier. source_extensions = ['c', 'cc', 'cpp', 'cxx', 'm', 'mm', 's'] # .o is conceptually more of a "source" than a "library," but Xcode thinks # of "sources" as things to compile and "libraries" (or "frameworks") as # things to link with. Adding an object file to an Xcode target's frameworks # phase works properly. library_extensions = ['a', 'dylib', 'framework', 'o'] basename = posixpath.basename(source) (root, ext) = posixpath.splitext(basename) if ext: ext = ext[1:].lower() if ext in source_extensions and type != 'none': xct.SourcesPhase().AddFile(source) elif ext in library_extensions and type != 'none': xct.FrameworksPhase().AddFile(source) else: # Files that aren't added to a sources or frameworks build phase can still # go into the project file, just not as part of a build phase. pbxp.AddOrGetFileInRootGroup(source) def AddResourceToTarget(resource, pbxp, xct): # TODO(mark): Combine with AddSourceToTarget above? Or just inline this call # where it's used. xct.ResourcesPhase().AddFile(resource) def AddHeaderToTarget(header, pbxp, xct, is_public): # TODO(mark): Combine with AddSourceToTarget above? Or just inline this call # where it's used. settings = '{ATTRIBUTES = (%s, ); }' % ('Private', 'Public')[is_public] xct.HeadersPhase().AddFile(header, settings) _xcode_variable_re = re.compile('(\$\((.*?)\))') def ExpandXcodeVariables(string, expansions): """Expands Xcode-style $(VARIABLES) in string per the expansions dict. In some rare cases, it is appropriate to expand Xcode variables when a project file is generated. For any substring $(VAR) in string, if VAR is a key in the expansions dict, $(VAR) will be replaced with expansions[VAR]. Any $(VAR) substring in string for which VAR is not a key in the expansions dict will remain in the returned string. """ matches = _xcode_variable_re.findall(string) if matches == None: return string matches.reverse() for match in matches: (to_replace, variable) = match if not variable in expansions: continue replacement = expansions[variable] string = re.sub(re.escape(to_replace), replacement, string) return string _xcode_define_re = re.compile(r'([\\\"\' ])') def EscapeXcodeDefine(s): """We must escape the defines that we give to XCode so that it knows not to split on spaces and to respect backslash and quote literals. However, we must not quote the define, or Xcode will incorrectly intepret variables especially $(inherited).""" return re.sub(_xcode_define_re, r'\\\1', s) def PerformBuild(data, configurations, params): options = params['options'] for build_file, build_file_dict in data.iteritems(): (build_file_root, build_file_ext) = os.path.splitext(build_file) if build_file_ext != '.gyp': continue xcodeproj_path = build_file_root + options.suffix + '.xcodeproj' if options.generator_output: xcodeproj_path = os.path.join(options.generator_output, xcodeproj_path) for config in configurations: arguments = ['xcodebuild', '-project', xcodeproj_path] arguments += ['-configuration', config] print "Building [%s]: %s" % (config, arguments) subprocess.check_call(arguments) def GenerateOutput(target_list, target_dicts, data, params): options = params['options'] generator_flags = params.get('generator_flags', {}) parallel_builds = generator_flags.get('xcode_parallel_builds', True) serialize_all_tests = \ generator_flags.get('xcode_serialize_all_test_runs', True) project_version = generator_flags.get('xcode_project_version', None) skip_excluded_files = \ not generator_flags.get('xcode_list_excluded_files', True) xcode_projects = {} for build_file, build_file_dict in data.iteritems(): (build_file_root, build_file_ext) = os.path.splitext(build_file) if build_file_ext != '.gyp': continue xcodeproj_path = build_file_root + options.suffix + '.xcodeproj' if options.generator_output: xcodeproj_path = os.path.join(options.generator_output, xcodeproj_path) xcp = XcodeProject(build_file, xcodeproj_path, build_file_dict) xcode_projects[build_file] = xcp pbxp = xcp.project if parallel_builds: pbxp.SetProperty('attributes', {'BuildIndependentTargetsInParallel': 'YES'}) if project_version: xcp.project_file.SetXcodeVersion(project_version) # Add gyp/gypi files to project if not generator_flags.get('standalone'): main_group = pbxp.GetProperty('mainGroup') build_group = gyp.xcodeproj_file.PBXGroup({'name': 'Build'}) main_group.AppendChild(build_group) for included_file in build_file_dict['included_files']: build_group.AddOrGetFileByPath(included_file, False) xcode_targets = {} xcode_target_to_target_dict = {} for qualified_target in target_list: [build_file, target_name, toolset] = \ gyp.common.ParseQualifiedTarget(qualified_target) spec = target_dicts[qualified_target] if spec['toolset'] != 'target': raise Exception( 'Multiple toolsets not supported in xcode build (target %s)' % qualified_target) configuration_names = [spec['default_configuration']] for configuration_name in sorted(spec['configurations'].keys()): if configuration_name not in configuration_names: configuration_names.append(configuration_name) xcp = xcode_projects[build_file] pbxp = xcp.project # Set up the configurations for the target according to the list of names # supplied. xccl = CreateXCConfigurationList(configuration_names) # Create an XCTarget subclass object for the target. The type with # "+bundle" appended will be used if the target has "mac_bundle" set. # loadable_modules not in a mac_bundle are mapped to # com.googlecode.gyp.xcode.bundle, a pseudo-type that xcode.py interprets # to create a single-file mh_bundle. _types = { 'executable': 'com.apple.product-type.tool', 'loadable_module': 'com.googlecode.gyp.xcode.bundle', 'shared_library': 'com.apple.product-type.library.dynamic', 'static_library': 'com.apple.product-type.library.static', 'executable+bundle': 'com.apple.product-type.application', 'loadable_module+bundle': 'com.apple.product-type.bundle', 'loadable_module+xctest': 'com.apple.product-type.bundle.unit-test', 'shared_library+bundle': 'com.apple.product-type.framework', } target_properties = { 'buildConfigurationList': xccl, 'name': target_name, } type = spec['type'] is_xctest = int(spec.get('mac_xctest_bundle', 0)) is_bundle = int(spec.get('mac_bundle', 0)) or is_xctest if type != 'none': type_bundle_key = type if is_xctest: type_bundle_key += '+xctest' assert type == 'loadable_module', ( 'mac_xctest_bundle targets must have type loadable_module ' '(target %s)' % target_name) elif is_bundle: type_bundle_key += '+bundle' xctarget_type = gyp.xcodeproj_file.PBXNativeTarget try: target_properties['productType'] = _types[type_bundle_key] except KeyError, e: gyp.common.ExceptionAppend(e, "-- unknown product type while " "writing target %s" % target_name) raise else: xctarget_type = gyp.xcodeproj_file.PBXAggregateTarget assert not is_bundle, ( 'mac_bundle targets cannot have type none (target "%s")' % target_name) assert not is_xctest, ( 'mac_xctest_bundle targets cannot have type none (target "%s")' % target_name) target_product_name = spec.get('product_name') if target_product_name is not None: target_properties['productName'] = target_product_name xct = xctarget_type(target_properties, parent=pbxp, force_outdir=spec.get('product_dir'), force_prefix=spec.get('product_prefix'), force_extension=spec.get('product_extension')) pbxp.AppendProperty('targets', xct) xcode_targets[qualified_target] = xct xcode_target_to_target_dict[xct] = spec spec_actions = spec.get('actions', []) spec_rules = spec.get('rules', []) # Xcode has some "issues" with checking dependencies for the "Compile # sources" step with any source files/headers generated by actions/rules. # To work around this, if a target is building anything directly (not # type "none"), then a second target is used to run the GYP actions/rules # and is made a dependency of this target. This way the work is done # before the dependency checks for what should be recompiled. support_xct = None if type != 'none' and (spec_actions or spec_rules): support_xccl = CreateXCConfigurationList(configuration_names); support_target_properties = { 'buildConfigurationList': support_xccl, 'name': target_name + ' Support', } if target_product_name: support_target_properties['productName'] = \ target_product_name + ' Support' support_xct = \ gyp.xcodeproj_file.PBXAggregateTarget(support_target_properties, parent=pbxp) pbxp.AppendProperty('targets', support_xct) xct.AddDependency(support_xct) # Hang the support target off the main target so it can be tested/found # by the generator during Finalize. xct.support_target = support_xct prebuild_index = 0 # Add custom shell script phases for "actions" sections. for action in spec_actions: # There's no need to write anything into the script to ensure that the # output directories already exist, because Xcode will look at the # declared outputs and automatically ensure that they exist for us. # Do we have a message to print when this action runs? message = action.get('message') if message: message = 'echo note: ' + gyp.common.EncodePOSIXShellArgument(message) else: message = '' # Turn the list into a string that can be passed to a shell. action_string = gyp.common.EncodePOSIXShellList(action['action']) # Convert Xcode-type variable references to sh-compatible environment # variable references. message_sh = gyp.xcodeproj_file.ConvertVariablesToShellSyntax(message) action_string_sh = gyp.xcodeproj_file.ConvertVariablesToShellSyntax( action_string) script = '' # Include the optional message if message_sh: script += message_sh + '\n' # Be sure the script runs in exec, and that if exec fails, the script # exits signalling an error. script += 'exec ' + action_string_sh + '\nexit 1\n' ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'inputPaths': action['inputs'], 'name': 'Action "' + action['action_name'] + '"', 'outputPaths': action['outputs'], 'shellScript': script, 'showEnvVarsInLog': 0, }) if support_xct: support_xct.AppendProperty('buildPhases', ssbp) else: # TODO(mark): this assumes too much knowledge of the internals of # xcodeproj_file; some of these smarts should move into xcodeproj_file # itself. xct._properties['buildPhases'].insert(prebuild_index, ssbp) prebuild_index = prebuild_index + 1 # TODO(mark): Should verify that at most one of these is specified. if int(action.get('process_outputs_as_sources', False)): for output in action['outputs']: AddSourceToTarget(output, type, pbxp, xct) if int(action.get('process_outputs_as_mac_bundle_resources', False)): for output in action['outputs']: AddResourceToTarget(output, pbxp, xct) # tgt_mac_bundle_resources holds the list of bundle resources so # the rule processing can check against it. if is_bundle: tgt_mac_bundle_resources = spec.get('mac_bundle_resources', []) else: tgt_mac_bundle_resources = [] # Add custom shell script phases driving "make" for "rules" sections. # # Xcode's built-in rule support is almost powerful enough to use directly, # but there are a few significant deficiencies that render them unusable. # There are workarounds for some of its inadequacies, but in aggregate, # the workarounds added complexity to the generator, and some workarounds # actually require input files to be crafted more carefully than I'd like. # Consequently, until Xcode rules are made more capable, "rules" input # sections will be handled in Xcode output by shell script build phases # performed prior to the compilation phase. # # The following problems with Xcode rules were found. The numbers are # Apple radar IDs. I hope that these shortcomings are addressed, I really # liked having the rules handled directly in Xcode during the period that # I was prototyping this. # # 6588600 Xcode compiles custom script rule outputs too soon, compilation # fails. This occurs when rule outputs from distinct inputs are # interdependent. The only workaround is to put rules and their # inputs in a separate target from the one that compiles the rule # outputs. This requires input file cooperation and it means that # process_outputs_as_sources is unusable. # 6584932 Need to declare that custom rule outputs should be excluded from # compilation. A possible workaround is to lie to Xcode about a # rule's output, giving it a dummy file it doesn't know how to # compile. The rule action script would need to touch the dummy. # 6584839 I need a way to declare additional inputs to a custom rule. # A possible workaround is a shell script phase prior to # compilation that touches a rule's primary input files if any # would-be additional inputs are newer than the output. Modifying # the source tree - even just modification times - feels dirty. # 6564240 Xcode "custom script" build rules always dump all environment # variables. This is a low-prioroty problem and is not a # show-stopper. rules_by_ext = {} for rule in spec_rules: rules_by_ext[rule['extension']] = rule # First, some definitions: # # A "rule source" is a file that was listed in a target's "sources" # list and will have a rule applied to it on the basis of matching the # rule's "extensions" attribute. Rule sources are direct inputs to # rules. # # Rule definitions may specify additional inputs in their "inputs" # attribute. These additional inputs are used for dependency tracking # purposes. # # A "concrete output" is a rule output with input-dependent variables # resolved. For example, given a rule with: # 'extension': 'ext', 'outputs': ['$(INPUT_FILE_BASE).cc'], # if the target's "sources" list contained "one.ext" and "two.ext", # the "concrete output" for rule input "two.ext" would be "two.cc". If # a rule specifies multiple outputs, each input file that the rule is # applied to will have the same number of concrete outputs. # # If any concrete outputs are outdated or missing relative to their # corresponding rule_source or to any specified additional input, the # rule action must be performed to generate the concrete outputs. # concrete_outputs_by_rule_source will have an item at the same index # as the rule['rule_sources'] that it corresponds to. Each item is a # list of all of the concrete outputs for the rule_source. concrete_outputs_by_rule_source = [] # concrete_outputs_all is a flat list of all concrete outputs that this # rule is able to produce, given the known set of input files # (rule_sources) that apply to it. concrete_outputs_all = [] # messages & actions are keyed by the same indices as rule['rule_sources'] # and concrete_outputs_by_rule_source. They contain the message and # action to perform after resolving input-dependent variables. The # message is optional, in which case None is stored for each rule source. messages = [] actions = [] for rule_source in rule.get('rule_sources', []): rule_source_dirname, rule_source_basename = \ posixpath.split(rule_source) (rule_source_root, rule_source_ext) = \ posixpath.splitext(rule_source_basename) # These are the same variable names that Xcode uses for its own native # rule support. Because Xcode's rule engine is not being used, they # need to be expanded as they are written to the makefile. rule_input_dict = { 'INPUT_FILE_BASE': rule_source_root, 'INPUT_FILE_SUFFIX': rule_source_ext, 'INPUT_FILE_NAME': rule_source_basename, 'INPUT_FILE_PATH': rule_source, 'INPUT_FILE_DIRNAME': rule_source_dirname, } concrete_outputs_for_this_rule_source = [] for output in rule.get('outputs', []): # Fortunately, Xcode and make both use $(VAR) format for their # variables, so the expansion is the only transformation necessary. # Any remaning $(VAR)-type variables in the string can be given # directly to make, which will pick up the correct settings from # what Xcode puts into the environment. concrete_output = ExpandXcodeVariables(output, rule_input_dict) concrete_outputs_for_this_rule_source.append(concrete_output) # Add all concrete outputs to the project. pbxp.AddOrGetFileInRootGroup(concrete_output) concrete_outputs_by_rule_source.append( \ concrete_outputs_for_this_rule_source) concrete_outputs_all.extend(concrete_outputs_for_this_rule_source) # TODO(mark): Should verify that at most one of these is specified. if int(rule.get('process_outputs_as_sources', False)): for output in concrete_outputs_for_this_rule_source: AddSourceToTarget(output, type, pbxp, xct) # If the file came from the mac_bundle_resources list or if the rule # is marked to process outputs as bundle resource, do so. was_mac_bundle_resource = rule_source in tgt_mac_bundle_resources if was_mac_bundle_resource or \ int(rule.get('process_outputs_as_mac_bundle_resources', False)): for output in concrete_outputs_for_this_rule_source: AddResourceToTarget(output, pbxp, xct) # Do we have a message to print when this rule runs? message = rule.get('message') if message: message = gyp.common.EncodePOSIXShellArgument(message) message = ExpandXcodeVariables(message, rule_input_dict) messages.append(message) # Turn the list into a string that can be passed to a shell. action_string = gyp.common.EncodePOSIXShellList(rule['action']) action = ExpandXcodeVariables(action_string, rule_input_dict) actions.append(action) if len(concrete_outputs_all) > 0: # TODO(mark): There's a possibilty for collision here. Consider # target "t" rule "A_r" and target "t_A" rule "r". makefile_name = '%s.make' % re.sub( '[^a-zA-Z0-9_]', '_' , '%s_%s' % (target_name, rule['rule_name'])) makefile_path = os.path.join(xcode_projects[build_file].path, makefile_name) # TODO(mark): try/close? Write to a temporary file and swap it only # if it's got changes? makefile = open(makefile_path, 'wb') # make will build the first target in the makefile by default. By # convention, it's called "all". List all (or at least one) # concrete output for each rule source as a prerequisite of the "all" # target. makefile.write('all: \\\n') for concrete_output_index in \ xrange(0, len(concrete_outputs_by_rule_source)): # Only list the first (index [0]) concrete output of each input # in the "all" target. Otherwise, a parallel make (-j > 1) would # attempt to process each input multiple times simultaneously. # Otherwise, "all" could just contain the entire list of # concrete_outputs_all. concrete_output = \ concrete_outputs_by_rule_source[concrete_output_index][0] if concrete_output_index == len(concrete_outputs_by_rule_source) - 1: eol = '' else: eol = ' \\' makefile.write(' %s%s\n' % (concrete_output, eol)) for (rule_source, concrete_outputs, message, action) in \ zip(rule['rule_sources'], concrete_outputs_by_rule_source, messages, actions): makefile.write('\n') # Add a rule that declares it can build each concrete output of a # rule source. Collect the names of the directories that are # required. concrete_output_dirs = [] for concrete_output_index in xrange(0, len(concrete_outputs)): concrete_output = concrete_outputs[concrete_output_index] if concrete_output_index == 0: bol = '' else: bol = ' ' makefile.write('%s%s \\\n' % (bol, concrete_output)) concrete_output_dir = posixpath.dirname(concrete_output) if (concrete_output_dir and concrete_output_dir not in concrete_output_dirs): concrete_output_dirs.append(concrete_output_dir) makefile.write(' : \\\n') # The prerequisites for this rule are the rule source itself and # the set of additional rule inputs, if any. prerequisites = [rule_source] prerequisites.extend(rule.get('inputs', [])) for prerequisite_index in xrange(0, len(prerequisites)): prerequisite = prerequisites[prerequisite_index] if prerequisite_index == len(prerequisites) - 1: eol = '' else: eol = ' \\' makefile.write(' %s%s\n' % (prerequisite, eol)) # Make sure that output directories exist before executing the rule # action. if len(concrete_output_dirs) > 0: makefile.write('\t@mkdir -p "%s"\n' % '" "'.join(concrete_output_dirs)) # The rule message and action have already had the necessary variable # substitutions performed. if message: # Mark it with note: so Xcode picks it up in build output. makefile.write('\t@echo note: %s\n' % message) makefile.write('\t%s\n' % action) makefile.close() # It might be nice to ensure that needed output directories exist # here rather than in each target in the Makefile, but that wouldn't # work if there ever was a concrete output that had an input-dependent # variable anywhere other than in the leaf position. # Don't declare any inputPaths or outputPaths. If they're present, # Xcode will provide a slight optimization by only running the script # phase if any output is missing or outdated relative to any input. # Unfortunately, it will also assume that all outputs are touched by # the script, and if the outputs serve as files in a compilation # phase, they will be unconditionally rebuilt. Since make might not # rebuild everything that could be declared here as an output, this # extra compilation activity is unnecessary. With inputPaths and # outputPaths not supplied, make will always be called, but it knows # enough to not do anything when everything is up-to-date. # To help speed things up, pass -j COUNT to make so it does some work # in parallel. Don't use ncpus because Xcode will build ncpus targets # in parallel and if each target happens to have a rules step, there # would be ncpus^2 things going. With a machine that has 2 quad-core # Xeons, a build can quickly run out of processes based on # scheduling/other tasks, and randomly failing builds are no good. script = \ """JOB_COUNT="$(/usr/sbin/sysctl -n hw.ncpu)" if [ "${JOB_COUNT}" -gt 4 ]; then JOB_COUNT=4 fi exec xcrun make -f "${PROJECT_FILE_PATH}/%s" -j "${JOB_COUNT}" exit 1 """ % makefile_name ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'name': 'Rule "' + rule['rule_name'] + '"', 'shellScript': script, 'showEnvVarsInLog': 0, }) if support_xct: support_xct.AppendProperty('buildPhases', ssbp) else: # TODO(mark): this assumes too much knowledge of the internals of # xcodeproj_file; some of these smarts should move into xcodeproj_file # itself. xct._properties['buildPhases'].insert(prebuild_index, ssbp) prebuild_index = prebuild_index + 1 # Extra rule inputs also go into the project file. Concrete outputs were # already added when they were computed. groups = ['inputs', 'inputs_excluded'] if skip_excluded_files: groups = [x for x in groups if not x.endswith('_excluded')] for group in groups: for item in rule.get(group, []): pbxp.AddOrGetFileInRootGroup(item) # Add "sources". for source in spec.get('sources', []): (source_root, source_extension) = posixpath.splitext(source) if source_extension[1:] not in rules_by_ext: # AddSourceToTarget will add the file to a root group if it's not # already there. AddSourceToTarget(source, type, pbxp, xct) else: pbxp.AddOrGetFileInRootGroup(source) # Add "mac_bundle_resources" and "mac_framework_private_headers" if # it's a bundle of any type. if is_bundle: for resource in tgt_mac_bundle_resources: (resource_root, resource_extension) = posixpath.splitext(resource) if resource_extension[1:] not in rules_by_ext: AddResourceToTarget(resource, pbxp, xct) else: pbxp.AddOrGetFileInRootGroup(resource) for header in spec.get('mac_framework_private_headers', []): AddHeaderToTarget(header, pbxp, xct, False) # Add "mac_framework_headers". These can be valid for both frameworks # and static libraries. if is_bundle or type == 'static_library': for header in spec.get('mac_framework_headers', []): AddHeaderToTarget(header, pbxp, xct, True) # Add "copies". pbxcp_dict = {} for copy_group in spec.get('copies', []): dest = copy_group['destination'] if dest[0] not in ('/', '$'): # Relative paths are relative to $(SRCROOT). dest = '$(SRCROOT)/' + dest # Coalesce multiple "copies" sections in the same target with the same # "destination" property into the same PBXCopyFilesBuildPhase, otherwise # they'll wind up with ID collisions. pbxcp = pbxcp_dict.get(dest, None) if pbxcp is None: pbxcp = gyp.xcodeproj_file.PBXCopyFilesBuildPhase({ 'name': 'Copy to ' + copy_group['destination'] }, parent=xct) pbxcp.SetDestination(dest) # TODO(mark): The usual comment about this knowing too much about # gyp.xcodeproj_file internals applies. xct._properties['buildPhases'].insert(prebuild_index, pbxcp) pbxcp_dict[dest] = pbxcp for file in copy_group['files']: pbxcp.AddFile(file) # Excluded files can also go into the project file. if not skip_excluded_files: for key in ['sources', 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers']: excluded_key = key + '_excluded' for item in spec.get(excluded_key, []): pbxp.AddOrGetFileInRootGroup(item) # So can "inputs" and "outputs" sections of "actions" groups. groups = ['inputs', 'inputs_excluded', 'outputs', 'outputs_excluded'] if skip_excluded_files: groups = [x for x in groups if not x.endswith('_excluded')] for action in spec.get('actions', []): for group in groups: for item in action.get(group, []): # Exclude anything in BUILT_PRODUCTS_DIR. They're products, not # sources. if not item.startswith('$(BUILT_PRODUCTS_DIR)/'): pbxp.AddOrGetFileInRootGroup(item) for postbuild in spec.get('postbuilds', []): action_string_sh = gyp.common.EncodePOSIXShellList(postbuild['action']) script = 'exec ' + action_string_sh + '\nexit 1\n' # Make the postbuild step depend on the output of ld or ar from this # target. Apparently putting the script step after the link step isn't # sufficient to ensure proper ordering in all cases. With an input # declared but no outputs, the script step should run every time, as # desired. ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'inputPaths': ['$(BUILT_PRODUCTS_DIR)/$(EXECUTABLE_PATH)'], 'name': 'Postbuild "' + postbuild['postbuild_name'] + '"', 'shellScript': script, 'showEnvVarsInLog': 0, }) xct.AppendProperty('buildPhases', ssbp) # Add dependencies before libraries, because adding a dependency may imply # adding a library. It's preferable to keep dependencies listed first # during a link phase so that they can override symbols that would # otherwise be provided by libraries, which will usually include system # libraries. On some systems, ld is finicky and even requires the # libraries to be ordered in such a way that unresolved symbols in # earlier-listed libraries may only be resolved by later-listed libraries. # The Mac linker doesn't work that way, but other platforms do, and so # their linker invocations need to be constructed in this way. There's # no compelling reason for Xcode's linker invocations to differ. if 'dependencies' in spec: for dependency in spec['dependencies']: xct.AddDependency(xcode_targets[dependency]) # The support project also gets the dependencies (in case they are # needed for the actions/rules to work). if support_xct: support_xct.AddDependency(xcode_targets[dependency]) if 'libraries' in spec: for library in spec['libraries']: xct.FrameworksPhase().AddFile(library) # Add the library's directory to LIBRARY_SEARCH_PATHS if necessary. # I wish Xcode handled this automatically. library_dir = posixpath.dirname(library) if library_dir not in xcode_standard_library_dirs and ( not xct.HasBuildSetting(_library_search_paths_var) or library_dir not in xct.GetBuildSetting(_library_search_paths_var)): xct.AppendBuildSetting(_library_search_paths_var, library_dir) for configuration_name in configuration_names: configuration = spec['configurations'][configuration_name] xcbc = xct.ConfigurationNamed(configuration_name) for include_dir in configuration.get('mac_framework_dirs', []): xcbc.AppendBuildSetting('FRAMEWORK_SEARCH_PATHS', include_dir) for include_dir in configuration.get('include_dirs', []): xcbc.AppendBuildSetting('HEADER_SEARCH_PATHS', include_dir) for library_dir in configuration.get('library_dirs', []): if library_dir not in xcode_standard_library_dirs and ( not xcbc.HasBuildSetting(_library_search_paths_var) or library_dir not in xcbc.GetBuildSetting(_library_search_paths_var)): xcbc.AppendBuildSetting(_library_search_paths_var, library_dir) if 'defines' in configuration: for define in configuration['defines']: set_define = EscapeXcodeDefine(define) xcbc.AppendBuildSetting('GCC_PREPROCESSOR_DEFINITIONS', set_define) if 'xcode_settings' in configuration: for xck, xcv in configuration['xcode_settings'].iteritems(): xcbc.SetBuildSetting(xck, xcv) if 'xcode_config_file' in configuration: config_ref = pbxp.AddOrGetFileInRootGroup( configuration['xcode_config_file']) xcbc.SetBaseConfiguration(config_ref) build_files = [] for build_file, build_file_dict in data.iteritems(): if build_file.endswith('.gyp'): build_files.append(build_file) for build_file in build_files: xcode_projects[build_file].Finalize1(xcode_targets, serialize_all_tests) for build_file in build_files: xcode_projects[build_file].Finalize2(xcode_targets, xcode_target_to_target_dict) for build_file in build_files: xcode_projects[build_file].Write() ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode_test.py��������������������000644 �000766 �000024 �00000001205 12455173731 034236� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the xcode.py file. """ import gyp.generator.xcode as xcode import unittest import sys class TestEscapeXcodeDefine(unittest.TestCase): if sys.platform == 'darwin': def test_InheritedRemainsUnescaped(self): self.assertEqual(xcode.EscapeXcodeDefine('$(inherited)'), '$(inherited)') def test_Escaping(self): self.assertEqual(xcode.EscapeXcodeDefine('a b"c\\'), 'a\\ b\\"c\\\\') if __name__ == '__main__': unittest.main() �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/data/win/���������������������000755 �000766 �000024 �00000000000 12456115117 027401� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/data/win/large-pdb-shim.cc����000644 �000766 �000024 �00000001215 12455173731 032507� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Copyright (c) 2013 Google Inc. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. // This file is used to generate an empty .pdb -- with a 4KB pagesize -- that is // then used during the final link for modules that have large PDBs. Otherwise, // the linker will generate a pdb with a page size of 1KB, which imposes a limit // of 1GB on the .pdb. By generating an initial empty .pdb with the compiler // (rather than the linker), this limit is avoided. With this in place PDBs may // grow to 2GB. // // This file is referenced by the msvs_large_pdb mechanism in MSVSUtil.py. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/gyp/buildbot/buildbot_run.py������000755 �000766 �000024 �00000013556 12455173731 032563� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Argument-less script to select what to run on the buildbots.""" import os import shutil import subprocess import sys if sys.platform in ['win32', 'cygwin']: EXE_SUFFIX = '.exe' else: EXE_SUFFIX = '' BUILDBOT_DIR = os.path.dirname(os.path.abspath(__file__)) TRUNK_DIR = os.path.dirname(BUILDBOT_DIR) ROOT_DIR = os.path.dirname(TRUNK_DIR) ANDROID_DIR = os.path.join(ROOT_DIR, 'android') CMAKE_DIR = os.path.join(ROOT_DIR, 'cmake') CMAKE_BIN_DIR = os.path.join(CMAKE_DIR, 'bin') OUT_DIR = os.path.join(TRUNK_DIR, 'out') def CallSubProcess(*args, **kwargs): """Wrapper around subprocess.call which treats errors as build exceptions.""" retcode = subprocess.call(*args, **kwargs) if retcode != 0: print '@@@STEP_EXCEPTION@@@' sys.exit(1) def PrepareCmake(): """Build CMake 2.8.8 since the version in Precise is 2.8.7.""" if os.environ['BUILDBOT_CLOBBER'] == '1': print '@@@BUILD_STEP Clobber CMake checkout@@@' shutil.rmtree(CMAKE_DIR) # We always build CMake 2.8.8, so no need to do anything # if the directory already exists. if os.path.isdir(CMAKE_DIR): return print '@@@BUILD_STEP Initialize CMake checkout@@@' os.mkdir(CMAKE_DIR) CallSubProcess(['git', 'config', '--global', 'user.name', 'trybot']) CallSubProcess(['git', 'config', '--global', 'user.email', 'chrome-bot@google.com']) CallSubProcess(['git', 'config', '--global', 'color.ui', 'false']) print '@@@BUILD_STEP Sync CMake@@@' CallSubProcess( ['git', 'clone', '--depth', '1', '--single-branch', '--branch', 'v2.8.8', '--', 'git://cmake.org/cmake.git', CMAKE_DIR], cwd=CMAKE_DIR) print '@@@BUILD_STEP Build CMake@@@' CallSubProcess( ['/bin/bash', 'bootstrap', '--prefix=%s' % CMAKE_DIR], cwd=CMAKE_DIR) CallSubProcess( ['make', 'cmake'], cwd=CMAKE_DIR) def PrepareAndroidTree(): """Prepare an Android tree to run 'android' format tests.""" if os.environ['BUILDBOT_CLOBBER'] == '1': print '@@@BUILD_STEP Clobber Android checkout@@@' shutil.rmtree(ANDROID_DIR) # The release of Android we use is static, so there's no need to do anything # if the directory already exists. if os.path.isdir(ANDROID_DIR): return print '@@@BUILD_STEP Initialize Android checkout@@@' os.mkdir(ANDROID_DIR) CallSubProcess(['git', 'config', '--global', 'user.name', 'trybot']) CallSubProcess(['git', 'config', '--global', 'user.email', 'chrome-bot@google.com']) CallSubProcess(['git', 'config', '--global', 'color.ui', 'false']) CallSubProcess( ['repo', 'init', '-u', 'https://android.googlesource.com/platform/manifest', '-b', 'android-4.2.1_r1', '-g', 'all,-notdefault,-device,-darwin,-mips,-x86'], cwd=ANDROID_DIR) print '@@@BUILD_STEP Sync Android@@@' CallSubProcess(['repo', 'sync', '-j4'], cwd=ANDROID_DIR) print '@@@BUILD_STEP Build Android@@@' CallSubProcess( ['/bin/bash', '-c', 'source build/envsetup.sh && lunch full-eng && make -j4'], cwd=ANDROID_DIR) def GypTestFormat(title, format=None, msvs_version=None): """Run the gyp tests for a given format, emitting annotator tags. See annotator docs at: https://sites.google.com/a/chromium.org/dev/developers/testing/chromium-build-infrastructure/buildbot-annotations Args: format: gyp format to test. Returns: 0 for sucesss, 1 for failure. """ if not format: format = title print '@@@BUILD_STEP ' + title + '@@@' sys.stdout.flush() env = os.environ.copy() if msvs_version: env['GYP_MSVS_VERSION'] = msvs_version command = ' '.join( [sys.executable, 'trunk/gyptest.py', '--all', '--passed', '--format', format, '--path', CMAKE_BIN_DIR, '--chdir', 'trunk']) if format == 'android': # gyptest needs the environment setup from envsetup/lunch in order to build # using the 'android' backend, so this is done in a single shell. retcode = subprocess.call( ['/bin/bash', '-c', 'source build/envsetup.sh && lunch full-eng && cd %s && %s' % (ROOT_DIR, command)], cwd=ANDROID_DIR, env=env) else: retcode = subprocess.call(command, cwd=ROOT_DIR, env=env, shell=True) if retcode: # Emit failure tag, and keep going. print '@@@STEP_FAILURE@@@' return 1 return 0 def GypBuild(): # Dump out/ directory. print '@@@BUILD_STEP cleanup@@@' print 'Removing %s...' % OUT_DIR shutil.rmtree(OUT_DIR, ignore_errors=True) print 'Done.' retcode = 0 # The Android gyp bot runs on linux so this must be tested first. if os.environ['BUILDBOT_BUILDERNAME'] == 'gyp-android': PrepareAndroidTree() retcode += GypTestFormat('android') elif sys.platform.startswith('linux'): retcode += GypTestFormat('ninja') retcode += GypTestFormat('make') PrepareCmake() retcode += GypTestFormat('cmake') elif sys.platform == 'darwin': retcode += GypTestFormat('ninja') retcode += GypTestFormat('xcode') retcode += GypTestFormat('make') elif sys.platform == 'win32': retcode += GypTestFormat('ninja') if os.environ['BUILDBOT_BUILDERNAME'] == 'gyp-win64': retcode += GypTestFormat('msvs-2010', format='msvs', msvs_version='2010') retcode += GypTestFormat('msvs-2012', format='msvs', msvs_version='2012') else: raise Exception('Unknown platform') if retcode: # TODO(bradnelson): once the annotator supports a postscript (section for # after the build proper that could be used for cumulative failures), # use that instead of this. This isolates the final return value so # that it isn't misattributed to the last stage. print '@@@BUILD_STEP failures@@@' sys.exit(retcode) if __name__ == '__main__': GypBuild() ��������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js�������������������000755 �000766 �000024 �00000006062 12455173731 027740� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node /** * Set the title. */ process.title = 'node-gyp' /** * Module dependencies. */ var gyp = require('../') var log = require('npmlog') /** * Process and execute the selected commands. */ var prog = gyp() var completed = false prog.parseArgv(process.argv) if (prog.todo.length === 0) { if (~process.argv.indexOf('-v') || ~process.argv.indexOf('--version')) { console.log('v%s', prog.version) } else { console.log('%s', prog.usage()) } return process.exit(0) } log.info('it worked if it ends with', 'ok') log.verbose('cli', process.argv) log.info('using', 'node-gyp@%s', prog.version) log.info('using', 'node@%s | %s | %s', process.versions.node, process.platform, process.arch) /** * Change dir if -C/--directory was passed. */ var dir = prog.opts.directory if (dir) { var fs = require('fs') try { var stat = fs.statSync(dir) if (stat.isDirectory()) { log.info('chdir', dir) process.chdir(dir) } else { log.warn('chdir', dir + ' is not a directory') } } catch (e) { if (e.code === 'ENOENT') { log.warn('chdir', dir + ' is not a directory') } else { log.warn('chdir', 'error during chdir() "%s"', e.message) } } } function run () { var command = prog.todo.shift() if (!command) { // done! completed = true log.info('ok') return } prog.commands[command.name](command.args, function (err) { if (err) { log.error(command.name + ' error') log.error('stack', err.stack) errorMessage() log.error('not ok') return process.exit(1) } if (command.name == 'list') { var versions = arguments[1] if (versions.length > 0) { versions.forEach(function (version) { console.log(version) }) } else { console.log('No node development files installed. Use `node-gyp install` to install a version.') } } else if (arguments.length >= 2) { console.log.apply(console, [].slice.call(arguments, 1)) } // now run the next command in the queue process.nextTick(run) }) } process.on('exit', function (code) { if (!completed && !code) { log.error('Completion callback never invoked!') issueMessage() process.exit(6) } }) process.on('uncaughtException', function (err) { log.error('UNCAUGHT EXCEPTION') log.error('stack', err.stack) issueMessage() process.exit(7) }) function errorMessage () { // copied from npm's lib/util/error-handler.js var os = require('os') log.error('System', os.type() + ' ' + os.release()) log.error('command', process.argv .map(JSON.stringify).join(' ')) log.error('cwd', process.cwd()) log.error('node -v', process.version) log.error('node-gyp -v', 'v' + prog.package.version) } function issueMessage () { errorMessage() log.error('', [ 'This is a bug in `node-gyp`.' , 'Try to update node-gyp and file an Issue if it does not help:' , ' <https://github.com/TooTallNate/node-gyp/issues>' ].join('\n')) } // start running the given commands! run() ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/.npmignore��������������������������000644 �000766 �000024 �00000000033 12455173731 026640� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/ npm-debug.log�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/.travis.yml�������������������������000644 �000766 �000024 �00000000066 12455173731 026760� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.6 - 0.8 - "0.10" ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/bin/��������������������������������000755 �000766 �000024 �00000000000 12456115117 025410� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/examples/���������������������������000755 �000766 �000024 �00000000000 12456115117 026456� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/index.js����������������������������000644 �000766 �000024 �00000005043 12455173731 026314� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require('path'); var fs = require('fs'); module.exports = mkdirP.mkdirp = mkdirP.mkdirP = mkdirP; function mkdirP (p, opts, f, made) { if (typeof opts === 'function') { f = opts; opts = {}; } else if (!opts || typeof opts !== 'object') { opts = { mode: opts }; } var mode = opts.mode; var xfs = opts.fs || fs; if (mode === undefined) { mode = 0777 & (~process.umask()); } if (!made) made = null; var cb = f || function () {}; p = path.resolve(p); xfs.mkdir(p, mode, function (er) { if (!er) { made = made || p; return cb(null, made); } switch (er.code) { case 'ENOENT': mkdirP(path.dirname(p), opts, function (er, made) { if (er) cb(er, made); else mkdirP(p, opts, cb, made); }); break; // In the case of any other error, just see if there's a dir // there already. If so, then hooray! If not, then something // is borked. default: xfs.stat(p, function (er2, stat) { // if the stat fails, then that's super weird. // let the original error be the failure reason. if (er2 || !stat.isDirectory()) cb(er, made) else cb(null, made); }); break; } }); } mkdirP.sync = function sync (p, opts, made) { if (!opts || typeof opts !== 'object') { opts = { mode: opts }; } var mode = opts.mode; var xfs = opts.fs || fs; if (mode === undefined) { mode = 0777 & (~process.umask()); } if (!made) made = null; p = path.resolve(p); try { xfs.mkdirSync(p, mode); made = made || p; } catch (err0) { switch (err0.code) { case 'ENOENT' : made = sync(path.dirname(p), opts, made); sync(p, opts, made); break; // In the case of any other error, just see if there's a dir // there already. If so, then hooray! If not, then something // is borked. default: var stat; try { stat = xfs.statSync(p); } catch (err1) { throw err0; } if (!stat.isDirectory()) throw err0; break; } } return made; }; ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/LICENSE�����������������������������000644 �000766 �000024 �00000002165 12455173731 025656� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2010 James Halliday (mail@substack.net) This project is free software released under the MIT/X11 license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/�����������������������000755 �000766 �000024 �00000000000 12456115117 027315� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/package.json������������������������000644 �000766 �000024 �00000002500 12455173731 027130� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "mkdirp", "description": "Recursively mkdir, like `mkdir -p`", "version": "0.5.0", "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "main": "./index", "keywords": [ "mkdir", "directory" ], "repository": { "type": "git", "url": "https://github.com/substack/node-mkdirp.git" }, "scripts": { "test": "tap test/*.js" }, "dependencies": { "minimist": "0.0.8" }, "devDependencies": { "tap": "~0.4.0", "mock-fs": "~2.2.0" }, "bin": { "mkdirp": "bin/cmd.js" }, "license": "MIT", "bugs": { "url": "https://github.com/substack/node-mkdirp/issues" }, "homepage": "https://github.com/substack/node-mkdirp", "_id": "mkdirp@0.5.0", "dist": { "shasum": "1d73076a6df986cd9344e15e71fcc05a4c9abf12", "tarball": "http://registry.npmjs.org/mkdirp/-/mkdirp-0.5.0.tgz" }, "_from": "mkdirp@latest", "_npmVersion": "1.4.3", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "directories": {}, "_shasum": "1d73076a6df986cd9344e15e71fcc05a4c9abf12", "_resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.0.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/README.markdown���������������������000644 �000766 �000024 �00000004057 12455173731 027354� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# mkdirp Like `mkdir -p`, but in node.js! [![build status](https://secure.travis-ci.org/substack/node-mkdirp.png)](http://travis-ci.org/substack/node-mkdirp) # example ## pow.js ```js var mkdirp = require('mkdirp'); mkdirp('/tmp/foo/bar/baz', function (err) { if (err) console.error(err) else console.log('pow!') }); ``` Output ``` pow! ``` And now /tmp/foo/bar/baz exists, huzzah! # methods ```js var mkdirp = require('mkdirp'); ``` ## mkdirp(dir, opts, cb) Create a new directory and any necessary subdirectories at `dir` with octal permission string `opts.mode`. If `opts` is a non-object, it will be treated as the `opts.mode`. If `opts.mode` isn't specified, it defaults to `0777 & (~process.umask())`. `cb(err, made)` fires with the error or the first directory `made` that had to be created, if any. You can optionally pass in an alternate `fs` implementation by passing in `opts.fs`. Your implementation should have `opts.fs.mkdir(path, mode, cb)` and `opts.fs.stat(path, cb)`. ## mkdirp.sync(dir, opts) Synchronously create a new directory and any necessary subdirectories at `dir` with octal permission string `opts.mode`. If `opts` is a non-object, it will be treated as the `opts.mode`. If `opts.mode` isn't specified, it defaults to `0777 & (~process.umask())`. Returns the first directory that had to be created, if any. You can optionally pass in an alternate `fs` implementation by passing in `opts.fs`. Your implementation should have `opts.fs.mkdirSync(path, mode)` and `opts.fs.statSync(path)`. # usage This package also ships with a `mkdirp` command. ``` usage: mkdirp [DIR1,DIR2..] {OPTIONS} Create each supplied directory including any necessary parent directories that don't yet exist. If the directory already exists, do nothing. OPTIONS are: -m, --mode If a directory needs to be created, set the mode as an octal permission string. ``` # install With [npm](http://npmjs.org) do: ``` npm install mkdirp ``` to get the library, or ``` npm install -g mkdirp ``` to get the command. # license MIT ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/��������������000755 �000766 �000024 �00000000000 12456115117 031146� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/.travis.yml���000644 �000766 �000024 �00000000060 12455173731 033260� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10" ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/example/������000755 �000766 �000024 �00000000000 12456115117 032601� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/index.js������000644 �000766 �000024 �00000012725 12455173731 032627� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = function (args, opts) { if (!opts) opts = {}; var flags = { bools : {}, strings : {} }; [].concat(opts['boolean']).filter(Boolean).forEach(function (key) { flags.bools[key] = true; }); [].concat(opts.string).filter(Boolean).forEach(function (key) { flags.strings[key] = true; }); var aliases = {}; Object.keys(opts.alias || {}).forEach(function (key) { aliases[key] = [].concat(opts.alias[key]); aliases[key].forEach(function (x) { aliases[x] = [key].concat(aliases[key].filter(function (y) { return x !== y; })); }); }); var defaults = opts['default'] || {}; var argv = { _ : [] }; Object.keys(flags.bools).forEach(function (key) { setArg(key, defaults[key] === undefined ? false : defaults[key]); }); var notFlags = []; if (args.indexOf('--') !== -1) { notFlags = args.slice(args.indexOf('--')+1); args = args.slice(0, args.indexOf('--')); } function setArg (key, val) { var value = !flags.strings[key] && isNumber(val) ? Number(val) : val ; setKey(argv, key.split('.'), value); (aliases[key] || []).forEach(function (x) { setKey(argv, x.split('.'), value); }); } for (var i = 0; i < args.length; i++) { var arg = args[i]; if (/^--.+=/.test(arg)) { // Using [\s\S] instead of . because js doesn't support the // 'dotall' regex modifier. See: // http://stackoverflow.com/a/1068308/13216 var m = arg.match(/^--([^=]+)=([\s\S]*)$/); setArg(m[1], m[2]); } else if (/^--no-.+/.test(arg)) { var key = arg.match(/^--no-(.+)/)[1]; setArg(key, false); } else if (/^--.+/.test(arg)) { var key = arg.match(/^--(.+)/)[1]; var next = args[i + 1]; if (next !== undefined && !/^-/.test(next) && !flags.bools[key] && (aliases[key] ? !flags.bools[aliases[key]] : true)) { setArg(key, next); i++; } else if (/^(true|false)$/.test(next)) { setArg(key, next === 'true'); i++; } else { setArg(key, flags.strings[key] ? '' : true); } } else if (/^-[^-]+/.test(arg)) { var letters = arg.slice(1,-1).split(''); var broken = false; for (var j = 0; j < letters.length; j++) { var next = arg.slice(j+2); if (next === '-') { setArg(letters[j], next) continue; } if (/[A-Za-z]/.test(letters[j]) && /-?\d+(\.\d*)?(e-?\d+)?$/.test(next)) { setArg(letters[j], next); broken = true; break; } if (letters[j+1] && letters[j+1].match(/\W/)) { setArg(letters[j], arg.slice(j+2)); broken = true; break; } else { setArg(letters[j], flags.strings[letters[j]] ? '' : true); } } var key = arg.slice(-1)[0]; if (!broken && key !== '-') { if (args[i+1] && !/^(-|--)[^-]/.test(args[i+1]) && !flags.bools[key] && (aliases[key] ? !flags.bools[aliases[key]] : true)) { setArg(key, args[i+1]); i++; } else if (args[i+1] && /true|false/.test(args[i+1])) { setArg(key, args[i+1] === 'true'); i++; } else { setArg(key, flags.strings[key] ? '' : true); } } } else { argv._.push( flags.strings['_'] || !isNumber(arg) ? arg : Number(arg) ); } } Object.keys(defaults).forEach(function (key) { if (!hasKey(argv, key.split('.'))) { setKey(argv, key.split('.'), defaults[key]); (aliases[key] || []).forEach(function (x) { setKey(argv, x.split('.'), defaults[key]); }); } }); notFlags.forEach(function(key) { argv._.push(key); }); return argv; }; function hasKey (obj, keys) { var o = obj; keys.slice(0,-1).forEach(function (key) { o = (o[key] || {}); }); var key = keys[keys.length - 1]; return key in o; } function setKey (obj, keys, value) { var o = obj; keys.slice(0,-1).forEach(function (key) { if (o[key] === undefined) o[key] = {}; o = o[key]; }); var key = keys[keys.length - 1]; if (o[key] === undefined || typeof o[key] === 'boolean') { o[key] = value; } else if (Array.isArray(o[key])) { o[key].push(value); } else { o[key] = [ o[key], value ]; } } function isNumber (x) { if (typeof x === 'number') return true; if (/^0x[0-9a-f]+$/i.test(x)) return true; return /^[-+]?(?:\d+(?:\.\d*)?|\.\d+)(e[-+]?\d+)?$/.test(x); } function longest (xs) { return Math.max.apply(null, xs.map(function (x) { return x.length })); } �������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/LICENSE�������000644 �000766 �000024 �00000002061 12455173731 032157� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/package.json��000644 �000766 �000024 �00000002733 12455173731 033446� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "minimist", "version": "0.0.8", "description": "parse argument options", "main": "index.js", "devDependencies": { "tape": "~1.0.4", "tap": "~0.4.0" }, "scripts": { "test": "tap test/*.js" }, "testling": { "files": "test/*.js", "browsers": [ "ie/6..latest", "ff/5", "firefox/latest", "chrome/10", "chrome/latest", "safari/5.1", "safari/latest", "opera/12" ] }, "repository": { "type": "git", "url": "git://github.com/substack/minimist.git" }, "homepage": "https://github.com/substack/minimist", "keywords": [ "argv", "getopt", "parser", "optimist" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "bugs": { "url": "https://github.com/substack/minimist/issues" }, "_id": "minimist@0.0.8", "dist": { "shasum": "857fcabfc3397d2625b8228262e86aa7a011b05d", "tarball": "http://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz" }, "_from": "minimist@0.0.8", "_npmVersion": "1.4.3", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "directories": {}, "_shasum": "857fcabfc3397d2625b8228262e86aa7a011b05d", "_resolved": "https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz", "readme": "ERROR: No README data found!" } �������������������������������������lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/readme.markdown����������������������000644 �000766 �000024 �00000003147 12455173731 034102� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# minimist parse argument options This module is the guts of optimist's argument parser without all the fanciful decoration. [![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist) [![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist) # example ``` js var argv = require('minimist')(process.argv.slice(2)); console.dir(argv); ``` ``` $ node example/parse.js -a beep -b boop { _: [], a: 'beep', b: 'boop' } ``` ``` $ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz { _: [ 'foo', 'bar', 'baz' ], x: 3, y: 4, n: 5, a: true, b: true, c: true, beep: 'boop' } ``` # methods ``` js var parseArgs = require('minimist') ``` ## var argv = parseArgs(args, opts={}) Return an argument object `argv` populated with the array arguments from `args`. `argv._` contains all the arguments that didn't have an option associated with them. Numeric-looking arguments will be returned as numbers unless `opts.string` or `opts.boolean` is set for that argument name. Any arguments after `'--'` will not be parsed and will end up in `argv._`. options can be: * `opts.string` - a string or array of strings argument names to always treat as strings * `opts.boolean` - a string or array of strings to always treat as booleans * `opts.alias` - an object mapping string names to strings or arrays of string argument names to use as aliases * `opts.default` - an object mapping string argument names to default values # install With [npm](https://npmjs.org) do: ``` npm install minimist ``` # license MIT �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/mkdirp/node_modules/minimist/example/parse.js���������������������000644 �000766 �000024 �00000000105 12455173731 034173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var argv = require('../')(process.argv.slice(2)); console.dir(argv); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/examples/pow.js���������������������000644 �000766 �000024 �00000000216 12455173731 027625� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdirp = require('mkdirp'); mkdirp('/tmp/foo/bar/baz', function (err) { if (err) console.error(err) else console.log('pow!') }); ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/bin/cmd.js��������������������������000755 �000766 �000024 �00000001333 12455173731 026521� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var mkdirp = require('../'); var minimist = require('minimist'); var fs = require('fs'); var argv = minimist(process.argv.slice(2), { alias: { m: 'mode', h: 'help' }, string: [ 'mode' ] }); if (argv.help) { fs.createReadStream(__dirname + '/usage.txt').pipe(process.stdout); return; } var paths = argv._.slice(); var mode = argv.mode ? parseInt(argv.mode, 8) : undefined; (function next () { if (paths.length === 0) return; var p = paths.shift(); if (mode === undefined) mkdirp(p, cb) else mkdirp(p, mode, cb) function cb (err) { if (err) { console.error(err.message); process.exit(1); } else next(); } })(); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/mkdirp/bin/usage.txt�����������������������000644 �000766 �000024 �00000000473 12455173731 027266� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������usage: mkdirp [DIR1,DIR2..] {OPTIONS} Create each supplied directory including any necessary parent directories that don't yet exist. If the directory already exists, do nothing. OPTIONS are: -m, --mode If a directory needs to be created, set the mode as an octal permission string. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/.npmignore�����������������������000644 �000766 �000024 �00000000017 12455173731 027325� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# nothing here �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/.travis.yml����������������������000644 �000766 �000024 �00000000055 12455173731 027441� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.10 - 0.11 �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/benchmark.js���������������������000644 �000766 �000024 �00000000651 12455173731 027622� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var m = require('./minimatch.js') var pattern = "**/*.js" var expand = require('brace-expansion') var files = expand('x/y/z/{1..1000}.js') var start = process.hrtime() for (var i = 0; i < 1000; i++) { for (var f = 0; f < files.length; f++) { var res = m(pattern, files[f]) } if (!(i%10)) process.stdout.write('.') } console.log('done') var dur = process.hrtime(start) console.log('%s ms', dur[0]*1e3 + dur[1]/1e6) ���������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/browser.js�����������������������000644 �000766 �000024 �00000075000 12455173731 027353� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������(function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o<r.length;o++)s(r[o]);return s})({1:[function(require,module,exports){ (function (process){ module.exports = minimatch minimatch.Minimatch = Minimatch var isWindows = false if (typeof process !== 'undefined' && process.platform === 'win32') isWindows = true var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} , expand = require("brace-expansion") // any single thing other than / // don't need to escape / when using new RegExp() , qmark = "[^/]" // * => any number of characters , star = qmark + "*?" // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. , twoStarDot = "(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?" // not a ^ or / followed by a dot, // followed by anything, any number of times. , twoStarNoDot = "(?:(?!(?:\\\/|^)\\.).)*?" // characters that need to be escaped in RegExp. , reSpecials = charSet("().*{}+?[]^$\\!") // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split("").reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { return false } // "" only matches "" if (pattern.trim() === "") return p === "" return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options) } if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} pattern = pattern.trim() // windows support: need to use /, not \ if (isWindows) pattern = pattern.split("\\").join("/") this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function() {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return -1 === s.indexOf(false) }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern , negate = false , options = this.options , negateOffset = 0 if (options.nonegate) return for ( var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === "!" ; i ++) { negate = !negate negateOffset ++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return braceExpand(pattern, options) } Minimatch.prototype.braceExpand = braceExpand function braceExpand (pattern, options) { if (!options) { if (this instanceof Minimatch) options = this.options else options = {} } pattern = typeof pattern === "undefined" ? this.pattern : pattern if (typeof pattern === "undefined") { throw new Error("undefined pattern") } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } return expand(pattern) } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === "**") return GLOBSTAR if (pattern === "") return "" var re = "" , hasMagic = !!options.nocase , escaping = false // ? => one single character , patternListStack = [] , plType , stateChar , inClass = false , reClassStart = -1 , classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. , patternStart = pattern.charAt(0) === "." ? "" // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? "(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))" : "(?!\\.)" , self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case "*": re += star hasMagic = true break case "?": re += qmark hasMagic = true break default: re += "\\"+stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for ( var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i ++ ) { this.debug("%s\t%s %s %j", pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += "\\" + c escaping = false continue } SWITCH: switch (c) { case "/": // completely not allowed, even escaped. // Should already be path-split by now. return false case "\\": clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case "?": case "*": case "+": case "@": case "!": this.debug("%s\t%s %s %j <-- stateChar", pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === "!" && i === classStart + 1) c = "^" re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case "(": if (inClass) { re += "(" continue } if (!stateChar) { re += "\\(" continue } plType = stateChar patternListStack.push({ type: plType , start: i - 1 , reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === "!" ? "(?:(?!" : "(?:" this.debug('plType %j %j', stateChar, re) stateChar = false continue case ")": if (inClass || !patternListStack.length) { re += "\\)" continue } clearStateChar() hasMagic = true re += ")" plType = patternListStack.pop().type // negation is (?:(?!js)[^/]*) // The others are (?:<pattern>)<type> switch (plType) { case "!": re += "[^/]*?)" break case "?": case "+": case "*": re += plType case "@": break // the default anyway } continue case "|": if (inClass || !patternListStack.length || escaping) { re += "\\|" escaping = false continue } clearStateChar() re += "|" continue // these are mostly the same in regexp and glob case "[": // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += "\\" + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case "]": // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += "\\" + c escaping = false continue } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === "^" && inClass)) { re += "\\" } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is var cs = pattern.substr(classStart + 1) , sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + "\\[" + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. var pl while (pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = "\\" } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + "|" }) this.debug("tail=%j\n %s", tail, tail) var t = pl.type === "*" ? star : pl.type === "?" ? qmark : "\\" + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + "\\(" + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += "\\\\" } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case ".": case "[": case "(": addPatternStart = true } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== "" && hasMagic) re = "(?=.)" + re if (addPatternStart) re = patternStart + re // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [ re, hasMagic ] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? "i" : "" , regExp = new RegExp("^" + re + "$", flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) return this.regexp = false var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot , flags = options.nocase ? "i" : "" var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === "string") ? regExpEscape(p) : p._src }).join("\\\/") }).join("|") // must match entire pattern // ending in a * or ** will make it less strict. re = "^(?:" + re + ")$" // can match anything, as long as it's not this. if (this.negate) re = "^(?!" + re + ").*$" try { return this.regexp = new RegExp(re, flags) } catch (ex) { return this.regexp = false } } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug("match", f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === "" if (f === "/" && partial) return true var options = this.options // windows: need to use /, not \ if (isWindows) f = f.split("\\").join("/") // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, "split", f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, "set", set) // Find the basename of the path by looking for the last non-empty segment var filename; for (var i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (var i = 0, l = set.length; i < l; i ++) { var pattern = set[i], file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug("matchOne", { "this": this , file: file , pattern: pattern }) this.debug("matchOne", file.length, pattern.length) for ( var fi = 0 , pi = 0 , fl = file.length , pl = pattern.length ; (fi < fl) && (pi < pl) ; fi ++, pi ++ ) { this.debug("matchOne loop") var p = pattern[pi] , f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi , pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for ( ; fi < fl; fi ++) { if (file[fi] === "." || file[fi] === ".." || (!options.dot && file[fi].charAt(0) === ".")) return false } return true } // ok, let's see if we can swallow whatever we can. WHILE: while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === "." || swallowee === ".." || (!options.dot && swallowee.charAt(0) === ".")) { this.debug("dot detected!", file, fr, pattern, pr) break WHILE } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr ++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug("\n>>> no match, partial?", file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === "string") { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug("string match", p, f, hit) } else { hit = f.match(p) this.debug("pattern match", p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === "") return emptyFileEnd } // should be unreachable. throw new Error("wtf?") } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, "$1") } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, "\\$&") } }).call(this,require('_process')) },{"_process":5,"brace-expansion":2}],2:[function(require,module,exports){ var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; var expansions = expand(escapeBraces(str)); return expansions.filter(identity).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } function expand(str) { var expansions = []; var m = balanced('{', '}', str); if (!m || /\$$/.test(m.pre)) return [str]; var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body); var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = /^(.*,)+(.+)?$/.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (m.post.match(/,.*}/)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0]).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { expansions.push([pre, N[j], post[k]].join('')) } } return expansions; } },{"balanced-match":3,"concat-map":4}],3:[function(require,module,exports){ module.exports = balanced; function balanced(a, b, str) { var bal = 0; var m = {}; var ended = false; for (var i = 0; i < str.length; i++) { if (a == str.substr(i, a.length)) { if (!('start' in m)) m.start = i; bal++; } else if (b == str.substr(i, b.length) && 'start' in m) { ended = true; bal--; if (!bal) { m.end = i; m.pre = str.substr(0, m.start); m.body = (m.end - m.start > 1) ? str.substring(m.start + a.length, m.end) : ''; m.post = str.slice(m.end + b.length); return m; } } } // if we opened more than we closed, find the one we closed if (bal && ended) { var start = m.start + a.length; m = balanced(a, b, str.substr(start)); if (m) { m.start += start; m.end += start; m.pre = str.slice(0, start) + m.pre; } return m; } } },{}],4:[function(require,module,exports){ module.exports = function (xs, fn) { var res = []; for (var i = 0; i < xs.length; i++) { var x = fn(xs[i], i); if (Array.isArray(x)) res.push.apply(res, x); else res.push(x); } return res; }; },{}],5:[function(require,module,exports){ // shim for using process in browser var process = module.exports = {}; process.nextTick = (function () { var canSetImmediate = typeof window !== 'undefined' && window.setImmediate; var canMutationObserver = typeof window !== 'undefined' && window.MutationObserver; var canPost = typeof window !== 'undefined' && window.postMessage && window.addEventListener ; if (canSetImmediate) { return function (f) { return window.setImmediate(f) }; } var queue = []; if (canMutationObserver) { var hiddenDiv = document.createElement("div"); var observer = new MutationObserver(function () { var queueList = queue.slice(); queue.length = 0; queueList.forEach(function (fn) { fn(); }); }); observer.observe(hiddenDiv, { attributes: true }); return function nextTick(fn) { if (!queue.length) { hiddenDiv.setAttribute('yes', 'no'); } queue.push(fn); }; } if (canPost) { window.addEventListener('message', function (ev) { var source = ev.source; if ((source === window || source === null) && ev.data === 'process-tick') { ev.stopPropagation(); if (queue.length > 0) { var fn = queue.shift(); fn(); } } }, true); return function nextTick(fn) { queue.push(fn); window.postMessage('process-tick', '*'); }; } return function nextTick(fn) { setTimeout(fn, 0); }; })(); process.title = 'browser'; process.browser = true; process.env = {}; process.argv = []; function noop() {} process.on = noop; process.addListener = noop; process.once = noop; process.off = noop; process.removeListener = noop; process.removeAllListeners = noop; process.emit = noop; process.binding = function (name) { throw new Error('process.binding is not supported'); }; // TODO(shtylman) process.cwd = function () { return '/' }; process.chdir = function (dir) { throw new Error('process.chdir is not supported'); }; },{}]},{},[1]); iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/LICENSE��������������������������000644 �000766 �000024 �00000002104 12455173731 026332� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/minimatch.js���������������������000644 �000766 �000024 �00000054441 12455173731 027647� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = minimatch minimatch.Minimatch = Minimatch var isWindows = false if (typeof process !== 'undefined' && process.platform === 'win32') isWindows = true var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} , expand = require("brace-expansion") // any single thing other than / // don't need to escape / when using new RegExp() , qmark = "[^/]" // * => any number of characters , star = qmark + "*?" // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. , twoStarDot = "(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?" // not a ^ or / followed by a dot, // followed by anything, any number of times. , twoStarNoDot = "(?:(?!(?:\\\/|^)\\.).)*?" // characters that need to be escaped in RegExp. , reSpecials = charSet("().*{}+?[]^$\\!") // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split("").reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { return false } // "" only matches "" if (pattern.trim() === "") return p === "" return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options) } if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} pattern = pattern.trim() // windows support: need to use /, not \ if (isWindows) pattern = pattern.split("\\").join("/") this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function() {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return -1 === s.indexOf(false) }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern , negate = false , options = this.options , negateOffset = 0 if (options.nonegate) return for ( var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === "!" ; i ++) { negate = !negate negateOffset ++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return braceExpand(pattern, options) } Minimatch.prototype.braceExpand = braceExpand function braceExpand (pattern, options) { if (!options) { if (this instanceof Minimatch) options = this.options else options = {} } pattern = typeof pattern === "undefined" ? this.pattern : pattern if (typeof pattern === "undefined") { throw new Error("undefined pattern") } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } return expand(pattern) } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === "**") return GLOBSTAR if (pattern === "") return "" var re = "" , hasMagic = !!options.nocase , escaping = false // ? => one single character , patternListStack = [] , plType , stateChar , inClass = false , reClassStart = -1 , classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. , patternStart = pattern.charAt(0) === "." ? "" // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? "(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))" : "(?!\\.)" , self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case "*": re += star hasMagic = true break case "?": re += qmark hasMagic = true break default: re += "\\"+stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for ( var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i ++ ) { this.debug("%s\t%s %s %j", pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += "\\" + c escaping = false continue } SWITCH: switch (c) { case "/": // completely not allowed, even escaped. // Should already be path-split by now. return false case "\\": clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case "?": case "*": case "+": case "@": case "!": this.debug("%s\t%s %s %j <-- stateChar", pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === "!" && i === classStart + 1) c = "^" re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case "(": if (inClass) { re += "(" continue } if (!stateChar) { re += "\\(" continue } plType = stateChar patternListStack.push({ type: plType , start: i - 1 , reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === "!" ? "(?:(?!" : "(?:" this.debug('plType %j %j', stateChar, re) stateChar = false continue case ")": if (inClass || !patternListStack.length) { re += "\\)" continue } clearStateChar() hasMagic = true re += ")" plType = patternListStack.pop().type // negation is (?:(?!js)[^/]*) // The others are (?:<pattern>)<type> switch (plType) { case "!": re += "[^/]*?)" break case "?": case "+": case "*": re += plType case "@": break // the default anyway } continue case "|": if (inClass || !patternListStack.length || escaping) { re += "\\|" escaping = false continue } clearStateChar() re += "|" continue // these are mostly the same in regexp and glob case "[": // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += "\\" + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case "]": // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += "\\" + c escaping = false continue } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === "^" && inClass)) { re += "\\" } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is var cs = pattern.substr(classStart + 1) , sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + "\\[" + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. var pl while (pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = "\\" } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + "|" }) this.debug("tail=%j\n %s", tail, tail) var t = pl.type === "*" ? star : pl.type === "?" ? qmark : "\\" + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + "\\(" + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += "\\\\" } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case ".": case "[": case "(": addPatternStart = true } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== "" && hasMagic) re = "(?=.)" + re if (addPatternStart) re = patternStart + re // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [ re, hasMagic ] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? "i" : "" , regExp = new RegExp("^" + re + "$", flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) return this.regexp = false var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot , flags = options.nocase ? "i" : "" var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === "string") ? regExpEscape(p) : p._src }).join("\\\/") }).join("|") // must match entire pattern // ending in a * or ** will make it less strict. re = "^(?:" + re + ")$" // can match anything, as long as it's not this. if (this.negate) re = "^(?!" + re + ").*$" try { return this.regexp = new RegExp(re, flags) } catch (ex) { return this.regexp = false } } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug("match", f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === "" if (f === "/" && partial) return true var options = this.options // windows: need to use /, not \ if (isWindows) f = f.split("\\").join("/") // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, "split", f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, "set", set) // Find the basename of the path by looking for the last non-empty segment var filename; for (var i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (var i = 0, l = set.length; i < l; i ++) { var pattern = set[i], file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug("matchOne", { "this": this , file: file , pattern: pattern }) this.debug("matchOne", file.length, pattern.length) for ( var fi = 0 , pi = 0 , fl = file.length , pl = pattern.length ; (fi < fl) && (pi < pl) ; fi ++, pi ++ ) { this.debug("matchOne loop") var p = pattern[pi] , f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi , pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for ( ; fi < fl; fi ++) { if (file[fi] === "." || file[fi] === ".." || (!options.dot && file[fi].charAt(0) === ".")) return false } return true } // ok, let's see if we can swallow whatever we can. WHILE: while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === "." || swallowee === ".." || (!options.dot && swallowee.charAt(0) === ".")) { this.debug("dot detected!", file, fr, pattern, pr) break WHILE } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr ++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug("\n>>> no match, partial?", file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === "string") { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug("string match", p, f, hit) } else { hit = f.match(p) this.debug("pattern match", p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === "") return emptyFileEnd } // should be unreachable. throw new Error("wtf?") } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, "$1") } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, "\\$&") } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/node_modules/��������������������000755 �000766 �000024 �00000000000 12456115117 030000� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/package.json���������������������000644 �000766 �000024 �00000002751 12455173731 027623� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "minimatch", "description": "a glob matcher in javascript", "version": "2.0.1", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "main": "minimatch.js", "scripts": { "test": "tap test/*.js", "prepublish": "browserify -o browser.js -e minimatch.js" }, "engines": { "node": "*" }, "dependencies": { "brace-expansion": "^1.0.0" }, "devDependencies": { "browserify": "^6.3.3", "tap": "" }, "license": { "type": "MIT", "url": "http://github.com/isaacs/minimatch/raw/master/LICENSE" }, "gitHead": "eac219d8f665c8043fda9a1cd34eab9b006fae01", "bugs": { "url": "https://github.com/isaacs/minimatch/issues" }, "homepage": "https://github.com/isaacs/minimatch", "_id": "minimatch@2.0.1", "_shasum": "6c3760b45f66ed1cd5803143ee8d372488f02c37", "_from": "minimatch@>=2.0.1 <2.1.0", "_npmVersion": "2.1.11", "_nodeVersion": "0.10.16", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "6c3760b45f66ed1cd5803143ee8d372488f02c37", "tarball": "http://registry.npmjs.org/minimatch/-/minimatch-2.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-2.0.1.tgz", "readme": "ERROR: No README data found!" } �����������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/README.md������������������������000644 �000766 �000024 �00000014714 12455173731 026616� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# minimatch A minimal matching utility. [![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch) This is the matching library used internally by npm. It works by converting glob expressions into JavaScript `RegExp` objects. ## Usage ```javascript var minimatch = require("minimatch") minimatch("bar.foo", "*.foo") // true! minimatch("bar.foo", "*.bar") // false! minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! ``` ## Features Supports these glob features: * Brace Expansion * Extended glob matching * "Globstar" `**` matching See: * `man sh` * `man bash` * `man 3 fnmatch` * `man 5 gitignore` ## Minimatch Class Create a minimatch object by instanting the `minimatch.Minimatch` class. ```javascript var Minimatch = require("minimatch").Minimatch var mm = new Minimatch(pattern, options) ``` ### Properties * `pattern` The original pattern the minimatch object represents. * `options` The options supplied to the constructor. * `set` A 2-dimensional array of regexp or string expressions. Each row in the array corresponds to a brace-expanded pattern. Each item in the row corresponds to a single path-part. For example, the pattern `{a,b/c}/d` would expand to a set of patterns like: [ [ a, d ] , [ b, c, d ] ] If a portion of the pattern doesn't have any "magic" in it (that is, it's something like `"foo"` rather than `fo*o?`), then it will be left as a string rather than converted to a regular expression. * `regexp` Created by the `makeRe` method. A single regular expression expressing the entire pattern. This is useful in cases where you wish to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. * `negate` True if the pattern is negated. * `comment` True if the pattern is a comment. * `empty` True if the pattern is `""`. ### Methods * `makeRe` Generate the `regexp` member if necessary, and return it. Will return `false` if the pattern is invalid. * `match(fname)` Return true if the filename matches the pattern, or false otherwise. * `matchOne(fileArray, patternArray, partial)` Take a `/`-split filename, and match it against a single row in the `regExpSet`. This method is mainly for internal use, but is exposed so that it can be used by a glob-walker that needs to avoid excessive filesystem calls. All other methods are internal, and will be called as necessary. ## Functions The top-level exported function has a `cache` property, which is an LRU cache set to store 100 items. So, calling these methods repeatedly with the same pattern and options will use the same Minimatch object, saving the cost of parsing it multiple times. ### minimatch(path, pattern, options) Main export. Tests a path against the pattern using the options. ```javascript var isJS = minimatch(file, "*.js", { matchBase: true }) ``` ### minimatch.filter(pattern, options) Returns a function that tests its supplied argument, suitable for use with `Array.filter`. Example: ```javascript var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) ``` ### minimatch.match(list, pattern, options) Match against the list of files, in the style of fnmatch or glob. If nothing is matched, and options.nonull is set, then return a list containing the pattern itself. ```javascript var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) ``` ### minimatch.makeRe(pattern, options) Make a regular expression object from the pattern. ## Options All options are `false` by default. ### debug Dump a ton of stuff to stderr. ### nobrace Do not expand `{a,b}` and `{1..3}` brace sets. ### noglobstar Disable `**` matching against multiple folder names. ### dot Allow patterns to match filenames starting with a period, even if the pattern does not explicitly have a period in that spot. Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` is set. ### noext Disable "extglob" style patterns like `+(a|b)`. ### nocase Perform a case-insensitive match. ### nonull When a match is not found by `minimatch.match`, return a list containing the pattern itself if this option is set. When not set, an empty list is returned if there are no matches. ### matchBase If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. ### nocomment Suppress the behavior of treating `#` at the start of a pattern as a comment. ### nonegate Suppress the behavior of treating a leading `!` character as negation. ### flipNegate Returns from negate expressions the same as if they were not negated. (Ie, true on a hit, false on a miss.) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between minimatch and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.1, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. If an escaped pattern has no matches, and the `nonull` flag is set, then minimatch.match returns the pattern as-provided, rather than interpreting the character escapes. For example, `minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. ����������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/����000755 �000766 �000024 �00000000000 12456115117 033056� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/.npmignore�����������������000644 �000766 �000024 �00000000023 12455173731 034776� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules *.sw* �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/.travis.yml����������������000644 �000766 �000024 �00000000046 12455173731 035115� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.10" ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/example.js�����������������000644 �000766 �000024 �00000000615 12455173731 034777� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var expand = require('./'); console.log(expand('http://any.org/archive{1996..1999}/vol{1..4}/part{a,b,c}.html')); console.log(expand('http://www.numericals.com/file{1..100..10}.txt')); console.log(expand('http://www.letters.com/file{a..z..2}.txt')); console.log(expand('mkdir /usr/local/src/bash/{old,new,dist,bugs}')); console.log(expand('chown root /usr/{ucb/{ex,edit},lib/{ex?.?*,how_ex}}')); �������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/index.bak������������������000644 �000766 �000024 �00000010633 12455173731 034575� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; return expand(escapeBraces(str), true).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } var exprCommaBrace = /,.*}/; var exprDollarEnd = /\$$/; var exprNumericSeq = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/; var exprAlphaSeq = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/; var exprIsOptions = /,/; function expand(str, isTop) { var expansions = []; var m = balanced('{', '}', str); if (!m || exprDollarEnd.test(m.pre)) return [str]; var isNumericSequence = exprNumericSeq.test(m.body); var isAlphaSequence = exprAlphaSeq.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = exprIsOptions.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (exprCommaBrace.test(m.post)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str, false); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0], false).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post, false) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post, false) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el, false) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { var expansion = pre + N[j] + post[k]; if (!isTop || isSequence || expansion) expansions.push(expansion); } } return expansions; } �����������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/index.js�������������������000644 �000766 �000024 �00000010356 12455173731 034456� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; return expand(escapeBraces(str), true).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } function expand(str, isTop) { var expansions = []; var m = balanced('{', '}', str); if (!m || /\$$/.test(m.pre)) return [str]; var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body); var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = /^(.*,)+(.+)?$/.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (m.post.match(/,.*}/)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0], false).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post, false) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post, false) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el, false) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { var expansion = pre + N[j] + post[k]; if (!isTop || isSequence || expansion) expansions.push(expansion); } } return expansions; } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/��������������000755 �000766 �000024 �00000000000 12456115117 035454� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/package.json���������������000644 �000766 �000024 �00000003625 12455173731 035300� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "brace-expansion", "description": "Brace expansion as known from sh/bash", "version": "1.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/brace-expansion.git" }, "homepage": "https://github.com/juliangruber/brace-expansion", "main": "index.js", "scripts": { "test": "tape test/*.js", "gentest": "bash test/generate.sh" }, "dependencies": { "balanced-match": "^0.2.0", "concat-map": "0.0.0" }, "devDependencies": { "tape": "~1.1.1" }, "keywords": [], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/20..latest", "firefox/nightly", "chrome/25..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "ceba9627f19c590feb7df404e1d6c41f8c01b93a", "bugs": { "url": "https://github.com/juliangruber/brace-expansion/issues" }, "_id": "brace-expansion@1.0.1", "_shasum": "817708d72ab27a8c312d25efababaea963439ed5", "_from": "brace-expansion@>=1.0.0 <2.0.0", "_npmVersion": "2.1.11", "_nodeVersion": "0.10.16", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" }, { "name": "isaacs", "email": "isaacs@npmjs.com" } ], "dist": { "shasum": "817708d72ab27a8c312d25efababaea963439ed5", "tarball": "http://registry.npmjs.org/brace-expansion/-/brace-expansion-1.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.0.1.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/README.md������������������000644 �000766 �000024 �00000006360 12455173731 034270� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# brace-expansion [Brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html), as known from sh/bash, in JavaScript. [![build status](https://secure.travis-ci.org/juliangruber/brace-expansion.png)](http://travis-ci.org/juliangruber/brace-expansion) [![testling badge](https://ci.testling.com/juliangruber/brace-expansion.png)](https://ci.testling.com/juliangruber/brace-expansion) ## Example ```js var expand = require('brace-expansion'); expand('file-{a,b,c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('-v{,,}') // => ['-v', '-v', '-v'] expand('file{0..2}.jpg') // => ['file0.jpg', 'file1.jpg', 'file2.jpg'] expand('file-{a..c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('file{2..0}.jpg') // => ['file2.jpg', 'file1.jpg', 'file0.jpg'] expand('file{0..4..2}.jpg') // => ['file0.jpg', 'file2.jpg', 'file4.jpg'] expand('file-{a..e..2}.jpg') // => ['file-a.jpg', 'file-c.jpg', 'file-e.jpg'] expand('file{00..10..5}.jpg') // => ['file00.jpg', 'file05.jpg', 'file10.jpg'] expand('{{A..C},{a..c}}') // => ['A', 'B', 'C', 'a', 'b', 'c'] expand('ppp{,config,oe{,conf}}') // => ['ppp', 'pppconfig', 'pppoe', 'pppoeconf'] ``` ## API ```js var expand = require('brace-expansion'); ``` ### var expanded = expand(str) Return an array of all possible and valid expansions of `str`. If none are found, `[str]` is returned. Valid expansions are: ```js /^(.*,)+(.+)?$/ // {a,b,...} ``` A comma seperated list of options, like `{a,b}` or `{a,{b,c}}` or `{,a,}`. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` A numeric sequence from `x` to `y` inclusive, with optional increment. If `x` or `y` start with a leading `0`, all the numbers will be padded to have equal length. Negative numbers and backwards iteration work too. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` An alphabetic sequence from `x` to `y` inclusive, with optional increment. `x` and `y` must be exactly one character, and if given, `incr` must be a number. For compatibility reasons, the string `${` is not eligible for brace expansion. ## Installation With [npm](https://npmjs.org) do: ```bash npm install brace-expansion ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/���000755 �000766 �000024 �00000000000 12456115117 040277� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/���000755 �000766 �000024 �00000000000 12456115117 037476� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/.travis.yml���������000644 �000766 �000024 �00000000053 12455173731 041612� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.4 - 0.6 �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/������������000755 �000766 �000024 �00000000000 12456115117 041131� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/index.js������������000644 �000766 �000024 �00000000350 12455173731 041146� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������module.exports = function (xs, fn) { var res = []; for (var i = 0; i < xs.length; i++) { var x = fn(xs[i], i); if (Array.isArray(x)) res.push.apply(res, x); else res.push(x); } return res; }; ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/package.json��������000644 �000766 �000024 �00000002753 12455173731 042000� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "name": "concat-map", "description": "concatenative mapdashery", "version": "0.0.0", "repository": { "type": "git", "url": "git://github.com/substack/node-concat-map.git" }, "main": "index.js", "keywords": [ "concat", "map", "functional", "higher-order" ], "directories": { "example": "example", "test": "test" }, "scripts": { "test": "tap test/*.js" }, "devDependencies": { "tap": "~0.2.5" }, "engines": { "node": ">=0.4.0" }, "license": "MIT", "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "_id": "concat-map@0.0.0", "dependencies": {}, "optionalDependencies": {}, "_engineSupported": true, "_npmVersion": "1.1.19", "_nodeVersion": "v0.6.11", "_defaultsLoaded": true, "dist": { "shasum": "604be9c2afb6dc9ba8182e3ff294fdd48e238e6d", "tarball": "http://registry.npmjs.org/concat-map/-/concat-map-0.0.0.tgz" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "_shasum": "604be9c2afb6dc9ba8182e3ff294fdd48e238e6d", "_resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.0.tgz", "_from": "concat-map@0.0.0", "bugs": { "url": "https://github.com/substack/node-concat-map/issues" }, "readme": "ERROR: No README data found!", "homepage": "https://github.com/substack/node-concat-map" } ���������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/README.markdown�����000644 �000766 �000024 �00000002021 12455173731 042177� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������concat-map ========== Concatenative mapdashery. [![build status](https://secure.travis-ci.org/substack/node-concat-map.png)](http://travis-ci.org/substack/node-concat-map) example ======= ``` js var concatMap = require('concat-map'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ``` *** ``` [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ] ``` methods ======= ``` js var concatMap = require('concat-map') ``` concatMap(xs, fn) ----------------- Return an array of concatenated elements by calling `fn(x, i)` for each element `x` and each index `i` in the array `xs`. When `fn(x, i)` returns an array, its result will be concatenated with the result array. If `fn(x, i)` returns anything else, that value will be pushed onto the end of the result array. install ======= With [npm](http://npmjs.org) do: ``` npm install concat-map ``` license ======= MIT notes ===== This module was written while sitting high above the ground in a tree. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/map.js������000644 �000766 �000024 �00000000253 12455173731 042251� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var concatMap = require('../'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.npmignore������000644 �000766 �000024 �00000000027 12455173731 042302� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������node_modules .DS_Store ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.travis.yml�����000644 �000766 �000024 �00000000060 12455173731 042411� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10" ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/example.js������000644 �000766 �000024 �00000000231 12455173731 042271� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������var balanced = require('./'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/index.js��������000644 �000766 �000024 �00000001606 12455173731 041754� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������module.exports = balanced; function balanced(a, b, str) { var bal = 0; var m = {}; var ended = false; for (var i = 0; i < str.length; i++) { if (a == str.substr(i, a.length)) { if (!('start' in m)) m.start = i; bal++; } else if (b == str.substr(i, b.length) && 'start' in m) { ended = true; bal--; if (!bal) { m.end = i; m.pre = str.substr(0, m.start); m.body = (m.end - m.start > 1) ? str.substring(m.start + a.length, m.end) : ''; m.post = str.slice(m.end + b.length); return m; } } } // if we opened more than we closed, find the one we closed if (bal && ended) { var start = m.start + a.length; m = balanced(a, b, str.substr(start)); if (m) { m.start += start; m.end += start; m.pre = str.slice(0, start) + m.pre; } return m; } } ��������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/Makefile��������000644 �000766 �000024 �00000000070 12455173731 041741� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules�������������������������������������������������������������������������������������������������������������������������������� test: @node_modules/.bin/tape test/*.js .PHONY: test ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/package.json����000644 �000766 �000024 �00000003503 12455173731 042573� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������{ "name": "balanced-match", "description": "Match balanced character pairs, like \"{\" and \"}\"", "version": "0.2.0", "repository": { "type": "git", "url": "git://github.com/juliangruber/balanced-match.git" }, "homepage": "https://github.com/juliangruber/balanced-match", "main": "index.js", "scripts": { "test": "make test" }, "dependencies": {}, "devDependencies": { "tape": "~1.1.1" }, "keywords": [ "match", "regexp", "test", "balanced", "parse" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/20..latest", "firefox/nightly", "chrome/25..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "ba40ed78e7114a4a67c51da768a100184dead39c", "bugs": { "url": "https://github.com/juliangruber/balanced-match/issues" }, "_id": "balanced-match@0.2.0", "_shasum": "38f6730c03aab6d5edbb52bd934885e756d71674", "_from": "balanced-match@>=0.2.0 <0.3.0", "_npmVersion": "2.1.8", "_nodeVersion": "0.10.32", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "dist": { "shasum": "38f6730c03aab6d5edbb52bd934885e756d71674", "tarball": "http://registry.npmjs.org/balanced-match/-/balanced-match-0.2.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-0.2.0.tgz", "readme": "ERROR: No README data found!" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/README.md�������000644 �000766 �000024 �00000005173 12455173731 041571� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������# balanced-match Match balanced string pairs, like `{` and `}` or `<b>` and `</b>`. [![build status](https://secure.travis-ci.org/juliangruber/balanced-match.svg)](http://travis-ci.org/juliangruber/balanced-match) [![downloads](https://img.shields.io/npm/dm/balanced-match.svg)](https://www.npmjs.org/package/balanced-match) [![testling badge](https://ci.testling.com/juliangruber/balanced-match.png)](https://ci.testling.com/juliangruber/balanced-match) ## Example Get the first matching pair of braces: ```js var balanced = require('balanced-match'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ``` The matches are: ```bash $ node example.js { start: 3, end: 14, pre: 'pre', body: 'in{nested}', post: 'post' } { start: 3, end: 9, pre: 'pre', body: 'first', post: 'between{second}post' } ``` ## API ### var m = balanced(a, b, str) For the first non-nested matching pair of `a` and `b` in `str`, return an object with those keys: * **start** the index of the first match of `a` * **end** the index of the matching `b` * **pre** the preamble, `a` and `b` not included * **body** the match, `a` and `b` not included * **post** the postscript, `a` and `b` not included If there's no match, `undefined` will be returned. If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `['{', 'a', '']`. ## Installation With [npm](https://npmjs.org) do: ```bash npm install balanced-match ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/.npmignore�����������������������000644 �000766 �000024 �00000000016 12455173731 027216� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/node_modules ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/CONTRIBUTORS���������������������000644 �000766 �000024 �00000001014 12455173731 027076� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# Authors, sorted by whether or not they are me Isaac Z. Schlueter <i@izs.me> Brian Cottingham <spiffytech@gmail.com> Carlos Brito Lage <carlos@carloslage.net> Jesse Dailey <jesse.dailey@gmail.com> Kevin O'Hara <kevinohara80@gmail.com> Marco Rogers <marco.rogers@gmail.com> Mark Cavage <mcavage@gmail.com> Marko Mikulicic <marko.mikulicic@isti.cnr.it> Nathan Rajlich <nathan@tootallnate.net> Satheesh Natesan <snateshan@myspace-inc.com> Trent Mick <trentm@gmail.com> ashleybrener <ashley@starlogik.com> n4kz <n4kz@n4kz.com> ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/lib/�����������������������������000755 �000766 �000024 �00000000000 12456115117 025763� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/LICENSE��������������������������000644 �000766 �000024 �00000002104 12455173731 026224� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/package.json���������������������000644 �000766 �000024 �00000010021 12455173731 027502� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "lru-cache", "description": "A cache object that deletes the least-recently-used items.", "version": "2.5.0", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me" }, "scripts": { "test": "tap test --gc" }, "main": "lib/lru-cache.js", "repository": { "type": "git", "url": "git://github.com/isaacs/node-lru-cache.git" }, "devDependencies": { "tap": "", "weak": "" }, "license": { "type": "MIT", "url": "http://github.com/isaacs/node-lru-cache/raw/master/LICENSE" }, "readme": "# lru cache\n\nA cache object that deletes the least-recently-used items.\n\n## Usage:\n\n```javascript\nvar LRU = require(\"lru-cache\")\n , options = { max: 500\n , length: function (n) { return n * 2 }\n , dispose: function (key, n) { n.close() }\n , maxAge: 1000 * 60 * 60 }\n , cache = LRU(options)\n , otherCache = LRU(50) // sets just the max size\n\ncache.set(\"key\", \"value\")\ncache.get(\"key\") // \"value\"\n\ncache.reset() // empty the cache\n```\n\nIf you put more stuff in it, then items will fall out.\n\nIf you try to put an oversized thing in it, then it'll fall out right\naway.\n\n## Options\n\n* `max` The maximum size of the cache, checked by applying the length\n function to all values in the cache. Not setting this is kind of\n silly, since that's the whole purpose of this lib, but it defaults\n to `Infinity`.\n* `maxAge` Maximum age in ms. Items are not pro-actively pruned out\n as they age, but if you try to get an item that is too old, it'll\n drop it and return undefined instead of giving it to you.\n* `length` Function that is used to calculate the length of stored\n items. If you're storing strings or buffers, then you probably want\n to do something like `function(n){return n.length}`. The default is\n `function(n){return 1}`, which is fine if you want to store `n`\n like-sized things.\n* `dispose` Function that is called on items when they are dropped\n from the cache. This can be handy if you want to close file\n descriptors or do other cleanup tasks when items are no longer\n accessible. Called with `key, value`. It's called *before*\n actually removing the item from the internal cache, so if you want\n to immediately put it back in, you'll have to do that in a\n `nextTick` or `setTimeout` callback or it won't do anything.\n* `stale` By default, if you set a `maxAge`, it'll only actually pull\n stale items out of the cache when you `get(key)`. (That is, it's\n not pre-emptively doing a `setTimeout` or anything.) If you set\n `stale:true`, it'll return the stale value before deleting it. If\n you don't set this, then it'll return `undefined` when you try to\n get a stale entry, as if it had already been deleted.\n\n## API\n\n* `set(key, value)`\n* `get(key) => value`\n\n Both of these will update the \"recently used\"-ness of the key.\n They do what you think.\n\n* `peek(key)`\n\n Returns the key value (or `undefined` if not found) without\n updating the \"recently used\"-ness of the key.\n\n (If you find yourself using this a lot, you *might* be using the\n wrong sort of data structure, but there are some use cases where\n it's handy.)\n\n* `del(key)`\n\n Deletes a key out of the cache.\n\n* `reset()`\n\n Clear the cache entirely, throwing away all values.\n\n* `has(key)`\n\n Check if a key is in the cache, without updating the recent-ness\n or deleting it for being stale.\n\n* `forEach(function(value,key,cache), [thisp])`\n\n Just like `Array.prototype.forEach`. Iterates over all the keys\n in the cache, in order of recent-ness. (Ie, more recently used\n items are iterated over first.)\n\n* `keys()`\n\n Return an array of the keys in the cache.\n\n* `values()`\n\n Return an array of the values in the cache.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/node-lru-cache/issues" }, "homepage": "https://github.com/isaacs/node-lru-cache", "_id": "lru-cache@2.5.0", "_from": "lru-cache@latest" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/README.md������������������������000644 �000766 �000024 �00000006211 12455173731 026501� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# lru cache A cache object that deletes the least-recently-used items. ## Usage: ```javascript var LRU = require("lru-cache") , options = { max: 500 , length: function (n) { return n * 2 } , dispose: function (key, n) { n.close() } , maxAge: 1000 * 60 * 60 } , cache = LRU(options) , otherCache = LRU(50) // sets just the max size cache.set("key", "value") cache.get("key") // "value" cache.reset() // empty the cache ``` If you put more stuff in it, then items will fall out. If you try to put an oversized thing in it, then it'll fall out right away. ## Options * `max` The maximum size of the cache, checked by applying the length function to all values in the cache. Not setting this is kind of silly, since that's the whole purpose of this lib, but it defaults to `Infinity`. * `maxAge` Maximum age in ms. Items are not pro-actively pruned out as they age, but if you try to get an item that is too old, it'll drop it and return undefined instead of giving it to you. * `length` Function that is used to calculate the length of stored items. If you're storing strings or buffers, then you probably want to do something like `function(n){return n.length}`. The default is `function(n){return 1}`, which is fine if you want to store `n` like-sized things. * `dispose` Function that is called on items when they are dropped from the cache. This can be handy if you want to close file descriptors or do other cleanup tasks when items are no longer accessible. Called with `key, value`. It's called *before* actually removing the item from the internal cache, so if you want to immediately put it back in, you'll have to do that in a `nextTick` or `setTimeout` callback or it won't do anything. * `stale` By default, if you set a `maxAge`, it'll only actually pull stale items out of the cache when you `get(key)`. (That is, it's not pre-emptively doing a `setTimeout` or anything.) If you set `stale:true`, it'll return the stale value before deleting it. If you don't set this, then it'll return `undefined` when you try to get a stale entry, as if it had already been deleted. ## API * `set(key, value)` * `get(key) => value` Both of these will update the "recently used"-ness of the key. They do what you think. * `peek(key)` Returns the key value (or `undefined` if not found) without updating the "recently used"-ness of the key. (If you find yourself using this a lot, you *might* be using the wrong sort of data structure, but there are some use cases where it's handy.) * `del(key)` Deletes a key out of the cache. * `reset()` Clear the cache entirely, throwing away all values. * `has(key)` Check if a key is in the cache, without updating the recent-ness or deleting it for being stale. * `forEach(function(value,key,cache), [thisp])` Just like `Array.prototype.forEach`. Iterates over all the keys in the cache, in order of recent-ness. (Ie, more recently used items are iterated over first.) * `keys()` Return an array of the keys in the cache. * `values()` Return an array of the values in the cache. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lru-cache/lib/lru-cache.js�����������������000644 �000766 �000024 �00000014455 12455173731 030202� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������;(function () { // closure for web browsers if (typeof module === 'object' && module.exports) { module.exports = LRUCache } else { // just set the global for non-node platforms. this.LRUCache = LRUCache } function hOP (obj, key) { return Object.prototype.hasOwnProperty.call(obj, key) } function naiveLength () { return 1 } function LRUCache (options) { if (!(this instanceof LRUCache)) return new LRUCache(options) if (typeof options === 'number') options = { max: options } if (!options) options = {} this._max = options.max // Kind of weird to have a default max of Infinity, but oh well. if (!this._max || !(typeof this._max === "number") || this._max <= 0 ) this._max = Infinity this._lengthCalculator = options.length || naiveLength if (typeof this._lengthCalculator !== "function") this._lengthCalculator = naiveLength this._allowStale = options.stale || false this._maxAge = options.maxAge || null this._dispose = options.dispose this.reset() } // resize the cache when the max changes. Object.defineProperty(LRUCache.prototype, "max", { set : function (mL) { if (!mL || !(typeof mL === "number") || mL <= 0 ) mL = Infinity this._max = mL if (this._length > this._max) trim(this) } , get : function () { return this._max } , enumerable : true }) // resize the cache when the lengthCalculator changes. Object.defineProperty(LRUCache.prototype, "lengthCalculator", { set : function (lC) { if (typeof lC !== "function") { this._lengthCalculator = naiveLength this._length = this._itemCount for (var key in this._cache) { this._cache[key].length = 1 } } else { this._lengthCalculator = lC this._length = 0 for (var key in this._cache) { this._cache[key].length = this._lengthCalculator(this._cache[key].value) this._length += this._cache[key].length } } if (this._length > this._max) trim(this) } , get : function () { return this._lengthCalculator } , enumerable : true }) Object.defineProperty(LRUCache.prototype, "length", { get : function () { return this._length } , enumerable : true }) Object.defineProperty(LRUCache.prototype, "itemCount", { get : function () { return this._itemCount } , enumerable : true }) LRUCache.prototype.forEach = function (fn, thisp) { thisp = thisp || this var i = 0; for (var k = this._mru - 1; k >= 0 && i < this._itemCount; k--) if (this._lruList[k]) { i++ var hit = this._lruList[k] if (this._maxAge && (Date.now() - hit.now > this._maxAge)) { del(this, hit) if (!this._allowStale) hit = undefined } if (hit) { fn.call(thisp, hit.value, hit.key, this) } } } LRUCache.prototype.keys = function () { var keys = new Array(this._itemCount) var i = 0 for (var k = this._mru - 1; k >= 0 && i < this._itemCount; k--) if (this._lruList[k]) { var hit = this._lruList[k] keys[i++] = hit.key } return keys } LRUCache.prototype.values = function () { var values = new Array(this._itemCount) var i = 0 for (var k = this._mru - 1; k >= 0 && i < this._itemCount; k--) if (this._lruList[k]) { var hit = this._lruList[k] values[i++] = hit.value } return values } LRUCache.prototype.reset = function () { if (this._dispose && this._cache) { for (var k in this._cache) { this._dispose(k, this._cache[k].value) } } this._cache = Object.create(null) // hash of items by key this._lruList = Object.create(null) // list of items in order of use recency this._mru = 0 // most recently used this._lru = 0 // least recently used this._length = 0 // number of items in the list this._itemCount = 0 } // Provided for debugging/dev purposes only. No promises whatsoever that // this API stays stable. LRUCache.prototype.dump = function () { return this._cache } LRUCache.prototype.dumpLru = function () { return this._lruList } LRUCache.prototype.set = function (key, value) { if (hOP(this._cache, key)) { // dispose of the old one before overwriting if (this._dispose) this._dispose(key, this._cache[key].value) if (this._maxAge) this._cache[key].now = Date.now() this._cache[key].value = value this.get(key) return true } var len = this._lengthCalculator(value) var age = this._maxAge ? Date.now() : 0 var hit = new Entry(key, value, this._mru++, len, age) // oversized objects fall out of cache automatically. if (hit.length > this._max) { if (this._dispose) this._dispose(key, value) return false } this._length += hit.length this._lruList[hit.lu] = this._cache[key] = hit this._itemCount ++ if (this._length > this._max) trim(this) return true } LRUCache.prototype.has = function (key) { if (!hOP(this._cache, key)) return false var hit = this._cache[key] if (this._maxAge && (Date.now() - hit.now > this._maxAge)) { return false } return true } LRUCache.prototype.get = function (key) { return get(this, key, true) } LRUCache.prototype.peek = function (key) { return get(this, key, false) } LRUCache.prototype.pop = function () { var hit = this._lruList[this._lru] del(this, hit) return hit || null } LRUCache.prototype.del = function (key) { del(this, this._cache[key]) } function get (self, key, doUse) { var hit = self._cache[key] if (hit) { if (self._maxAge && (Date.now() - hit.now > self._maxAge)) { del(self, hit) if (!self._allowStale) hit = undefined } else { if (doUse) use(self, hit) } if (hit) hit = hit.value } return hit } function use (self, hit) { shiftLU(self, hit) hit.lu = self._mru ++ self._lruList[hit.lu] = hit } function trim (self) { while (self._lru < self._mru && self._length > self._max) del(self, self._lruList[self._lru]) } function shiftLU (self, hit) { delete self._lruList[ hit.lu ] while (self._lru < self._mru && !self._lruList[self._lru]) self._lru ++ } function del (self, hit) { if (hit) { if (self._dispose) self._dispose(hit.key, hit.value) self._length -= hit.length self._itemCount -- delete self._cache[ hit.key ] shiftLU(self, hit) } } // classy, since V8 prefers predictable objects. function Entry (key, value, lu, length, now) { this.key = key this.value = value this.lu = lu this.length = length this.now = now } })() �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lockfile/LICENSE���������������������������000644 �000766 �000024 �00000002436 12455173731 026161� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lockfile/lockfile.js�����������������������000644 �000766 �000024 �00000020042 12455173731 027273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var wx = 'wx' if (process.version.match(/^v0\.[0-6]/)) { var c = require('constants') wx = c.O_TRUNC | c.O_CREAT | c.O_WRONLY | c.O_EXCL } var os = require('os') exports.filetime = 'ctime' if (os.platform() == "win32") { exports.filetime = 'mtime' } var debug var util = require('util') if (util.debuglog) debug = util.debuglog('LOCKFILE') else if (/\blockfile\b/i.test(process.env.NODE_DEBUG)) debug = function() { var msg = util.format.apply(util, arguments) console.error('LOCKFILE %d %s', process.pid, msg) } else debug = function() {} var locks = {} function hasOwnProperty (obj, prop) { return Object.prototype.hasOwnProperty.call(obj, prop) } process.on('exit', function () { debug('exit listener') // cleanup Object.keys(locks).forEach(exports.unlockSync) }) // XXX https://github.com/joyent/node/issues/3555 // Remove when node 0.8 is deprecated. if (/^v0\.[0-8]\./.test(process.version)) { debug('uncaughtException, version = %s', process.version) process.on('uncaughtException', function H (er) { debug('uncaughtException') var l = process.listeners('uncaughtException').filter(function (h) { return h !== H }) if (!l.length) { // cleanup try { Object.keys(locks).forEach(exports.unlockSync) } catch (e) {} process.removeListener('uncaughtException', H) throw er } }) } exports.unlock = function (path, cb) { debug('unlock', path) // best-effort. unlocking an already-unlocked lock is a noop delete locks[path] fs.unlink(path, function (unlinkEr) { cb() }) } exports.unlockSync = function (path) { debug('unlockSync', path) // best-effort. unlocking an already-unlocked lock is a noop try { fs.unlinkSync(path) } catch (er) {} delete locks[path] } // if the file can be opened in readonly mode, then it's there. // if the error is something other than ENOENT, then it's not. exports.check = function (path, opts, cb) { if (typeof opts === 'function') cb = opts, opts = {} debug('check', path, opts) fs.open(path, 'r', function (er, fd) { if (er) { if (er.code !== 'ENOENT') return cb(er) return cb(null, false) } if (!opts.stale) { return fs.close(fd, function (er) { return cb(er, true) }) } fs.fstat(fd, function (er, st) { if (er) return fs.close(fd, function (er2) { return cb(er) }) fs.close(fd, function (er) { var age = Date.now() - st[exports.filetime].getTime() return cb(er, age <= opts.stale) }) }) }) } exports.checkSync = function (path, opts) { opts = opts || {} debug('checkSync', path, opts) if (opts.wait) { throw new Error('opts.wait not supported sync for obvious reasons') } try { var fd = fs.openSync(path, 'r') } catch (er) { if (er.code !== 'ENOENT') throw er return false } if (!opts.stale) { try { fs.closeSync(fd) } catch (er) {} return true } // file exists. however, might be stale if (opts.stale) { try { var st = fs.fstatSync(fd) } finally { fs.closeSync(fd) } var age = Date.now() - st[exports.filetime].getTime() return (age <= opts.stale) } } var req = 1 exports.lock = function (path, opts, cb) { if (typeof opts === 'function') cb = opts, opts = {} opts.req = opts.req || req++ debug('lock', path, opts) opts.start = opts.start || Date.now() if (typeof opts.retries === 'number' && opts.retries > 0) { debug('has retries', opts.retries) var retries = opts.retries opts.retries = 0 cb = (function (orig) { return function cb (er, fd) { debug('retry-mutated callback') retries -= 1 if (!er || retries < 0) return orig(er, fd) debug('lock retry', path, opts) if (opts.retryWait) setTimeout(retry, opts.retryWait) else retry() function retry () { opts.start = Date.now() debug('retrying', opts.start) exports.lock(path, opts, cb) } }})(cb) } // try to engage the lock. // if this succeeds, then we're in business. fs.open(path, wx, function (er, fd) { if (!er) { debug('locked', path, fd) locks[path] = fd return fs.close(fd, function () { return cb() }) } // something other than "currently locked" // maybe eperm or something. if (er.code !== 'EEXIST') return cb(er) // someone's got this one. see if it's valid. if (!opts.stale) return notStale(er, path, opts, cb) return maybeStale(er, path, opts, false, cb) }) } // Staleness checking algorithm // 1. acquire $lock, fail // 2. stat $lock, find that it is stale // 3. acquire $lock.STALE // 4. stat $lock, assert that it is still stale // 5. unlink $lock // 6. link $lock.STALE $lock // 7. unlink $lock.STALE // On any failure, clean up whatever we've done, and raise the error. function maybeStale (originalEr, path, opts, hasStaleLock, cb) { fs.stat(path, function (statEr, st) { if (statEr) { if (statEr.code === 'ENOENT') { // expired already! opts.stale = false debug('lock stale enoent retry', path, opts) exports.lock(path, opts, cb) return } return cb(statEr) } var age = Date.now() - st[exports.filetime].getTime() if (age <= opts.stale) return notStale(originalEr, path, opts, cb) debug('lock stale', path, opts) if (hasStaleLock) { exports.unlock(path, function (er) { if (er) return cb(er) debug('lock stale retry', path, opts) fs.link(path + '.STALE', path, function (er) { fs.unlink(path + '.STALE', function () { // best effort. if the unlink fails, oh well. cb(er) }) }) }) } else { debug('acquire .STALE file lock', opts) exports.lock(path + '.STALE', opts, function (er) { if (er) return cb(er) maybeStale(originalEr, path, opts, true, cb) }) } }) } function notStale (er, path, opts, cb) { debug('notStale', path, opts) // if we can't wait, then just call it a failure if (typeof opts.wait !== 'number' || opts.wait <= 0) return cb(er) // poll for some ms for the lock to clear var now = Date.now() var start = opts.start || now var end = start + opts.wait if (end <= now) return cb(er) debug('now=%d, wait until %d (delta=%d)', start, end, end-start) var wait = Math.min(end - start, opts.pollPeriod || 100) var timer = setTimeout(poll, wait) function poll () { debug('notStale, polling', path, opts) exports.lock(path, opts, cb) } } exports.lockSync = function (path, opts) { opts = opts || {} opts.req = opts.req || req++ debug('lockSync', path, opts) if (opts.wait || opts.retryWait) { throw new Error('opts.wait not supported sync for obvious reasons') } try { var fd = fs.openSync(path, wx) locks[path] = fd try { fs.closeSync(fd) } catch (er) {} debug('locked sync!', path, fd) return } catch (er) { if (er.code !== 'EEXIST') return retryThrow(path, opts, er) if (opts.stale) { var st = fs.statSync(path) var ct = st[exports.filetime].getTime() if (!(ct % 1000) && (opts.stale % 1000)) { // probably don't have subsecond resolution. // round up the staleness indicator. // Yes, this will be wrong 1/1000 times on platforms // with subsecond stat precision, but that's acceptable // in exchange for not mistakenly removing locks on // most other systems. opts.stale = 1000 * Math.ceil(opts.stale / 1000) } var age = Date.now() - ct if (age > opts.stale) { debug('lockSync stale', path, opts, age) exports.unlockSync(path) return exports.lockSync(path, opts) } } // failed to lock! debug('failed to lock', path, opts, er) return retryThrow(path, opts, er) } } function retryThrow (path, opts, er) { if (typeof opts.retries === 'number' && opts.retries > 0) { var newRT = opts.retries - 1 debug('retryThrow', path, opts, newRT) opts.retries = newRT return exports.lockSync(path, opts) } throw er } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lockfile/package.json����������������������000644 �000766 �000024 �00000002732 12455173731 027441� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "lockfile", "version": "1.0.0", "main": "lockfile.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.2.5", "touch": "0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/lockfile" }, "keywords": [ "lockfile", "lock", "file", "fs", "O_EXCL" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "BSD", "description": "A very polite lock file utility, which endeavors to not litter, and to wait patiently for others.", "gitHead": "9590c6f02521eb1bb154ddc3ca9a7e84ce770c45", "bugs": { "url": "https://github.com/isaacs/lockfile/issues" }, "homepage": "https://github.com/isaacs/lockfile", "_id": "lockfile@1.0.0", "_shasum": "b3a7609dda6012060083bacb0ab0ecbca58e9203", "_from": "lockfile@1.0.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "trevorburnham", "email": "trevorburnham@gmail.com" }, { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "b3a7609dda6012060083bacb0ab0ecbca58e9203", "tarball": "http://registry.npmjs.org/lockfile/-/lockfile-1.0.0.tgz" }, "_resolved": "https://registry.npmjs.org/lockfile/-/lockfile-1.0.0.tgz", "readme": "ERROR: No README data found!" } ��������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/lockfile/README.md�������������������������000644 �000766 �000024 �00000004042 12455173731 026426� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# lockfile A very polite lock file utility, which endeavors to not litter, and to wait patiently for others. ## Usage ```javascript var lockFile = require('lockfile') // opts is optional, and defaults to {} lockFile.lock('some-file.lock', opts, function (er) { // if the er happens, then it failed to acquire a lock. // if there was not an error, then the file was created, // and won't be deleted until we unlock it. // do my stuff, free of interruptions // then, some time later, do: lockFile.unlock('some-file.lock', function (er) { // er means that an error happened, and is probably bad. }) }) ``` ## Methods Sync methods return the value/throw the error, others don't. Standard node fs stuff. All known locks are removed when the process exits. Of course, it's possible for certain types of failures to cause this to fail, but a best effort is made to not be a litterbug. ### lockFile.lock(path, [opts], cb) Acquire a file lock on the specified path ### lockFile.lockSync(path, [opts]) Acquire a file lock on the specified path ### lockFile.unlock(path, cb) Close and unlink the lockfile. ### lockFile.unlockSync(path) Close and unlink the lockfile. ### lockFile.check(path, [opts], cb) Check if the lockfile is locked and not stale. Returns boolean. ### lockFile.checkSync(path, [opts], cb) Check if the lockfile is locked and not stale. Callback is called with `cb(error, isLocked)`. ## Options ### opts.wait A number of milliseconds to wait for locks to expire before giving up. Only used by lockFile.lock. Poll for `opts.wait` ms. If the lock is not cleared by the time the wait expires, then it returns with the original error. ### opts.pollPeriod When using `opts.wait`, this is the period in ms in which it polls to check if the lock has expired. Defaults to `100`. ### opts.stale A number of milliseconds before locks are considered to have expired. ### opts.retries Used by lock and lockSync. Retry `n` number of times before giving up. ### opts.retryWait Used by lock. Wait `n` milliseconds before retrying. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/.npmignore���������������000644 �000766 �000024 �00000000030 12455173731 030652� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/ .eslintrc ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/default-input.js���������000644 �000766 �000024 �00000012455 12455173731 032010� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var glob = require('glob') // more popular packages should go here, maybe? function isTestPkg (p) { return !!p.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/) } function niceName (n) { return n.replace(/^node-|[.-]js$/g, '') } function readDeps (test) { return function (cb) { fs.readdir('node_modules', function (er, dir) { if (er) return cb() var deps = {} var n = dir.length if (n === 0) return cb(null, deps) dir.forEach(function (d) { if (d.match(/^\./)) return next() if (test !== isTestPkg(d)) return next() var dp = path.join(dirname, 'node_modules', d, 'package.json') fs.readFile(dp, 'utf8', function (er, p) { if (er) return next() try { p = JSON.parse(p) } catch (e) { return next() } if (!p.version) return next() deps[d] = config.get('save-prefix') + p.version return next() }) }) function next () { if (--n === 0) return cb(null, deps) } }) }} var name = package.name || basename exports.name = yes ? name : prompt('name', name) var version = package.version || config.get('init.version') || config.get('init-version') || '1.0.0' exports.version = yes ? version : prompt('version', version) if (!package.description) { exports.description = yes ? '' : prompt('description') } if (!package.main) { exports.main = function (cb) { fs.readdir(dirname, function (er, f) { if (er) f = [] f = f.filter(function (f) { return f.match(/\.js$/) }) if (f.indexOf('index.js') !== -1) f = 'index.js' else if (f.indexOf('main.js') !== -1) f = 'main.js' else if (f.indexOf(basename + '.js') !== -1) f = basename + '.js' else f = f[0] var index = f || 'index.js' return cb(null, yes ? index : prompt('entry point', index)) }) } } if (!package.bin) { exports.bin = function (cb) { fs.readdir(path.resolve(dirname, 'bin'), function (er, d) { // no bins if (er) return cb() // just take the first js file we find there, or nada return cb(null, d.filter(function (f) { return f.match(/\.js$/) })[0]) }) } } exports.directories = function (cb) { fs.readdir(dirname, function (er, dirs) { if (er) return cb(er) var res = {} dirs.forEach(function (d) { switch (d) { case 'example': case 'examples': return res.example = d case 'test': case 'tests': return res.test = d case 'doc': case 'docs': return res.doc = d case 'man': return res.man = d } }) if (Object.keys(res).length === 0) res = undefined return cb(null, res) }) } if (!package.dependencies) { exports.dependencies = readDeps(false) } if (!package.devDependencies) { exports.devDependencies = readDeps(true) } // MUST have a test script! var s = package.scripts || {} var notest = 'echo "Error: no test specified" && exit 1' if (!package.scripts) { exports.scripts = function (cb) { fs.readdir(path.join(dirname, 'node_modules'), function (er, d) { setupScripts(d || [], cb) }) } } function setupScripts (d, cb) { // check to see what framework is in use, if any function tx (test) { return test || notest } if (!s.test || s.test === notest) { var commands = { 'tap':'tap test/*.js' , 'expresso':'expresso test' , 'mocha':'mocha' } var command Object.keys(commands).forEach(function (k) { if (d.indexOf(k) !== -1) command = commands[k] }) var ps = 'test command' if (yes) { s.test = command || notest } else { s.test = command ? prompt(ps, command, tx) : prompt(ps, tx) } } return cb(null, s) } if (!package.repository) { exports.repository = function (cb) { fs.readFile('.git/config', 'utf8', function (er, gconf) { if (er || !gconf) { return cb(null, yes ? '' : prompt('git repository')) } gconf = gconf.split(/\r?\n/) var i = gconf.indexOf('[remote "origin"]') if (i !== -1) { var u = gconf[i + 1] if (!u.match(/^\s*url =/)) u = gconf[i + 2] if (!u.match(/^\s*url =/)) u = null else u = u.replace(/^\s*url = /, '') } if (u && u.match(/^git@github.com:/)) u = u.replace(/^git@github.com:/, 'https://github.com/') return cb(null, yes ? u : prompt('git repository', u)) }) } } if (!package.keywords) { exports.keywords = yes ? '' : prompt('keywords', function (s) { if (!s) return undefined if (Array.isArray(s)) s = s.join(' ') if (typeof s !== 'string') return s return s.split(/[\s,]+/) }) } if (!package.author) { exports.author = config.get('init.author.name') || config.get('init-author-name') ? { "name" : config.get('init.author.name') || config.get('init-author-name'), "email" : config.get('init.author.email') || config.get('init-author-email'), "url" : config.get('init.author.url') || config.get('init-author-url') } : prompt('author') } var license = package.license || config.get('init.license') || config.get('init-license') || 'ISC' exports.license = yes ? license : prompt('license', license) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/example/�����������������000755 �000766 �000024 �00000000000 12456115117 030310� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/init-package-json.js�����000644 �000766 �000024 �00000007461 12455173731 032533� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = init module.exports.yes = yes var PZ = require('promzard').PromZard var path = require('path') var def = require.resolve('./default-input.js') var fs = require('fs') var semver = require('semver') var read = require('read') // to validate the data object at the end as a worthwhile package // and assign default values for things. // readJson.extras(file, data, cb) var readJson = require('read-package-json') function yes (conf) { return !!( conf.get('yes') || conf.get('y') || conf.get('force') || conf.get('f') ) } function init (dir, input, config, cb) { if (typeof config === 'function') cb = config, config = {} // accept either a plain-jane object, or a config object // with a "get" method. if (typeof config.get !== 'function') { var data = config config = { get: function (k) { return data[k] }, toJSON: function () { return data } } } var package = path.resolve(dir, 'package.json') input = path.resolve(input) var pkg var ctx = { yes: yes(config) } var es = readJson.extraSet readJson.extraSet = es.filter(function (fn) { return fn.name !== 'authors' && fn.name !== 'mans' }) readJson(package, function (er, d) { readJson.extraSet = es if (er) pkg = {} else pkg = d ctx.filename = package ctx.dirname = path.dirname(package) ctx.basename = path.basename(ctx.dirname) if (!pkg.version || !semver.valid(pkg.version)) delete pkg.version ctx.package = pkg ctx.config = config || {} // make sure that the input is valid. // if not, use the default var pz = new PZ(input, ctx) pz.backupFile = def pz.on('error', cb) pz.on('data', function (data) { Object.keys(data).forEach(function (k) { if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k] }) // only do a few of these. // no need for mans or contributors if they're in the files var es = readJson.extraSet readJson.extraSet = es.filter(function (fn) { return fn.name !== 'authors' && fn.name !== 'mans' }) readJson.extras(package, pkg, function (er, pkg) { readJson.extraSet = es if (er) return cb(er, pkg) pkg = unParsePeople(pkg) // no need for the readme now. delete pkg.readme delete pkg.readmeFilename // really don't want to have this lying around in the file delete pkg._id // ditto delete pkg.gitHead // if the repo is empty, remove it. if (!pkg.repository) delete pkg.repository var d = JSON.stringify(pkg, null, 2) + '\n' function write (yes) { fs.writeFile(package, d, 'utf8', function (er) { if (!er && yes) console.log('Wrote to %s:\n\n%s\n', package, d) return cb(er, pkg) }) } if (ctx.yes) { return write(true) } console.log('About to write to %s:\n\n%s\n', package, d) read({prompt:'Is this ok? ', default: 'yes'}, function (er, ok) { if (!ok || ok.toLowerCase().charAt(0) !== 'y') { console.log('Aborted.') } else { return write() } }) }) }) }) } // turn the objects into somewhat more humane strings. function unParsePeople (data) { if (data.author) data.author = unParsePerson(data.author) ;["maintainers", "contributors"].forEach(function (set) { if (!Array.isArray(data[set])) return; data[set] = data[set].map(unParsePerson) }) return data } function unParsePerson (person) { if (typeof person === "string") return person var name = person.name || "" var u = person.url || person.web var url = u ? (" ("+u+")") : "" var e = person.email || person.mail var email = e ? (" <"+e+">") : "" return name+email+url } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/LICENSE������������������000644 �000766 �000024 �00000001354 12455173731 027672� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/node_modules/������������000755 �000766 �000024 �00000000000 12456115117 031332� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/package.json�������������000644 �000766 �000024 �00000004563 12455173731 031160� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "init-package-json", "version": "1.1.3", "main": "init-package-json.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/init-package-json" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "description": "A node module to get your node module started", "dependencies": { "glob": "^4.0.2", "promzard": "~0.2.0", "read": "~1.0.1", "read-package-json": "1", "semver": "2.x || 3.x || 4" }, "devDependencies": { "npm": "^2.1.4", "rimraf": "^2.1.4", "tap": "^0.4.13" }, "keywords": [ "init", "package.json", "package", "helper", "wizard", "wizerd", "prompt", "start" ], "readme": "# init-package-json\n\nA node module to get your node module started.\n\n## Usage\n\n```javascript\nvar init = require('init-package-json')\nvar path = require('path')\n\n// a path to a promzard module. In the event that this file is\n// not found, one will be provided for you.\nvar initFile = path.resolve(process.env.HOME, '.npm-init')\n\n// the dir where we're doin stuff.\nvar dir = process.cwd()\n\n// extra stuff that gets put into the PromZard module's context.\n// In npm, this is the resolved config object. Exposed as 'config'\n// Optional.\nvar configData = { some: 'extra stuff' }\n\n// Any existing stuff from the package.json file is also exposed in the\n// PromZard module as the `package` object. There will also be free\n// vars for:\n// * `filename` path to the package.json file\n// * `basename` the tip of the package dir\n// * `dirname` the parent of the package dir\n\ninit(dir, initFile, configData, function (er, data) {\n // the data's already been written to {dir}/package.json\n // now you can do stuff with it\n})\n```\n\nOr from the command line:\n\n```\n$ npm-init\n```\n\nSee [PromZard](https://github.com/isaacs/promzard) for details about\nwhat can go in the config file.\n", "readmeFilename": "README.md", "gitHead": "b766900b2d615ddc43c452e251b8c5543538e832", "bugs": { "url": "https://github.com/isaacs/init-package-json/issues" }, "homepage": "https://github.com/isaacs/init-package-json", "_id": "init-package-json@1.1.3", "_shasum": "1d633c151a4909891afc8ee13cace8b336c0c9c2", "_from": "init-package-json@>=1.1.3 <1.2.0" } ���������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/README.md����������������000644 �000766 �000024 �00000002221 12455173731 030136� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# init-package-json A node module to get your node module started. ## Usage ```javascript var init = require('init-package-json') var path = require('path') // a path to a promzard module. In the event that this file is // not found, one will be provided for you. var initFile = path.resolve(process.env.HOME, '.npm-init') // the dir where we're doin stuff. var dir = process.cwd() // extra stuff that gets put into the PromZard module's context. // In npm, this is the resolved config object. Exposed as 'config' // Optional. var configData = { some: 'extra stuff' } // Any existing stuff from the package.json file is also exposed in the // PromZard module as the `package` object. There will also be free // vars for: // * `filename` path to the package.json file // * `basename` the tip of the package dir // * `dirname` the parent of the package dir init(dir, initFile, configData, function (er, data) { // the data's already been written to {dir}/package.json // now you can do stuff with it }) ``` Or from the command line: ``` $ npm-init ``` See [PromZard](https://github.com/isaacs/promzard) for details about what can go in the config file. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/���000755 �000766 �000024 �00000000000 12456115117 033170� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/.npmignore����������������000644 �000766 �000024 �00000000036 12455173731 035114� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������example/npm-init/package.json ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/������������������000755 �000766 �000024 �00000000000 12456115117 034544� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/LICENSE�������������������000644 �000766 �000024 �00000001354 12455173731 034126� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/package.json��������������000644 �000766 �000024 �00000002136 12455173731 035406� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "promzard", "description": "prompting wizardly", "version": "0.2.2", "repository": { "url": "git://github.com/isaacs/promzard" }, "dependencies": { "read": "1" }, "devDependencies": { "tap": "~0.2.5" }, "main": "promzard.js", "scripts": { "test": "tap test/*.js" }, "license": "ISC", "bugs": { "url": "https://github.com/isaacs/promzard/issues" }, "homepage": "https://github.com/isaacs/promzard", "_id": "promzard@0.2.2", "_shasum": "918b9f2b29458cb001781a8856502e4a79b016e0", "_from": "promzard@>=0.2.0 <0.3.0", "_npmVersion": "1.4.10", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "918b9f2b29458cb001781a8856502e4a79b016e0", "tarball": "http://registry.npmjs.org/promzard/-/promzard-0.2.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/promzard/-/promzard-0.2.2.tgz" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/promzard.js���������������000644 �000766 �000024 �00000012631 12455173731 035315� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = promzard promzard.PromZard = PromZard var fs = require('fs') var vm = require('vm') var util = require('util') var files = {} var crypto = require('crypto') var EventEmitter = require('events').EventEmitter var read = require('read') var Module = require('module').Module var path = require('path') function promzard (file, ctx, cb) { if (typeof ctx === 'function') cb = ctx, ctx = null; if (!ctx) ctx = {}; var pz = new PromZard(file, ctx) pz.on('error', cb) pz.on('data', function (data) { cb(null, data) }) } function PromZard (file, ctx) { if (!(this instanceof PromZard)) return new PromZard(file, ctx) EventEmitter.call(this) this.file = file this.ctx = ctx this.unique = crypto.randomBytes(8).toString('hex') this.load() } PromZard.prototype = Object.create( EventEmitter.prototype, { constructor: { value: PromZard, readable: true, configurable: true, writable: true, enumerable: false } } ) PromZard.prototype.load = function () { if (files[this.file]) return this.loaded() fs.readFile(this.file, 'utf8', function (er, d) { if (er && this.backupFile) { this.file = this.backupFile delete this.backupFile return this.load() } if (er) return this.emit('error', this.error = er) files[this.file] = d this.loaded() }.bind(this)) } PromZard.prototype.loaded = function () { this.ctx.prompt = this.makePrompt() this.ctx.__filename = this.file this.ctx.__dirname = path.dirname(this.file) this.ctx.__basename = path.basename(this.file) var mod = this.ctx.module = this.makeModule() this.ctx.require = function (path) { return mod.require(path) } this.ctx.require.resolve = function(path) { return Module._resolveFilename(path, mod); } this.ctx.exports = mod.exports this.script = this.wrap(files[this.file]) var fn = vm.runInThisContext(this.script, this.file) var args = Object.keys(this.ctx).map(function (k) { return this.ctx[k] }.bind(this)) try { var res = fn.apply(this.ctx, args) } catch (er) { this.emit('error', er) } if (res && typeof res === 'object' && exports === mod.exports && Object.keys(exports).length === 1) { this.result = res } else { this.result = mod.exports } this.walk() } PromZard.prototype.makeModule = function () { var mod = new Module(this.file, module) mod.loaded = true mod.filename = this.file mod.id = this.file mod.paths = Module._nodeModulePaths(path.dirname(this.file)) return mod } PromZard.prototype.wrap = function (body) { var s = '(function( %s ) { %s\n })' var args = Object.keys(this.ctx).join(', ') return util.format(s, args, body) } PromZard.prototype.makePrompt = function () { this.prompts = [] return prompt.bind(this) function prompt () { var p, d, t for (var i = 0; i < arguments.length; i++) { var a = arguments[i] if (typeof a === 'string' && p) d = a else if (typeof a === 'string') p = a else if (typeof a === 'function') t = a else if (a && typeof a === 'object') { p = a.prompt || p d = a.default || d t = a.transform || t } } try { return this.unique + '-' + this.prompts.length } finally { this.prompts.push([p, d, t]) } } } PromZard.prototype.walk = function (o, cb) { o = o || this.result cb = cb || function (er, res) { if (er) return this.emit('error', this.error = er) this.result = res return this.emit('data', res) } cb = cb.bind(this) var keys = Object.keys(o) var i = 0 var len = keys.length L.call(this) function L () { if (this.error) return while (i < len) { var k = keys[i] var v = o[k] i++ if (v && typeof v === 'object') { return this.walk(v, function (er, res) { if (er) return cb(er) o[k] = res L.call(this) }.bind(this)) } else if (v && typeof v === 'string' && v.indexOf(this.unique) === 0) { var n = +v.substr(this.unique.length + 1) var prompt = this.prompts[n] if (isNaN(n) || !prompt) continue // default to the key if (undefined === prompt[0]) prompt[0] = k // default to the ctx value, if there is one if (undefined === prompt[1]) prompt[1] = this.ctx[k] return this.prompt(prompt, function (er, res) { if (er) return this.emit('error', this.error = er); o[k] = res L.call(this) }.bind(this)) } else if (typeof v === 'function') { try { return v.call(this.ctx, function (er, res) { if (er) return this.emit('error', this.error = er) o[k] = res // back up so that we process this one again. // this is because it might return a prompt() call in the cb. i -- L.call(this) }.bind(this)) } catch (er) { this.emit('error', er) } } } // made it to the end of the loop, maybe if (i >= len) return cb(null, o) } } PromZard.prototype.prompt = function (pdt, cb) { var prompt = pdt[0] var def = pdt[1] var tx = pdt[2] if (tx) { cb = function (cb) { return function (er, data) { try { return cb(er, tx(data)) } catch (er) { this.emit('error', er) } }}(cb).bind(this) } read({ prompt: prompt + ':' , default: def }, cb) } �������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/README.md�����������������000644 �000766 �000024 �00000010572 12455173731 034402� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# promzard A prompting wizard for building files from specialized PromZard modules. Used by `npm init`. A reimplementation of @SubStack's [prompter](https://github.com/substack/node-prompter), which does not use AST traversal. From another point of view, it's a reimplementation of [@Marak](https://github.com/marak)'s [wizard](https://github.com/Marak/wizard) which doesn't use schemas. The goal is a nice drop-in enhancement for `npm init`. ## Usage ```javascript var promzard = require('promzard') promzard(inputFile, optionalContextAdditions, function (er, data) { // .. you know what you doing .. }) ``` In the `inputFile` you can have something like this: ```javascript var fs = require('fs') module.exports = { "greeting": prompt("Who shall you greet?", "world", function (who) { return "Hello, " + who }), "filename": __filename, "directory": function (cb) { fs.readdir(__dirname, cb) } } ``` When run, promzard will display the prompts and resolve the async functions in order, and then either give you an error, or the resolved data, ready to be dropped into a JSON file or some other place. ### promzard(inputFile, ctx, callback) The inputFile is just a node module. You can require() things, set module.exports, etc. Whatever that module exports is the result, and it is walked over to call any functions as described below. The only caveat is that you must give PromZard the full absolute path to the module (you can get this via Node's `require.resolve`.) Also, the `prompt` function is injected into the context object, so watch out. Whatever you put in that `ctx` will of course also be available in the module. You can get quite fancy with this, passing in existing configs and so on. ### Class: promzard.PromZard(file, ctx) Just like the `promzard` function, but the EventEmitter that makes it all happen. Emits either a `data` event with the data, or a `error` event if it blows up. If `error` is emitted, then `data` never will be. ### prompt(...) In the promzard input module, you can call the `prompt` function. This prompts the user to input some data. The arguments are interpreted based on type: 1. `string` The first string encountered is the prompt. The second is the default value. 2. `function` A transformer function which receives the data and returns something else. More than meets the eye. 3. `object` The `prompt` member is the prompt, the `default` member is the default value, and the `transform` is the transformer. Whatever the final value is, that's what will be put on the resulting object. ### Functions If there are any functions on the promzard input module's exports, then promzard will call each of them with a callback. This way, your module can do asynchronous actions if necessary to validate or ascertain whatever needs verification. The functions are called in the context of the ctx object, and are given a single argument, which is a callback that should be called with either an error, or the result to assign to that spot. In the async function, you can also call prompt() and return the result of the prompt in the callback. For example, this works fine in a promzard module: ``` exports.asyncPrompt = function (cb) { fs.stat(someFile, function (er, st) { // if there's an error, no prompt, just error // otherwise prompt and use the actual file size as the default cb(er, prompt('file size', st.size)) }) } ``` You can also return other async functions in the async function callback. Though that's a bit silly, it could be a handy way to reuse functionality in some cases. ### Sync vs Async The `prompt()` function is not synchronous, though it appears that way. It just returns a token that is swapped out when the data object is walked over asynchronously later, and returns a token. For that reason, prompt() calls whose results don't end up on the data object are never shown to the user. For example, this will only prompt once: ``` exports.promptThreeTimes = prompt('prompt me once', 'shame on you') exports.promptThreeTimes = prompt('prompt me twice', 'um....') exports.promptThreeTimes = prompt('you cant prompt me again') ``` ### Isn't this exactly the sort of 'looks sync' that you said was bad about other libraries? Yeah, sorta. I wouldn't use promzard for anything more complicated than a wizard that spits out prompts to set up a config file or something. Maybe there are other use cases I haven't considered. ��������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/index.js����������000644 �000766 �000024 �00000000432 12455173731 036215� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var pz = require('../promzard') var path = require('path') var file = path.resolve(__dirname, 'substack-input.js') var ctx = { basename: path.basename(path.dirname(file)) } pz(file, ctx, function (er, res) { if (er) throw er console.error(JSON.stringify(res, null, 2)) }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/���������000755 �000766 �000024 �00000000000 12456115117 036277� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/substack-input.js�000644 �000766 �000024 �00000003413 12455173731 040064� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������module.exports = { "name" : basename.replace(/^node-/, ''), "version" : "0.0.0", "description" : (function (cb) { var fs = require('fs'); var value; try { var src = fs.readFileSync('README.markdown', 'utf8'); value = src.split('\n').filter(function (line) { return /\s+/.test(line) && line.trim() !== basename.replace(/^node-/, '') ; })[0] .trim() .replace(/^./, function (c) { return c.toLowerCase() }) .replace(/\.$/, '') ; } catch (e) {} return prompt('description', value); })(), "main" : prompt('entry point', 'index.js'), "bin" : function (cb) { var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; exists('bin/cmd.js', function (ex) { var bin if (ex) { var bin = {} bin[basename.replace(/^node-/, '')] = 'bin/cmd.js' } cb(null, bin); }); }, "directories" : { "example" : "example", "test" : "test" }, "dependencies" : {}, "devDependencies" : { "tap" : "~0.2.5" }, "scripts" : { "test" : "tap test/*.js" }, "repository" : { "type" : "git", "url" : "git://github.com/substack/" + basename + ".git" }, "homepage" : "https://github.com/substack/" + basename, "keywords" : prompt(function (s) { return s.split(/\s+/) }), "author" : { "name" : "James Halliday", "email" : "mail@substack.net", "url" : "http://substack.net" }, "license" : "MIT", "engine" : { "node" : ">=0.6" } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/init-input.js000644 �000766 �000024 �00000013652 12455173731 040751� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path'); module.exports = { "name" : prompt('name', typeof name === 'undefined' ? basename.replace(/^node-|[.-]js$/g, ''): name), "version" : prompt('version', typeof version !== "undefined" ? version : '0.0.0'), "description" : (function () { if (typeof description !== 'undefined' && description) { return description } var value; try { var src = fs.readFileSync('README.md', 'utf8'); value = src.split('\n').filter(function (line) { return /\s+/.test(line) && line.trim() !== basename.replace(/^node-/, '') && !line.trim().match(/^#/) ; })[0] .trim() .replace(/^./, function (c) { return c.toLowerCase() }) .replace(/\.$/, '') ; } catch (e) { try { // Wouldn't it be nice if that file mattered? var d = fs.readFileSync('.git/description', 'utf8') } catch (e) {} if (d.trim() && !value) value = d } return prompt('description', value); })(), "main" : (function () { var f try { f = fs.readdirSync(dirname).filter(function (f) { return f.match(/\.js$/) }) if (f.indexOf('index.js') !== -1) f = 'index.js' else if (f.indexOf('main.js') !== -1) f = 'main.js' else if (f.indexOf(basename + '.js') !== -1) f = basename + '.js' else f = f[0] } catch (e) {} return prompt('entry point', f || 'index.js') })(), "bin" : function (cb) { fs.readdir(dirname + '/bin', function (er, d) { // no bins if (er) return cb() // just take the first js file we find there, or nada return cb(null, d.filter(function (f) { return f.match(/\.js$/) })[0]) }) }, "directories" : function (cb) { fs.readdir('.', function (er, dirs) { if (er) return cb(er) var res = {} dirs.forEach(function (d) { switch (d) { case 'example': case 'examples': return res.example = d case 'test': case 'tests': return res.test = d case 'doc': case 'docs': return res.doc = d case 'man': return res.man = d } }) if (Object.keys(res).length === 0) res = undefined return cb(null, res) }) }, "dependencies" : typeof dependencies !== 'undefined' ? dependencies : function (cb) { fs.readdir('node_modules', function (er, dir) { if (er) return cb() var deps = {} var n = dir.length dir.forEach(function (d) { if (d.match(/^\./)) return next() if (d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)) return next() fs.readFile('node_modules/' + d + '/package.json', function (er, p) { if (er) return next() try { p = JSON.parse(p) } catch (e) { return next() } if (!p.version) return next() deps[d] = '~' + p.version return next() }) }) function next () { if (--n === 0) return cb(null, deps) } }) }, "devDependencies" : typeof devDependencies !== 'undefined' ? devDependencies : function (cb) { // same as dependencies but for dev deps fs.readdir('node_modules', function (er, dir) { if (er) return cb() var deps = {} var n = dir.length dir.forEach(function (d) { if (d.match(/^\./)) return next() if (!d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)) return next() fs.readFile('node_modules/' + d + '/package.json', function (er, p) { if (er) return next() try { p = JSON.parse(p) } catch (e) { return next() } if (!p.version) return next() deps[d] = '~' + p.version return next() }) }) function next () { if (--n === 0) return cb(null, deps) } }) }, "scripts" : (function () { // check to see what framework is in use, if any try { var d = fs.readdirSync('node_modules') } catch (e) { d = [] } var s = typeof scripts === 'undefined' ? {} : scripts if (d.indexOf('coffee-script') !== -1) s.prepublish = prompt('build command', s.prepublish || 'coffee src/*.coffee -o lib') var notest = 'echo "Error: no test specified" && exit 1' function tx (test) { return test || notest } if (!s.test || s.test === notest) { if (d.indexOf('tap') !== -1) s.test = prompt('test command', 'tap test/*.js', tx) else if (d.indexOf('expresso') !== -1) s.test = prompt('test command', 'expresso test', tx) else if (d.indexOf('mocha') !== -1) s.test = prompt('test command', 'mocha', tx) else s.test = prompt('test command', tx) } return s })(), "repository" : (function () { try { var gconf = fs.readFileSync('.git/config') } catch (e) { gconf = null } if (gconf) { gconf = gconf.split(/\r?\n/) var i = gconf.indexOf('[remote "origin"]') if (i !== -1) { var u = gconf[i + 1] if (!u.match(/^\s*url =/)) u = gconf[i + 2] if (!u.match(/^\s*url =/)) u = null else u = u.replace(/^\s*url = /, '') } if (u && u.match(/^git@github.com:/)) u = u.replace(/^git@github.com:/, 'git://github.com/') } return prompt('git repository', u) })(), "keywords" : prompt(function (s) { if (!s) return undefined if (Array.isArray(s)) s = s.join(' ') if (typeof s !== 'string') return s return s.split(/[\s,]+/) }), "author" : config['init.author.name'] ? { "name" : config['init.author.name'], "email" : config['init.author.email'], "url" : config['init.author.url'] } : undefined, "license" : prompt('license', 'BSD') } ��������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/init.js��000644 �000766 �000024 �00000002010 12455173731 037576� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var PZ = require('../../promzard').PromZard var path = require('path') var input = path.resolve(__dirname, 'init-input.js') var fs = require('fs') var package = path.resolve(__dirname, 'package.json') var pkg fs.readFile(package, 'utf8', function (er, d) { if (er) ctx = {} try { ctx = JSON.parse(d); pkg = JSON.parse(d) } catch (e) { ctx = {} } ctx.dirname = path.dirname(package) ctx.basename = path.basename(ctx.dirname) if (!ctx.version) ctx.version = undefined // this should be replaced with the npm conf object ctx.config = {} console.error('ctx=', ctx) var pz = new PZ(input, ctx) pz.on('data', function (data) { console.error('pz data', data) if (!pkg) pkg = {} Object.keys(data).forEach(function (k) { if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k] }) console.error('package data %s', JSON.stringify(data, null, 2)) fs.writeFile(package, JSON.stringify(pkg, null, 2), function (er) { if (er) throw er console.log('ok') }) }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/package.json�000644 �000766 �000024 �00000000263 12455173731 040573� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-init", "version": "0.0.0", "description": "an initter you init wit, innit?", "main": "index.js", "scripts": { "test": "asdf" }, "license": "BSD" }���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/node_modules/promzard/example/npm-init/README.md000644 �000766 �000024 �00000000234 12455173731 037562� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# npm-init An initter you init wit, innit? ## More stuff here Blerp derp herp lerg borgle pop munch efemerate baz foo a gandt synergy jorka chatt slurm. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/example/example-basic.js�000644 �000766 �000024 �00000000367 12455173731 033373� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var init = require('../init-package-json.js') var path = require('path') var dir = process.cwd() var initFile = require.resolve('./init/basic-init.js') init(dir, initFile, function (err, data) { if (!err) console.log('written successfully') }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/example/example-default.js����������������������000644 �000766 �000024 �00000000322 12455173731 033646� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var init = require('../init-package-json.js') var path = require('path') var dir = process.cwd() init(dir, 'file that does not exist', function (err, data) { if (!err) console.log('written successfully') }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/example/example-npm.js���000644 �000766 �000024 �00000000477 12455173731 033106� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var init = require('../init-package-json.js') var path = require('path') var dir = process.cwd() var npm = require('npm') npm.load(function (er, npm) { if (er) throw er init(dir, npm.config.get('init-module'), npm.config, function (er, data) { if (er) throw er console.log('written successfully') }) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/init-package-json/example/init/������������000755 �000766 �000024 �00000000000 12456115117 031253� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/init-package-json/example/init/basic-init.js����������������������000644 �000766 �000024 �00000000135 12455173731 033560� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������exports.flavor = prompt("what's your favorite flavor of ice cream buddy?", "I LIKE THEM ALL")�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ini/.npmignore�����������������������������000644 �000766 �000024 �00000000015 12455173731 026131� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ini/ini.js���������������������������������000644 �000766 �000024 �00000011320 12455173731 025250� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ exports.parse = exports.decode = decode exports.stringify = exports.encode = encode exports.safe = safe exports.unsafe = unsafe var eol = process.platform === "win32" ? "\r\n" : "\n" function encode (obj, opt) { var children = [] , out = "" if (typeof opt === "string") { opt = { section: opt, whitespace: false } } else { opt = opt || {} opt.whitespace = opt.whitespace === true } var separator = opt.whitespace ? " = " : "=" Object.keys(obj).forEach(function (k, _, __) { var val = obj[k] if (val && Array.isArray(val)) { val.forEach(function(item) { out += safe(k + "[]") + separator + safe(item) + "\n" }) } else if (val && typeof val === "object") { children.push(k) } else { out += safe(k) + separator + safe(val) + eol } }) if (opt.section && out.length) { out = "[" + safe(opt.section) + "]" + eol + out } children.forEach(function (k, _, __) { var nk = dotSplit(k).join('\\.') var section = (opt.section ? opt.section + "." : "") + nk var child = encode(obj[k], { section: section, whitespace: opt.whitespace }) if (out.length && child.length) { out += eol } out += child }) return out } function dotSplit (str) { return str.replace(/\1/g, '\u0002LITERAL\\1LITERAL\u0002') .replace(/\\\./g, '\u0001') .split(/\./).map(function (part) { return part.replace(/\1/g, '\\.') .replace(/\2LITERAL\\1LITERAL\2/g, '\u0001') }) } function decode (str) { var out = {} , p = out , section = null , state = "START" // section |key = value , re = /^\[([^\]]*)\]$|^([^=]+)(=(.*))?$/i , lines = str.split(/[\r\n]+/g) , section = null lines.forEach(function (line, _, __) { if (!line || line.match(/^\s*[;#]/)) return var match = line.match(re) if (!match) return if (match[1] !== undefined) { section = unsafe(match[1]) p = out[section] = out[section] || {} return } var key = unsafe(match[2]) , value = match[3] ? unsafe((match[4] || "")) : true switch (value) { case 'true': case 'false': case 'null': value = JSON.parse(value) } // Convert keys with '[]' suffix to an array if (key.length > 2 && key.slice(-2) === "[]") { key = key.substring(0, key.length - 2) if (!p[key]) { p[key] = [] } else if (!Array.isArray(p[key])) { p[key] = [p[key]] } } // safeguard against resetting a previously defined // array by accidentally forgetting the brackets if (Array.isArray(p[key])) { p[key].push(value) } else { p[key] = value } }) // {a:{y:1},"a.b":{x:2}} --> {a:{y:1,b:{x:2}}} // use a filter to return the keys that have to be deleted. Object.keys(out).filter(function (k, _, __) { if (!out[k] || typeof out[k] !== "object" || Array.isArray(out[k])) return false // see if the parent section is also an object. // if so, add it to that, and mark this one for deletion var parts = dotSplit(k) , p = out , l = parts.pop() , nl = l.replace(/\\\./g, '.') parts.forEach(function (part, _, __) { if (!p[part] || typeof p[part] !== "object") p[part] = {} p = p[part] }) if (p === out && nl === l) return false p[nl] = out[k] return true }).forEach(function (del, _, __) { delete out[del] }) return out } function isQuoted (val) { return (val.charAt(0) === "\"" && val.slice(-1) === "\"") || (val.charAt(0) === "'" && val.slice(-1) === "'") } function safe (val) { return ( typeof val !== "string" || val.match(/[\r\n]/) || val.match(/^\[/) || (val.length > 1 && isQuoted(val)) || val !== val.trim() ) ? JSON.stringify(val) : val.replace(/;/g, '\\;').replace(/#/g, "\\#") } function unsafe (val, doUnesc) { val = (val || "").trim() if (isQuoted(val)) { // remove the single quotes before calling JSON.parse if (val.charAt(0) === "'") { val = val.substr(1, val.length - 2); } try { val = JSON.parse(val) } catch (_) {} } else { // walk the val to find the first not-escaped ; character var esc = false var unesc = ""; for (var i = 0, l = val.length; i < l; i++) { var c = val.charAt(i) if (esc) { if ("\\;#".indexOf(c) !== -1) unesc += c else unesc += "\\" + c esc = false } else if (";#".indexOf(c) !== -1) { break } else if (c === "\\") { esc = true } else { unesc += c } } if (esc) unesc += "\\" return unesc } return val } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ini/LICENSE��������������������������������000644 �000766 �000024 �00000001375 12455173731 025151� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ini/package.json���������������������������000644 �000766 �000024 �00000002261 12455173731 026425� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "ini", "description": "An ini encoder/decoder for node", "version": "1.3.2", "repository": { "type": "git", "url": "git://github.com/isaacs/ini.git" }, "main": "ini.js", "scripts": { "test": "tap test/*.js" }, "engines": { "node": "*" }, "dependencies": {}, "devDependencies": { "tap": "~0.4.0" }, "license": "ISC", "gitHead": "bbe4a8bb09afa58f724c04ce43a49037cabeadfb", "bugs": { "url": "https://github.com/isaacs/ini/issues" }, "homepage": "https://github.com/isaacs/ini", "_id": "ini@1.3.2", "_shasum": "9ebf4a44daf9d89acd07aab9f89a083d887f6dec", "_from": "ini@>=1.3.2 <1.4.0", "_npmVersion": "2.1.9", "_nodeVersion": "0.10.16", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "9ebf4a44daf9d89acd07aab9f89a083d887f6dec", "tarball": "http://registry.npmjs.org/ini/-/ini-1.3.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/ini/-/ini-1.3.2.tgz" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ini/README.md������������������������������000644 �000766 �000024 �00000005236 12455173731 025423� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������An ini format parser and serializer for node. Sections are treated as nested objects. Items before the first heading are saved on the object directly. ## Usage Consider an ini-file `config.ini` that looks like this: ; this comment is being ignored scope = global [database] user = dbuser password = dbpassword database = use_this_database [paths.default] datadir = /var/lib/data array[] = first value array[] = second value array[] = third value You can read, manipulate and write the ini-file like so: var fs = require('fs') , ini = require('ini') var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8')) config.scope = 'local' config.database.database = 'use_another_database' config.paths.default.tmpdir = '/tmp' delete config.paths.default.datadir config.paths.default.array.push('fourth value') fs.writeFileSync('./config_modified.ini', ini.stringify(config, { section: 'section' })) This will result in a file called `config_modified.ini` being written to the filesystem with the following content: [section] scope=local [section.database] user=dbuser password=dbpassword database=use_another_database [section.paths.default] tmpdir=/tmp array[]=first value array[]=second value array[]=third value array[]=fourth value ## API ### decode(inistring) Decode the ini-style formatted `inistring` into a nested object. ### parse(inistring) Alias for `decode(inistring)` ### encode(object, [options]) Encode the object `object` into an ini-style formatted string. If the optional parameter `section` is given, then all top-level properties of the object are put into this section and the `section`-string is prepended to all sub-sections, see the usage example above. The `options` object may contain the following: * `section` A string which will be the first `section` in the encoded ini data. Defaults to none. * `whitespace` Boolean to specify whether to put whitespace around the `=` character. By default, whitespace is omitted, to be friendly to some persnickety old parsers that don't tolerate it well. But some find that it's more human-readable and pretty with the whitespace. For backwards compatibility reasons, if a `string` options is passed in, then it is assumed to be the `section` value. ### stringify(object, [options]) Alias for `encode(object, [options])` ### safe(val) Escapes the string `val` such that it is safe to be used as a key or value in an ini-file. Basically escapes quotes. For example ini.safe('"unsafe string"') would result in "\"unsafe string\"" ### unsafe(val) Unescapes the string `val` ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/inherits.js�����������������������000644 �000766 �000024 �00000000052 12455173731 027364� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require('util').inherits ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/inherits_browser.js���������������000644 �000766 �000024 �00000001240 12455173731 031127� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������if (typeof Object.create === 'function') { // implementation from standard node.js 'util' module module.exports = function inherits(ctor, superCtor) { ctor.super_ = superCtor ctor.prototype = Object.create(superCtor.prototype, { constructor: { value: ctor, enumerable: false, writable: true, configurable: true } }); }; } else { // old school shim for old browsers module.exports = function inherits(ctor, superCtor) { ctor.super_ = superCtor var TempCtor = function () {} TempCtor.prototype = superCtor.prototype ctor.prototype = new TempCtor() ctor.prototype.constructor = ctor } } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/LICENSE���������������������������000644 �000766 �000024 �00000001355 12455173731 026215� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/package.json����������������������000644 �000766 �000024 �00000005510 12455173731 027473� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "inherits", "description": "Browser-friendly inheritance fully compatible with standard node.js inherits()", "version": "2.0.1", "keywords": [ "inheritance", "class", "klass", "oop", "object-oriented", "inherits", "browser", "browserify" ], "main": "./inherits.js", "browser": "./inherits_browser.js", "repository": { "type": "git", "url": "git://github.com/isaacs/inherits" }, "license": "ISC", "scripts": { "test": "node test" }, "readme": "Browser-friendly inheritance fully compatible with standard node.js\n[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor).\n\nThis package exports standard `inherits` from node.js `util` module in\nnode environment, but also provides alternative browser-friendly\nimplementation through [browser\nfield](https://gist.github.com/shtylman/4339901). Alternative\nimplementation is a literal copy of standard one located in standalone\nmodule to avoid requiring of `util`. It also has a shim for old\nbrowsers with no `Object.create` support.\n\nWhile keeping you sure you are using standard `inherits`\nimplementation in node.js environment, it allows bundlers such as\n[browserify](https://github.com/substack/node-browserify) to not\ninclude full `util` package to your client code if all you need is\njust `inherits` function. It worth, because browser shim for `util`\npackage is large and `inherits` is often the single function you need\nfrom it.\n\nIt's recommended to use this package instead of\n`require('util').inherits` for any code that has chances to be used\nnot only in node.js but in browser too.\n\n## usage\n\n```js\nvar inherits = require('inherits');\n// then use exactly as the standard one\n```\n\n## note on version ~1.0\n\nVersion ~1.0 had completely different motivation and is not compatible\nneither with 2.0 nor with standard node.js `inherits`.\n\nIf you are using version ~1.0 and planning to switch to ~2.0, be\ncareful:\n\n* new version uses `super_` instead of `super` for referencing\n superclass\n* new version overwrites current prototype while old one preserves any\n existing fields on it\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/inherits/issues" }, "_id": "inherits@2.0.1", "dist": { "shasum": "b17d08d326b4423e568eff719f91b0b1cbdf69f1", "tarball": "http://registry.npmjs.org/inherits/-/inherits-2.0.1.tgz" }, "_from": "inherits@latest", "_npmVersion": "1.3.8", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "b17d08d326b4423e568eff719f91b0b1cbdf69f1", "_resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.1.tgz", "homepage": "https://github.com/isaacs/inherits" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/README.md�������������������������000644 �000766 �000024 �00000003131 12455173731 026461� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Browser-friendly inheritance fully compatible with standard node.js [inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor). This package exports standard `inherits` from node.js `util` module in node environment, but also provides alternative browser-friendly implementation through [browser field](https://gist.github.com/shtylman/4339901). Alternative implementation is a literal copy of standard one located in standalone module to avoid requiring of `util`. It also has a shim for old browsers with no `Object.create` support. While keeping you sure you are using standard `inherits` implementation in node.js environment, it allows bundlers such as [browserify](https://github.com/substack/node-browserify) to not include full `util` package to your client code if all you need is just `inherits` function. It worth, because browser shim for `util` package is large and `inherits` is often the single function you need from it. It's recommended to use this package instead of `require('util').inherits` for any code that has chances to be used not only in node.js but in browser too. ## usage ```js var inherits = require('inherits'); // then use exactly as the standard one ``` ## note on version ~1.0 Version ~1.0 had completely different motivation and is not compatible neither with 2.0 nor with standard node.js `inherits`. If you are using version ~1.0 and planning to switch to ~2.0, be careful: * new version uses `super_` instead of `super` for referencing superclass * new version overwrites current prototype while old one preserves any existing fields on it ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inherits/test.js���������������������������000644 �000766 �000024 �00000000776 12455173731 026533� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var inherits = require('./inherits.js') var assert = require('assert') function test(c) { assert(c.constructor === Child) assert(c.constructor.super_ === Parent) assert(Object.getPrototypeOf(c) === Child.prototype) assert(Object.getPrototypeOf(Object.getPrototypeOf(c)) === Parent.prototype) assert(c instanceof Child) assert(c instanceof Parent) } function Child() { Parent.call(this) test(this) } function Parent() {} inherits(Child, Parent) var c = new Child test(c) console.log('ok') ��iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/.eslintrc�������������������������000644 �000766 �000024 �00000000557 12455173731 027016� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "env" : { "node" : true }, "rules" : { "semi": [2, "never"], "strict": 0, "quotes": [1, "single", "avoid-escape"], "no-use-before-define": 0, "curly": 0, "no-underscore-dangle": 0, "no-lonely-if": 1, "no-unused-vars": [2, {"vars" : "all", "args" : "after-used"}], "no-mixed-requires": 0, "space-infix-ops": 0 } } �������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/inflight.js�����������������������000644 �000766 �000024 �00000001601 12455173731 027323� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var wrappy = require('wrappy') var reqs = Object.create(null) var once = require('once') module.exports = wrappy(inflight) function inflight (key, cb) { if (reqs[key]) { reqs[key].push(cb) return null } else { reqs[key] = [cb] return makeres(key) } } function makeres (key) { return once(function RES () { var cbs = reqs[key] var len = cbs.length var args = slice(arguments) for (var i = 0; i < len; i++) { cbs[i].apply(null, args) } if (cbs.length > len) { // added more in the interim. // de-zalgo, just in case, but don't call again. cbs.splice(0, len) process.nextTick(function () { RES.apply(null, args) }) } else { delete reqs[key] } }) } function slice (args) { var length = args.length var array = [] for (var i = 0; i < length; i++) array[i] = args[i] return array } �������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/LICENSE���������������������������000644 �000766 �000024 �00000001354 12455173731 026173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/package.json����������������������000644 �000766 �000024 �00000003572 12455173731 027460� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "inflight", "version": "1.0.4", "description": "Add callbacks to requests in flight to avoid async duplication", "main": "inflight.js", "dependencies": { "once": "^1.3.0", "wrappy": "1" }, "devDependencies": { "tap": "^0.4.10" }, "scripts": { "test": "tap test.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/inflight" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/isaacs/inflight/issues" }, "homepage": "https://github.com/isaacs/inflight", "license": "ISC", "readme": "# inflight\n\nAdd callbacks to requests in flight to avoid async duplication\n\n## USAGE\n\n```javascript\nvar inflight = require('inflight')\n\n// some request that does some stuff\nfunction req(key, callback) {\n // key is any random string. like a url or filename or whatever.\n //\n // will return either a falsey value, indicating that the\n // request for this key is already in flight, or a new callback\n // which when called will call all callbacks passed to inflightk\n // with the same key\n callback = inflight(key, callback)\n\n // If we got a falsey value back, then there's already a req going\n if (!callback) return\n\n // this is where you'd fetch the url or whatever\n // callback is also once()-ified, so it can safely be assigned\n // to multiple events etc. First call wins.\n setTimeout(function() {\n callback(null, key)\n }, 100)\n}\n\n// only assigns a single setTimeout\n// when it dings, all cbs get called\nreq('foo', cb1)\nreq('foo', cb2)\nreq('foo', cb3)\nreq('foo', cb4)\n```\n", "readmeFilename": "README.md", "gitHead": "c7b5531d572a867064d4a1da9e013e8910b7d1ba", "_id": "inflight@1.0.4", "_shasum": "6cbb4521ebd51ce0ec0a936bfd7657ef7e9b172a", "_from": "inflight@>=1.0.4 <1.1.0" } ��������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/README.md�������������������������000644 �000766 �000024 �00000001737 12455173731 026452� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# inflight Add callbacks to requests in flight to avoid async duplication ## USAGE ```javascript var inflight = require('inflight') // some request that does some stuff function req(key, callback) { // key is any random string. like a url or filename or whatever. // // will return either a falsey value, indicating that the // request for this key is already in flight, or a new callback // which when called will call all callbacks passed to inflightk // with the same key callback = inflight(key, callback) // If we got a falsey value back, then there's already a req going if (!callback) return // this is where you'd fetch the url or whatever // callback is also once()-ified, so it can safely be assigned // to multiple events etc. First call wins. setTimeout(function() { callback(null, key) }, 100) } // only assigns a single setTimeout // when it dings, all cbs get called req('foo', cb1) req('foo', cb2) req('foo', cb3) req('foo', cb4) ``` ���������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/inflight/test.js���������������������������000644 �000766 �000024 �00000003373 12455173731 026506� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var inf = require('./inflight.js') function req (key, cb) { cb = inf(key, cb) if (cb) setTimeout(function () { cb(key) cb(key) }) return cb } test('basic', function (t) { var calleda = false var a = req('key', function (k) { t.notOk(calleda) calleda = true t.equal(k, 'key') if (calledb) t.end() }) t.ok(a, 'first returned cb function') var calledb = false var b = req('key', function (k) { t.notOk(calledb) calledb = true t.equal(k, 'key') if (calleda) t.end() }) t.notOk(b, 'second should get falsey inflight response') }) test('timing', function (t) { var expect = [ 'method one', 'start one', 'end one', 'two', 'tick', 'three' ] var i = 0 function log (m) { t.equal(m, expect[i], m + ' === ' + expect[i]) ++i if (i === expect.length) t.end() } function method (name, cb) { log('method ' + name) process.nextTick(cb) } var one = inf('foo', function () { log('start one') var three = inf('foo', function () { log('three') }) if (three) method('three', three) log('end one') }) method('one', one) var two = inf('foo', function () { log('two') }) if (two) method('one', two) process.nextTick(log.bind(null, 'tick')) }) test('parameters', function (t) { t.plan(8) var a = inf('key', function (first, second, third) { t.equal(first, 1) t.equal(second, 2) t.equal(third, 3) }) t.ok(a, 'first returned cb function') var b = inf('key', function (first, second, third) { t.equal(first, 1) t.equal(second, 2) t.equal(third, 3) }) t.notOk(b, 'second should get falsey inflight response') setTimeout(function () { a(1, 2, 3) }) }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/.npmignore���������������������000644 �000766 �000024 �00000000016 12455173731 027551� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/ ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/fs.js��������������������������000644 �000766 �000024 �00000000572 12455173731 026527� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// eeeeeevvvvviiiiiiillllll // more evil than monkey-patching the native builtin? // Not sure. var mod = require("module") var pre = '(function (exports, require, module, __filename, __dirname) { ' var post = '});' var src = pre + process.binding('natives').fs + post var vm = require('vm') var fn = vm.runInThisContext(src) fn(exports, require, module, __filename, __dirname) ��������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/graceful-fs.js�����������������000644 �000766 �000024 �00000006005 12455173731 030312� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Monkey-patching the fs module. // It's ugly, but there is simply no other way to do this. var fs = module.exports = require('./fs.js') var assert = require('assert') // fix up some busted stuff, mostly on windows and old nodes require('./polyfills.js') var util = require('util') function noop () {} var debug = noop if (util.debuglog) debug = util.debuglog('gfs') else if (/\bgfs\b/i.test(process.env.NODE_DEBUG || '')) debug = function() { var m = util.format.apply(util, arguments) m = 'GFS: ' + m.split(/\n/).join('\nGFS: ') console.error(m) } if (/\bgfs\b/i.test(process.env.NODE_DEBUG || '')) { process.on('exit', function() { debug('fds', fds) debug(queue) assert.equal(queue.length, 0) }) } var originalOpen = fs.open fs.open = open function open(path, flags, mode, cb) { if (typeof mode === "function") cb = mode, mode = null if (typeof cb !== "function") cb = noop new OpenReq(path, flags, mode, cb) } function OpenReq(path, flags, mode, cb) { this.path = path this.flags = flags this.mode = mode this.cb = cb Req.call(this) } util.inherits(OpenReq, Req) OpenReq.prototype.process = function() { originalOpen.call(fs, this.path, this.flags, this.mode, this.done) } var fds = {} OpenReq.prototype.done = function(er, fd) { debug('open done', er, fd) if (fd) fds['fd' + fd] = this.path Req.prototype.done.call(this, er, fd) } var originalReaddir = fs.readdir fs.readdir = readdir function readdir(path, cb) { if (typeof cb !== "function") cb = noop new ReaddirReq(path, cb) } function ReaddirReq(path, cb) { this.path = path this.cb = cb Req.call(this) } util.inherits(ReaddirReq, Req) ReaddirReq.prototype.process = function() { originalReaddir.call(fs, this.path, this.done) } ReaddirReq.prototype.done = function(er, files) { if (files && files.sort) files = files.sort() Req.prototype.done.call(this, er, files) onclose() } var originalClose = fs.close fs.close = close function close (fd, cb) { debug('close', fd) if (typeof cb !== "function") cb = noop delete fds['fd' + fd] originalClose.call(fs, fd, function(er) { onclose() cb(er) }) } var originalCloseSync = fs.closeSync fs.closeSync = closeSync function closeSync (fd) { try { return originalCloseSync(fd) } finally { onclose() } } // Req class function Req () { // start processing this.done = this.done.bind(this) this.failures = 0 this.process() } Req.prototype.done = function (er, result) { var tryAgain = false if (er) { var code = er.code var tryAgain = code === "EMFILE" if (process.platform === "win32") tryAgain = tryAgain || code === "OK" } if (tryAgain) { this.failures ++ enqueue(this) } else { var cb = this.cb cb(er, result) } } var queue = [] function enqueue(req) { queue.push(req) debug('enqueue %d %s', queue.length, req.constructor.name, req) } function onclose() { var req = queue.shift() if (req) { debug('process', req.constructor.name, req) req.process() } } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/LICENSE������������������������000644 �000766 �000024 �00000002436 12455173731 026567� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/package.json�������������������000644 �000766 �000024 �00000003152 12455173731 030044� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "graceful-fs", "description": "A drop-in replacement for fs, making various improvements.", "version": "3.0.5", "repository": { "type": "git", "url": "git://github.com/isaacs/node-graceful-fs.git" }, "main": "graceful-fs.js", "engines": { "node": ">=0.4.0" }, "directories": { "test": "test" }, "scripts": { "test": "tap test/*.js" }, "keywords": [ "fs", "module", "reading", "retry", "retries", "queue", "error", "errors", "handling", "EMFILE", "EAGAIN", "EINVAL", "EPERM", "EACCESS" ], "license": "BSD", "devDependencies": { "mkdirp": "^0.5.0", "rimraf": "^2.2.8", "tap": "^0.4.13" }, "gitHead": "a6cd37cff01ac3af8d0ab2fd180290538fabd326", "bugs": { "url": "https://github.com/isaacs/node-graceful-fs/issues" }, "homepage": "https://github.com/isaacs/node-graceful-fs", "_id": "graceful-fs@3.0.5", "_shasum": "4a880474bdeb716fe3278cf29792dec38dfac418", "_from": "graceful-fs@>=3.0.5 <3.1.0", "_npmVersion": "2.1.9", "_nodeVersion": "0.10.16", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "4a880474bdeb716fe3278cf29792dec38dfac418", "tarball": "http://registry.npmjs.org/graceful-fs/-/graceful-fs-3.0.5.tgz" }, "_resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-3.0.5.tgz", "readme": "ERROR: No README data found!" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/polyfills.js�������������������000644 �000766 �000024 �00000014526 12455173731 030140� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('./fs.js') var constants = require('constants') var origCwd = process.cwd var cwd = null process.cwd = function() { if (!cwd) cwd = origCwd.call(process) return cwd } var chdir = process.chdir process.chdir = function(d) { cwd = null chdir.call(process, d) } // (re-)implement some things that are known busted or missing. // lchmod, broken prior to 0.6.2 // back-port the fix here. if (constants.hasOwnProperty('O_SYMLINK') && process.version.match(/^v0\.6\.[0-2]|^v0\.5\./)) { fs.lchmod = function (path, mode, callback) { callback = callback || noop fs.open( path , constants.O_WRONLY | constants.O_SYMLINK , mode , function (err, fd) { if (err) { callback(err) return } // prefer to return the chmod error, if one occurs, // but still try to close, and report closing errors if they occur. fs.fchmod(fd, mode, function (err) { fs.close(fd, function(err2) { callback(err || err2) }) }) }) } fs.lchmodSync = function (path, mode) { var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK, mode) // prefer to return the chmod error, if one occurs, // but still try to close, and report closing errors if they occur. var err, err2 try { var ret = fs.fchmodSync(fd, mode) } catch (er) { err = er } try { fs.closeSync(fd) } catch (er) { err2 = er } if (err || err2) throw (err || err2) return ret } } // lutimes implementation, or no-op if (!fs.lutimes) { if (constants.hasOwnProperty("O_SYMLINK")) { fs.lutimes = function (path, at, mt, cb) { fs.open(path, constants.O_SYMLINK, function (er, fd) { cb = cb || noop if (er) return cb(er) fs.futimes(fd, at, mt, function (er) { fs.close(fd, function (er2) { return cb(er || er2) }) }) }) } fs.lutimesSync = function (path, at, mt) { var fd = fs.openSync(path, constants.O_SYMLINK) , err , err2 , ret try { var ret = fs.futimesSync(fd, at, mt) } catch (er) { err = er } try { fs.closeSync(fd) } catch (er) { err2 = er } if (err || err2) throw (err || err2) return ret } } else if (fs.utimensat && constants.hasOwnProperty("AT_SYMLINK_NOFOLLOW")) { // maybe utimensat will be bound soonish? fs.lutimes = function (path, at, mt, cb) { fs.utimensat(path, at, mt, constants.AT_SYMLINK_NOFOLLOW, cb) } fs.lutimesSync = function (path, at, mt) { return fs.utimensatSync(path, at, mt, constants.AT_SYMLINK_NOFOLLOW) } } else { fs.lutimes = function (_a, _b, _c, cb) { process.nextTick(cb) } fs.lutimesSync = function () {} } } // https://github.com/isaacs/node-graceful-fs/issues/4 // Chown should not fail on einval or eperm if non-root. // It should not fail on enosys ever, as this just indicates // that a fs doesn't support the intended operation. fs.chown = chownFix(fs.chown) fs.fchown = chownFix(fs.fchown) fs.lchown = chownFix(fs.lchown) fs.chmod = chownFix(fs.chmod) fs.fchmod = chownFix(fs.fchmod) fs.lchmod = chownFix(fs.lchmod) fs.chownSync = chownFixSync(fs.chownSync) fs.fchownSync = chownFixSync(fs.fchownSync) fs.lchownSync = chownFixSync(fs.lchownSync) fs.chmodSync = chownFix(fs.chmodSync) fs.fchmodSync = chownFix(fs.fchmodSync) fs.lchmodSync = chownFix(fs.lchmodSync) function chownFix (orig) { if (!orig) return orig return function (target, uid, gid, cb) { return orig.call(fs, target, uid, gid, function (er, res) { if (chownErOk(er)) er = null cb(er, res) }) } } function chownFixSync (orig) { if (!orig) return orig return function (target, uid, gid) { try { return orig.call(fs, target, uid, gid) } catch (er) { if (!chownErOk(er)) throw er } } } // ENOSYS means that the fs doesn't support the op. Just ignore // that, because it doesn't matter. // // if there's no getuid, or if getuid() is something other // than 0, and the error is EINVAL or EPERM, then just ignore // it. // // This specific case is a silent failure in cp, install, tar, // and most other unix tools that manage permissions. // // When running as root, or if other types of errors are // encountered, then it's strict. function chownErOk (er) { if (!er) return true if (er.code === "ENOSYS") return true var nonroot = !process.getuid || process.getuid() !== 0 if (nonroot) { if (er.code === "EINVAL" || er.code === "EPERM") return true } return false } // if lchmod/lchown do not exist, then make them no-ops if (!fs.lchmod) { fs.lchmod = function (path, mode, cb) { process.nextTick(cb) } fs.lchmodSync = function () {} } if (!fs.lchown) { fs.lchown = function (path, uid, gid, cb) { process.nextTick(cb) } fs.lchownSync = function () {} } // on Windows, A/V software can lock the directory, causing this // to fail with an EACCES or EPERM if the directory contains newly // created files. Try again on failure, for up to 1 second. if (process.platform === "win32") { var rename_ = fs.rename fs.rename = function rename (from, to, cb) { var start = Date.now() rename_(from, to, function CB (er) { if (er && (er.code === "EACCES" || er.code === "EPERM") && Date.now() - start < 1000) { return rename_(from, to, CB) } cb(er) }) } } // if read() returns EAGAIN, then just try it again. var read = fs.read fs.read = function (fd, buffer, offset, length, position, callback_) { var callback if (callback_ && typeof callback_ === 'function') { var eagCounter = 0 callback = function (er, _, __) { if (er && er.code === 'EAGAIN' && eagCounter < 10) { eagCounter ++ return read.call(fs, fd, buffer, offset, length, position, callback) } callback_.apply(this, arguments) } } return read.call(fs, fd, buffer, offset, length, position, callback) } var readSync = fs.readSync fs.readSync = function (fd, buffer, offset, length, position) { var eagCounter = 0 while (true) { try { return readSync.call(fs, fd, buffer, offset, length, position) } catch (er) { if (er.code === 'EAGAIN' && eagCounter < 10) { eagCounter ++ continue } throw er } } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/graceful-fs/README.md����������������������000644 �000766 �000024 �00000002162 12455173731 027035� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# graceful-fs graceful-fs functions as a drop-in replacement for the fs module, making various improvements. The improvements are meant to normalize behavior across different platforms and environments, and to make filesystem access more resilient to errors. ## Improvements over [fs module](http://api.nodejs.org/fs.html) graceful-fs: * Queues up `open` and `readdir` calls, and retries them once something closes if there is an EMFILE error from too many file descriptors. * fixes `lchmod` for Node versions prior to 0.6.2. * implements `fs.lutimes` if possible. Otherwise it becomes a noop. * ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or `lchown` if the user isn't root. * makes `lchmod` and `lchown` become noops, if not available. * retries reading a file if `read` results in EAGAIN error. On Windows, it retries renaming a file for up to one second if `EACCESS` or `EPERM` error occurs, likely because antivirus software has locked the directory. ## USAGE ```javascript // use just like fs var fs = require('graceful-fs') // now go and do stuff with it... fs.readFileSync('some-file-or-whatever') ``` ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/common.js�����������������������������000644 �000766 �000024 �00000010240 12455173731 026125� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������exports.alphasort = alphasort exports.alphasorti = alphasorti exports.isAbsolute = process.platform === "win32" ? absWin : absUnix exports.setopts = setopts exports.ownProp = ownProp exports.makeAbs = makeAbs exports.finish = finish exports.mark = mark function ownProp (obj, field) { return Object.prototype.hasOwnProperty.call(obj, field) } var path = require("path") var minimatch = require("minimatch") var Minimatch = minimatch.Minimatch function absWin (p) { if (absUnix(p)) return true // pull off the device/UNC bit from a windows path. // from node's lib/path.js var splitDeviceRe = /^([a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/]+[^\\\/]+)?([\\\/])?([\s\S]*?)$/ var result = splitDeviceRe.exec(p) var device = result[1] || '' var isUnc = device && device.charAt(1) !== ':' var isAbsolute = !!result[2] || isUnc // UNC paths are always absolute return isAbsolute } function absUnix (p) { return p.charAt(0) === "/" || p === "" } function alphasorti (a, b) { return a.toLowerCase().localeCompare(b.toLowerCase()) } function alphasort (a, b) { return a.localeCompare(b) } function setopts (self, pattern, options) { if (!options) options = {} // base-matching: just use globstar for that. if (options.matchBase && -1 === pattern.indexOf("/")) { if (options.noglobstar) { throw new Error("base matching requires globstar") } pattern = "**/" + pattern } self.pattern = pattern self.strict = options.strict !== false self.dot = !!options.dot self.mark = !!options.mark self.nodir = !!options.nodir if (self.nodir) self.mark = true self.sync = !!options.sync self.nounique = !!options.nounique self.nonull = !!options.nonull self.nosort = !!options.nosort self.nocase = !!options.nocase self.stat = !!options.stat self.noprocess = !!options.noprocess self.maxLength = options.maxLength || Infinity self.cache = options.cache || Object.create(null) self.statCache = options.statCache || Object.create(null) self.symlinks = options.symlinks || Object.create(null) self.changedCwd = false var cwd = process.cwd() if (!ownProp(options, "cwd")) self.cwd = cwd else { self.cwd = options.cwd self.changedCwd = path.resolve(options.cwd) !== cwd } self.root = options.root || path.resolve(self.cwd, "/") self.root = path.resolve(self.root) if (process.platform === "win32") self.root = self.root.replace(/\\/g, "/") self.nomount = !!options.nomount self.minimatch = new Minimatch(pattern, options) self.options = self.minimatch.options } function finish (self) { var nou = self.nounique var all = nou ? [] : Object.create(null) for (var i = 0, l = self.matches.length; i < l; i ++) { var matches = self.matches[i] if (!matches) { if (self.nonull) { // do like the shell, and spit out the literal glob var literal = self.minimatch.globSet[i] if (nou) all.push(literal) else all[literal] = true } } else { // had matches var m = Object.keys(matches) if (nou) all.push.apply(all, m) else m.forEach(function (m) { all[m] = true }) } } if (!nou) all = Object.keys(all) if (!self.nosort) all = all.sort(self.nocase ? alphasorti : alphasort) // at *some* point we statted all of these if (self.mark) { for (var i = 0; i < all.length; i++) { all[i] = self._mark(all[i]) } if (self.nodir) { all = all.filter(function (e) { return !(/\/$/.test(e)) }) } } self.found = all } function mark (self, p) { var c = self.cache[p] var m = p if (c) { var isDir = c === 'DIR' || Array.isArray(c) var slash = p.slice(-1) === '/' if (isDir && !slash) m += '/' else if (!isDir && slash) m = m.slice(0, -1) if (m !== p) { self.statCache[m] = self.statCache[p] self.cache[m] = self.cache[p] } } return m } // lotta situps... function makeAbs (self, f) { var abs = f if (f.charAt(0) === "/") { abs = path.join(self.root, f) } else if (exports.isAbsolute(f)) { abs = f } else if (self.changedCwd) { abs = path.resolve(self.cwd, f) } return abs } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/glob.js�������������������������������000644 �000766 �000024 �00000037474 12455173731 025602� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Approach: // // 1. Get the minimatch set // 2. For each pattern in the set, PROCESS(pattern, false) // 3. Store matches per-set, then uniq them // // PROCESS(pattern, inGlobStar) // Get the first [n] items from pattern that are all strings // Join these together. This is PREFIX. // If there is no more remaining, then stat(PREFIX) and // add to matches if it succeeds. END. // // If inGlobStar and PREFIX is symlink and points to dir // set ENTRIES = [] // else readdir(PREFIX) as ENTRIES // If fail, END // // with ENTRIES // If pattern[n] is GLOBSTAR // // handle the case where the globstar match is empty // // by pruning it out, and testing the resulting pattern // PROCESS(pattern[0..n] + pattern[n+1 .. $], false) // // handle other cases. // for ENTRY in ENTRIES (not dotfiles) // // attach globstar + tail onto the entry // // Mark that this entry is a globstar match // PROCESS(pattern[0..n] + ENTRY + pattern[n .. $], true) // // else // not globstar // for ENTRY in ENTRIES (not dotfiles, unless pattern[n] is dot) // Test ENTRY against pattern[n] // If fails, continue // If passes, PROCESS(pattern[0..n] + item + pattern[n+1 .. $]) // // Caveat: // Cache all stats and readdirs results to minimize syscall. Since all // we ever care about is existence and directory-ness, we can just keep // `true` for files, and [children,...] for directories, or `false` for // things that don't exist. module.exports = glob var fs = require("fs") var minimatch = require("minimatch") var Minimatch = minimatch.Minimatch var inherits = require("inherits") var EE = require("events").EventEmitter var path = require("path") var assert = require("assert") var globSync = require("./sync.js") var common = require("./common.js") var alphasort = common.alphasort var isAbsolute = common.isAbsolute var setopts = common.setopts var ownProp = common.ownProp var inflight = require("inflight") var util = require("util") var once = require("once") function glob (pattern, options, cb) { if (typeof options === "function") cb = options, options = {} if (!options) options = {} if (options.sync) { if (cb) throw new TypeError('callback provided to sync glob') return globSync(pattern, options) } return new Glob(pattern, options, cb) } glob.sync = globSync var GlobSync = glob.GlobSync = globSync.GlobSync // old api surface glob.glob = glob glob.hasMagic = function (pattern, options_) { var options = util._extend({}, options_) options.noprocess = true var g = new Glob(pattern, options) var set = g.minimatch.set if (set.length > 1) return true for (var j = 0; j < set[0].length; j++) { if (typeof set[0][j] !== 'string') return true } return false } glob.Glob = Glob inherits(Glob, EE) function Glob (pattern, options, cb) { if (typeof options === "function") { cb = options options = null } if (options && options.sync) { if (cb) throw new TypeError('callback provided to sync glob') return new GlobSync(pattern, options) } if (!(this instanceof Glob)) return new Glob(pattern, options, cb) setopts(this, pattern, options) // process each pattern in the minimatch set var n = this.minimatch.set.length // The matches are stored as {<filename>: true,...} so that // duplicates are automagically pruned. // Later, we do an Object.keys() on these. // Keep them as a list so we can fill in when nonull is set. this.matches = new Array(n) if (typeof cb === "function") { cb = once(cb) this.on("error", cb) this.on("end", function (matches) { cb(null, matches) }) } var self = this var n = this.minimatch.set.length this._processing = 0 this.matches = new Array(n) this._emitQueue = [] this._processQueue = [] this.paused = false if (this.noprocess) return this if (n === 0) return done() for (var i = 0; i < n; i ++) { this._process(this.minimatch.set[i], i, false, done) } function done () { --self._processing if (self._processing <= 0) self._finish() } } Glob.prototype._finish = function () { assert(this instanceof Glob) if (this.aborted) return //console.error('FINISH', this.matches) common.finish(this) this.emit("end", this.found) } Glob.prototype._mark = function (p) { return common.mark(this, p) } Glob.prototype._makeAbs = function (f) { return common.makeAbs(this, f) } Glob.prototype.abort = function () { this.aborted = true this.emit("abort") } Glob.prototype.pause = function () { if (!this.paused) { this.paused = true this.emit("pause") } } Glob.prototype.resume = function () { if (this.paused) { this.emit("resume") this.paused = false if (this._emitQueue.length) { var eq = this._emitQueue.slice(0) this._emitQueue.length = 0 for (var i = 0; i < eq.length; i ++) { var e = eq[i] this._emitMatch(e[0], e[1]) } } if (this._processQueue.length) { var pq = this._processQueue.slice(0) this._processQueue.length = 0 for (var i = 0; i < pq.length; i ++) { var p = pq[i] this._processing-- this._process(p[0], p[1], p[2], p[3]) } } } } Glob.prototype._process = function (pattern, index, inGlobStar, cb) { assert(this instanceof Glob) assert(typeof cb === 'function') if (this.aborted) return this._processing++ if (this.paused) { this._processQueue.push([pattern, index, inGlobStar, cb]) return } //console.error("PROCESS %d", this._processing, pattern) // Get the first [n] parts of pattern that are all strings. var n = 0 while (typeof pattern[n] === "string") { n ++ } // now n is the index of the first one that is *not* a string. // see if there's anything else var prefix switch (n) { // if not, then this is rather simple case pattern.length: this._processSimple(pattern.join('/'), index, cb) return case 0: // pattern *starts* with some non-trivial item. // going to readdir(cwd), but not include the prefix in matches. prefix = null break default: // pattern has some string bits in the front. // whatever it starts with, whether that's "absolute" like /foo/bar, // or "relative" like "../baz" prefix = pattern.slice(0, n).join("/") break } var remain = pattern.slice(n) // get the list of entries. var read if (prefix === null) read = "." else if (isAbsolute(prefix) || isAbsolute(pattern.join("/"))) { if (!prefix || !isAbsolute(prefix)) prefix = "/" + prefix read = prefix } else read = prefix var abs = this._makeAbs(read) var isGlobStar = remain[0] === minimatch.GLOBSTAR if (isGlobStar) this._processGlobStar(prefix, read, abs, remain, index, inGlobStar, cb) else this._processReaddir(prefix, read, abs, remain, index, inGlobStar, cb) } Glob.prototype._processReaddir = function (prefix, read, abs, remain, index, inGlobStar, cb) { var self = this this._readdir(abs, inGlobStar, function (er, entries) { return self._processReaddir2(prefix, read, abs, remain, index, inGlobStar, entries, cb) }) } Glob.prototype._processReaddir2 = function (prefix, read, abs, remain, index, inGlobStar, entries, cb) { // if the abs isn't a dir, then nothing can match! if (!entries) return cb() // It will only match dot entries if it starts with a dot, or if // dot is set. Stuff like @(.foo|.bar) isn't allowed. var pn = remain[0] var negate = !!this.minimatch.negate var rawGlob = pn._glob var dotOk = this.dot || rawGlob.charAt(0) === "." var matchedEntries = [] for (var i = 0; i < entries.length; i++) { var e = entries[i] if (e.charAt(0) !== "." || dotOk) { var m if (negate && !prefix) { m = !e.match(pn) } else { m = e.match(pn) } if (m) matchedEntries.push(e) } } //console.error('prd2', prefix, entries, remain[0]._glob, matchedEntries) var len = matchedEntries.length // If there are no matched entries, then nothing matches. if (len === 0) return cb() // if this is the last remaining pattern bit, then no need for // an additional stat *unless* the user has specified mark or // stat explicitly. We know they exist, since readdir returned // them. if (remain.length === 1 && !this.mark && !this.stat) { if (!this.matches[index]) this.matches[index] = Object.create(null) for (var i = 0; i < len; i ++) { var e = matchedEntries[i] if (prefix) { if (prefix !== "/") e = prefix + "/" + e else e = prefix + e } if (e.charAt(0) === "/" && !this.nomount) { e = path.join(this.root, e) } this._emitMatch(index, e) } // This was the last one, and no stats were needed return cb() } // now test all matched entries as stand-ins for that part // of the pattern. remain.shift() for (var i = 0; i < len; i ++) { var e = matchedEntries[i] var newPattern if (prefix) { if (prefix !== "/") e = prefix + "/" + e else e = prefix + e } this._process([e].concat(remain), index, inGlobStar, cb) } cb() } Glob.prototype._emitMatch = function (index, e) { if (this.aborted) return if (!this.matches[index][e]) { if (this.paused) { this._emitQueue.push([index, e]) return } if (this.nodir) { var c = this.cache[this._makeAbs(e)] if (c === 'DIR' || Array.isArray(c)) return } this.matches[index][e] = true if (!this.stat && !this.mark) return this.emit("match", e) var self = this this._stat(this._makeAbs(e), function (er, c, st) { self.emit("stat", e, st) self.emit("match", e) }) } } Glob.prototype._readdirInGlobStar = function (abs, cb) { if (this.aborted) return var lstatkey = "lstat\0" + abs var self = this var lstatcb = inflight(lstatkey, lstatcb_) if (lstatcb) fs.lstat(abs, lstatcb) function lstatcb_ (er, lstat) { if (er) return cb() var isSym = lstat.isSymbolicLink() self.symlinks[abs] = isSym // If it's not a symlink or a dir, then it's definitely a regular file. // don't bother doing a readdir in that case. if (!isSym && !lstat.isDirectory()) { self.cache[abs] = 'FILE' cb() } else self._readdir(abs, false, cb) } } Glob.prototype._readdir = function (abs, inGlobStar, cb) { if (this.aborted) return cb = inflight("readdir\0"+abs+"\0"+inGlobStar, cb) if (!cb) return //console.error("RD %j %j", +inGlobStar, abs) if (inGlobStar && !ownProp(this.symlinks, abs)) return this._readdirInGlobStar(abs, cb) if (ownProp(this.cache, abs)) { var c = this.cache[abs] if (!c || c === 'FILE') return cb() if (Array.isArray(c)) return cb(null, c) } var self = this fs.readdir(abs, readdirCb(this, abs, cb)) } function readdirCb (self, abs, cb) { return function (er, entries) { if (er) self._readdirError(abs, er, cb) else self._readdirEntries(abs, entries.sort(alphasort), cb) } } Glob.prototype._readdirEntries = function (abs, entries, cb) { if (this.aborted) return // if we haven't asked to stat everything, then just // assume that everything in there exists, so we can avoid // having to stat it a second time. if (!this.mark && !this.stat) { for (var i = 0; i < entries.length; i ++) { var e = entries[i] if (abs === "/") e = abs + e else e = abs + "/" + e this.cache[e] = true } } this.cache[abs] = entries return cb(null, entries) } Glob.prototype._readdirError = function (f, er, cb) { if (this.aborted) return // handle errors, and cache the information switch (er.code) { case "ENOTDIR": // totally normal. means it *does* exist. this.cache[f] = 'FILE' break case "ENOENT": // not terribly unusual case "ELOOP": case "ENAMETOOLONG": case "UNKNOWN": this.cache[f] = false break default: // some unusual error. Treat as failure. this.cache[f] = false if (this.strict) return this.emit("error", er) if (!this.silent) console.error("glob error", er) break } return cb() } Glob.prototype._processGlobStar = function (prefix, read, abs, remain, index, inGlobStar, cb) { var self = this this._readdir(abs, inGlobStar, function (er, entries) { self._processGlobStar2(prefix, read, abs, remain, index, inGlobStar, entries, cb) }) } Glob.prototype._processGlobStar2 = function (prefix, read, abs, remain, index, inGlobStar, entries, cb) { //console.error("pgs2", prefix, remain[0], entries) // no entries means not a dir, so it can never have matches // foo.txt/** doesn't match foo.txt if (!entries) return cb() // test without the globstar, and with every child both below // and replacing the globstar. var remainWithoutGlobStar = remain.slice(1) var gspref = prefix ? [ prefix ] : [] var noGlobStar = gspref.concat(remainWithoutGlobStar) // the noGlobStar pattern exits the inGlobStar state this._process(noGlobStar, index, false, cb) var isSym = this.symlinks[abs] var len = entries.length // If it's a symlink, and we're in a globstar, then stop if (isSym && inGlobStar) return cb() for (var i = 0; i < len; i++) { var e = entries[i] if (e.charAt(0) === "." && !this.dot) continue // these two cases enter the inGlobStar state var instead = gspref.concat(entries[i], remainWithoutGlobStar) this._process(instead, index, true, cb) var below = gspref.concat(entries[i], remain) this._process(below, index, true, cb) } cb() } Glob.prototype._processSimple = function (prefix, index, cb) { // XXX review this. Shouldn't it be doing the mounting etc // before doing stat? kinda weird? var self = this this._stat(prefix, function (er, exists) { self._processSimple2(prefix, index, er, exists, cb) }) } Glob.prototype._processSimple2 = function (prefix, index, er, exists, cb) { //console.error("ps2", prefix, exists) if (!this.matches[index]) this.matches[index] = Object.create(null) // If it doesn't exist, then just mark the lack of results if (!exists) return cb() if (prefix && isAbsolute(prefix) && !this.nomount) { var trail = /[\/\\]$/.test(prefix) if (prefix.charAt(0) === "/") { prefix = path.join(this.root, prefix) } else { prefix = path.resolve(this.root, prefix) if (trail) prefix += '/' } } if (process.platform === "win32") prefix = prefix.replace(/\\/g, "/") // Mark this as a match this._emitMatch(index, prefix) cb() } // Returns either 'DIR', 'FILE', or false Glob.prototype._stat = function (f, cb) { var abs = f if (f.charAt(0) === "/") abs = path.join(this.root, f) else if (this.changedCwd) abs = path.resolve(this.cwd, f) if (f.length > this.maxLength) return cb() if (!this.stat && ownProp(this.cache, f)) { var c = this.cache[f] if (Array.isArray(c)) c = 'DIR' // It exists, but not how we need it if (abs.slice(-1) === "/" && c !== 'DIR') return cb() return cb(null, c) } var exists var stat = this.statCache[abs] if (stat !== undefined) { if (stat === false) return cb(null, stat) else return cb(null, stat.isDirectory() ? 'DIR' : 'FILE', stat) } var self = this var statcb = inflight("stat\0" + abs, statcb_) if (statcb) fs.stat(abs, statcb) function statcb_ (er, stat) { self._stat2(f, abs, er, stat, cb) } } Glob.prototype._stat2 = function (f, abs, er, stat, cb) { if (er) { this.statCache[abs] = false return cb() } this.statCache[abs] = stat if (abs.slice(-1) === "/" && !stat.isDirectory()) return cb(null, false, stat) var c = stat.isDirectory() ? 'DIR' : 'FILE' this.cache[f] = this.cache[f] || c return cb(null, c, stat) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/LICENSE�������������������������������000644 �000766 �000024 �00000001375 12455173731 025315� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/package.json��������������������������000644 �000766 �000024 �00000003373 12455173731 026576� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "glob", "description": "a little globber", "version": "4.3.2", "repository": { "type": "git", "url": "git://github.com/isaacs/node-glob.git" }, "main": "glob.js", "files": [ "glob.js", "sync.js", "common.js" ], "engines": { "node": "*" }, "dependencies": { "inflight": "^1.0.4", "inherits": "2", "minimatch": "^2.0.1", "once": "^1.3.0" }, "devDependencies": { "mkdirp": "0", "rimraf": "^2.2.8", "tap": "~0.4.0", "tick": "0.0.6" }, "scripts": { "prepublish": "npm run benchclean", "profclean": "rm -f v8.log profile.txt", "test": "npm run profclean && tap test/*.js", "test-regen": "npm run profclean && TEST_REGEN=1 node test/00-setup.js", "bench": "bash benchmark.sh", "prof": "bash prof.sh && cat profile.txt", "benchclean": "bash benchclean.sh" }, "license": "ISC", "gitHead": "941d53c8ab6216f43a6f5e8e01245364ba90cfe9", "bugs": { "url": "https://github.com/isaacs/node-glob/issues" }, "homepage": "https://github.com/isaacs/node-glob", "_id": "glob@4.3.2", "_shasum": "351ec7dafc29256b253ad86cd6b48c5a3404b76d", "_from": "glob@>=4.3.2 <4.4.0", "_npmVersion": "2.1.14", "_nodeVersion": "0.10.33", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "351ec7dafc29256b253ad86cd6b48c5a3404b76d", "tarball": "http://registry.npmjs.org/glob/-/glob-4.3.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/glob/-/glob-4.3.2.tgz", "readme": "ERROR: No README data found!" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/README.md�����������������������������000644 �000766 �000024 �00000034424 12455173731 025570� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������[![Build Status](https://travis-ci.org/isaacs/node-glob.svg?branch=master)](https://travis-ci.org/isaacs/node-glob/) [![Dependency Status](https://david-dm.org/isaacs/node-glob.svg)](https://david-dm.org/isaacs/node-glob) [![devDependency Status](https://david-dm.org/isaacs/node-glob/dev-status.svg)](https://david-dm.org/isaacs/node-glob#info=devDependencies) [![optionalDependency Status](https://david-dm.org/isaacs/node-glob/optional-status.svg)](https://david-dm.org/isaacs/node-glob#info=optionalDependencies) # Glob Match files using the patterns the shell uses, like stars and stuff. This is a glob implementation in JavaScript. It uses the `minimatch` library to do its matching. ![](oh-my-glob.gif) ## Usage ```javascript var glob = require("glob") // options is optional glob("**/*.js", options, function (er, files) { // files is an array of filenames. // If the `nonull` option is set, and nothing // was found, then files is ["**/*.js"] // er is an error object or null. }) ``` ## Glob Primer "Globs" are the patterns you type when you do stuff like `ls *.js` on the command line, or put `build/*` in a `.gitignore` file. Before parsing the path part patterns, braced sections are expanded into a set. Braced sections start with `{` and end with `}`, with any number of comma-delimited sections within. Braced sections may contain slash characters, so `a{/b/c,bcd}` would expand into `a/b/c` and `abcd`. The following characters have special magic meaning when used in a path portion: * `*` Matches 0 or more characters in a single path portion * `?` Matches 1 character * `[...]` Matches a range of characters, similar to a RegExp range. If the first character of the range is `!` or `^` then it matches any character not in the range. * `!(pattern|pattern|pattern)` Matches anything that does not match any of the patterns provided. * `?(pattern|pattern|pattern)` Matches zero or one occurrence of the patterns provided. * `+(pattern|pattern|pattern)` Matches one or more occurrences of the patterns provided. * `*(a|b|c)` Matches zero or more occurrences of the patterns provided * `@(pattern|pat*|pat?erN)` Matches exactly one of the patterns provided * `**` If a "globstar" is alone in a path portion, then it matches zero or more directories and subdirectories searching for matches. It does not crawl symlinked directories. ### Dots If a file or directory path portion has a `.` as the first character, then it will not match any glob pattern unless that pattern's corresponding path part also has a `.` as its first character. For example, the pattern `a/.*/c` would match the file at `a/.b/c`. However the pattern `a/*/c` would not, because `*` does not start with a dot character. You can make glob treat dots as normal characters by setting `dot:true` in the options. ### Basename Matching If you set `matchBase:true` in the options, and the pattern has no slashes in it, then it will seek for any file anywhere in the tree with a matching basename. For example, `*.js` would match `test/simple/basic.js`. ### Negation The intent for negation would be for a pattern starting with `!` to match everything that *doesn't* match the supplied pattern. However, the implementation is weird, and for the time being, this should be avoided. The behavior will change or be deprecated in version 5. ### Empty Sets If no matching files are found, then an empty array is returned. This differs from the shell, where the pattern itself is returned. For example: $ echo a*s*d*f a*s*d*f To get the bash-style behavior, set the `nonull:true` in the options. ### See Also: * `man sh` * `man bash` (Search for "Pattern Matching") * `man 3 fnmatch` * `man 5 gitignore` * [minimatch documentation](https://github.com/isaacs/minimatch) ## glob.hasMagic(pattern, [options]) Returns `true` if there are any special characters in the pattern, and `false` otherwise. Note that the options affect the results. If `noext:true` is set in the options object, then `+(a|b)` will not be considered a magic pattern. If the pattern has a brace expansion, like `a/{b/c,x/y}` then that is considered magical, unless `nobrace:true` is set in the options. ## glob(pattern, [options], cb) * `pattern` {String} Pattern to be matched * `options` {Object} * `cb` {Function} * `err` {Error | null} * `matches` {Array<String>} filenames found matching the pattern Perform an asynchronous glob search. ## glob.sync(pattern, [options]) * `pattern` {String} Pattern to be matched * `options` {Object} * return: {Array<String>} filenames found matching the pattern Perform a synchronous glob search. ## Class: glob.Glob Create a Glob object by instantiating the `glob.Glob` class. ```javascript var Glob = require("glob").Glob var mg = new Glob(pattern, options, cb) ``` It's an EventEmitter, and starts walking the filesystem to find matches immediately. ### new glob.Glob(pattern, [options], [cb]) * `pattern` {String} pattern to search for * `options` {Object} * `cb` {Function} Called when an error occurs, or matches are found * `err` {Error | null} * `matches` {Array<String>} filenames found matching the pattern Note that if the `sync` flag is set in the options, then matches will be immediately available on the `g.found` member. ### Properties * `minimatch` The minimatch object that the glob uses. * `options` The options object passed in. * `aborted` Boolean which is set to true when calling `abort()`. There is no way at this time to continue a glob search after aborting, but you can re-use the statCache to avoid having to duplicate syscalls. * `statCache` Collection of all the stat results the glob search performed. * `cache` Convenience object. Each field has the following possible values: * `false` - Path does not exist * `true` - Path exists * `'DIR'` - Path exists, and is not a directory * `'FILE'` - Path exists, and is a directory * `[file, entries, ...]` - Path exists, is a directory, and the array value is the results of `fs.readdir` * `statCache` Cache of `fs.stat` results, to prevent statting the same path multiple times. * `symlinks` A record of which paths are symbolic links, which is relevant in resolving `**` patterns. ### Events * `end` When the matching is finished, this is emitted with all the matches found. If the `nonull` option is set, and no match was found, then the `matches` list contains the original pattern. The matches are sorted, unless the `nosort` flag is set. * `match` Every time a match is found, this is emitted with the matched. * `error` Emitted when an unexpected error is encountered, or whenever any fs error occurs if `options.strict` is set. * `abort` When `abort()` is called, this event is raised. ### Methods * `pause` Temporarily stop the search * `resume` Resume the search * `abort` Stop the search forever ### Options All the options that can be passed to Minimatch can also be passed to Glob to change pattern matching behavior. Also, some have been added, or have glob-specific ramifications. All options are false by default, unless otherwise noted. All options are added to the Glob object, as well. If you are running many `glob` operations, you can pass a Glob object as the `options` argument to a subsequent operation to shortcut some `stat` and `readdir` calls. At the very least, you may pass in shared `symlinks`, `statCache`, and `cache` options, so that parallel glob operations will be sped up by sharing information about the filesystem. * `cwd` The current working directory in which to search. Defaults to `process.cwd()`. * `root` The place where patterns starting with `/` will be mounted onto. Defaults to `path.resolve(options.cwd, "/")` (`/` on Unix systems, and `C:\` or some such on Windows.) * `dot` Include `.dot` files in normal matches and `globstar` matches. Note that an explicit dot in a portion of the pattern will always match dot files. * `nomount` By default, a pattern starting with a forward-slash will be "mounted" onto the root setting, so that a valid filesystem path is returned. Set this flag to disable that behavior. * `mark` Add a `/` character to directory matches. Note that this requires additional stat calls. * `nosort` Don't sort the results. * `stat` Set to true to stat *all* results. This reduces performance somewhat, and is completely unnecessary, unless `readdir` is presumed to be an untrustworthy indicator of file existence. * `silent` When an unusual error is encountered when attempting to read a directory, a warning will be printed to stderr. Set the `silent` option to true to suppress these warnings. * `strict` When an unusual error is encountered when attempting to read a directory, the process will just continue on in search of other matches. Set the `strict` option to raise an error in these cases. * `cache` See `cache` property above. Pass in a previously generated cache object to save some fs calls. * `statCache` A cache of results of filesystem information, to prevent unnecessary stat calls. While it should not normally be necessary to set this, you may pass the statCache from one glob() call to the options object of another, if you know that the filesystem will not change between calls. (See "Race Conditions" below.) * `symlinks` A cache of known symbolic links. You may pass in a previously generated `symlinks` object to save `lstat` calls when resolving `**` matches. * `sync` Perform a synchronous glob search. * `nounique` In some cases, brace-expanded patterns can result in the same file showing up multiple times in the result set. By default, this implementation prevents duplicates in the result set. Set this flag to disable that behavior. * `nonull` Set to never return an empty set, instead returning a set containing the pattern itself. This is the default in glob(3). * `nocase` Perform a case-insensitive match. Note that case-insensitive filesystems will sometimes result in glob returning results that are case-insensitively matched anyway, since readdir and stat will not raise an error. * `debug` Set to enable debug logging in minimatch and glob. * `nobrace` Do not expand `{a,b}` and `{1..3}` brace sets. * `noglobstar` Do not match `**` against multiple filenames. (Ie, treat it as a normal `*` instead.) * `noext` Do not match `+(a|b)` "extglob" patterns. * `nocase` Perform a case-insensitive match. Note: on case-insensitive filesystems, non-magic patterns will match by default, since `stat` and `readdir` will not raise errors. * `matchBase` Perform a basename-only match if the pattern does not contain any slash characters. That is, `*.js` would be treated as equivalent to `**/*.js`, matching all js files in all directories. * `nonegate` Suppress `negate` behavior. (See below.) * `nocomment` Suppress `comment` behavior. (See below.) * `nonull` Return the pattern when no matches are found. * `nodir` Do not match directories, only files. ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between node-glob and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.3, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. Note that symlinked directories are not crawled as part of a `**`, though their contents may match against subsequent portions of the pattern. This prevents infinite loops and duplicates and the like. If an escaped pattern has no matches, and the `nonull` flag is set, then glob returns the pattern as-provided, rather than interpreting the character escapes. For example, `glob.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. ## Windows **Please only use forward-slashes in glob expressions.** Though windows uses either `/` or `\` as its path separator, only `/` characters are used by this glob implementation. You must use forward-slashes **only** in glob expressions. Back-slashes will always be interpreted as escape characters, not path separators. Results from absolute patterns such as `/foo/*` are mounted onto the root setting using `path.join`. On windows, this will by default result in `/foo/*` matching `C:\foo\bar.txt`. ## Race Conditions Glob searching, by its very nature, is susceptible to race conditions, since it relies on directory walking and such. As a result, it is possible that a file that exists when glob looks for it may have been deleted or modified by the time it returns the result. As part of its internal implementation, this program caches all stat and readdir calls that it makes, in order to cut down on system overhead. However, this also makes it even more susceptible to races, especially if the cache or statCache objects are reused between glob calls. Users are thus advised not to use a glob result as a guarantee of filesystem state in the face of rapid changes. For the vast majority of operations, this is never a problem. ## Contributing Any change to behavior (including bugfixes) must come with a test. Patches that fail tests or reduce performance will be rejected. ``` # to run tests npm test # to re-generate test fixtures npm run test-regen # to benchmark against bash/zsh npm run bench # to profile javascript npm run prof ``` ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/glob/sync.js�������������������������������000644 �000766 �000024 �00000023577 12455173731 025632� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = globSync globSync.GlobSync = GlobSync var fs = require("fs") var minimatch = require("minimatch") var Minimatch = minimatch.Minimatch var Glob = require("./glob.js").Glob var util = require("util") var path = require("path") var assert = require("assert") var common = require("./common.js") var alphasort = common.alphasort var isAbsolute = common.isAbsolute var setopts = common.setopts var ownProp = common.ownProp function globSync (pattern, options) { if (typeof options === 'function' || arguments.length === 3) throw new TypeError('callback provided to sync glob') return new GlobSync(pattern, options).found } function GlobSync (pattern, options) { if (!pattern) throw new Error("must provide pattern") if (typeof options === 'function' || arguments.length === 3) throw new TypeError('callback provided to sync glob') if (!(this instanceof GlobSync)) return new GlobSync(pattern, options) setopts(this, pattern, options) if (this.noprocess) return this var n = this.minimatch.set.length this.matches = new Array(n) for (var i = 0; i < n; i ++) { this._process(this.minimatch.set[i], i, false) } this._finish() } GlobSync.prototype._finish = function () { assert(this instanceof GlobSync) common.finish(this) } GlobSync.prototype._process = function (pattern, index, inGlobStar) { assert(this instanceof GlobSync) // Get the first [n] parts of pattern that are all strings. var n = 0 while (typeof pattern[n] === "string") { n ++ } // now n is the index of the first one that is *not* a string. // See if there's anything else var prefix switch (n) { // if not, then this is rather simple case pattern.length: this._processSimple(pattern.join('/'), index) return case 0: // pattern *starts* with some non-trivial item. // going to readdir(cwd), but not include the prefix in matches. prefix = null break default: // pattern has some string bits in the front. // whatever it starts with, whether that's "absolute" like /foo/bar, // or "relative" like "../baz" prefix = pattern.slice(0, n).join("/") break } var remain = pattern.slice(n) // get the list of entries. var read if (prefix === null) read = "." else if (isAbsolute(prefix) || isAbsolute(pattern.join("/"))) { if (!prefix || !isAbsolute(prefix)) prefix = "/" + prefix read = prefix } else read = prefix var abs = this._makeAbs(read) var isGlobStar = remain[0] === minimatch.GLOBSTAR if (isGlobStar) this._processGlobStar(prefix, read, abs, remain, index, inGlobStar) else this._processReaddir(prefix, read, abs, remain, index, inGlobStar) } GlobSync.prototype._processReaddir = function (prefix, read, abs, remain, index, inGlobStar) { var entries = this._readdir(abs, inGlobStar) // if the abs isn't a dir, then nothing can match! if (!entries) return // It will only match dot entries if it starts with a dot, or if // dot is set. Stuff like @(.foo|.bar) isn't allowed. var pn = remain[0] var negate = !!this.minimatch.negate var rawGlob = pn._glob var dotOk = this.dot || rawGlob.charAt(0) === "." var matchedEntries = [] for (var i = 0; i < entries.length; i++) { var e = entries[i] if (e.charAt(0) !== "." || dotOk) { var m if (negate && !prefix) { m = !e.match(pn) } else { m = e.match(pn) } if (m) matchedEntries.push(e) } } var len = matchedEntries.length // If there are no matched entries, then nothing matches. if (len === 0) return // if this is the last remaining pattern bit, then no need for // an additional stat *unless* the user has specified mark or // stat explicitly. We know they exist, since readdir returned // them. if (remain.length === 1 && !this.mark && !this.stat) { if (!this.matches[index]) this.matches[index] = Object.create(null) for (var i = 0; i < len; i ++) { var e = matchedEntries[i] if (prefix) { if (prefix.slice(-1) !== "/") e = prefix + "/" + e else e = prefix + e } if (e.charAt(0) === "/" && !this.nomount) { e = path.join(this.root, e) } this.matches[index][e] = true } // This was the last one, and no stats were needed return } // now test all matched entries as stand-ins for that part // of the pattern. remain.shift() for (var i = 0; i < len; i ++) { var e = matchedEntries[i] var newPattern if (prefix) newPattern = [prefix, e] else newPattern = [e] this._process(newPattern.concat(remain), index, inGlobStar) } } GlobSync.prototype._emitMatch = function (index, e) { if (!this.matches[index][e]) { if (this.nodir) { var c = this.cache[this._makeAbs(e)] if (c === 'DIR' || Array.isArray(c)) return } this.matches[index][e] = true if (this.stat || this.mark) this._stat(this._makeAbs(e)) } } GlobSync.prototype._readdirInGlobStar = function (abs) { var entries var lstat var stat try { lstat = fs.lstatSync(abs) } catch (er) { // lstat failed, doesn't exist return null } var isSym = lstat.isSymbolicLink() this.symlinks[abs] = isSym // If it's not a symlink or a dir, then it's definitely a regular file. // don't bother doing a readdir in that case. if (!isSym && !lstat.isDirectory()) this.cache[abs] = 'FILE' else entries = this._readdir(abs, false) return entries } GlobSync.prototype._readdir = function (abs, inGlobStar) { var entries if (inGlobStar && !ownProp(this.symlinks, abs)) return this._readdirInGlobStar(abs) if (ownProp(this.cache, abs)) { var c = this.cache[abs] if (!c || c === 'FILE') return null if (Array.isArray(c)) return c } try { return this._readdirEntries(abs, fs.readdirSync(abs).sort(alphasort)) } catch (er) { this._readdirError(abs, er) return null } } GlobSync.prototype._readdirEntries = function (abs, entries) { // if we haven't asked to stat everything, then just // assume that everything in there exists, so we can avoid // having to stat it a second time. if (!this.mark && !this.stat) { for (var i = 0; i < entries.length; i ++) { var e = entries[i] if (abs === "/") e = abs + e else e = abs + "/" + e this.cache[e] = true } } this.cache[abs] = entries // mark and cache dir-ness return entries } GlobSync.prototype._readdirError = function (f, er) { // handle errors, and cache the information switch (er.code) { case "ENOTDIR": // totally normal. means it *does* exist. this.cache[f] = 'FILE' break case "ENOENT": // not terribly unusual case "ELOOP": case "ENAMETOOLONG": case "UNKNOWN": this.cache[f] = false break default: // some unusual error. Treat as failure. this.cache[f] = false if (this.strict) throw er if (!this.silent) console.error("glob error", er) break } } GlobSync.prototype._processGlobStar = function (prefix, read, abs, remain, index, inGlobStar) { var entries = this._readdir(abs, inGlobStar) // no entries means not a dir, so it can never have matches // foo.txt/** doesn't match foo.txt if (!entries) return // test without the globstar, and with every child both below // and replacing the globstar. var remainWithoutGlobStar = remain.slice(1) var gspref = prefix ? [ prefix ] : [] var noGlobStar = gspref.concat(remainWithoutGlobStar) // the noGlobStar pattern exits the inGlobStar state this._process(noGlobStar, index, false) var len = entries.length var isSym = this.symlinks[abs] // If it's a symlink, and we're in a globstar, then stop if (isSym && inGlobStar) return for (var i = 0; i < len; i++) { var e = entries[i] if (e.charAt(0) === "." && !this.dot) continue // these two cases enter the inGlobStar state var instead = gspref.concat(entries[i], remainWithoutGlobStar) this._process(instead, index, true) var below = gspref.concat(entries[i], remain) this._process(below, index, true) } } GlobSync.prototype._processSimple = function (prefix, index) { // XXX review this. Shouldn't it be doing the mounting etc // before doing stat? kinda weird? var exists = this._stat(prefix) if (!this.matches[index]) this.matches[index] = Object.create(null) // If it doesn't exist, then just mark the lack of results if (!exists) return if (prefix && isAbsolute(prefix) && !this.nomount) { var trail = /[\/\\]$/.test(prefix) if (prefix.charAt(0) === "/") { prefix = path.join(this.root, prefix) } else { prefix = path.resolve(this.root, prefix) if (trail) prefix += '/' } } if (process.platform === "win32") prefix = prefix.replace(/\\/g, "/") // Mark this as a match this.matches[index][prefix] = true } // Returns either 'DIR', 'FILE', or false GlobSync.prototype._stat = function (f) { var abs = f if (f.charAt(0) === "/") abs = path.join(this.root, f) else if (this.changedCwd) abs = path.resolve(this.cwd, f) if (f.length > this.maxLength) return false if (!this.stat && ownProp(this.cache, f)) { var c = this.cache[f] if (Array.isArray(c)) c = 'DIR' // It exists, but not how we need it if (abs.slice(-1) === "/" && c !== 'DIR') return false return c } var exists var stat = this.statCache[abs] if (!stat) { try { stat = fs.statSync(abs) } catch (er) { return false } } this.statCache[abs] = stat if (abs.slice(-1) === "/" && !stat.isDirectory()) return false var c = stat.isDirectory() ? 'DIR' : 'FILE' this.cache[f] = this.cache[f] || c return c } GlobSync.prototype._mark = function (p) { return common.mark(this, p) } GlobSync.prototype._makeAbs = function (f) { return common.makeAbs(this, f) } ���������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/.npmignore���000644 �000766 �000024 �00000000145 12455173731 033161� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modules���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/.travis.yml��000644 �000766 �000024 �00000000057 12455173731 033275� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.8" - "0.10"���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/index.js�����000644 �000766 �000024 �00000001421 12455173731 032625� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = getUrl function getUrl (r, forBrowser) { if (!r) return null // Regex taken from https://github.com/npm/npm-package-arg/commit/01dce583c64afae07b66a2a8a6033aeba871c3cd // Note: This does not fully test the git ref format. // See https://www.kernel.org/pub/software/scm/git/docs/git-check-ref-format.html // // The only way to do this properly would be to shell out to // git-check-ref-format, and as this is a fast sync function, // we don't want to do that. Just let git fail if it turns // out that the commit-ish is invalid. // GH usernames cannot start with . or - if (/^[^@%\/\s\.-][^:@%\/\s]*\/[^@\s\/%]+(?:#.*)?$/.test(r)) { if (forBrowser) r = r.replace("#", "/tree/") return "https://github.com/" + r } return null } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/LICENSE������000644 �000766 �000024 �00000002432 12455173731 032170� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Robert Kowalski ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/package.json�000644 �000766 �000024 �00000003224 12455173731 033451� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "github-url-from-username-repo", "version": "1.0.2", "description": "Create urls from username/repo", "main": "index.js", "scripts": { "test": "mocha -R spec" }, "devDependencies": { "mocha": "~1.13.0" }, "repository": { "type": "git", "url": "git@github.com:robertkowalski/github-url-from-username-repo.git" }, "author": { "name": "Robert Kowalski", "email": "rok@kowalski.gd" }, "license": "BSD-2-Clause", "bugs": { "url": "https://github.com/robertkowalski/github-url-from-username-repo/issues" }, "keywords": [ "git", "github", "repo" ], "readme": "[![Build Status](https://travis-ci.org/robertkowalski/github-url-from-username-repo.png?branch=master)](https://travis-ci.org/robertkowalski/github-url-from-username-repo)\n[![Dependency Status](https://gemnasium.com/robertkowalski/github-url-from-username-repo.png)](https://gemnasium.com/robertkowalski/github-url-from-username-repo)\n\n\n# github-url-from-username-repo\n\n## API\n\n### getUrl(url, [forBrowser])\n\nGet's the url normalized for npm.\nIf `forBrowser` is true, return a GitHub url that is usable in a webbrowser.\n\n## Usage\n\n```javascript\n\nvar getUrl = require(\"github-url-from-username-repo\")\ngetUrl(\"visionmedia/express\") // https://github.com/visionmedia/express\n\n```\n", "readmeFilename": "README.md", "gitHead": "d404a13f7f04edaed0e2f068a43b81230b8c7aee", "homepage": "https://github.com/robertkowalski/github-url-from-username-repo", "_id": "github-url-from-username-repo@1.0.2", "_shasum": "7dd79330d2abe69c10c2cef79714c97215791dfa", "_from": "github-url-from-username-repo@>=1.0.2-0 <2.0.0-0" } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-username-repo/README.md����000644 �000766 �000024 �00000001245 12455173731 032443� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������[![Build Status](https://travis-ci.org/robertkowalski/github-url-from-username-repo.png?branch=master)](https://travis-ci.org/robertkowalski/github-url-from-username-repo) [![Dependency Status](https://gemnasium.com/robertkowalski/github-url-from-username-repo.png)](https://gemnasium.com/robertkowalski/github-url-from-username-repo) # github-url-from-username-repo ## API ### getUrl(url, [forBrowser]) Get's the url normalized for npm. If `forBrowser` is true, return a GitHub url that is usable in a webbrowser. ## Usage ```javascript var getUrl = require("github-url-from-username-repo") getUrl("visionmedia/express") // https://github.com/visionmedia/express ``` �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/.npmignore�������������000644 �000766 �000024 �00000000015 12455173731 031156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/index.js���������������000644 �000766 �000024 �00000001643 12455173731 030634� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// convert git:// form url to github URL, e.g., // git://github.com/bcoe/foo.git // https://github.com/bcoe/foo. function githubUrlFromGit(url, opts){ try { var m = re(opts).exec(url.replace(/\.git(#.*)?$/, '')); var host = m[1]; var path = m[2]; return 'https://' + host + '/' + path; } catch (err) { // ignore } }; // generate the git:// parsing regex // with options, e.g., the ability // to specify multiple GHE domains. function re(opts) { opts = opts || {}; // whitelist of URLs that should be treated as GitHub repos. var baseUrls = ['gist.github.com', 'github.com'].concat(opts.extraBaseUrls || []); // build regex from whitelist. return new RegExp( /^(?:https?:\/\/|git:\/\/|git\+ssh:\/\/|git\+https:\/\/)?(?:[^@]+@)?/.source + '(' + baseUrls.join('|') + ')' + /[:\/]([^\/]+\/[^\/]+?|[0-9]+)$/.source ); } githubUrlFromGit.re = re(); module.exports = githubUrlFromGit; ���������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/LICENSE����������������000644 �000766 �000024 �00000002112 12455173731 030164� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������(The MIT License) Copyright (c) 2013 TJ Holowaychuk <tj@vision-media.ca> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/Makefile���������������000644 �000766 �000024 �00000000132 12455173731 030617� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ test: @./node_modules/.bin/mocha test.js --reporter spec --require should .PHONY: test ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/package.json�����������000644 �000766 �000024 �00000002773 12455173731 031462� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "github-url-from-git", "version": "1.4.0", "description": "Parse a github git url and return the github repo url", "main": "index.js", "scripts": { "test": "mocha test.js --reporter spec --require should" }, "repository": { "type": "git", "url": "https://github.com/visionmedia/node-github-url-from-git.git" }, "keywords": [ "github", "git", "url", "parser" ], "author": { "name": "TJ Holowaychuk" }, "license": "MIT", "devDependencies": { "better-assert": "~1.0.0", "mocha": "~1.9.0", "should": "~1.2.2" }, "gitHead": "154df00b0b590c29be5d2a5822e7b2e160b75345", "bugs": { "url": "https://github.com/visionmedia/node-github-url-from-git/issues" }, "homepage": "https://github.com/visionmedia/node-github-url-from-git", "_id": "github-url-from-git@1.4.0", "_shasum": "285e6b520819001bde128674704379e4ff03e0de", "_from": "github-url-from-git@>=1.4.0-0 <2.0.0-0", "_npmVersion": "2.0.0-alpha.7", "_npmUser": { "name": "bcoe", "email": "bencoe@gmail.com" }, "maintainers": [ { "name": "tjholowaychuk", "email": "tj@vision-media.ca" }, { "name": "bcoe", "email": "bencoe@gmail.com" } ], "dist": { "shasum": "285e6b520819001bde128674704379e4ff03e0de", "tarball": "http://registry.npmjs.org/github-url-from-git/-/github-url-from-git-1.4.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/github-url-from-git/-/github-url-from-git-1.4.0.tgz" } �����iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/Readme.md��������������000644 �000766 �000024 �00000006156 12455173731 030712� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ # github-url-from-git ```js describe('parse(url)', function(){ it('should support git://*', function(){ var url = 'git://github.com/jamesor/mongoose-versioner'; parse(url).should.equal('https://github.com/jamesor/mongoose-versioner'); }) it('should support git://*.git', function(){ var url = 'git://github.com/treygriffith/cellar.git'; parse(url).should.equal('https://github.com/treygriffith/cellar'); }) it('should support https://*', function(){ var url = 'https://github.com/Empeeric/i18n-node'; parse(url).should.equal('https://github.com/Empeeric/i18n-node'); }) it('should support https://*.git', function(){ var url = 'https://jpillora@github.com/banchee/tranquil.git'; parse(url).should.equal('https://github.com/banchee/tranquil'); }) it('should return undefined on failure', function(){ var url = 'git://github.com/justgord/.git'; assert(null == parse(url)); }) it('should parse git@github.com:bcoe/thumbd.git', function() { var url = 'git@github.com:bcoe/thumbd.git'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse git@github.com:bcoe/thumbd.git#2.7.0', function() { var url = 'git@github.com:bcoe/thumbd.git#2.7.0'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse git+https://github.com/bcoe/thumbd.git', function() { var url = 'git+https://github.com/bcoe/thumbd.git'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse git+ssh://github.com/bcoe/thumbd.git', function() { var url = 'git+ssh://github.com/bcoe/thumbd.git'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse https://EastCloud@github.com/EastCloud/node-websockets.git', function() { var url = 'https://EastCloud@github.com/EastCloud/node-websockets.git'; parse(url).should.eql('https://github.com/EastCloud/node-websockets'); }) // gist urls. it('should parse git@gist urls', function() { var url = 'git@gist.github.com:3135914.git'; parse(url).should.equal('https://gist.github.com/3135914') }) it('should parse https://gist urls', function() { var url = 'https://gist.github.com/3135914.git'; parse(url).should.equal('https://gist.github.com/3135914') }) // Handle arbitrary GitHub Enterprise domains. it('should parse parse extra GHE urls provided', function() { var url = 'git://github.example.com/treygriffith/cellar.git'; parse( url, {extraBaseUrls: ['github.example.com']} ).should.equal('https://github.example.com/treygriffith/cellar'); }); it('should parse GHE urls with multiple subdomains', function() { var url = 'git://github.internal.example.com/treygriffith/cellar.git'; parse( url, {extraBaseUrls: ['github.internal.example.com']} ).should.equal('https://github.internal.example.com/treygriffith/cellar'); }); }) describe('re', function() { it('should expose GitHub url parsing regex', function() { parse.re.source.should.equal( /^(?:https?:\/\/|git:\/\/)?(?:[^@]+@)?(gist.github.com|github.com)[:\/]([^\/]+\/[^\/]+?|[0-9]+)$/.source ) }); }) ``` ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/github-url-from-git/test.js����������������000644 �000766 �000024 �00000006255 12455173731 030510� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var parse = require('./'); var assert = require('better-assert'); describe('parse(url)', function(){ it('should support git://*', function(){ var url = 'git://github.com/jamesor/mongoose-versioner'; parse(url).should.equal('https://github.com/jamesor/mongoose-versioner'); }) it('should support git://*.git', function(){ var url = 'git://github.com/treygriffith/cellar.git'; parse(url).should.equal('https://github.com/treygriffith/cellar'); }) it('should support https://*', function(){ var url = 'https://github.com/Empeeric/i18n-node'; parse(url).should.equal('https://github.com/Empeeric/i18n-node'); }) it('should support https://*.git', function(){ var url = 'https://jpillora@github.com/banchee/tranquil.git'; parse(url).should.equal('https://github.com/banchee/tranquil'); }) it('should return undefined on failure', function(){ var url = 'git://github.com/justgord/.git'; assert(null == parse(url)); }) it('should parse git@github.com:bcoe/thumbd.git', function() { var url = 'git@github.com:bcoe/thumbd.git'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse git@github.com:bcoe/thumbd.git#2.7.0', function() { var url = 'git@github.com:bcoe/thumbd.git#2.7.0'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse git+https://github.com/bcoe/thumbd.git', function() { var url = 'git+https://github.com/bcoe/thumbd.git'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse git+ssh://github.com/bcoe/thumbd.git', function() { var url = 'git+ssh://github.com/bcoe/thumbd.git'; parse(url).should.eql('https://github.com/bcoe/thumbd'); }) it('should parse https://EastCloud@github.com/EastCloud/node-websockets.git', function() { var url = 'https://EastCloud@github.com/EastCloud/node-websockets.git'; parse(url).should.eql('https://github.com/EastCloud/node-websockets'); }) // gist urls. it('should parse git@gist urls', function() { var url = 'git@gist.github.com:3135914.git'; parse(url).should.equal('https://gist.github.com/3135914') }) it('should parse https://gist urls', function() { var url = 'https://gist.github.com/3135914.git'; parse(url).should.equal('https://gist.github.com/3135914') }) // Handle arbitrary GitHub Enterprise domains. it('should parse parse extra GHE urls provided', function() { var url = 'git://github.example.com/treygriffith/cellar.git'; parse( url, {extraBaseUrls: ['github.example.com']} ).should.equal('https://github.example.com/treygriffith/cellar'); }); it('should parse GHE urls with multiple subdomains', function() { var url = 'git://github.internal.example.com/treygriffith/cellar.git'; parse( url, {extraBaseUrls: ['github.internal.example.com']} ).should.equal('https://github.internal.example.com/treygriffith/cellar'); }); }) describe('re', function() { it('should expose GitHub url parsing regex', function() { parse.re.source.should.equal( /^(?:https?:\/\/|git:\/\/|git\+ssh:\/\/|git\+https:\/\/)?(?:[^@]+@)?(gist.github.com|github.com)[:\/]([^\/]+\/[^\/]+?|[0-9]+)$/.source ) }); }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/.npmignore���������������������000644 �000766 �000024 �00000000101 12455173731 027577� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# ignore the output junk from the example scripts example/output ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/�����������������������000755 �000766 �000024 �00000000000 12456115117 027236� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/fstream-npm.js�����������������000644 �000766 �000024 �00000022505 12455173731 030403� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var Ignore = require("fstream-ignore") , inherits = require("inherits") , path = require("path") , fs = require("fs") module.exports = Packer inherits(Packer, Ignore) function Packer (props) { if (!(this instanceof Packer)) { return new Packer(props) } if (typeof props === "string") { props = { path: props } } props.ignoreFiles = props.ignoreFiles || [ ".npmignore", ".gitignore", "package.json" ] Ignore.call(this, props) this.bundled = props.bundled this.bundleLinks = props.bundleLinks this.package = props.package // only do the magic bundling stuff for the node_modules folder that // lives right next to a package.json file. this.bundleMagic = this.parent && this.parent.packageRoot && this.basename === "node_modules" // in a node_modules folder, resolve symbolic links to // bundled dependencies when creating the package. props.follow = this.follow = this.bundleMagic // console.error("follow?", this.path, props.follow) if (this === this.root || this.parent && this.parent.bundleMagic && this.basename.charAt(0) !== ".") { this.readBundledLinks() } this.on("entryStat", function (entry, props) { // files should *always* get into tarballs // in a user-writable state, even if they're // being installed from some wackey vm-mounted // read-only filesystem. entry.mode = props.mode = props.mode | 0200 }) } Packer.prototype.readBundledLinks = function () { if (this._paused) { this.once("resume", this.addIgnoreFiles) return } this.pause() fs.readdir(this.path + "/node_modules", function (er, list) { // no harm if there's no bundle var l = list && list.length if (er || l === 0) return this.resume() var errState = null , then = function then (er) { if (errState) return if (er) return errState = er, this.resume() if (-- l === 0) return this.resume() }.bind(this) list.forEach(function (pkg) { if (pkg.charAt(0) === ".") return then() var pd = this.path + "/node_modules/" + pkg fs.realpath(pd, function (er, rp) { if (er) return then() this.bundleLinks = this.bundleLinks || {} this.bundleLinks[pkg] = rp then() }.bind(this)) }, this) }.bind(this)) } Packer.prototype.applyIgnores = function (entry, partial, entryObj) { // package.json files can never be ignored. if (entry === "package.json") return true // readme files should never be ignored. if (entry.match(/^readme(\.[^\.]*)$/i)) return true // license files should never be ignored. if (entry.match(/^(license|licence)(\.[^\.]*)?$/i)) return true // changelogs should never be ignored. if (entry.match(/^(changes|changelog|history)(\.[^\.]*)?$/i)) return true // special rules. see below. if (entry === "node_modules" && this.packageRoot) return true // some files are *never* allowed under any circumstances if (entry === ".git" || entry === ".lock-wscript" || entry.match(/^\.wafpickle-[0-9]+$/) || entry === "CVS" || entry === ".svn" || entry === ".hg" || entry.match(/^\..*\.swp$/) || entry === ".DS_Store" || entry.match(/^\._/) || entry === "npm-debug.log" ) { return false } // in a node_modules folder, we only include bundled dependencies // also, prevent packages in node_modules from being affected // by rules set in the containing package, so that // bundles don't get busted. // Also, once in a bundle, everything is installed as-is // To prevent infinite cycles in the case of cyclic deps that are // linked with npm link, even in a bundle, deps are only bundled // if they're not already present at a higher level. if (this.bundleMagic) { // bubbling up. stop here and allow anything the bundled pkg allows if (entry.indexOf("/") !== -1) return true // never include the .bin. It's typically full of platform-specific // stuff like symlinks and .cmd files anyway. if (entry === ".bin") return false var shouldBundle = false // the package root. var p = this.parent // the package before this one. var pp = p && p.parent // if this entry has already been bundled, and is a symlink, // and it is the *same* symlink as this one, then exclude it. if (pp && pp.bundleLinks && this.bundleLinks && pp.bundleLinks[entry] && pp.bundleLinks[entry] === this.bundleLinks[entry]) { return false } // since it's *not* a symbolic link, if we're *already* in a bundle, // then we should include everything. if (pp && pp.package && pp.basename === "node_modules") { return true } // only include it at this point if it's a bundleDependency var bd = this.package && this.package.bundleDependencies var shouldBundle = bd && bd.indexOf(entry) !== -1 // if we're not going to bundle it, then it doesn't count as a bundleLink // if (this.bundleLinks && !shouldBundle) delete this.bundleLinks[entry] return shouldBundle } // if (this.bundled) return true return Ignore.prototype.applyIgnores.call(this, entry, partial, entryObj) } Packer.prototype.addIgnoreFiles = function () { var entries = this.entries // if there's a .npmignore, then we do *not* want to // read the .gitignore. if (-1 !== entries.indexOf(".npmignore")) { var i = entries.indexOf(".gitignore") if (i !== -1) { entries.splice(i, 1) } } this.entries = entries Ignore.prototype.addIgnoreFiles.call(this) } Packer.prototype.readRules = function (buf, e) { if (e !== "package.json") { return Ignore.prototype.readRules.call(this, buf, e) } buf = buf.toString().trim() if (buf.length === 0) return [] try { var p = this.package = JSON.parse(buf) } catch (er) { // just pretend it's a normal old file, not magic at all. return [] } if (this === this.root) { this.bundleLinks = this.bundleLinks || {} this.bundleLinks[p.name] = this._path } this.packageRoot = true this.emit("package", p) // make bundle deps predictable if (p.bundledDependencies && !p.bundleDependencies) { p.bundleDependencies = p.bundledDependencies delete p.bundledDependencies } if (!p.files || !Array.isArray(p.files)) return [] // ignore everything except what's in the files array. return ["*"].concat(p.files.map(function (f) { return "!" + f })).concat(p.files.map(function (f) { return "!" + f.replace(/\/+$/, "") + "/**" })) } Packer.prototype.getChildProps = function (stat) { var props = Ignore.prototype.getChildProps.call(this, stat) props.package = this.package props.bundled = this.bundled && this.bundled.slice(0) props.bundleLinks = this.bundleLinks && Object.create(this.bundleLinks) // Directories have to be read as Packers // otherwise fstream.Reader will create a DirReader instead. if (stat.isDirectory()) { props.type = this.constructor } // only follow symbolic links directly in the node_modules folder. props.follow = false return props } var order = [ "package.json" , ".npmignore" , ".gitignore" , /^README(\.md)?$/ , "LICENCE" , "LICENSE" , /\.js$/ ] Packer.prototype.sort = function (a, b) { for (var i = 0, l = order.length; i < l; i ++) { var o = order[i] if (typeof o === "string") { if (a === o) return -1 if (b === o) return 1 } else { if (a.match(o)) return -1 if (b.match(o)) return 1 } } // deps go in the back if (a === "node_modules") return 1 if (b === "node_modules") return -1 return Ignore.prototype.sort.call(this, a, b) } Packer.prototype.emitEntry = function (entry) { if (this._paused) { this.once("resume", this.emitEntry.bind(this, entry)) return } // if there is a .gitignore, then we're going to // rename it to .npmignore in the output. if (entry.basename === ".gitignore") { entry.basename = ".npmignore" entry.path = path.resolve(entry.dirname, entry.basename) } // all *.gyp files are renamed to binding.gyp for node-gyp // but only when they are in the same folder as a package.json file. if (entry.basename.match(/\.gyp$/) && this.entries.indexOf("package.json") !== -1) { entry.basename = "binding.gyp" entry.path = path.resolve(entry.dirname, entry.basename) } // skip over symbolic links if (entry.type === "SymbolicLink") { entry.abort() return } if (entry.type !== "Directory") { // make it so that the folder in the tarball is named "package" var h = path.dirname((entry.root || entry).path) , t = entry.path.substr(h.length + 1).replace(/^[^\/\\]+/, "package") , p = h + "/" + t entry.path = p entry.dirname = path.dirname(p) return Ignore.prototype.emitEntry.call(this, entry) } // we don't want empty directories to show up in package // tarballs. // don't emit entry events for dirs, but still walk through // and read them. This means that we need to proxy up their // entry events so that those entries won't be missed, since // .pipe() doesn't do anythign special with "child" events, on // with "entry" events. var me = this entry.on("entry", function (e) { if (e.parent === entry) { e.parent = me me.emit("entry", e) } }) entry.on("package", this.emit.bind(this, "package")) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/LICENSE������������������������000644 �000766 �000024 �00000001375 12455173731 026623� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/node_modules/������������������000755 �000766 �000024 �00000000000 12456115117 030260� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/package.json�������������������000644 �000766 �000024 �00000002316 12455173731 030100� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "fstream-npm", "description": "fstream class for creating npm packages", "version": "1.0.1", "repository": { "type": "git", "url": "git://github.com/isaacs/fstream-npm.git" }, "main": "./fstream-npm.js", "dependencies": { "fstream-ignore": "^1.0.0", "inherits": "2" }, "license": "ISC", "gitHead": "4a95e1903f93dc122320349bb55e367ddd08ad6b", "bugs": { "url": "https://github.com/isaacs/fstream-npm/issues" }, "homepage": "https://github.com/isaacs/fstream-npm", "_id": "fstream-npm@1.0.1", "scripts": {}, "_shasum": "1e35c77f0fa24f5d6367e6d447ae7d6ddb482db2", "_from": "fstream-npm@1.0.1", "_npmVersion": "2.1.3", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "1e35c77f0fa24f5d6367e6d447ae7d6ddb482db2", "tarball": "http://registry.npmjs.org/fstream-npm/-/fstream-npm-1.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/fstream-npm/-/fstream-npm-1.0.1.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/README.md����������������������000644 �000766 �000024 �00000000700 12455173731 027064� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# fstream-npm This is an fstream DirReader class that will read a directory and filter things according to the semantics of what goes in an npm package. For example: ```javascript // This will print out all the files that would be included // by 'npm publish' or 'npm install' of this directory. var FN = require("fstream-npm") FN({ path: "./" }) .on("child", function (e) { console.error(e.path.substr(e.root.path.length + 1)) }) ``` ����������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/���000755 �000766 �000024 �00000000000 12456115117 033202� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/.npmignore����������������000644 �000766 �000024 �00000000016 12455173731 035124� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������test/fixtures ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/example/������������������000755 �000766 �000024 �00000000000 12456115117 034556� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/ignore.js�����������������000644 �000766 �000024 �00000016514 12455173731 034760� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������// Essentially, this is a fstream.DirReader class, but with a // bit of special logic to read the specified sort of ignore files, // and a filter that prevents it from picking up anything excluded // by those files. var Minimatch = require("minimatch").Minimatch , fstream = require("fstream") , DirReader = fstream.DirReader , inherits = require("inherits") , path = require("path") , fs = require("fs") module.exports = IgnoreReader inherits(IgnoreReader, DirReader) function IgnoreReader (props) { if (!(this instanceof IgnoreReader)) { return new IgnoreReader(props) } // must be a Directory type if (typeof props === "string") { props = { path: path.resolve(props) } } props.type = "Directory" props.Directory = true if (!props.ignoreFiles) props.ignoreFiles = [".ignore"] this.ignoreFiles = props.ignoreFiles this.ignoreRules = null // ensure that .ignore files always show up at the top of the list // that way, they can be read before proceeding to handle other // entries in that same folder if (props.sort) { this._sort = props.sort === "alpha" ? alphasort : props.sort props.sort = null } this.on("entries", function () { // if there are any ignore files in the list, then // pause and add them. // then, filter the list based on our ignoreRules var hasIg = this.entries.some(this.isIgnoreFile, this) if (!hasIg) return this.filterEntries() this.addIgnoreFiles() }) // we filter entries before we know what they are. // however, directories have to be re-tested against // rules with a "/" appended, because "a/b/" will only // match if "a/b" is a dir, and not otherwise. this.on("_entryStat", function (entry, props) { var t = entry.basename if (!this.applyIgnores(entry.basename, entry.type === "Directory", entry)) { entry.abort() } }.bind(this)) DirReader.call(this, props) } IgnoreReader.prototype.addIgnoreFiles = function () { if (this._paused) { this.once("resume", this.addIgnoreFiles) return } if (this._ignoreFilesAdded) return this._ignoreFilesAdded = true var newIg = this.entries.filter(this.isIgnoreFile, this) , count = newIg.length , errState = null if (!count) return this.pause() var then = function (er) { if (errState) return if (er) return this.emit("error", errState = er) if (-- count === 0) { this.filterEntries() this.resume() } else { this.addIgnoreFile(newIg[newIg.length - count], then) } }.bind(this) this.addIgnoreFile(newIg[0], then) } IgnoreReader.prototype.isIgnoreFile = function (e) { return e !== "." && e !== ".." && -1 !== this.ignoreFiles.indexOf(e) } IgnoreReader.prototype.getChildProps = function (stat) { var props = DirReader.prototype.getChildProps.call(this, stat) props.ignoreFiles = this.ignoreFiles // Directories have to be read as IgnoreReaders // otherwise fstream.Reader will create a DirReader instead. if (stat.isDirectory()) { props.type = this.constructor } return props } IgnoreReader.prototype.addIgnoreFile = function (e, cb) { // read the file, and then call addIgnoreRules // if there's an error, then tell the cb about it. var ig = path.resolve(this.path, e) fs.readFile(ig, function (er, data) { if (er) return cb(er) this.emit("ignoreFile", e, data) var rules = this.readRules(data, e) this.addIgnoreRules(rules, e) cb() }.bind(this)) } IgnoreReader.prototype.readRules = function (buf, e) { return buf.toString().split(/\r?\n/) } // Override this to do fancier things, like read the // "files" array from a package.json file or something. IgnoreReader.prototype.addIgnoreRules = function (set, e) { // filter out anything obvious set = set.filter(function (s) { s = s.trim() return s && !s.match(/^#/) }) // no rules to add! if (!set.length) return // now get a minimatch object for each one of these. // Note that we need to allow dot files by default, and // not switch the meaning of their exclusion var mmopt = { matchBase: true, dot: true, flipNegate: true } , mm = set.map(function (s) { var m = new Minimatch(s, mmopt) m.ignoreFile = e return m }) if (!this.ignoreRules) this.ignoreRules = [] this.ignoreRules.push.apply(this.ignoreRules, mm) } IgnoreReader.prototype.filterEntries = function () { // this exclusion is at the point where we know the list of // entries in the dir, but don't know what they are. since // some of them *might* be directories, we have to run the // match in dir-mode as well, so that we'll pick up partials // of files that will be included later. Anything included // at this point will be checked again later once we know // what it is. this.entries = this.entries.filter(function (entry) { // at this point, we don't know if it's a dir or not. return this.applyIgnores(entry) || this.applyIgnores(entry, true) }, this) } IgnoreReader.prototype.applyIgnores = function (entry, partial, obj) { var included = true // this = /a/b/c // entry = d // parent /a/b sees c/d if (this.parent && this.parent.applyIgnores) { var pt = this.basename + "/" + entry included = this.parent.applyIgnores(pt, partial) } // Negated Rules // Since we're *ignoring* things here, negating means that a file // is re-included, if it would have been excluded by a previous // rule. So, negated rules are only relevant if the file // has been excluded. // // Similarly, if a file has been excluded, then there's no point // trying it against rules that have already been applied // // We're using the "flipnegate" flag here, which tells minimatch // to set the "negate" for our information, but still report // whether the core pattern was a hit or a miss. if (!this.ignoreRules) { return included } this.ignoreRules.forEach(function (rule) { // negation means inclusion if (rule.negate && included || !rule.negate && !included) { // unnecessary return } // first, match against /foo/bar var match = rule.match("/" + entry) if (!match) { // try with the leading / trimmed off the test // eg: foo/bar instead of /foo/bar match = rule.match(entry) } // if the entry is a directory, then it will match // with a trailing slash. eg: /foo/bar/ or foo/bar/ if (!match && partial) { match = rule.match("/" + entry + "/") || rule.match(entry + "/") } // When including a file with a negated rule, it's // relevant if a directory partially matches, since // it may then match a file within it. // Eg, if you ignore /a, but !/a/b/c if (!match && rule.negate && partial) { match = rule.match("/" + entry, true) || rule.match(entry, true) } if (match) { included = rule.negate } }, this) return included } IgnoreReader.prototype.sort = function (a, b) { var aig = this.ignoreFiles.indexOf(a) !== -1 , big = this.ignoreFiles.indexOf(b) !== -1 if (aig && !big) return -1 if (big && !aig) return 1 return this._sort(a, b) } IgnoreReader.prototype._sort = function (a, b) { return 0 } function alphasort (a, b) { return a === b ? 0 : a.toLowerCase() > b.toLowerCase() ? 1 : a.toLowerCase() < b.toLowerCase() ? -1 : a > b ? 1 : -1 } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/LICENSE�������������������000644 �000766 �000024 �00000001375 12455173731 034143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/package.json��������������000644 �000766 �000024 �00000002562 12455173731 035423� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "fstream-ignore", "description": "A thing for ignoring files based on globs", "version": "1.0.2", "repository": { "type": "git", "url": "git://github.com/isaacs/fstream-ignore.git" }, "main": "ignore.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "fstream": "^1.0.0", "inherits": "2", "minimatch": "^2.0.1" }, "devDependencies": { "tap": "", "rimraf": "", "mkdirp": "" }, "license": "ISC", "gitHead": "20363d39660671c0de746bd07a0d07de7090d085", "bugs": { "url": "https://github.com/isaacs/fstream-ignore/issues" }, "homepage": "https://github.com/isaacs/fstream-ignore", "_id": "fstream-ignore@1.0.2", "_shasum": "18c891db01b782a74a7bff936a0f24997741c7ab", "_from": "fstream-ignore@>=1.0.0 <2.0.0", "_npmVersion": "2.1.11", "_nodeVersion": "0.10.16", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "18c891db01b782a74a7bff936a0f24997741c7ab", "tarball": "http://registry.npmjs.org/fstream-ignore/-/fstream-ignore-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/fstream-ignore/-/fstream-ignore-1.0.2.tgz" } ����������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/README.md�����������������000644 �000766 �000024 �00000001155 12455173731 034411� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# fstream-ignore A fstream DirReader that filters out files that match globs in `.ignore` files throughout the tree, like how git ignores files based on a `.gitignore` file. Here's an example: ```javascript var Ignore = require("fstream-ignore") Ignore({ path: __dirname , ignoreFiles: [".ignore", ".gitignore"] }) .on("child", function (c) { console.error(c.path.substr(c.root.path.length + 1)) }) .pipe(tar.Pack()) .pipe(fs.createWriteStream("foo.tar")) ``` This will tar up the files in __dirname into `foo.tar`, ignoring anything matched by the globs in any .iginore or .gitignore file. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/fstream-npm/node_modules/fstream-ignore/example/basic.js����������000644 �000766 �000024 �00000000537 12455173731 036207� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var Ignore = require("../") Ignore({ path: __dirname , ignoreFiles: [".ignore", ".gitignore"] }) .on("child", function (c) { console.error(c.path.substr(c.root.path.length + 1)) c.on("ignoreFile", onIgnoreFile) }) .on("ignoreFile", onIgnoreFile) function onIgnoreFile (e) { console.error("adding ignore file", e.path) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/bundle.js��������������000644 �000766 �000024 �00000000531 12455173731 031051� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this example will bundle every dependency var P = require("../") P({ path: "./" }) .on("package", bundleIt) .on("entry", function (e) { console.error(e.constructor.name, e.path.substr(e.root.dirname.length + 1)) e.on("package", bundleIt) }) function bundleIt (p) { p.bundleDependencies = Object.keys(p.dependencies || {}) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/dir-tar.js�������������000644 �000766 �000024 �00000001044 12455173731 031142� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this will show what ends up in the fstream-npm package var P = require("fstream").DirReader var tar = require("tar") function f (entry) { return entry.basename !== ".git" } new P({ path: "./", type: "Directory", Directory: true, filter: f }) .on("package", function (p) { console.error("package", p) }) .on("ignoreFile", function (e) { console.error("ignoreFile", e) }) .on("entry", function (e) { console.error(e.constructor.name, e.path.substr(e.root.path.length + 1)) }) .pipe(tar.Pack()) .pipe(process.stdout) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/dir.js�����������������000644 �000766 �000024 �00000001221 12455173731 030353� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this will show what ends up in the fstream-npm package var P = require("../") var DW = require("fstream").DirWriter var target = new DW({ Directory: true, type: "Directory", path: __dirname + "/output"}) function f (entry) { return entry.basename !== ".git" } P({ path: "./", type: "Directory", isDirectory: true, filter: f }) .on("package", function (p) { console.error("package", p) }) .on("ignoreFile", function (e) { console.error("ignoreFile", e) }) .on("entry", function (e) { console.error(e.constructor.name, e.path) }) .pipe(target) .on("end", function () { console.error("ended") }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/example.js�������������000644 �000766 �000024 �00000000546 12455173731 031241� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this will show what ends up in the fstream-npm package var P = require("../") P({ path: "./" }) .on("package", function (p) { console.error("package", p) }) .on("ignoreFile", function (e) { console.error("ignoreFile", e) }) .on("entry", function (e) { console.error(e.constructor.name, e.path.substr(e.root.dirname.length + 1)) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/ig-tar.js��������������000644 �000766 �000024 �00000001041 12455173731 030760� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this will show what ends up in the fstream-npm package var P = require("fstream-ignore") var tar = require("tar") function f (entry) { return entry.basename !== ".git" } new P({ path: "./", type: "Directory", Directory: true, filter: f }) .on("package", function (p) { console.error("package", p) }) .on("ignoreFile", function (e) { console.error("ignoreFile", e) }) .on("entry", function (e) { console.error(e.constructor.name, e.path.substr(e.root.path.length + 1)) }) .pipe(tar.Pack()) .pipe(process.stdout) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream-npm/example/tar.js�����������������000644 �000766 �000024 �00000001135 12455173731 030367� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this will show what ends up in the fstream-npm package var P = require("../") var tar = require("tar") function f () { return true } // function f (entry) { // return entry.basename !== ".git" // } new P({ path: "./", type: "Directory", isDirectory: true, filter: f }) .on("package", function (p) { console.error("package", p) }) .on("ignoreFile", function (e) { console.error("ignoreFile", e) }) .on("entry", function (e) { console.error(e.constructor.name, e.path) }) .on("end", function () { console.error("ended") }) .pipe(tar.Pack()) .pipe(process.stdout) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/.npmignore�������������������������000644 �000766 �000024 �00000000116 12455173731 027015� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.*.swp node_modules/ examples/deep-copy/ examples/path/ examples/filter-copy/ ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/.travis.yml������������������������000644 �000766 �000024 �00000000043 12455173731 027126� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.6 ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/examples/��������������������������000755 �000766 �000024 �00000000000 12456115117 026631� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/fstream.js�������������������������000644 �000766 �000024 �00000002140 12455173731 027014� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������exports.Abstract = require("./lib/abstract.js") exports.Reader = require("./lib/reader.js") exports.Writer = require("./lib/writer.js") exports.File = { Reader: require("./lib/file-reader.js") , Writer: require("./lib/file-writer.js") } exports.Dir = { Reader : require("./lib/dir-reader.js") , Writer : require("./lib/dir-writer.js") } exports.Link = { Reader : require("./lib/link-reader.js") , Writer : require("./lib/link-writer.js") } exports.Proxy = { Reader : require("./lib/proxy-reader.js") , Writer : require("./lib/proxy-writer.js") } exports.Reader.Dir = exports.DirReader = exports.Dir.Reader exports.Reader.File = exports.FileReader = exports.File.Reader exports.Reader.Link = exports.LinkReader = exports.Link.Reader exports.Reader.Proxy = exports.ProxyReader = exports.Proxy.Reader exports.Writer.Dir = exports.DirWriter = exports.Dir.Writer exports.Writer.File = exports.FileWriter = exports.File.Writer exports.Writer.Link = exports.LinkWriter = exports.Link.Writer exports.Writer.Proxy = exports.ProxyWriter = exports.Proxy.Writer exports.collect = require("./lib/collect.js") ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/�������������������������������000755 �000766 �000024 �00000000000 12456115117 025561� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/LICENSE����������������������������000644 �000766 �000024 �00000002436 12455173731 026032� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/package.json�����������������������000644 �000766 �000024 �00000006501 12455173731 027310� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "fstream", "description": "Advanced file system stream things", "version": "1.0.3", "repository": { "type": "git", "url": "git://github.com/isaacs/fstream.git" }, "main": "fstream.js", "engines": { "node": ">=0.6" }, "dependencies": { "graceful-fs": "3", "inherits": "~2.0.0", "mkdirp": ">=0.5 0", "rimraf": "2" }, "devDependencies": { "tap": "" }, "scripts": { "test": "tap examples/*.js" }, "license": "BSD", "readme": "Like FS streams, but with stat on them, and supporting directories and\nsymbolic links, as well as normal files. Also, you can use this to set\nthe stats on a file, even if you don't change its contents, or to create\na symlink, etc.\n\nSo, for example, you can \"write\" a directory, and it'll call `mkdir`. You\ncan specify a uid and gid, and it'll call `chown`. You can specify a\n`mtime` and `atime`, and it'll call `utimes`. You can call it a symlink\nand provide a `linkpath` and it'll call `symlink`.\n\nNote that it won't automatically resolve symbolic links. So, if you\ncall `fstream.Reader('/some/symlink')` then you'll get an object\nthat stats and then ends immediately (since it has no data). To follow\nsymbolic links, do this: `fstream.Reader({path:'/some/symlink', follow:\ntrue })`.\n\nThere are various checks to make sure that the bytes emitted are the\nsame as the intended size, if the size is set.\n\n## Examples\n\n```javascript\nfstream\n .Writer({ path: \"path/to/file\"\n , mode: 0755\n , size: 6\n })\n .write(\"hello\\n\")\n .end()\n```\n\nThis will create the directories if they're missing, and then write\n`hello\\n` into the file, chmod it to 0755, and assert that 6 bytes have\nbeen written when it's done.\n\n```javascript\nfstream\n .Writer({ path: \"path/to/file\"\n , mode: 0755\n , size: 6\n , flags: \"a\"\n })\n .write(\"hello\\n\")\n .end()\n```\n\nYou can pass flags in, if you want to append to a file.\n\n```javascript\nfstream\n .Writer({ path: \"path/to/symlink\"\n , linkpath: \"./file\"\n , SymbolicLink: true\n , mode: \"0755\" // octal strings supported\n })\n .end()\n```\n\nIf isSymbolicLink is a function, it'll be called, and if it returns\ntrue, then it'll treat it as a symlink. If it's not a function, then\nany truish value will make a symlink, or you can set `type:\n'SymbolicLink'`, which does the same thing.\n\nNote that the linkpath is relative to the symbolic link location, not\nthe parent dir or cwd.\n\n```javascript\nfstream\n .Reader(\"path/to/dir\")\n .pipe(fstream.Writer(\"path/to/other/dir\"))\n```\n\nThis will do like `cp -Rp path/to/dir path/to/other/dir`. If the other\ndir exists and isn't a directory, then it'll emit an error. It'll also\nset the uid, gid, mode, etc. to be identical. In this way, it's more\nlike `rsync -a` than simply a copy.\n", "readmeFilename": "README.md", "gitHead": "d205397b27d93eee5314e9d2d87693e82b560106", "bugs": { "url": "https://github.com/isaacs/fstream/issues" }, "homepage": "https://github.com/isaacs/fstream", "_id": "fstream@1.0.3", "_shasum": "5ce69767710d7a39c8cd9232470d9426790195da", "_from": "fstream@>=1.0.3 <1.1.0" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/README.md��������������������������000644 �000766 �000024 �00000004451 12455173731 026303� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Like FS streams, but with stat on them, and supporting directories and symbolic links, as well as normal files. Also, you can use this to set the stats on a file, even if you don't change its contents, or to create a symlink, etc. So, for example, you can "write" a directory, and it'll call `mkdir`. You can specify a uid and gid, and it'll call `chown`. You can specify a `mtime` and `atime`, and it'll call `utimes`. You can call it a symlink and provide a `linkpath` and it'll call `symlink`. Note that it won't automatically resolve symbolic links. So, if you call `fstream.Reader('/some/symlink')` then you'll get an object that stats and then ends immediately (since it has no data). To follow symbolic links, do this: `fstream.Reader({path:'/some/symlink', follow: true })`. There are various checks to make sure that the bytes emitted are the same as the intended size, if the size is set. ## Examples ```javascript fstream .Writer({ path: "path/to/file" , mode: 0755 , size: 6 }) .write("hello\n") .end() ``` This will create the directories if they're missing, and then write `hello\n` into the file, chmod it to 0755, and assert that 6 bytes have been written when it's done. ```javascript fstream .Writer({ path: "path/to/file" , mode: 0755 , size: 6 , flags: "a" }) .write("hello\n") .end() ``` You can pass flags in, if you want to append to a file. ```javascript fstream .Writer({ path: "path/to/symlink" , linkpath: "./file" , SymbolicLink: true , mode: "0755" // octal strings supported }) .end() ``` If isSymbolicLink is a function, it'll be called, and if it returns true, then it'll treat it as a symlink. If it's not a function, then any truish value will make a symlink, or you can set `type: 'SymbolicLink'`, which does the same thing. Note that the linkpath is relative to the symbolic link location, not the parent dir or cwd. ```javascript fstream .Reader("path/to/dir") .pipe(fstream.Writer("path/to/other/dir")) ``` This will do like `cp -Rp path/to/dir path/to/other/dir`. If the other dir exists and isn't a directory, then it'll emit an error. It'll also set the uid, gid, mode, etc. to be identical. In this way, it's more like `rsync -a` than simply a copy. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/abstract.js��������������������000644 �000766 �000024 �00000004242 12455173731 027731� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// the parent class for all fstreams. module.exports = Abstract var Stream = require("stream").Stream , inherits = require("inherits") function Abstract () { Stream.call(this) } inherits(Abstract, Stream) Abstract.prototype.on = function (ev, fn) { if (ev === "ready" && this.ready) { process.nextTick(fn.bind(this)) } else { Stream.prototype.on.call(this, ev, fn) } return this } Abstract.prototype.abort = function () { this._aborted = true this.emit("abort") } Abstract.prototype.destroy = function () {} Abstract.prototype.warn = function (msg, code) { var me = this , er = decorate(msg, code, me) if (!me.listeners("warn")) { console.error("%s %s\n" + "path = %s\n" + "syscall = %s\n" + "fstream_type = %s\n" + "fstream_path = %s\n" + "fstream_unc_path = %s\n" + "fstream_class = %s\n" + "fstream_stack =\n%s\n", code || "UNKNOWN", er.stack, er.path, er.syscall, er.fstream_type, er.fstream_path, er.fstream_unc_path, er.fstream_class, er.fstream_stack.join("\n")) } else { me.emit("warn", er) } } Abstract.prototype.info = function (msg, code) { this.emit("info", msg, code) } Abstract.prototype.error = function (msg, code, th) { var er = decorate(msg, code, this) if (th) throw er else this.emit("error", er) } function decorate (er, code, me) { if (!(er instanceof Error)) er = new Error(er) er.code = er.code || code er.path = er.path || me.path er.fstream_type = er.fstream_type || me.type er.fstream_path = er.fstream_path || me.path if (me._path !== me.path) { er.fstream_unc_path = er.fstream_unc_path || me._path } if (me.linkpath) { er.fstream_linkpath = er.fstream_linkpath || me.linkpath } er.fstream_class = er.fstream_class || me.constructor.name er.fstream_stack = er.fstream_stack || new Error().stack.split(/\n/).slice(3).map(function (s) { return s.replace(/^ at /, "") }) return er } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/collect.js���������������������000644 �000766 �000024 �00000003164 12455173731 027555� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = collect function collect (stream) { if (stream._collected) return stream._collected = true stream.pause() stream.on("data", save) stream.on("end", save) var buf = [] function save (b) { if (typeof b === "string") b = new Buffer(b) if (Buffer.isBuffer(b) && !b.length) return buf.push(b) } stream.on("entry", saveEntry) var entryBuffer = [] function saveEntry (e) { collect(e) entryBuffer.push(e) } stream.on("proxy", proxyPause) function proxyPause (p) { p.pause() } // replace the pipe method with a new version that will // unlock the buffered stuff. if you just call .pipe() // without a destination, then it'll re-play the events. stream.pipe = (function (orig) { return function (dest) { // console.error(" === open the pipes", dest && dest.path) // let the entries flow through one at a time. // Once they're all done, then we can resume completely. var e = 0 ;(function unblockEntry () { var entry = entryBuffer[e++] // console.error(" ==== unblock entry", entry && entry.path) if (!entry) return resume() entry.on("end", unblockEntry) if (dest) dest.add(entry) else stream.emit("entry", entry) })() function resume () { stream.removeListener("entry", saveEntry) stream.removeListener("data", save) stream.removeListener("end", save) stream.pipe = orig if (dest) stream.pipe(dest) buf.forEach(function (b) { if (b) stream.emit("data", b) else stream.emit("end") }) stream.resume() } return dest }})(stream.pipe) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/dir-reader.js������������������000644 �000766 �000024 �00000014445 12455173731 030152� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// A thing that emits "entry" events with Reader objects // Pausing it causes it to stop emitting entry events, and also // pauses the current entry if there is one. module.exports = DirReader var fs = require("graceful-fs") , fstream = require("../fstream.js") , Reader = fstream.Reader , inherits = require("inherits") , mkdir = require("mkdirp") , path = require("path") , Reader = require("./reader.js") , assert = require("assert").ok inherits(DirReader, Reader) function DirReader (props) { var me = this if (!(me instanceof DirReader)) throw new Error( "DirReader must be called as constructor.") // should already be established as a Directory type if (props.type !== "Directory" || !props.Directory) { throw new Error("Non-directory type "+ props.type) } me.entries = null me._index = -1 me._paused = false me._length = -1 if (props.sort) { this.sort = props.sort } Reader.call(this, props) } DirReader.prototype._getEntries = function () { var me = this // race condition. might pause() before calling _getEntries, // and then resume, and try to get them a second time. if (me._gotEntries) return me._gotEntries = true fs.readdir(me._path, function (er, entries) { if (er) return me.error(er) me.entries = entries me.emit("entries", entries) if (me._paused) me.once("resume", processEntries) else processEntries() function processEntries () { me._length = me.entries.length if (typeof me.sort === "function") { me.entries = me.entries.sort(me.sort.bind(me)) } me._read() } }) } // start walking the dir, and emit an "entry" event for each one. DirReader.prototype._read = function () { var me = this if (!me.entries) return me._getEntries() if (me._paused || me._currentEntry || me._aborted) { // console.error("DR paused=%j, current=%j, aborted=%j", me._paused, !!me._currentEntry, me._aborted) return } me._index ++ if (me._index >= me.entries.length) { if (!me._ended) { me._ended = true me.emit("end") me.emit("close") } return } // ok, handle this one, then. // save creating a proxy, by stat'ing the thing now. var p = path.resolve(me._path, me.entries[me._index]) assert(p !== me._path) assert(me.entries[me._index]) // set this to prevent trying to _read() again in the stat time. me._currentEntry = p fs[ me.props.follow ? "stat" : "lstat" ](p, function (er, stat) { if (er) return me.error(er) var who = me._proxy || me stat.path = p stat.basename = path.basename(p) stat.dirname = path.dirname(p) var childProps = me.getChildProps.call(who, stat) childProps.path = p childProps.basename = path.basename(p) childProps.dirname = path.dirname(p) var entry = Reader(childProps, stat) // console.error("DR Entry", p, stat.size) me._currentEntry = entry // "entry" events are for direct entries in a specific dir. // "child" events are for any and all children at all levels. // This nomenclature is not completely final. entry.on("pause", function (who) { if (!me._paused && !entry._disowned) { me.pause(who) } }) entry.on("resume", function (who) { if (me._paused && !entry._disowned) { me.resume(who) } }) entry.on("stat", function (props) { me.emit("_entryStat", entry, props) if (entry._aborted) return if (entry._paused) entry.once("resume", function () { me.emit("entryStat", entry, props) }) else me.emit("entryStat", entry, props) }) entry.on("ready", function EMITCHILD () { // console.error("DR emit child", entry._path) if (me._paused) { // console.error(" DR emit child - try again later") // pause the child, and emit the "entry" event once we drain. // console.error("DR pausing child entry") entry.pause(me) return me.once("resume", EMITCHILD) } // skip over sockets. they can't be piped around properly, // so there's really no sense even acknowledging them. // if someone really wants to see them, they can listen to // the "socket" events. if (entry.type === "Socket") { me.emit("socket", entry) } else { me.emitEntry(entry) } }) var ended = false entry.on("close", onend) entry.on("disown", onend) function onend () { if (ended) return ended = true me.emit("childEnd", entry) me.emit("entryEnd", entry) me._currentEntry = null if (!me._paused) { me._read() } } // XXX Remove this. Works in node as of 0.6.2 or so. // Long filenames should not break stuff. entry.on("error", function (er) { if (entry._swallowErrors) { me.warn(er) entry.emit("end") entry.emit("close") } else { me.emit("error", er) } }) // proxy up some events. ; [ "child" , "childEnd" , "warn" ].forEach(function (ev) { entry.on(ev, me.emit.bind(me, ev)) }) }) } DirReader.prototype.disown = function (entry) { entry.emit("beforeDisown") entry._disowned = true entry.parent = entry.root = null if (entry === this._currentEntry) { this._currentEntry = null } entry.emit("disown") } DirReader.prototype.getChildProps = function (stat) { return { depth: this.depth + 1 , root: this.root || this , parent: this , follow: this.follow , filter: this.filter , sort: this.props.sort , hardlinks: this.props.hardlinks } } DirReader.prototype.pause = function (who) { var me = this if (me._paused) return who = who || me me._paused = true if (me._currentEntry && me._currentEntry.pause) { me._currentEntry.pause(who) } me.emit("pause", who) } DirReader.prototype.resume = function (who) { var me = this if (!me._paused) return who = who || me me._paused = false // console.error("DR Emit Resume", me._path) me.emit("resume", who) if (me._paused) { // console.error("DR Re-paused", me._path) return } if (me._currentEntry) { if (me._currentEntry.resume) me._currentEntry.resume(who) } else me._read() } DirReader.prototype.emitEntry = function (entry) { this.emit("entry", entry) this.emit("child", entry) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/dir-writer.js������������������000644 �000766 �000024 �00000010630 12455173731 030214� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// It is expected that, when .add() returns false, the consumer // of the DirWriter will pause until a "drain" event occurs. Note // that this is *almost always going to be the case*, unless the // thing being written is some sort of unsupported type, and thus // skipped over. module.exports = DirWriter var fs = require("graceful-fs") , fstream = require("../fstream.js") , Writer = require("./writer.js") , inherits = require("inherits") , mkdir = require("mkdirp") , path = require("path") , collect = require("./collect.js") inherits(DirWriter, Writer) function DirWriter (props) { var me = this if (!(me instanceof DirWriter)) me.error( "DirWriter must be called as constructor.", null, true) // should already be established as a Directory type if (props.type !== "Directory" || !props.Directory) { me.error("Non-directory type "+ props.type + " " + JSON.stringify(props), null, true) } Writer.call(this, props) } DirWriter.prototype._create = function () { var me = this mkdir(me._path, Writer.dirmode, function (er) { if (er) return me.error(er) // ready to start getting entries! me.ready = true me.emit("ready") me._process() }) } // a DirWriter has an add(entry) method, but its .write() doesn't // do anything. Why a no-op rather than a throw? Because this // leaves open the door for writing directory metadata for // gnu/solaris style dumpdirs. DirWriter.prototype.write = function () { return true } DirWriter.prototype.end = function () { this._ended = true this._process() } DirWriter.prototype.add = function (entry) { var me = this // console.error("\tadd", entry._path, "->", me._path) collect(entry) if (!me.ready || me._currentEntry) { me._buffer.push(entry) return false } // create a new writer, and pipe the incoming entry into it. if (me._ended) { return me.error("add after end") } me._buffer.push(entry) me._process() return 0 === this._buffer.length } DirWriter.prototype._process = function () { var me = this // console.error("DW Process p=%j", me._processing, me.basename) if (me._processing) return var entry = me._buffer.shift() if (!entry) { // console.error("DW Drain") me.emit("drain") if (me._ended) me._finish() return } me._processing = true // console.error("DW Entry", entry._path) me.emit("entry", entry) // ok, add this entry // // don't allow recursive copying var p = entry do { var pp = p._path || p.path if (pp === me.root._path || pp === me._path || (pp && pp.indexOf(me._path) === 0)) { // console.error("DW Exit (recursive)", entry.basename, me._path) me._processing = false if (entry._collected) entry.pipe() return me._process() } } while (p = p.parent) // console.error("DW not recursive") // chop off the entry's root dir, replace with ours var props = { parent: me , root: me.root || me , type: entry.type , depth: me.depth + 1 } var p = entry._path || entry.path || entry.props.path if (entry.parent) { p = p.substr(entry.parent._path.length + 1) } // get rid of any ../../ shenanigans props.path = path.join(me.path, path.join("/", p)) // if i have a filter, the child should inherit it. props.filter = me.filter // all the rest of the stuff, copy over from the source. Object.keys(entry.props).forEach(function (k) { if (!props.hasOwnProperty(k)) { props[k] = entry.props[k] } }) // not sure at this point what kind of writer this is. var child = me._currentChild = new Writer(props) child.on("ready", function () { // console.error("DW Child Ready", child.type, child._path) // console.error(" resuming", entry._path) entry.pipe(child) entry.resume() }) // XXX Make this work in node. // Long filenames should not break stuff. child.on("error", function (er) { if (child._swallowErrors) { me.warn(er) child.emit("end") child.emit("close") } else { me.emit("error", er) } }) // we fire _end internally *after* end, so that we don't move on // until any "end" listeners have had their chance to do stuff. child.on("close", onend) var ended = false function onend () { if (ended) return ended = true // console.error("* DW Child end", child.basename) me._currentChild = null me._processing = false me._process() } } ��������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/file-reader.js�����������������000644 �000766 �000024 �00000007531 12455173731 030311� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Basically just a wrapper around an fs.ReadStream module.exports = FileReader var fs = require("graceful-fs") , fstream = require("../fstream.js") , Reader = fstream.Reader , inherits = require("inherits") , mkdir = require("mkdirp") , Reader = require("./reader.js") , EOF = {EOF: true} , CLOSE = {CLOSE: true} inherits(FileReader, Reader) function FileReader (props) { // console.error(" FR create", props.path, props.size, new Error().stack) var me = this if (!(me instanceof FileReader)) throw new Error( "FileReader must be called as constructor.") // should already be established as a File type // XXX Todo: preserve hardlinks by tracking dev+inode+nlink, // with a HardLinkReader class. if (!((props.type === "Link" && props.Link) || (props.type === "File" && props.File))) { throw new Error("Non-file type "+ props.type) } me._buffer = [] me._bytesEmitted = 0 Reader.call(me, props) } FileReader.prototype._getStream = function () { var me = this , stream = me._stream = fs.createReadStream(me._path, me.props) if (me.props.blksize) { stream.bufferSize = me.props.blksize } stream.on("open", me.emit.bind(me, "open")) stream.on("data", function (c) { // console.error("\t\t%d %s", c.length, me.basename) me._bytesEmitted += c.length // no point saving empty chunks if (!c.length) return else if (me._paused || me._buffer.length) { me._buffer.push(c) me._read() } else me.emit("data", c) }) stream.on("end", function () { if (me._paused || me._buffer.length) { // console.error("FR Buffering End", me._path) me._buffer.push(EOF) me._read() } else { me.emit("end") } if (me._bytesEmitted !== me.props.size) { me.error("Didn't get expected byte count\n"+ "expect: "+me.props.size + "\n" + "actual: "+me._bytesEmitted) } }) stream.on("close", function () { if (me._paused || me._buffer.length) { // console.error("FR Buffering Close", me._path) me._buffer.push(CLOSE) me._read() } else { // console.error("FR close 1", me._path) me.emit("close") } }) stream.on("error", function (e) { me.emit("error", e); }); me._read() } FileReader.prototype._read = function () { var me = this // console.error("FR _read", me._path) if (me._paused) { // console.error("FR _read paused", me._path) return } if (!me._stream) { // console.error("FR _getStream calling", me._path) return me._getStream() } // clear out the buffer, if there is one. if (me._buffer.length) { // console.error("FR _read has buffer", me._buffer.length, me._path) var buf = me._buffer for (var i = 0, l = buf.length; i < l; i ++) { var c = buf[i] if (c === EOF) { // console.error("FR Read emitting buffered end", me._path) me.emit("end") } else if (c === CLOSE) { // console.error("FR Read emitting buffered close", me._path) me.emit("close") } else { // console.error("FR Read emitting buffered data", me._path) me.emit("data", c) } if (me._paused) { // console.error("FR Read Re-pausing at "+i, me._path) me._buffer = buf.slice(i) return } } me._buffer.length = 0 } // console.error("FR _read done") // that's about all there is to it. } FileReader.prototype.pause = function (who) { var me = this // console.error("FR Pause", me._path) if (me._paused) return who = who || me me._paused = true if (me._stream) me._stream.pause() me.emit("pause", who) } FileReader.prototype.resume = function (who) { var me = this // console.error("FR Resume", me._path) if (!me._paused) return who = who || me me.emit("resume", who) me._paused = false if (me._stream) me._stream.resume() me._read() } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/file-writer.js�����������������000644 �000766 �000024 �00000004614 12455173731 030362� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = FileWriter var fs = require("graceful-fs") , mkdir = require("mkdirp") , Writer = require("./writer.js") , inherits = require("inherits") , EOF = {} inherits(FileWriter, Writer) function FileWriter (props) { var me = this if (!(me instanceof FileWriter)) throw new Error( "FileWriter must be called as constructor.") // should already be established as a File type if (props.type !== "File" || !props.File) { throw new Error("Non-file type "+ props.type) } me._buffer = [] me._bytesWritten = 0 Writer.call(this, props) } FileWriter.prototype._create = function () { var me = this if (me._stream) return var so = {} if (me.props.flags) so.flags = me.props.flags so.mode = Writer.filemode if (me._old && me._old.blksize) so.bufferSize = me._old.blksize me._stream = fs.createWriteStream(me._path, so) me._stream.on("open", function (fd) { // console.error("FW open", me._buffer, me._path) me.ready = true me._buffer.forEach(function (c) { if (c === EOF) me._stream.end() else me._stream.write(c) }) me.emit("ready") // give this a kick just in case it needs it. me.emit("drain") }) me._stream.on("drain", function () { me.emit("drain") }) me._stream.on("close", function () { // console.error("\n\nFW Stream Close", me._path, me.size) me._finish() }) } FileWriter.prototype.write = function (c) { var me = this me._bytesWritten += c.length if (!me.ready) { if (!Buffer.isBuffer(c) && typeof c !== 'string') throw new Error('invalid write data') me._buffer.push(c) return false } var ret = me._stream.write(c) // console.error("\t-- fw wrote, _stream says", ret, me._stream._queue.length) // allow 2 buffered writes, because otherwise there's just too // much stop and go bs. if (ret === false && me._stream._queue) { return me._stream._queue.length <= 2; } else { return ret; } } FileWriter.prototype.end = function (c) { var me = this if (c) me.write(c) if (!me.ready) { me._buffer.push(EOF) return false } return me._stream.end() } FileWriter.prototype._finish = function () { var me = this if (typeof me.size === "number" && me._bytesWritten != me.size) { me.error( "Did not get expected byte count.\n" + "expect: " + me.size + "\n" + "actual: " + me._bytesWritten) } Writer.prototype._finish.call(me) } ��������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/get-type.js��������������������000644 �000766 �000024 �00000001170 12455173731 027661� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = getType function getType (st) { var types = [ "Directory" , "File" , "SymbolicLink" , "Link" // special for hardlinks from tarballs , "BlockDevice" , "CharacterDevice" , "FIFO" , "Socket" ] , type if (st.type && -1 !== types.indexOf(st.type)) { st[st.type] = true return st.type } for (var i = 0, l = types.length; i < l; i ++) { type = types[i] var is = st[type] || st["is" + type] if (typeof is === "function") is = is.call(st) if (is) { st[type] = true st.type = type return type } } return null } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/link-reader.js�����������������000644 �000766 �000024 �00000003006 12455173731 030320� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Basically just a wrapper around an fs.readlink // // XXX: Enhance this to support the Link type, by keeping // a lookup table of {<dev+inode>:<path>}, so that hardlinks // can be preserved in tarballs. module.exports = LinkReader var fs = require("graceful-fs") , fstream = require("../fstream.js") , inherits = require("inherits") , mkdir = require("mkdirp") , Reader = require("./reader.js") inherits(LinkReader, Reader) function LinkReader (props) { var me = this if (!(me instanceof LinkReader)) throw new Error( "LinkReader must be called as constructor.") if (!((props.type === "Link" && props.Link) || (props.type === "SymbolicLink" && props.SymbolicLink))) { throw new Error("Non-link type "+ props.type) } Reader.call(me, props) } // When piping a LinkReader into a LinkWriter, we have to // already have the linkpath property set, so that has to // happen *before* the "ready" event, which means we need to // override the _stat method. LinkReader.prototype._stat = function (currentStat) { var me = this fs.readlink(me._path, function (er, linkpath) { if (er) return me.error(er) me.linkpath = me.props.linkpath = linkpath me.emit("linkpath", linkpath) Reader.prototype._stat.call(me, currentStat) }) } LinkReader.prototype._read = function () { var me = this if (me._paused) return // basically just a no-op, since we got all the info we need // from the _stat method if (!me._ended) { me.emit("end") me.emit("close") me._ended = true } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/link-writer.js�����������������000644 �000766 �000024 �00000005344 12455173731 030401� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = LinkWriter var fs = require("graceful-fs") , Writer = require("./writer.js") , inherits = require("inherits") , path = require("path") , rimraf = require("rimraf") inherits(LinkWriter, Writer) function LinkWriter (props) { var me = this if (!(me instanceof LinkWriter)) throw new Error( "LinkWriter must be called as constructor.") // should already be established as a Link type if (!((props.type === "Link" && props.Link) || (props.type === "SymbolicLink" && props.SymbolicLink))) { throw new Error("Non-link type "+ props.type) } if (props.linkpath === "") props.linkpath = "." if (!props.linkpath) { me.error("Need linkpath property to create " + props.type) } Writer.call(this, props) } LinkWriter.prototype._create = function () { // console.error(" LW _create") var me = this , hard = me.type === "Link" || process.platform === "win32" , link = hard ? "link" : "symlink" , lp = hard ? path.resolve(me.dirname, me.linkpath) : me.linkpath // can only change the link path by clobbering // For hard links, let's just assume that's always the case, since // there's no good way to read them if we don't already know. if (hard) return clobber(me, lp, link) fs.readlink(me._path, function (er, p) { // only skip creation if it's exactly the same link if (p && p === lp) return finish(me) clobber(me, lp, link) }) } function clobber (me, lp, link) { rimraf(me._path, function (er) { if (er) return me.error(er) create(me, lp, link) }) } function create (me, lp, link) { fs[link](lp, me._path, function (er) { // if this is a hard link, and we're in the process of writing out a // directory, it's very possible that the thing we're linking to // doesn't exist yet (especially if it was intended as a symlink), // so swallow ENOENT errors here and just soldier in. // Additionally, an EPERM or EACCES can happen on win32 if it's trying // to make a link to a directory. Again, just skip it. // A better solution would be to have fs.symlink be supported on // windows in some nice fashion. if (er) { if ((er.code === "ENOENT" || er.code === "EACCES" || er.code === "EPERM" ) && process.platform === "win32") { me.ready = true me.emit("ready") me.emit("end") me.emit("close") me.end = me._finish = function () {} } else return me.error(er) } finish(me) }) } function finish (me) { me.ready = true me.emit("ready") if (me._ended && !me._finished) me._finish() } LinkWriter.prototype.end = function () { // console.error("LW finish in end") this._ended = true if (this.ready) { this._finished = true this._finish() } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/proxy-reader.js����������������000644 �000766 �000024 �00000003656 12455173731 030557� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// A reader for when we don't yet know what kind of thing // the thing is. module.exports = ProxyReader var Reader = require("./reader.js") , getType = require("./get-type.js") , inherits = require("inherits") , fs = require("graceful-fs") inherits(ProxyReader, Reader) function ProxyReader (props) { var me = this if (!(me instanceof ProxyReader)) throw new Error( "ProxyReader must be called as constructor.") me.props = props me._buffer = [] me.ready = false Reader.call(me, props) } ProxyReader.prototype._stat = function () { var me = this , props = me.props // stat the thing to see what the proxy should be. , stat = props.follow ? "stat" : "lstat" fs[stat](props.path, function (er, current) { var type if (er || !current) { type = "File" } else { type = getType(current) } props[type] = true props.type = me.type = type me._old = current me._addProxy(Reader(props, current)) }) } ProxyReader.prototype._addProxy = function (proxy) { var me = this if (me._proxyTarget) { return me.error("proxy already set") } me._proxyTarget = proxy proxy._proxy = me ; [ "error" , "data" , "end" , "close" , "linkpath" , "entry" , "entryEnd" , "child" , "childEnd" , "warn" , "stat" ].forEach(function (ev) { // console.error("~~ proxy event", ev, me.path) proxy.on(ev, me.emit.bind(me, ev)) }) me.emit("proxy", proxy) proxy.on("ready", function () { // console.error("~~ proxy is ready!", me.path) me.ready = true me.emit("ready") }) var calls = me._buffer me._buffer.length = 0 calls.forEach(function (c) { proxy[c[0]].apply(proxy, c[1]) }) } ProxyReader.prototype.pause = function () { return this._proxyTarget ? this._proxyTarget.pause() : false } ProxyReader.prototype.resume = function () { return this._proxyTarget ? this._proxyTarget.resume() : false } ����������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/proxy-writer.js����������������000644 �000766 �000024 �00000004551 12455173731 030624� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// A writer for when we don't know what kind of thing // the thing is. That is, it's not explicitly set, // so we're going to make it whatever the thing already // is, or "File" // // Until then, collect all events. module.exports = ProxyWriter var Writer = require("./writer.js") , getType = require("./get-type.js") , inherits = require("inherits") , collect = require("./collect.js") , fs = require("fs") inherits(ProxyWriter, Writer) function ProxyWriter (props) { var me = this if (!(me instanceof ProxyWriter)) throw new Error( "ProxyWriter must be called as constructor.") me.props = props me._needDrain = false Writer.call(me, props) } ProxyWriter.prototype._stat = function () { var me = this , props = me.props // stat the thing to see what the proxy should be. , stat = props.follow ? "stat" : "lstat" fs[stat](props.path, function (er, current) { var type if (er || !current) { type = "File" } else { type = getType(current) } props[type] = true props.type = me.type = type me._old = current me._addProxy(Writer(props, current)) }) } ProxyWriter.prototype._addProxy = function (proxy) { // console.error("~~ set proxy", this.path) var me = this if (me._proxy) { return me.error("proxy already set") } me._proxy = proxy ; [ "ready" , "error" , "close" , "pipe" , "drain" , "warn" ].forEach(function (ev) { proxy.on(ev, me.emit.bind(me, ev)) }) me.emit("proxy", proxy) var calls = me._buffer calls.forEach(function (c) { // console.error("~~ ~~ proxy buffered call", c[0], c[1]) proxy[c[0]].apply(proxy, c[1]) }) me._buffer.length = 0 if (me._needsDrain) me.emit("drain") } ProxyWriter.prototype.add = function (entry) { // console.error("~~ proxy add") collect(entry) if (!this._proxy) { this._buffer.push(["add", [entry]]) this._needDrain = true return false } return this._proxy.add(entry) } ProxyWriter.prototype.write = function (c) { // console.error("~~ proxy write") if (!this._proxy) { this._buffer.push(["write", [c]]) this._needDrain = true return false } return this._proxy.write(c) } ProxyWriter.prototype.end = function (c) { // console.error("~~ proxy end") if (!this._proxy) { this._buffer.push(["end", [c]]) return false } return this._proxy.end(c) } �������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/reader.js����������������������000644 �000766 �000024 �00000015631 12455173731 027374� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = Reader var fs = require("graceful-fs") , Stream = require("stream").Stream , inherits = require("inherits") , path = require("path") , getType = require("./get-type.js") , hardLinks = Reader.hardLinks = {} , Abstract = require("./abstract.js") // Must do this *before* loading the child classes inherits(Reader, Abstract) var DirReader = require("./dir-reader.js") , FileReader = require("./file-reader.js") , LinkReader = require("./link-reader.js") , SocketReader = require("./socket-reader.js") , ProxyReader = require("./proxy-reader.js") function Reader (props, currentStat) { var me = this if (!(me instanceof Reader)) return new Reader(props, currentStat) if (typeof props === "string") { props = { path: props } } if (!props.path) { me.error("Must provide a path", null, true) } // polymorphism. // call fstream.Reader(dir) to get a DirReader object, etc. // Note that, unlike in the Writer case, ProxyReader is going // to be the *normal* state of affairs, since we rarely know // the type of a file prior to reading it. var type , ClassType if (props.type && typeof props.type === "function") { type = props.type ClassType = type } else { type = getType(props) ClassType = Reader } if (currentStat && !type) { type = getType(currentStat) props[type] = true props.type = type } switch (type) { case "Directory": ClassType = DirReader break case "Link": // XXX hard links are just files. // However, it would be good to keep track of files' dev+inode // and nlink values, and create a HardLinkReader that emits // a linkpath value of the original copy, so that the tar // writer can preserve them. // ClassType = HardLinkReader // break case "File": ClassType = FileReader break case "SymbolicLink": ClassType = LinkReader break case "Socket": ClassType = SocketReader break case null: ClassType = ProxyReader break } if (!(me instanceof ClassType)) { return new ClassType(props) } Abstract.call(me) me.readable = true me.writable = false me.type = type me.props = props me.depth = props.depth = props.depth || 0 me.parent = props.parent || null me.root = props.root || (props.parent && props.parent.root) || me me._path = me.path = path.resolve(props.path) if (process.platform === "win32") { me.path = me._path = me.path.replace(/\?/g, "_") if (me._path.length >= 260) { // how DOES one create files on the moon? // if the path has spaces in it, then UNC will fail. me._swallowErrors = true //if (me._path.indexOf(" ") === -1) { me._path = "\\\\?\\" + me.path.replace(/\//g, "\\") //} } } me.basename = props.basename = path.basename(me.path) me.dirname = props.dirname = path.dirname(me.path) // these have served their purpose, and are now just noisy clutter props.parent = props.root = null // console.error("\n\n\n%s setting size to", props.path, props.size) me.size = props.size me.filter = typeof props.filter === "function" ? props.filter : null if (props.sort === "alpha") props.sort = alphasort // start the ball rolling. // this will stat the thing, and then call me._read() // to start reading whatever it is. // console.error("calling stat", props.path, currentStat) me._stat(currentStat) } function alphasort (a, b) { return a === b ? 0 : a.toLowerCase() > b.toLowerCase() ? 1 : a.toLowerCase() < b.toLowerCase() ? -1 : a > b ? 1 : -1 } Reader.prototype._stat = function (currentStat) { var me = this , props = me.props , stat = props.follow ? "stat" : "lstat" // console.error("Reader._stat", me._path, currentStat) if (currentStat) process.nextTick(statCb.bind(null, null, currentStat)) else fs[stat](me._path, statCb) function statCb (er, props_) { // console.error("Reader._stat, statCb", me._path, props_, props_.nlink) if (er) return me.error(er) Object.keys(props_).forEach(function (k) { props[k] = props_[k] }) // if it's not the expected size, then abort here. if (undefined !== me.size && props.size !== me.size) { return me.error("incorrect size") } me.size = props.size var type = getType(props) var handleHardlinks = props.hardlinks !== false // special little thing for handling hardlinks. if (handleHardlinks && type !== "Directory" && props.nlink && props.nlink > 1) { var k = props.dev + ":" + props.ino // console.error("Reader has nlink", me._path, k) if (hardLinks[k] === me._path || !hardLinks[k]) hardLinks[k] = me._path else { // switch into hardlink mode. type = me.type = me.props.type = "Link" me.Link = me.props.Link = true me.linkpath = me.props.linkpath = hardLinks[k] // console.error("Hardlink detected, switching mode", me._path, me.linkpath) // Setting __proto__ would arguably be the "correct" // approach here, but that just seems too wrong. me._stat = me._read = LinkReader.prototype._read } } if (me.type && me.type !== type) { me.error("Unexpected type: " + type) } // if the filter doesn't pass, then just skip over this one. // still have to emit end so that dir-walking can move on. if (me.filter) { var who = me._proxy || me // special handling for ProxyReaders if (!me.filter.call(who, who, props)) { if (!me._disowned) { me.abort() me.emit("end") me.emit("close") } return } } // last chance to abort or disown before the flow starts! var events = ["_stat", "stat", "ready"] var e = 0 ;(function go () { if (me._aborted) { me.emit("end") me.emit("close") return } if (me._paused && me.type !== "Directory") { me.once("resume", go) return } var ev = events[e ++] if (!ev) { return me._read() } me.emit(ev, props) go() })() } } Reader.prototype.pipe = function (dest, opts) { var me = this if (typeof dest.add === "function") { // piping to a multi-compatible, and we've got directory entries. me.on("entry", function (entry) { var ret = dest.add(entry) if (false === ret) { me.pause() } }) } // console.error("R Pipe apply Stream Pipe") return Stream.prototype.pipe.apply(this, arguments) } Reader.prototype.pause = function (who) { this._paused = true who = who || this this.emit("pause", who) if (this._stream) this._stream.pause(who) } Reader.prototype.resume = function (who) { this._paused = false who = who || this this.emit("resume", who) if (this._stream) this._stream.resume(who) this._read() } Reader.prototype._read = function () { this.error("Cannot read unknown type: "+this.type) } �������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/socket-reader.js���������������000644 �000766 �000024 �00000001740 12455173731 030656� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Just get the stats, and then don't do anything. // You can't really "read" from a socket. You "connect" to it. // Mostly, this is here so that reading a dir with a socket in it // doesn't blow up. module.exports = SocketReader var fs = require("graceful-fs") , fstream = require("../fstream.js") , inherits = require("inherits") , mkdir = require("mkdirp") , Reader = require("./reader.js") inherits(SocketReader, Reader) function SocketReader (props) { var me = this if (!(me instanceof SocketReader)) throw new Error( "SocketReader must be called as constructor.") if (!(props.type === "Socket" && props.Socket)) { throw new Error("Non-socket type "+ props.type) } Reader.call(me, props) } SocketReader.prototype._read = function () { var me = this if (me._paused) return // basically just a no-op, since we got all the info we have // from the _stat method if (!me._ended) { me.emit("end") me.emit("close") me._ended = true } } ��������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/lib/writer.js����������������������000644 �000766 �000024 �00000024746 12455173731 027455� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = Writer var fs = require("graceful-fs") , inherits = require("inherits") , rimraf = require("rimraf") , mkdir = require("mkdirp") , path = require("path") , umask = process.platform === "win32" ? 0 : process.umask() , getType = require("./get-type.js") , Abstract = require("./abstract.js") // Must do this *before* loading the child classes inherits(Writer, Abstract) Writer.dirmode = 0777 & (~umask) Writer.filemode = 0666 & (~umask) var DirWriter = require("./dir-writer.js") , LinkWriter = require("./link-writer.js") , FileWriter = require("./file-writer.js") , ProxyWriter = require("./proxy-writer.js") // props is the desired state. current is optionally the current stat, // provided here so that subclasses can avoid statting the target // more than necessary. function Writer (props, current) { var me = this if (typeof props === "string") { props = { path: props } } if (!props.path) me.error("Must provide a path", null, true) // polymorphism. // call fstream.Writer(dir) to get a DirWriter object, etc. var type = getType(props) , ClassType = Writer switch (type) { case "Directory": ClassType = DirWriter break case "File": ClassType = FileWriter break case "Link": case "SymbolicLink": ClassType = LinkWriter break case null: // Don't know yet what type to create, so we wrap in a proxy. ClassType = ProxyWriter break } if (!(me instanceof ClassType)) return new ClassType(props) // now get down to business. Abstract.call(me) // props is what we want to set. // set some convenience properties as well. me.type = props.type me.props = props me.depth = props.depth || 0 me.clobber = false === props.clobber ? props.clobber : true me.parent = props.parent || null me.root = props.root || (props.parent && props.parent.root) || me me._path = me.path = path.resolve(props.path) if (process.platform === "win32") { me.path = me._path = me.path.replace(/\?/g, "_") if (me._path.length >= 260) { me._swallowErrors = true me._path = "\\\\?\\" + me.path.replace(/\//g, "\\") } } me.basename = path.basename(props.path) me.dirname = path.dirname(props.path) me.linkpath = props.linkpath || null props.parent = props.root = null // console.error("\n\n\n%s setting size to", props.path, props.size) me.size = props.size if (typeof props.mode === "string") { props.mode = parseInt(props.mode, 8) } me.readable = false me.writable = true // buffer until ready, or while handling another entry me._buffer = [] me.ready = false me.filter = typeof props.filter === "function" ? props.filter: null // start the ball rolling. // this checks what's there already, and then calls // me._create() to call the impl-specific creation stuff. me._stat(current) } // Calling this means that it's something we can't create. // Just assert that it's already there, otherwise raise a warning. Writer.prototype._create = function () { var me = this fs[me.props.follow ? "stat" : "lstat"](me._path, function (er, current) { if (er) { return me.warn("Cannot create " + me._path + "\n" + "Unsupported type: "+me.type, "ENOTSUP") } me._finish() }) } Writer.prototype._stat = function (current) { var me = this , props = me.props , stat = props.follow ? "stat" : "lstat" , who = me._proxy || me if (current) statCb(null, current) else fs[stat](me._path, statCb) function statCb (er, current) { if (me.filter && !me.filter.call(who, who, current)) { me._aborted = true me.emit("end") me.emit("close") return } // if it's not there, great. We'll just create it. // if it is there, then we'll need to change whatever differs if (er || !current) { return create(me) } me._old = current var currentType = getType(current) // if it's a type change, then we need to clobber or error. // if it's not a type change, then let the impl take care of it. if (currentType !== me.type) { return rimraf(me._path, function (er) { if (er) return me.error(er) me._old = null create(me) }) } // otherwise, just handle in the app-specific way // this creates a fs.WriteStream, or mkdir's, or whatever create(me) } } function create (me) { // console.error("W create", me._path, Writer.dirmode) // XXX Need to clobber non-dirs that are in the way, // unless { clobber: false } in the props. mkdir(path.dirname(me._path), Writer.dirmode, function (er, made) { // console.error("W created", path.dirname(me._path), er) if (er) return me.error(er) // later on, we have to set the mode and owner for these me._madeDir = made return me._create() }) } function endChmod (me, want, current, path, cb) { var wantMode = want.mode , chmod = want.follow || me.type !== "SymbolicLink" ? "chmod" : "lchmod" if (!fs[chmod]) return cb() if (typeof wantMode !== "number") return cb() var curMode = current.mode & 0777 wantMode = wantMode & 0777 if (wantMode === curMode) return cb() fs[chmod](path, wantMode, cb) } function endChown (me, want, current, path, cb) { // Don't even try it unless root. Too easy to EPERM. if (process.platform === "win32") return cb() if (!process.getuid || !process.getuid() === 0) return cb() if (typeof want.uid !== "number" && typeof want.gid !== "number" ) return cb() if (current.uid === want.uid && current.gid === want.gid) return cb() var chown = (me.props.follow || me.type !== "SymbolicLink") ? "chown" : "lchown" if (!fs[chown]) return cb() if (typeof want.uid !== "number") want.uid = current.uid if (typeof want.gid !== "number") want.gid = current.gid fs[chown](path, want.uid, want.gid, cb) } function endUtimes (me, want, current, path, cb) { if (!fs.utimes || process.platform === "win32") return cb() var utimes = (want.follow || me.type !== "SymbolicLink") ? "utimes" : "lutimes" if (utimes === "lutimes" && !fs[utimes]) { utimes = "utimes" } if (!fs[utimes]) return cb() var curA = current.atime , curM = current.mtime , meA = want.atime , meM = want.mtime if (meA === undefined) meA = curA if (meM === undefined) meM = curM if (!isDate(meA)) meA = new Date(meA) if (!isDate(meM)) meA = new Date(meM) if (meA.getTime() === curA.getTime() && meM.getTime() === curM.getTime()) return cb() fs[utimes](path, meA, meM, cb) } // XXX This function is beastly. Break it up! Writer.prototype._finish = function () { var me = this if (me._finishing) return me._finishing = true // console.error(" W Finish", me._path, me.size) // set up all the things. // At this point, we're already done writing whatever we've gotta write, // adding files to the dir, etc. var todo = 0 var errState = null var done = false if (me._old) { // the times will almost *certainly* have changed. // adds the utimes syscall, but remove another stat. me._old.atime = new Date(0) me._old.mtime = new Date(0) // console.error(" W Finish Stale Stat", me._path, me.size) setProps(me._old) } else { var stat = me.props.follow ? "stat" : "lstat" // console.error(" W Finish Stating", me._path, me.size) fs[stat](me._path, function (er, current) { // console.error(" W Finish Stated", me._path, me.size, current) if (er) { // if we're in the process of writing out a // directory, it's very possible that the thing we're linking to // doesn't exist yet (especially if it was intended as a symlink), // so swallow ENOENT errors here and just soldier on. if (er.code === "ENOENT" && (me.type === "Link" || me.type === "SymbolicLink") && process.platform === "win32") { me.ready = true me.emit("ready") me.emit("end") me.emit("close") me.end = me._finish = function () {} return } else return me.error(er) } setProps(me._old = current) }) } return function setProps (current) { todo += 3 endChmod(me, me.props, current, me._path, next("chmod")) endChown(me, me.props, current, me._path, next("chown")) endUtimes(me, me.props, current, me._path, next("utimes")) } function next (what) { return function (er) { // console.error(" W Finish", what, todo) if (errState) return if (er) { er.fstream_finish_call = what return me.error(errState = er) } if (--todo > 0) return if (done) return done = true // we may still need to set the mode/etc. on some parent dirs // that were created previously. delay end/close until then. if (!me._madeDir) return end() else endMadeDir(me, me._path, end) function end (er) { if (er) { er.fstream_finish_call = "setupMadeDir" return me.error(er) } // all the props have been set, so we're completely done. me.emit("end") me.emit("close") } } } } function endMadeDir (me, p, cb) { var made = me._madeDir // everything *between* made and path.dirname(me._path) // needs to be set up. Note that this may just be one dir. var d = path.dirname(p) endMadeDir_(me, d, function (er) { if (er) return cb(er) if (d === made) { return cb() } endMadeDir(me, d, cb) }) } function endMadeDir_ (me, p, cb) { var dirProps = {} Object.keys(me.props).forEach(function (k) { dirProps[k] = me.props[k] // only make non-readable dirs if explicitly requested. if (k === "mode" && me.type !== "Directory") { dirProps[k] = dirProps[k] | 0111 } }) var todo = 3 , errState = null fs.stat(p, function (er, current) { if (er) return cb(errState = er) endChmod(me, dirProps, current, p, next) endChown(me, dirProps, current, p, next) endUtimes(me, dirProps, current, p, next) }) function next (er) { if (errState) return if (er) return cb(errState = er) if (-- todo === 0) return cb() } } Writer.prototype.pipe = function () { this.error("Can't pipe from writable stream") } Writer.prototype.add = function () { this.error("Cannot add to non-Directory type") } Writer.prototype.write = function () { return true } function objectToString (d) { return Object.prototype.toString.call(d) } function isDate(d) { return typeof d === 'object' && objectToString(d) === '[object Date]'; } ��������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/examples/filter-pipe.js������������000644 �000766 �000024 �00000007536 12455173731 031427� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fstream = require("../fstream.js") var path = require("path") var r = fstream.Reader({ path: path.dirname(__dirname) , filter: function () { return !this.basename.match(/^\./) && !this.basename.match(/^node_modules$/) && !this.basename.match(/^deep-copy$/) && !this.basename.match(/^filter-copy$/) } }) // this writer will only write directories var w = fstream.Writer({ path: path.resolve(__dirname, "filter-copy") , type: "Directory" , filter: function () { return this.type === "Directory" } }) var indent = "" var escape = {} r.on("entry", appears) r.on("ready", function () { console.error("ready to begin!", r.path) }) function appears (entry) { console.error(indent + "a %s appears!", entry.type, entry.basename, typeof entry.basename) if (foggy) { console.error("FOGGY!") var p = entry do { console.error(p.depth, p.path, p._paused) } while (p = p.parent) throw new Error("\033[mshould not have entries while foggy") } indent += "\t" entry.on("data", missile(entry)) entry.on("end", runaway(entry)) entry.on("entry", appears) } var foggy function missile (entry) { if (entry.type === "Directory") { var ended = false entry.once("end", function () { ended = true }) return function (c) { // throw in some pathological pause()/resume() behavior // just for extra fun. process.nextTick(function () { if (!foggy && !ended) { // && Math.random() < 0.3) { console.error(indent +"%s casts a spell", entry.basename) console.error("\na slowing fog comes over the battlefield...\n\033[32m") entry.pause() entry.once("resume", liftFog) foggy = setTimeout(liftFog, 1000) function liftFog (who) { if (!foggy) return if (who) { console.error("%s breaks the spell!", who && who.path) } else { console.error("the spell expires!") } console.error("\033[mthe fog lifts!\n") clearTimeout(foggy) foggy = null if (entry._paused) entry.resume() } } }) } } return function (c) { var e = Math.random() < 0.5 console.error(indent + "%s %s for %d damage!", entry.basename, e ? "is struck" : "fires a chunk", c.length) } } function runaway (entry) { return function () { var e = Math.random() < 0.5 console.error(indent + "%s %s", entry.basename, e ? "turns to flee" : "is vanquished!") indent = indent.slice(0, -1) }} w.on("entry", attacks) //w.on("ready", function () { attacks(w) }) function attacks (entry) { console.error(indent + "%s %s!", entry.basename, entry.type === "Directory" ? "calls for backup" : "attacks") entry.on("entry", attacks) } ended = false var i = 1 r.on("end", function () { if (foggy) clearTimeout(foggy) console.error("\033[mIT'S OVER!!") console.error("A WINNAR IS YOU!") console.log("ok " + (i ++) + " A WINNAR IS YOU") ended = true // now go through and verify that everything in there is a dir. var p = path.resolve(__dirname, "filter-copy") var checker = fstream.Reader({ path: p }) checker.checker = true checker.on("child", function (e) { var ok = e.type === "Directory" console.log((ok ? "" : "not ") + "ok " + (i ++) + " should be a dir: " + e.path.substr(checker.path.length + 1)) }) }) process.on("exit", function () { console.log((ended ? "" : "not ") + "ok " + (i ++) + " ended") }) r.pipe(w) ������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/examples/pipe.js�������������������000644 �000766 �000024 �00000006246 12455173731 030141� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fstream = require("../fstream.js") var path = require("path") var r = fstream.Reader({ path: path.dirname(__dirname) , filter: function () { return !this.basename.match(/^\./) && !this.basename.match(/^node_modules$/) && !this.basename.match(/^deep-copy$/) } }) var w = fstream.Writer({ path: path.resolve(__dirname, "deep-copy") , type: "Directory" }) var indent = "" var escape = {} r.on("entry", appears) r.on("ready", function () { console.error("ready to begin!", r.path) }) function appears (entry) { console.error(indent + "a %s appears!", entry.type, entry.basename, typeof entry.basename, entry) if (foggy) { console.error("FOGGY!") var p = entry do { console.error(p.depth, p.path, p._paused) } while (p = p.parent) throw new Error("\033[mshould not have entries while foggy") } indent += "\t" entry.on("data", missile(entry)) entry.on("end", runaway(entry)) entry.on("entry", appears) } var foggy function missile (entry) { if (entry.type === "Directory") { var ended = false entry.once("end", function () { ended = true }) return function (c) { // throw in some pathological pause()/resume() behavior // just for extra fun. process.nextTick(function () { if (!foggy && !ended) { // && Math.random() < 0.3) { console.error(indent +"%s casts a spell", entry.basename) console.error("\na slowing fog comes over the battlefield...\n\033[32m") entry.pause() entry.once("resume", liftFog) foggy = setTimeout(liftFog, 10) function liftFog (who) { if (!foggy) return if (who) { console.error("%s breaks the spell!", who && who.path) } else { console.error("the spell expires!") } console.error("\033[mthe fog lifts!\n") clearTimeout(foggy) foggy = null if (entry._paused) entry.resume() } } }) } } return function (c) { var e = Math.random() < 0.5 console.error(indent + "%s %s for %d damage!", entry.basename, e ? "is struck" : "fires a chunk", c.length) } } function runaway (entry) { return function () { var e = Math.random() < 0.5 console.error(indent + "%s %s", entry.basename, e ? "turns to flee" : "is vanquished!") indent = indent.slice(0, -1) }} w.on("entry", attacks) //w.on("ready", function () { attacks(w) }) function attacks (entry) { console.error(indent + "%s %s!", entry.basename, entry.type === "Directory" ? "calls for backup" : "attacks") entry.on("entry", attacks) } ended = false r.on("end", function () { if (foggy) clearTimeout(foggy) console.error("\033[mIT'S OVER!!") console.error("A WINNAR IS YOU!") console.log("ok 1 A WINNAR IS YOU") ended = true }) process.on("exit", function () { console.log((ended ? "" : "not ") + "ok 2 ended") }) r.pipe(w) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/examples/reader.js�����������������000644 �000766 �000024 �00000003070 12455173731 030436� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fstream = require("../fstream.js") var tap = require("tap") var fs = require("fs") var path = require("path") var dir = path.dirname(__dirname) tap.test("reader test", function (t) { var children = -1 var gotReady = false var ended = false var r = fstream.Reader({ path: dir , filter: function () { // return this.parent === r return this.parent === r || this === r } }) r.on("ready", function () { gotReady = true children = fs.readdirSync(dir).length console.error("Setting expected children to "+children) t.equal(r.type, "Directory", "should be a directory") }) r.on("entry", function (entry) { children -- if (!gotReady) { t.fail("children before ready!") } t.equal(entry.dirname, r.path, "basename is parent dir") }) r.on("error", function (er) { t.fail(er) t.end() process.exit(1) }) r.on("end", function () { t.equal(children, 0, "should have seen all children") ended = true }) var closed = false r.on("close", function () { t.ok(ended, "saw end before close") t.notOk(closed, "close should only happen once") closed = true t.end() }) }) tap.test("reader error test", function (t) { // assumes non-root on a *nix system var r = fstream.Reader({ path: '/etc/shadow' }) r.once("error", function (er) { t.ok(true); t.end() }) r.on("end", function () { t.fail("reader ended without error"); t.end() }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fstream/examples/symlink-write.js����������000644 �000766 �000024 �00000001300 12455173731 032004� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fstream = require("../fstream.js") , closed = false fstream .Writer({ path: "path/to/symlink" , linkpath: "./file" , isSymbolicLink: true , mode: "0755" // octal strings supported }) .on("close", function () { closed = true var fs = require("fs") var s = fs.lstatSync("path/to/symlink") var isSym = s.isSymbolicLink() console.log((isSym?"":"not ") +"ok 1 should be symlink") var t = fs.readlinkSync("path/to/symlink") var isTarget = t === "./file" console.log((isTarget?"":"not ") +"ok 2 should link to ./file") }) .end() process.on("exit", function () { console.log((closed?"":"not ")+"ok 3 should be closed") }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-write-stream-atomic/index.js������������000644 �000766 �000024 �00000004716 12455173731 031337� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var util = require('util') var crypto = require('crypto') function md5hex () { var hash = crypto.createHash('md5') for (var ii=0; ii<arguments.length; ++ii) { hash.update(''+arguments[ii]) } return hash.digest('hex') } var invocations = 0; function getTmpname (filename) { return filename + "." + md5hex(__filename, process.pid, ++invocations) } module.exports = WriteStream util.inherits(WriteStream, fs.WriteStream) function WriteStream (path, options) { if (!options) options = {} if (!(this instanceof WriteStream)) return new WriteStream(path, options) this.__atomicTarget = path this.__atomicChown = options.chown this.__atomicDidStuff = false this.__atomicTmp = getTmpname(path) fs.WriteStream.call(this, this.__atomicTmp, options) } function cleanup (er) { fs.unlink(this.__atomicTmp, function () { fs.WriteStream.prototype.emit.call(this, 'error', er) }.bind(this)) } function cleanupSync (er) { try { fs.unlinkSync(this.__atomicTmp) } finally { return fs.WriteStream.prototype.emit.call(this, 'error', er) } } // When we *would* emit 'close' or 'finish', instead do our stuff WriteStream.prototype.emit = function (ev) { if (ev === 'error') return cleanupSync(this) if (ev !== 'close' && ev !== 'finish') return fs.WriteStream.prototype.emit.apply(this, arguments) // We handle emitting finish and close after the rename. if (ev === 'close' || ev === 'finish') { if (!this.__atomicDidStuff) { atomicDoStuff.call(this, function (er) { if (er) cleanup.call(this, er) }.bind(this)) } } } function atomicDoStuff(cb) { if (this.__atomicDidStuff) throw new Error('Already did atomic move-into-place') this.__atomicDidStuff = true if (this.__atomicChown) { var uid = this.__atomicChown.uid var gid = this.__atomicChown.gid return fs.chown(this.__atomicTmp, uid, gid, function (er) { if (er) return cb(er) moveIntoPlace.call(this, cb) }.bind(this)) } else { moveIntoPlace.call(this, cb) } } function moveIntoPlace (cb) { fs.rename(this.__atomicTmp, this.__atomicTarget, function (er) { cb(er) // emit finish, and then close on the next tick // This makes finish/close consistent across Node versions also. fs.WriteStream.prototype.emit.call(this, 'finish') process.nextTick(function() { fs.WriteStream.prototype.emit.call(this, 'close') }.bind(this)) }.bind(this)) } ��������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-write-stream-atomic/LICENSE�������������000644 �000766 �000024 �00000001375 12455173731 030675� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-write-stream-atomic/package.json��������000644 �000766 �000024 �00000002606 12455173731 032154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "fs-write-stream-atomic", "version": "1.0.2", "description": "Like `fs.createWriteStream(...)`, but atomic.", "main": "index.js", "directories": { "test": "test" }, "dependencies": { "graceful-fs": "^3.0.2" }, "devDependencies": { "tap": "^0.4.12" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/fs-write-stream-atomic" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/fs-write-stream-atomic/issues" }, "homepage": "https://github.com/npm/fs-write-stream-atomic", "gitHead": "86b111ed1d5db84e8a9680986ef7917036b9c97b", "_id": "fs-write-stream-atomic@1.0.2", "_shasum": "fe0c6cec75256072b2fef8180d97e309fe3f5efb", "_from": "fs-write-stream-atomic@>=1.0.2 <1.1.0", "_npmVersion": "2.1.0", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "fe0c6cec75256072b2fef8180d97e309fe3f5efb", "tarball": "http://registry.npmjs.org/fs-write-stream-atomic/-/fs-write-stream-atomic-1.0.2.tgz" }, "_resolved": "https://registry.npmjs.org/fs-write-stream-atomic/-/fs-write-stream-atomic-1.0.2.tgz" } ��������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-write-stream-atomic/README.md�����������000644 �000766 �000024 �00000002011 12455173731 031133� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# fs-write-stream-atomic Like `fs.createWriteStream(...)`, but atomic. Writes to a tmp file and does an atomic `fs.rename` to move it into place when it's done. First rule of debugging: **It's always a race condition.** ## USAGE ```javascript var fsWriteStreamAtomic = require('fs-write-stream-atomic') // options are optional. var write = fsWriteStreamAtomic('output.txt', options) var read = fs.createReadStream('input.txt') read.pipe(write) // When the write stream emits a 'finish' or 'close' event, // you can be sure that it is moved into place, and contains // all the bytes that were written to it, even if something else // was writing to `output.txt` at the same time. ``` ### `fsWriteStreamAtomic(filename, [options])` * `filename` {String} The file we want to write to * `options` {Object} * `chown` {Object} User and group to set ownership after write * `uid` {Number} * `gid` {Number} * `encoding` {String} default = 'utf8' * `mode` {Number} default = `0666` * `flags` {String} default = `'w'` �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-vacuum/.eslintrc������������������������000644 �000766 �000024 �00000000645 12455173731 027116� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "env" : { "node" : true }, "rules" : { "curly" : 0, "no-lonely-if" : 1, "no-mixed-requires" : 0, "no-underscore-dangle" : 0, "no-unused-vars" : [2, {"vars" : "all", "args" : "after-used"}], "no-use-before-define" : [2, "nofunc"], "quotes" : [1, "double", "avoid-escape"], "semi" : [2, "never"], "space-after-keywords" : 1, "space-infix-ops" : 0, "strict" : 0 } } �������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-vacuum/.npmignore�����������������������000644 �000766 �000024 �00000000015 12455173731 027260� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-vacuum/package.json���������������������000644 �000766 �000024 �00000004310 12455173731 027551� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "fs-vacuum", "version": "1.2.5", "description": "recursively remove empty directories -- to a point", "main": "vacuum.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/fs-vacuum.git" }, "keywords": [ "rm", "rimraf", "clean" ], "author": { "name": "Forrest L Norvell", "email": "ogd@aoaioxxysz.net" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/fs-vacuum/issues" }, "homepage": "https://github.com/npm/fs-vacuum", "devDependencies": { "mkdirp": "^0.5.0", "tap": "^0.4.11", "tmp": "0.0.24" }, "dependencies": { "graceful-fs": "^3.0.2", "path-is-inside": "^1.0.1", "rimraf": "^2.2.8" }, "readme": "# fs-vacuum\n\nRemove the empty branches of a directory tree, optionally up to (but not\nincluding) a specified base directory. Optionally nukes the leaf directory.\n\n## Usage\n\n```javascript\nvar logger = require(\"npmlog\");\nvar vacuum = require(\"fs-vacuum\");\n\nvar options = {\n base : \"/path/to/my/tree/root\",\n purge : true,\n log : logger.silly.bind(logger, \"myCleanup\")\n};\n\n/* Assuming there are no other files or directories in \"out\", \"to\", or \"my\",\n * the final path will just be \"/path/to/my/tree/root\".\n */\nvacuum(\"/path/to/my/tree/root/out/to/my/files\", function (error) {\n if (error) console.error(\"Unable to cleanly vacuum:\", error.message);\n});\n```\n# vacuum(directory, options, callback)\n\n* `directory` {String} Leaf node to remove. **Must be a directory, symlink, or file.**\n* `options` {Object}\n * `base` {String} No directories at or above this level of the filesystem will be removed.\n * `purge` {Boolean} If set, nuke the whole leaf directory, including its contents.\n * `log` {Function} A logging function that takes `npmlog`-compatible argument lists.\n* `callback` {Function} Function to call once vacuuming is complete.\n * `error` {Error} What went wrong along the way, if anything.\n", "readmeFilename": "README.md", "gitHead": "4911a38a65b6a6cb19fc980d18304e1cfca91fbf", "_id": "fs-vacuum@1.2.5", "_shasum": "a5cbaa844b4b3a7cff139f3cc90e7f7007e5fbb8", "_from": "fs-vacuum@~1.2.5" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-vacuum/README.md������������������������000644 �000766 �000024 �00000002265 12455173731 026551� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# fs-vacuum Remove the empty branches of a directory tree, optionally up to (but not including) a specified base directory. Optionally nukes the leaf directory. ## Usage ```javascript var logger = require("npmlog"); var vacuum = require("fs-vacuum"); var options = { base : "/path/to/my/tree/root", purge : true, log : logger.silly.bind(logger, "myCleanup") }; /* Assuming there are no other files or directories in "out", "to", or "my", * the final path will just be "/path/to/my/tree/root". */ vacuum("/path/to/my/tree/root/out/to/my/files", function (error) { if (error) console.error("Unable to cleanly vacuum:", error.message); }); ``` # vacuum(directory, options, callback) * `directory` {String} Leaf node to remove. **Must be a directory, symlink, or file.** * `options` {Object} * `base` {String} No directories at or above this level of the filesystem will be removed. * `purge` {Boolean} If set, nuke the whole leaf directory, including its contents. * `log` {Function} A logging function that takes `npmlog`-compatible argument lists. * `callback` {Function} Function to call once vacuuming is complete. * `error` {Error} What went wrong along the way, if anything. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/fs-vacuum/vacuum.js������������������������000644 �000766 �000024 �00000006114 12455173731 027125� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") var dirname = require("path").dirname var resolve = require("path").resolve var isInside = require("path-is-inside") var rimraf = require("rimraf") var lstat = require("graceful-fs").lstat var readdir = require("graceful-fs").readdir var rmdir = require("graceful-fs").rmdir var unlink = require("graceful-fs").unlink module.exports = vacuum function vacuum(leaf, options, cb) { assert(typeof leaf === "string", "must pass in path to remove") assert(typeof cb === "function", "must pass in callback") if (!options) options = {} assert(typeof options === "object", "options must be an object") var log = options.log ? options.log : function () {} leaf = leaf && resolve(leaf) var base = options.base && resolve(options.base) if (base && !isInside(leaf, base)) { return cb(new Error(leaf + " is not a child of " + base)) } lstat(leaf, function (error, stat) { if (error) { if (error.code === "ENOENT") return cb(null) log(error.stack) return cb(error) } if (!(stat && (stat.isDirectory() || stat.isSymbolicLink() || stat.isFile()))) { log(leaf, "is not a directory, file, or link") return cb(new Error(leaf + " is not a directory, file, or link")) } if (options.purge) { log("purging", leaf) rimraf(leaf, function (error) { if (error) return cb(error) next(dirname(leaf)) }) } else if (!stat.isDirectory()) { log("removing", leaf) unlink(leaf, function (error) { if (error) return cb(error) next(dirname(leaf)) }) } else { next(leaf) } }) function next(branch) { branch = branch && resolve(branch) // either we've reached the base or we've reached the root if ((base && branch === base) || branch === dirname(branch)) { log("finished vacuuming up to", branch) return cb(null) } readdir(branch, function (error, files) { if (error) { if (error.code === "ENOENT") return cb(null) log("unable to check directory", branch, "due to", error.message) return cb(error) } if (files.length > 0) { log("quitting because other entries in", branch) return cb(null) } log("removing", branch) lstat(branch, function (error, stat) { if (error) { if (error.code === "ENOENT") return cb(null) log("unable to lstat", branch, "due to", error.message) return cb(error) } var remove = stat.isDirectory() ? rmdir : unlink remove(branch, function (error) { if (error) { if (error.code === "ENOENT") { log("quitting because lost the race to remove", branch) return cb(null) } if (error.code === "ENOTEMPTY") { log("quitting because new (racy) entries in", branch) return cb(null) } log("unable to remove", branch, "due to", error.message) return cb(error) } next(dirname(branch)) }) }) }) } } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/example/����������������������������000755 �000766 �000024 �00000000000 12456115117 026273� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/index.js����������������������������000644 �000766 �000024 �00000001123 12455173731 026307� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var spawn = require('child_process').spawn; module.exports = function (file, opts, cb) { if (typeof opts === 'function') { cb = opts; opts = {}; } if (!opts) opts = {}; var ed = /^win/.test(process.platform) ? 'notepad' : 'vim'; var editor = opts.editor || process.env.VISUAL || process.env.EDITOR || ed; var args = editor.split(/\s+/); var bin = args.shift(); var ps = spawn(bin, args.concat([ file ]), { stdio: 'inherit' }); ps.on('exit', function (code, sig) { if (typeof cb === 'function') cb(code, sig) }); }; ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/LICENSE�����������������������������000644 �000766 �000024 �00000002161 12455173731 025652� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2013 James Halliday (mail@substack.net) This project is free software released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/package.json������������������������000644 �000766 �000024 �00000002417 12455173731 027137� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "editor", "version": "0.1.0", "description": "launch $EDITOR in your program", "main": "index.js", "directories": { "example": "example", "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.4.4" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/substack/node-editor.git" }, "homepage": "https://github.com/substack/node-editor", "keywords": [ "text", "edit", "shell" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "engine": { "node": ">=0.6" }, "bugs": { "url": "https://github.com/substack/node-editor/issues" }, "_id": "editor@0.1.0", "dist": { "shasum": "542f4662c6a8c88e862fc11945e204e51981b9a1", "tarball": "http://registry.npmjs.org/editor/-/editor-0.1.0.tgz" }, "_from": "editor@latest", "_npmVersion": "1.3.21", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "_shasum": "542f4662c6a8c88e862fc11945e204e51981b9a1", "_resolved": "https://registry.npmjs.org/editor/-/editor-0.1.0.tgz" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/README.markdown���������������������000644 �000766 �000024 �00000001215 12455173731 027345� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������editor ====== Launch $EDITOR in your program. example ======= ``` js var editor = require('editor'); editor('beep.json', function (code, sig) { console.log('finished editing with code ' + code); }); ``` *** ``` $ node edit.js ``` ![editor](http://substack.net/images/screenshots/editor.png) ``` finished editing with code 0 ``` methods ======= ``` js var editor = require('editor') ``` editor(file, opts={}, cb) ------------------------- Launch the `$EDITOR` (or `opts.editor`) for `file`. When the editor exits, `cb(code, sig)` fires. install ======= With [npm](http://npmjs.org) do: ``` npm install editor ``` license ======= MIT �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/example/beep.json�������������������000644 �000766 �000024 �00000000044 12455173731 030104� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "a" : 3, "b" : 4, "c" : 5 } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/editor/example/edit.js���������������������000644 �000766 �000024 �00000000220 12455173731 027555� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var editor = require('../'); editor(__dirname + '/beep.json', function (code, sig) { console.log('finished editing with code ' + code); }); ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/dezalgo.js�������������������������000644 �000766 �000024 �00000000560 12455173731 026770� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var wrappy = require('wrappy') module.exports = wrappy(dezalgo) var asap = require('asap') function dezalgo (cb) { var sync = true asap(function () { sync = false }) return function zalgoSafe() { var args = arguments var me = this if (sync) asap(function() { cb.apply(me, args) }) else cb.apply(me, args) } } ������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/node_modules/����������������������000755 �000766 �000024 �00000000000 12456115117 027454� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/package.json�����������������������000644 �000766 �000024 �00000006046 12455173731 027300� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "dezalgo", "version": "1.0.1", "description": "Contain async insanity so that the dark pony lord doesn't eat souls", "main": "dezalgo.js", "directories": { "test": "test" }, "dependencies": { "asap": "^1.0.0", "wrappy": "1" }, "devDependencies": { "tap": "^0.4.11" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/npm/dezalgo" }, "keywords": [ "async", "zalgo", "the dark pony", "he comes", "asynchrony of all holy and good", "T̯̪ͅo̯͖̹ ̻̮̖̲͢i̥̖n̢͈͇̝͍v͏͉ok̭̬̝ͅe̞͍̩̫͍̩͝ ̩̮̖̟͇͉́t͔͔͎̗h͏̗̟e̘͉̰̦̠̞͓ ͕h͉̟͎̪̠̱͠ḭ̮̩v̺͉͇̩e̵͖-̺̪m͍i̜n̪̲̲̲̮d̷ ̢r̠̼̯̹̦̦͘ͅe͓̳͓̙p̺̗̫͙͘ͅr͔̰͜e̴͓̞s͉̩̩͟ͅe͏̣n͚͇̗̭̺͍tì͙̣n͏̖̥̗͎̰̪g̞͓̭̱̯̫̕ ̣̱͜ͅc̦̰̰̠̮͎͙̀hao̺̜̻͍͙ͅs͉͓̘.͎̼̺̼͕̹͘", "̠̞̱̰I͖͇̝̻n̦̰͍̰̟v̤̺̫̳̭̼̗͘ò̹̟̩̩͚k̢̥̠͍͉̦̬i̖͓͔̮̱̻͘n̶̳͙̫͎g̖̯̣̲̪͉ ̞͎̗͕͚ͅt̲͕̘̺̯̗̦h̘̦̲̜̻e̳͎͉̬͙ ̴̞̪̲̥f̜̯͓͓̭̭͢e̱̘͔̮e̜̤l̺̱͖̯͓͙͈͢i̵̦̬͉͔̫͚͕n͉g̨͖̙̙̹̹̟̤ ͉̪o̞̠͍̪̰͙ͅf̬̲̺ ͔͕̲͕͕̲̕c̙͉h̝͔̩̙̕ͅa̲͖̻̗̹o̥̼̫s̝̖̜̝͚̫̟.̺͚ ̸̱̲W̶̥̣͖̦i͏̤̬̱̳̣ͅt͉h̗̪̪ ̷̱͚̹̪ǫ͕̗̣̳̦͎u̼̦͔̥̮̕ţ͖͎̻͔͉ ̴͎̩òr̹̰̖͉͈͝d̷̲̦̖͓e̲͓̠r", "̧͚̜͓̰̭̭Ṯ̫̹̜̮̟̮͝h͚̘̩̘̖̰́e ̥̘͓͉͔͙̼N̟̜̣̘͔̪e̞̞̤͢z̰̖̘͇p̠͟e̺̱̣͍͙̝ṛ̘̬͔̙͇̠d͝ḭ̯̱̥̗̩a̛ͅn͏̦ ̷̥hi̥v̖̳̹͉̮̱͝e̹̪̘̖̰̟-̴͙͓͚̜̻mi̗̺̻͙̺ͅn̪̯͈d ͏̘͓̫̳ͅơ̹͔̳̖̣͓f͈̹̘ ͕ͅc̗̤̠̜̮̥̥h̡͍̩̭̫͚̱a̤͉̤͔͜os͕̤̼͍̲̀ͅ.̡̱ ̦Za̯̱̗̭͍̣͚l̗͉̰̤g͏̣̭̬̗̲͖ͅo̶̭̩̳̟͈.̪̦̰̳", "H̴̱̦̗̬̣͓̺e̮ ͉̠̰̞͎̖͟ẁh̛̺̯ͅo̖̫͡ ̢Ẁa̡̗i̸t͖̣͉̀ş͔̯̩ ̤̦̮͇̞̦̲B͎̭͇̦̼e̢hin͏͙̟̪d̴̰͓̻̣̮͕ͅ T͖̮̕h͖e̘̺̰̙͘ ̥Ẁ̦͔̻͚a̞͖̪͉l̪̠̻̰̣̠l̲͎͞", "Z̘͍̼͎̣͔͝Ą̲̜̱̱̹̤͇L̶̝̰̭͔G͍̖͍O̫͜ͅ!̼̤ͅ", "H̝̪̜͓̀̌̂̒E̢̙̠̣ ̴̳͇̥̟̠͍̐C̹̓̑̐̆͝Ó̶̭͓̚M̬̼Ĕ̖̤͔͔̟̹̽̿̊ͥ̍ͫS̻̰̦̻̖̘̱̒ͪ͌̅͟" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/dezalgo/issues" }, "homepage": "https://github.com/npm/dezalgo", "gitHead": "0a5eee75c179611f8b67f663015d68bb517e57d2", "_id": "dezalgo@1.0.1", "_shasum": "12bde135060807900d5a7aebb607c2abb7c76937", "_from": "dezalgo@latest", "_npmVersion": "2.0.0", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "12bde135060807900d5a7aebb607c2abb7c76937", "tarball": "http://registry.npmjs.org/dezalgo/-/dezalgo-1.0.1.tgz" }, "_resolved": "https://registry.npmjs.org/dezalgo/-/dezalgo-1.0.1.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/README.md��������������������������000644 �000766 �000024 �00000001213 12455173731 026260� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# dezalgo Contain async insanity so that the dark pony lord doesn't eat souls See [this blog post](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony). ## USAGE Pass a callback to `dezalgo` and it will ensure that it is *always* called in a future tick, and never in this tick. ```javascript var dz = require('dezalgo') var cache = {} function maybeSync(arg, cb) { cb = dz(cb) // this will actually defer to nextTick if (cache[arg]) cb(null, cache[arg]) fs.readFile(arg, function (er, data) { // since this is *already* defered, it will call immediately if (er) cb(er) cb(null, cache[arg] = data) }) } ``` �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/node_modules/asap/�����������������000755 �000766 �000024 �00000000000 12456115117 030400� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/node_modules/asap/asap.js����������000644 �000766 �000024 �00000005471 12455173731 031676� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // Use the fastest possible means to execute a task in a future turn // of the event loop. // linked list of tasks (single, with head node) var head = {task: void 0, next: null}; var tail = head; var flushing = false; var requestFlush = void 0; var isNodeJS = false; function flush() { /* jshint loopfunc: true */ while (head.next) { head = head.next; var task = head.task; head.task = void 0; var domain = head.domain; if (domain) { head.domain = void 0; domain.enter(); } try { task(); } catch (e) { if (isNodeJS) { // In node, uncaught exceptions are considered fatal errors. // Re-throw them synchronously to interrupt flushing! // Ensure continuation if the uncaught exception is suppressed // listening "uncaughtException" events (as domains does). // Continue in next event to avoid tick recursion. if (domain) { domain.exit(); } setTimeout(flush, 0); if (domain) { domain.enter(); } throw e; } else { // In browsers, uncaught exceptions are not fatal. // Re-throw them asynchronously to avoid slow-downs. setTimeout(function() { throw e; }, 0); } } if (domain) { domain.exit(); } } flushing = false; } if (typeof process !== "undefined" && process.nextTick) { // Node.js before 0.9. Note that some fake-Node environments, like the // Mocha test runner, introduce a `process` global without a `nextTick`. isNodeJS = true; requestFlush = function () { process.nextTick(flush); }; } else if (typeof setImmediate === "function") { // In IE10, Node.js 0.9+, or https://github.com/NobleJS/setImmediate if (typeof window !== "undefined") { requestFlush = setImmediate.bind(window, flush); } else { requestFlush = function () { setImmediate(flush); }; } } else if (typeof MessageChannel !== "undefined") { // modern browsers // http://www.nonblocking.io/2011/06/windownexttick.html var channel = new MessageChannel(); channel.port1.onmessage = flush; requestFlush = function () { channel.port2.postMessage(0); }; } else { // old browsers requestFlush = function () { setTimeout(flush, 0); }; } function asap(task) { tail = tail.next = { task: task, domain: isNodeJS && process.domain, next: null }; if (!flushing) { flushing = true; requestFlush(); } }; module.exports = asap; �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/node_modules/asap/LICENSE.md�������000644 �000766 �000024 �00000002072 12455173731 032012� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ Copyright 2009–2013 Contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/node_modules/asap/package.json�����000644 �000766 �000024 �00000010142 12455173731 032671� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "asap", "version": "1.0.0", "description": "High-priority task queue for Node.js and browsers", "keywords": [ "event", "task", "queue" ], "licenses": [ { "type": "MIT", "url": "https://github.com/kriskowal/asap/raw/master/LICENSE.md" } ], "main": "asap", "readme": "\n# ASAP\n\nThis `asap` CommonJS package contains a single `asap` module that\nexports a single `asap` function that executes a function **as soon as\npossible**.\n\n```javascript\nasap(function () {\n // ...\n});\n```\n\nMore formally, ASAP provides a fast event queue that will execute tasks\nuntil it is empty before yielding to the JavaScript engine's underlying\nevent-loop. When the event queue becomes non-empty, ASAP schedules a\nflush event, preferring for that event to occur before the JavaScript\nengine has an opportunity to perform IO tasks or rendering, thus making\nthe first task and subsequent tasks semantically indistinguishable.\nASAP uses a variety of techniques to preserve this invariant on\ndifferent versions of browsers and NodeJS.\n\nBy design, ASAP can starve the event loop on the theory that, if there\nis enough work to be done synchronously, albeit in separate events, long\nenough to starve input or output, it is a strong indicator that the\nprogram needs to push back on scheduling more work.\n\nTake care. ASAP can sustain infinite recursive calls indefinitely\nwithout warning. This is behaviorally equivalent to an infinite loop.\nIt will not halt from a stack overflow, but it *will* chew through\nmemory (which is an oddity I cannot explain at this time). Just as with\ninfinite loops, you can monitor a Node process for this behavior with a\nheart-beat signal. As with infinite loops, a very small amount of\ncaution goes a long way to avoiding problems.\n\n```javascript\nfunction loop() {\n asap(loop);\n}\nloop();\n```\n\nASAP is distinct from `setImmediate` in that it does not suffer the\noverhead of returning a handle and being possible to cancel. For a\n`setImmediate` shim, consider [setImmediate][].\n\n[setImmediate]: https://github.com/noblejs/setimmediate\n\nIf a task throws an exception, it will not interrupt the flushing of\nhigh-priority tasks. The exception will be postponed to a later,\nlow-priority event to avoid slow-downs, when the underlying JavaScript\nengine will treat it as it does any unhandled exception.\n\n## Heritage\n\nASAP has been factored out of the [Q][] asynchronous promise library.\nIt originally had a naïve implementation in terms of `setTimeout`, but\n[Malte Ubl][NonBlocking] provided an insight that `postMessage` might be\nuseful for creating a high-priority, no-delay event dispatch hack.\nSince then, Internet Explorer proposed and implemented `setImmediate`.\nRobert Kratić began contributing to Q by measuring the performance of\nthe internal implementation of `asap`, paying particular attention to\nerror recovery. Domenic, Robert, and I collectively settled on the\ncurrent strategy of unrolling the high-priority event queue internally\nregardless of what strategy we used to dispatch the potentially\nlower-priority flush event. Domenic went on to make ASAP cooperate with\nNodeJS domains.\n\n[Q]: https://github.com/kriskowal/q\n[NonBlocking]: http://www.nonblocking.io/2011/06/windownexttick.html\n\nFor further reading, Nicholas Zakas provided a thorough article on [The\nCase for setImmediate][NCZ].\n\n[NCZ]: http://www.nczonline.net/blog/2013/07/09/the-case-for-setimmediate/\n\n## License\n\nCopyright 2009-2013 by Contributors\nMIT License (enclosed)\n\n", "readmeFilename": "README.md", "_id": "asap@1.0.0", "dist": { "shasum": "b2a45da5fdfa20b0496fc3768cc27c12fa916a7d", "tarball": "http://registry.npmjs.org/asap/-/asap-1.0.0.tgz" }, "_from": "asap@>=1.0.0 <2.0.0", "_npmVersion": "1.2.15", "_npmUser": { "name": "kriskowal", "email": "kris.kowal@cixar.com" }, "maintainers": [ { "name": "kriskowal", "email": "kris.kowal@cixar.com" } ], "directories": {}, "_shasum": "b2a45da5fdfa20b0496fc3768cc27c12fa916a7d", "_resolved": "https://registry.npmjs.org/asap/-/asap-1.0.0.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/dezalgo/node_modules/asap/README.md��������000644 �000766 �000024 �00000006176 12455173731 031676� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ # ASAP This `asap` CommonJS package contains a single `asap` module that exports a single `asap` function that executes a function **as soon as possible**. ```javascript asap(function () { // ... }); ``` More formally, ASAP provides a fast event queue that will execute tasks until it is empty before yielding to the JavaScript engine's underlying event-loop. When the event queue becomes non-empty, ASAP schedules a flush event, preferring for that event to occur before the JavaScript engine has an opportunity to perform IO tasks or rendering, thus making the first task and subsequent tasks semantically indistinguishable. ASAP uses a variety of techniques to preserve this invariant on different versions of browsers and NodeJS. By design, ASAP can starve the event loop on the theory that, if there is enough work to be done synchronously, albeit in separate events, long enough to starve input or output, it is a strong indicator that the program needs to push back on scheduling more work. Take care. ASAP can sustain infinite recursive calls indefinitely without warning. This is behaviorally equivalent to an infinite loop. It will not halt from a stack overflow, but it *will* chew through memory (which is an oddity I cannot explain at this time). Just as with infinite loops, you can monitor a Node process for this behavior with a heart-beat signal. As with infinite loops, a very small amount of caution goes a long way to avoiding problems. ```javascript function loop() { asap(loop); } loop(); ``` ASAP is distinct from `setImmediate` in that it does not suffer the overhead of returning a handle and being possible to cancel. For a `setImmediate` shim, consider [setImmediate][]. [setImmediate]: https://github.com/noblejs/setimmediate If a task throws an exception, it will not interrupt the flushing of high-priority tasks. The exception will be postponed to a later, low-priority event to avoid slow-downs, when the underlying JavaScript engine will treat it as it does any unhandled exception. ## Heritage ASAP has been factored out of the [Q][] asynchronous promise library. It originally had a naïve implementation in terms of `setTimeout`, but [Malte Ubl][NonBlocking] provided an insight that `postMessage` might be useful for creating a high-priority, no-delay event dispatch hack. Since then, Internet Explorer proposed and implemented `setImmediate`. Robert Kratić began contributing to Q by measuring the performance of the internal implementation of `asap`, paying particular attention to error recovery. Domenic, Robert, and I collectively settled on the current strategy of unrolling the high-priority event queue internally regardless of what strategy we used to dispatch the potentially lower-priority flush event. Domenic went on to make ASAP cooperate with NodeJS domains. [Q]: https://github.com/kriskowal/q [NonBlocking]: http://www.nonblocking.io/2011/06/windownexttick.html For further reading, Nicholas Zakas provided a thorough article on [The Case for setImmediate][NCZ]. [NCZ]: http://www.nczonline.net/blog/2013/07/09/the-case-for-setimmediate/ ## License Copyright 2009-2013 by Contributors MIT License (enclosed) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/.npmignore��������������������000644 �000766 �000024 �00000000052 12455173731 027700� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules node_modules/* npm_debug.log ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/index.js����������������������000755 �000766 �000024 �00000016050 12455173731 027356� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var ProtoList = require('proto-list') , path = require('path') , fs = require('fs') , ini = require('ini') , EE = require('events').EventEmitter , url = require('url') , http = require('http') var exports = module.exports = function () { var args = [].slice.call(arguments) , conf = new ConfigChain() while(args.length) { var a = args.shift() if(a) conf.push ( 'string' === typeof a ? json(a) : a ) } return conf } //recursively find a file... var find = exports.find = function () { var rel = path.join.apply(null, [].slice.call(arguments)) function find(start, rel) { var file = path.join(start, rel) try { fs.statSync(file) return file } catch (err) { if(path.dirname(start) !== start) // root return find(path.dirname(start), rel) } } return find(__dirname, rel) } var parse = exports.parse = function (content, file, type) { content = '' + content // if we don't know what it is, try json and fall back to ini // if we know what it is, then it must be that. if (!type) { try { return JSON.parse(content) } catch (er) { return ini.parse(content) } } else if (type === 'json') { if (this.emit) { try { return JSON.parse(content) } catch (er) { this.emit('error', er) } } else { return JSON.parse(content) } } else { return ini.parse(content) } } var json = exports.json = function () { var args = [].slice.call(arguments).filter(function (arg) { return arg != null }) var file = path.join.apply(null, args) var content try { content = fs.readFileSync(file,'utf-8') } catch (err) { return } return parse(content, file, 'json') } var env = exports.env = function (prefix, env) { env = env || process.env var obj = {} var l = prefix.length for(var k in env) { if(k.indexOf(prefix) === 0) obj[k.substring(l)] = env[k] } return obj } exports.ConfigChain = ConfigChain function ConfigChain () { EE.apply(this) ProtoList.apply(this, arguments) this._awaiting = 0 this._saving = 0 this.sources = {} } // multi-inheritance-ish var extras = { constructor: { value: ConfigChain } } Object.keys(EE.prototype).forEach(function (k) { extras[k] = Object.getOwnPropertyDescriptor(EE.prototype, k) }) ConfigChain.prototype = Object.create(ProtoList.prototype, extras) ConfigChain.prototype.del = function (key, where) { // if not specified where, then delete from the whole chain, scorched // earth style if (where) { var target = this.sources[where] target = target && target.data if (!target) { return this.emit('error', new Error('not found '+where)) } delete target[key] } else { for (var i = 0, l = this.list.length; i < l; i ++) { delete this.list[i][key] } } return this } ConfigChain.prototype.set = function (key, value, where) { var target if (where) { target = this.sources[where] target = target && target.data if (!target) { return this.emit('error', new Error('not found '+where)) } } else { target = this.list[0] if (!target) { return this.emit('error', new Error('cannot set, no confs!')) } } target[key] = value return this } ConfigChain.prototype.get = function (key, where) { if (where) { where = this.sources[where] if (where) where = where.data if (where && Object.hasOwnProperty.call(where, key)) return where[key] return undefined } return this.list[0][key] } ConfigChain.prototype.save = function (where, type, cb) { if (typeof type === 'function') cb = type, type = null var target = this.sources[where] if (!target || !(target.path || target.source) || !target.data) { // TODO: maybe save() to a url target could be a PUT or something? // would be easy to swap out with a reddis type thing, too return this.emit('error', new Error('bad save target: '+where)) } if (target.source) { var pref = target.prefix || '' Object.keys(target.data).forEach(function (k) { target.source[pref + k] = target.data[k] }) return this } var type = type || target.type var data = target.data if (target.type === 'json') { data = JSON.stringify(data) } else { data = ini.stringify(data) } this._saving ++ fs.writeFile(target.path, data, 'utf8', function (er) { this._saving -- if (er) { if (cb) return cb(er) else return this.emit('error', er) } if (this._saving === 0) { if (cb) cb() this.emit('save') } }.bind(this)) return this } ConfigChain.prototype.addFile = function (file, type, name) { name = name || file var marker = {__source__:name} this.sources[name] = { path: file, type: type } this.push(marker) this._await() fs.readFile(file, 'utf8', function (er, data) { if (er) this.emit('error', er) this.addString(data, file, type, marker) }.bind(this)) return this } ConfigChain.prototype.addEnv = function (prefix, env, name) { name = name || 'env' var data = exports.env(prefix, env) this.sources[name] = { data: data, source: env, prefix: prefix } return this.add(data, name) } ConfigChain.prototype.addUrl = function (req, type, name) { this._await() var href = url.format(req) name = name || href var marker = {__source__:name} this.sources[name] = { href: href, type: type } this.push(marker) http.request(req, function (res) { var c = [] var ct = res.headers['content-type'] if (!type) { type = ct.indexOf('json') !== -1 ? 'json' : ct.indexOf('ini') !== -1 ? 'ini' : href.match(/\.json$/) ? 'json' : href.match(/\.ini$/) ? 'ini' : null marker.type = type } res.on('data', c.push.bind(c)) .on('end', function () { this.addString(Buffer.concat(c), href, type, marker) }.bind(this)) .on('error', this.emit.bind(this, 'error')) }.bind(this)) .on('error', this.emit.bind(this, 'error')) .end() return this } ConfigChain.prototype.addString = function (data, file, type, marker) { data = this.parse(data, file, type) this.add(data, marker) return this } ConfigChain.prototype.add = function (data, marker) { if (marker && typeof marker === 'object') { var i = this.list.indexOf(marker) if (i === -1) { return this.emit('error', new Error('bad marker')) } this.splice(i, 1, data) marker = marker.__source__ this.sources[marker] = this.sources[marker] || {} this.sources[marker].data = data // we were waiting for this. maybe emit 'load' this._resolve() } else { if (typeof marker === 'string') { this.sources[marker] = this.sources[marker] || {} this.sources[marker].data = data } // trigger the load event if nothing was already going to do so. this._await() this.push(data) process.nextTick(this._resolve.bind(this)) } return this } ConfigChain.prototype.parse = exports.parse ConfigChain.prototype._await = function () { this._awaiting++ } ConfigChain.prototype._resolve = function () { this._awaiting-- if (this._awaiting === 0) this.emit('load', this) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/LICENCE�����������������������000644 �000766 �000024 �00000002056 12455173731 026674� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) 2011 Dominic Tarr Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/node_modules/�����������������000755 �000766 �000024 �00000000000 12456115117 030354� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/package.json������������������000644 �000766 �000024 �00000015573 12455173731 030205� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "config-chain", "version": "1.1.8", "description": "HANDLE CONFIGURATION ONCE AND FOR ALL", "homepage": "http://github.com/dominictarr/config-chain", "repository": { "type": "git", "url": "https://github.com/dominictarr/config-chain.git" }, "dependencies": { "proto-list": "~1.2.1", "ini": "1" }, "devDependencies": { "tap": "0.3.0" }, "author": { "name": "Dominic Tarr", "email": "dominic.tarr@gmail.com", "url": "http://dominictarr.com" }, "scripts": { "test": "tap test/" }, "readme": "#config-chain\n\nUSE THIS MODULE TO LOAD ALL YOUR CONFIGURATIONS\n\n``` js\n\n //npm install config-chain\n\n var cc = require('config-chain')\n , opts = require('optimist').argv //ALWAYS USE OPTIMIST FOR COMMAND LINE OPTIONS.\n , env = opts.env || process.env.YOUR_APP_ENV || 'dev' //SET YOUR ENV LIKE THIS.\n\n // EACH ARG TO CONFIGURATOR IS LOADED INTO CONFIGURATION CHAIN\n // EARLIER ITEMS OVERIDE LATER ITEMS\n // PUTS COMMAND LINE OPTS FIRST, AND DEFAULTS LAST!\n\n //strings are interpereted as filenames.\n //will be loaded synchronously\n\n var conf =\n cc(\n //OVERRIDE SETTINGS WITH COMMAND LINE OPTS\n opts,\n\n //ENV VARS IF PREFIXED WITH 'myApp_'\n\n cc.env('myApp_'), //myApp_foo = 'like this'\n\n //FILE NAMED BY ENV\n path.join(__dirname, 'config.' + env + '.json'),\n\n //IF `env` is PRODUCTION\n env === 'prod'\n ? path.join(__dirname, 'special.json') //load a special file\n : null //NULL IS IGNORED!\n\n //SUBDIR FOR ENV CONFIG\n path.join(__dirname, 'config', env, 'config.json'),\n\n //SEARCH PARENT DIRECTORIES FROM CURRENT DIR FOR FILE\n cc.find('config.json'),\n\n //PUT DEFAULTS LAST\n {\n host: 'localhost'\n port: 8000\n })\n\n var host = conf.get('host')\n\n // or\n\n var host = conf.store.host\n\n```\n\nFINALLY, EASY FLEXIBLE CONFIGURATIONS!\n\n##see also: [proto-list](https://github.com/isaacs/proto-list/)\n\nWHATS THAT YOU SAY?\n\nYOU WANT A \"CLASS\" SO THAT YOU CAN DO CRAYCRAY JQUERY CRAPS?\n\nEXTEND WITH YOUR OWN FUNCTIONALTY!?\n\n## CONFIGCHAIN LIVES TO SERVE ONLY YOU!\n\n```javascript\nvar cc = require('config-chain')\n\n// all the stuff you did before\nvar config = cc({\n some: 'object'\n },\n cc.find('config.json'),\n cc.env('myApp_')\n )\n // CONFIGS AS A SERVICE, aka \"CaaS\", aka EVERY DEVOPS DREAM OMG!\n .addUrl('http://configurator:1234/my-configs')\n // ASYNC FTW!\n .addFile('/path/to/file.json')\n\n // OBJECTS ARE OK TOO, they're SYNC but they still ORDER RIGHT\n // BECAUSE PROMISES ARE USED BUT NO, NOT *THOSE* PROMISES, JUST\n // ACTUAL PROMISES LIKE YOU MAKE TO YOUR MOM, KEPT OUT OF LOVE\n .add({ another: 'object' })\n\n // DIE A THOUSAND DEATHS IF THIS EVER HAPPENS!!\n .on('error', function (er) {\n // IF ONLY THERE WAS SOMETHIGN HARDER THAN THROW\n // MY SORROW COULD BE ADEQUATELY EXPRESSED. /o\\\n throw er\n })\n\n // THROW A PARTY IN YOUR FACE WHEN ITS ALL LOADED!!\n .on('load', function (config) {\n console.awesome('HOLY SHIT!')\n })\n```\n\n# BORING API DOCS\n\n## cc(...args)\n\nMAKE A CHAIN AND ADD ALL THE ARGS.\n\nIf the arg is a STRING, then it shall be a JSON FILENAME.\n\nSYNC I/O!\n\nRETURN THE CHAIN!\n\n## cc.json(...args)\n\nJoin the args INTO A JSON FILENAME!\n\nSYNC I/O!\n\n## cc.find(relativePath)\n\nSEEK the RELATIVE PATH by climbing the TREE OF DIRECTORIES.\n\nRETURN THE FOUND PATH!\n\nSYNC I/O!\n\n## cc.parse(content, file, type)\n\nParse the content string, and guess the type from either the\nspecified type or the filename.\n\nRETURN THE RESULTING OBJECT!\n\nNO I/O!\n\n## cc.env(prefix, env=process.env)\n\nGet all the keys on the provided env object (or process.env) which are\nprefixed by the specified prefix, and put the values on a new object.\n\nRETURN THE RESULTING OBJECT!\n\nNO I/O!\n\n## cc.ConfigChain()\n\nThe ConfigChain class for CRAY CRAY JQUERY STYLE METHOD CHAINING!\n\nOne of these is returned by the main exported function, as well.\n\nIt inherits (prototypically) from\n[ProtoList](https://github.com/isaacs/proto-list/), and also inherits\n(parasitically) from\n[EventEmitter](http://nodejs.org/api/events.html#events_class_events_eventemitter)\n\nIt has all the methods from both, and except where noted, they are\nunchanged.\n\n### LET IT BE KNOWN THAT chain IS AN INSTANCE OF ConfigChain.\n\n## chain.sources\n\nA list of all the places where it got stuff. The keys are the names\npassed to addFile or addUrl etc, and the value is an object with some\ninfo about the data source.\n\n## chain.addFile(filename, type, [name=filename])\n\nFilename is the name of the file. Name is an arbitrary string to be\nused later if you desire. Type is either 'ini' or 'json', and will\ntry to guess intelligently if omitted.\n\nLoaded files can be saved later.\n\n## chain.addUrl(url, type, [name=url])\n\nSame as the filename thing, but with a url.\n\nCan't be saved later.\n\n## chain.addEnv(prefix, env, [name='env'])\n\nAdd all the keys from the env object that start with the prefix.\n\n## chain.addString(data, file, type, [name])\n\nParse the string and add it to the set. (Mainly used internally.)\n\n## chain.add(object, [name])\n\nAdd the object to the set.\n\n## chain.root {Object}\n\nThe root from which all the other config objects in the set descend\nprototypically.\n\nPut your defaults here.\n\n## chain.set(key, value, name)\n\nSet the key to the value on the named config object. If name is\nunset, then set it on the first config object in the set. (That is,\nthe one with the highest priority, which was added first.)\n\n## chain.get(key, [name])\n\nGet the key from the named config object explicitly, or from the\nresolved configs if not specified.\n\n## chain.save(name, type)\n\nWrite the named config object back to its origin.\n\nCurrently only supported for env and file config types.\n\nFor files, encode the data according to the type.\n\n## chain.on('save', function () {})\n\nWhen one or more files are saved, emits `save` event when they're all\nsaved.\n\n## chain.on('load', function (chain) {})\n\nWhen the config chain has loaded all the specified files and urls and\nsuch, the 'load' event fires.\n", "readmeFilename": "readme.markdown", "bugs": { "url": "https://github.com/dominictarr/config-chain/issues" }, "_id": "config-chain@1.1.8", "dist": { "shasum": "0943d0b7227213a20d4eaff4434f4a1c0a052cad", "tarball": "http://registry.npmjs.org/config-chain/-/config-chain-1.1.8.tgz" }, "_from": "config-chain@^1.1.8", "_npmVersion": "1.3.6", "_npmUser": { "name": "dominictarr", "email": "dominic.tarr@gmail.com" }, "maintainers": [ { "name": "dominictarr", "email": "dominic.tarr@gmail.com" }, { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "0943d0b7227213a20d4eaff4434f4a1c0a052cad", "_resolved": "https://registry.npmjs.org/config-chain/-/config-chain-1.1.8.tgz" } �������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/readme.markdown���������������000644 �000766 �000024 �00000012513 12455173731 030707� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#config-chain USE THIS MODULE TO LOAD ALL YOUR CONFIGURATIONS ``` js //npm install config-chain var cc = require('config-chain') , opts = require('optimist').argv //ALWAYS USE OPTIMIST FOR COMMAND LINE OPTIONS. , env = opts.env || process.env.YOUR_APP_ENV || 'dev' //SET YOUR ENV LIKE THIS. // EACH ARG TO CONFIGURATOR IS LOADED INTO CONFIGURATION CHAIN // EARLIER ITEMS OVERIDE LATER ITEMS // PUTS COMMAND LINE OPTS FIRST, AND DEFAULTS LAST! //strings are interpereted as filenames. //will be loaded synchronously var conf = cc( //OVERRIDE SETTINGS WITH COMMAND LINE OPTS opts, //ENV VARS IF PREFIXED WITH 'myApp_' cc.env('myApp_'), //myApp_foo = 'like this' //FILE NAMED BY ENV path.join(__dirname, 'config.' + env + '.json'), //IF `env` is PRODUCTION env === 'prod' ? path.join(__dirname, 'special.json') //load a special file : null //NULL IS IGNORED! //SUBDIR FOR ENV CONFIG path.join(__dirname, 'config', env, 'config.json'), //SEARCH PARENT DIRECTORIES FROM CURRENT DIR FOR FILE cc.find('config.json'), //PUT DEFAULTS LAST { host: 'localhost' port: 8000 }) var host = conf.get('host') // or var host = conf.store.host ``` FINALLY, EASY FLEXIBLE CONFIGURATIONS! ##see also: [proto-list](https://github.com/isaacs/proto-list/) WHATS THAT YOU SAY? YOU WANT A "CLASS" SO THAT YOU CAN DO CRAYCRAY JQUERY CRAPS? EXTEND WITH YOUR OWN FUNCTIONALTY!? ## CONFIGCHAIN LIVES TO SERVE ONLY YOU! ```javascript var cc = require('config-chain') // all the stuff you did before var config = cc({ some: 'object' }, cc.find('config.json'), cc.env('myApp_') ) // CONFIGS AS A SERVICE, aka "CaaS", aka EVERY DEVOPS DREAM OMG! .addUrl('http://configurator:1234/my-configs') // ASYNC FTW! .addFile('/path/to/file.json') // OBJECTS ARE OK TOO, they're SYNC but they still ORDER RIGHT // BECAUSE PROMISES ARE USED BUT NO, NOT *THOSE* PROMISES, JUST // ACTUAL PROMISES LIKE YOU MAKE TO YOUR MOM, KEPT OUT OF LOVE .add({ another: 'object' }) // DIE A THOUSAND DEATHS IF THIS EVER HAPPENS!! .on('error', function (er) { // IF ONLY THERE WAS SOMETHIGN HARDER THAN THROW // MY SORROW COULD BE ADEQUATELY EXPRESSED. /o\ throw er }) // THROW A PARTY IN YOUR FACE WHEN ITS ALL LOADED!! .on('load', function (config) { console.awesome('HOLY SHIT!') }) ``` # BORING API DOCS ## cc(...args) MAKE A CHAIN AND ADD ALL THE ARGS. If the arg is a STRING, then it shall be a JSON FILENAME. SYNC I/O! RETURN THE CHAIN! ## cc.json(...args) Join the args INTO A JSON FILENAME! SYNC I/O! ## cc.find(relativePath) SEEK the RELATIVE PATH by climbing the TREE OF DIRECTORIES. RETURN THE FOUND PATH! SYNC I/O! ## cc.parse(content, file, type) Parse the content string, and guess the type from either the specified type or the filename. RETURN THE RESULTING OBJECT! NO I/O! ## cc.env(prefix, env=process.env) Get all the keys on the provided env object (or process.env) which are prefixed by the specified prefix, and put the values on a new object. RETURN THE RESULTING OBJECT! NO I/O! ## cc.ConfigChain() The ConfigChain class for CRAY CRAY JQUERY STYLE METHOD CHAINING! One of these is returned by the main exported function, as well. It inherits (prototypically) from [ProtoList](https://github.com/isaacs/proto-list/), and also inherits (parasitically) from [EventEmitter](http://nodejs.org/api/events.html#events_class_events_eventemitter) It has all the methods from both, and except where noted, they are unchanged. ### LET IT BE KNOWN THAT chain IS AN INSTANCE OF ConfigChain. ## chain.sources A list of all the places where it got stuff. The keys are the names passed to addFile or addUrl etc, and the value is an object with some info about the data source. ## chain.addFile(filename, type, [name=filename]) Filename is the name of the file. Name is an arbitrary string to be used later if you desire. Type is either 'ini' or 'json', and will try to guess intelligently if omitted. Loaded files can be saved later. ## chain.addUrl(url, type, [name=url]) Same as the filename thing, but with a url. Can't be saved later. ## chain.addEnv(prefix, env, [name='env']) Add all the keys from the env object that start with the prefix. ## chain.addString(data, file, type, [name]) Parse the string and add it to the set. (Mainly used internally.) ## chain.add(object, [name]) Add the object to the set. ## chain.root {Object} The root from which all the other config objects in the set descend prototypically. Put your defaults here. ## chain.set(key, value, name) Set the key to the value on the named config object. If name is unset, then set it on the first config object in the set. (That is, the one with the highest priority, which was added first.) ## chain.get(key, [name]) Get the key from the named config object explicitly, or from the resolved configs if not specified. ## chain.save(name, type) Write the named config object back to its origin. Currently only supported for env and file config types. For files, encode the data according to the type. ## chain.on('save', function () {}) When one or more files are saved, emits `save` event when they're all saved. ## chain.on('load', function (chain) {}) When the config chain has loaded all the specified files and urls and such, the 'load' event fires. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/config-chain/node_modules/proto-list/������000755 �000766 �000024 �00000000000 12456115117 032470� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/config-chain/node_modules/proto-list/LICENSE����������������������000644 �000766 �000024 �00000002104 12455173731 033420� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/config-chain/node_modules/proto-list/package.json�����������������000644 �000766 �000024 �00000002451 12455173731 034706� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "proto-list", "version": "1.2.3", "description": "A utility for managing a prototype chain", "main": "./proto-list.js", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/isaacs/proto-list" }, "license": { "type": "MIT", "url": "https://github.com/isaacs/proto-list/blob/master/LICENSE" }, "devDependencies": { "tap": "0" }, "gitHead": "44d76897176861d176a53ed3f3fc5e05cdee7643", "bugs": { "url": "https://github.com/isaacs/proto-list/issues" }, "homepage": "https://github.com/isaacs/proto-list", "_id": "proto-list@1.2.3", "_shasum": "6235554a1bca1f0d15e3ca12ca7329d5def42bd9", "_from": "proto-list@~1.2.1", "_npmVersion": "1.4.14", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "6235554a1bca1f0d15e3ca12ca7329d5def42bd9", "tarball": "http://registry.npmjs.org/proto-list/-/proto-list-1.2.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/proto-list/-/proto-list-1.2.3.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/config-chain/node_modules/proto-list/proto-list.js����������������000644 �000766 �000024 �00000004343 12455173731 035074� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64������������������������������������������������������������������������������������������������������������������������������������������������� module.exports = ProtoList function setProto(obj, proto) { if (typeof Object.setPrototypeOf === "function") return Object.setPrototypeOf(obj, proto) else obj.__proto__ = proto } function ProtoList () { this.list = [] var root = null Object.defineProperty(this, 'root', { get: function () { return root }, set: function (r) { root = r if (this.list.length) { setProto(this.list[this.list.length - 1], r) } }, enumerable: true, configurable: true }) } ProtoList.prototype = { get length () { return this.list.length } , get keys () { var k = [] for (var i in this.list[0]) k.push(i) return k } , get snapshot () { var o = {} this.keys.forEach(function (k) { o[k] = this.get(k) }, this) return o } , get store () { return this.list[0] } , push : function (obj) { if (typeof obj !== "object") obj = {valueOf:obj} if (this.list.length >= 1) { setProto(this.list[this.list.length - 1], obj) } setProto(obj, this.root) return this.list.push(obj) } , pop : function () { if (this.list.length >= 2) { setProto(this.list[this.list.length - 2], this.root) } return this.list.pop() } , unshift : function (obj) { setProto(obj, this.list[0] || this.root) return this.list.unshift(obj) } , shift : function () { if (this.list.length === 1) { setProto(this.list[0], this.root) } return this.list.shift() } , get : function (key) { return this.list[0][key] } , set : function (key, val, save) { if (!this.length) this.push({}) if (save && this.list[0].hasOwnProperty(key)) this.push({}) return this.list[0][key] = val } , forEach : function (fn, thisp) { for (var key in this.list[0]) fn.call(thisp, key, this.list[0][key]) } , slice : function () { return this.list.slice.apply(this.list, arguments) } , splice : function () { // handle injections var ret = this.list.splice.apply(this.list, arguments) for (var i = 0, l = this.list.length; i < l; i++) { setProto(this.list[i], this.list[i + 1] || this.root) } return ret } } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/config-chain/node_modules/proto-list/README.md��������������������000644 �000766 �000024 �00000000120 12455173731 033666� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������A list of objects, bound by their prototype chain. Used in npm's config stuff. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/index.js�������������������������000644 �000766 �000024 �00000021016 12455173731 027031� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������"use strict" var wcwidth = require('./width') var utils = require('./utils') var padRight = utils.padRight var padCenter = utils.padCenter var padLeft = utils.padLeft var splitIntoLines = utils.splitIntoLines var splitLongWords = utils.splitLongWords var truncateString = utils.truncateString var DEFAULT_HEADING_TRANSFORM = function(key) { return key.toUpperCase() } var DEFAULT_DATA_TRANSFORM = function(cell, column, index) { return cell } var DEFAULTS = { maxWidth: Infinity, minWidth: 0, columnSplitter: ' ', truncate: false, truncateMarker: '…', preserveNewLines: false, paddingChr: ' ', showHeaders: true, headingTransform: DEFAULT_HEADING_TRANSFORM, dataTransform: DEFAULT_DATA_TRANSFORM } module.exports = function(items, options) { options = options || {} var columnConfigs = options.config || {} delete options.config // remove config so doesn't appear on every column. var maxLineWidth = options.maxLineWidth || Infinity delete options.maxLineWidth // this is a line control option, don't pass it to column // Option defaults inheritance: // options.config[columnName] => options => DEFAULTS options = mixin(options, DEFAULTS) options.config = options.config || Object.create(null) options.spacing = options.spacing || '\n' // probably useless options.preserveNewLines = !!options.preserveNewLines options.showHeaders = !!options.showHeaders; options.columns = options.columns || options.include // alias include/columns, prefer columns if supplied var columnNames = options.columns || [] // optional user-supplied columns to include items = toArray(items, columnNames) // if not suppled column names, automatically determine columns from data keys if (!columnNames.length) { items.forEach(function(item) { for (var columnName in item) { if (columnNames.indexOf(columnName) === -1) columnNames.push(columnName) } }) } // initialize column defaults (each column inherits from options.config) var columns = columnNames.reduce(function(columns, columnName) { var column = Object.create(options) columns[columnName] = mixin(column, columnConfigs[columnName]) return columns }, Object.create(null)) // sanitize column settings columnNames.forEach(function(columnName) { var column = columns[columnName] column.name = columnName column.maxWidth = Math.ceil(column.maxWidth) column.minWidth = Math.ceil(column.minWidth) column.truncate = !!column.truncate column.align = column.align || 'left' }) // sanitize data items = items.map(function(item) { var result = Object.create(null) columnNames.forEach(function(columnName) { // null/undefined -> '' result[columnName] = item[columnName] != null ? item[columnName] : '' // toString everything result[columnName] = '' + result[columnName] if (columns[columnName].preserveNewLines) { // merge non-newline whitespace chars result[columnName] = result[columnName].replace(/[^\S\n]/gmi, ' ') } else { // merge all whitespace chars result[columnName] = result[columnName].replace(/\s/gmi, ' ') } }) return result }) // transform data cells columnNames.forEach(function(columnName) { var column = columns[columnName] items = items.map(function(item, index) { var col = Object.create(column) item[columnName] = column.dataTransform(item[columnName], col, index) var changedKeys = Object.keys(col) // disable default heading transform if we wrote to column.name if (changedKeys.indexOf('name') !== -1) { if (column.headingTransform !== DEFAULT_HEADING_TRANSFORM) return column.headingTransform = function(heading) {return heading} } changedKeys.forEach(function(key) { column[key] = col[key] }) return item }) }) // add headers var headers = {} if(options.showHeaders) { columnNames.forEach(function(columnName) { var column = columns[columnName] headers[columnName] = column.headingTransform(column.name) }) items.unshift(headers) } // get actual max-width between min & max // based on length of data in columns columnNames.forEach(function(columnName) { var column = columns[columnName] column.width = items.map(function(item) { return item[columnName] }).reduce(function(min, cur) { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, wcwidth(cur)))) }, 0) }) // split long words so they can break onto multiple lines columnNames.forEach(function(columnName) { var column = columns[columnName] items = items.map(function(item) { item[columnName] = splitLongWords(item[columnName], column.width, column.truncateMarker) return item }) }) // wrap long lines. each item is now an array of lines. columnNames.forEach(function(columnName) { var column = columns[columnName] items = items.map(function(item, index) { var cell = item[columnName] item[columnName] = splitIntoLines(cell, column.width) // if truncating required, only include first line + add truncation char if (column.truncate && item[columnName].length > 1) { item[columnName] = splitIntoLines(cell, column.width - wcwidth(column.truncateMarker)) var firstLine = item[columnName][0] if (!endsWith(firstLine, column.truncateMarker)) item[columnName][0] += column.truncateMarker item[columnName] = item[columnName].slice(0, 1) } return item }) }) // recalculate column widths from truncated output/lines columnNames.forEach(function(columnName) { var column = columns[columnName] column.width = items.map(function(item) { return item[columnName].reduce(function(min, cur) { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, wcwidth(cur)))) }, 0) }).reduce(function(min, cur) { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, cur))) }, 0) }) var rows = createRows(items, columns, columnNames, options.paddingChr) // merge lines into rows // conceive output return rows.reduce(function(output, row) { return output.concat(row.reduce(function(rowOut, line) { return rowOut.concat(line.join(options.columnSplitter)) }, [])) }, []).map(function(line) { return truncateString(line, maxLineWidth) }).join(options.spacing) } /** * Convert wrapped lines into rows with padded values. * * @param Array items data to process * @param Array columns column width settings for wrapping * @param Array columnNames column ordering * @return Array items wrapped in arrays, corresponding to lines */ function createRows(items, columns, columnNames, paddingChr) { return items.map(function(item) { var row = [] var numLines = 0 columnNames.forEach(function(columnName) { numLines = Math.max(numLines, item[columnName].length) }) // combine matching lines of each rows for (var i = 0; i < numLines; i++) { row[i] = row[i] || [] columnNames.forEach(function(columnName) { var column = columns[columnName] var val = item[columnName][i] || '' // || '' ensures empty columns get padded if (column.align === 'right') row[i].push(padLeft(val, column.width, paddingChr)) else if (column.align === 'center' || column.align === 'centre') row[i].push(padCenter(val, column.width, paddingChr)) else row[i].push(padRight(val, column.width, paddingChr)) }) } return row }) } /** * Generic source->target mixin. * Copy properties from `source` into `target` if target doesn't have them. * Destructive. Modifies `target`. * * @param target Object target for mixin properties. * @param source Object source of mixin properties. * @return Object `target` after mixin applied. */ function mixin(target, source) { source = source || {} for (var key in source) { if (target.hasOwnProperty(key)) continue target[key] = source[key] } return target } /** * Adapted from String.prototype.endsWith polyfill. */ function endsWith(target, searchString, position) { position = position || target.length; position = position - searchString.length; var lastIndex = target.lastIndexOf(searchString); return lastIndex !== -1 && lastIndex === position; } function toArray(items, columnNames) { if (Array.isArray(items)) return items var rows = [] for (var key in items) { var item = {} item[columnNames[0] || 'key'] = key item[columnNames[1] || 'value'] = items[key] rows.push(item) } return rows } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/LICENSE��������������������������000644 �000766 �000024 �00000002064 12455173731 026373� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The MIT License (MIT) Copyright (c) 2013 Tim Oxley Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/��������������������000755 �000766 �000024 �00000000000 12456115117 030034� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/package.json���������������������000644 �000766 �000024 �00000002775 12455173731 027665� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "columnify", "version": "1.3.2", "description": "Render data in text columns. supports in-column text-wrap.", "main": "index.js", "scripts": { "pretest": "npm prune", "test": "tape test/*.js | tap-spec", "bench": "npm test && node bench" }, "author": { "name": "Tim Oxley" }, "license": "MIT", "devDependencies": { "chalk": "^0.5.1", "tap-spec": "^2.1.1", "tape": "^3.0.3" }, "repository": { "type": "git", "url": "git://github.com/timoxley/columnify.git" }, "keywords": [ "column", "text", "ansi", "console", "terminal", "wrap", "table" ], "bugs": { "url": "https://github.com/timoxley/columnify/issues" }, "homepage": "https://github.com/timoxley/columnify", "dependencies": { "strip-ansi": "^2.0.0", "wcwidth": "^1.0.0" }, "directories": { "test": "test" }, "gitHead": "5c7d4363a8d6178f0d415e8bdaf692281fe71975", "_id": "columnify@1.3.2", "_shasum": "61bd578a9269ae6fd949ce36fff589f3702c7867", "_from": "columnify@>=1.3.2 <1.4.0", "_npmVersion": "2.1.10", "_nodeVersion": "0.10.33", "_npmUser": { "name": "timoxley", "email": "secoif@gmail.com" }, "maintainers": [ { "name": "timoxley", "email": "secoif@gmail.com" } ], "dist": { "shasum": "61bd578a9269ae6fd949ce36fff589f3702c7867", "tarball": "http://registry.npmjs.org/columnify/-/columnify-1.3.2.tgz" }, "_resolved": "https://registry.npmjs.org/columnify/-/columnify-1.3.2.tgz" } ���iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/Readme.md������������������������000644 �000766 �000024 �00000017354 12455173731 027115� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# columnify [![Build Status](https://travis-ci.org/timoxley/columnify.png?branch=master)](https://travis-ci.org/timoxley/columnify) Create text-based columns suitable for console output from objects or arrays of objects. Columns are automatically resized to fit the content of the largest cell. Each cell will be padded with spaces to fill the available space and ensure column contents are left-aligned. Designed to [handle sensible wrapping in npm search results](https://github.com/isaacs/npm/pull/2328). `npm search` before & after integrating columnify: ![npm-tidy-search](https://f.cloud.github.com/assets/43438/1848959/ae02ad04-76a1-11e3-8255-4781debffc26.gif) ## Installation & Update ``` $ npm install --save columnify@latest ``` ## Usage ```javascript var columnify = require('columnify') var columns = columnify(data, options) console.log(columns) ``` ## Examples ### Columnify Objects Objects are converted to a list of key/value pairs: ```javascript var data = { "commander@0.6.1": 1, "minimatch@0.2.14": 3, "mkdirp@0.3.5": 2, "sigmund@1.0.0": 3 } console.log(columnify(data)) ``` #### Output: ``` KEY VALUE commander@0.6.1 1 minimatch@0.2.14 3 mkdirp@0.3.5 2 sigmund@1.0.0 3 ``` ### Custom Column Names ```javascript var data = { "commander@0.6.1": 1, "minimatch@0.2.14": 3, "mkdirp@0.3.5": 2, "sigmund@1.0.0": 3 } console.log(columnify(data, {columns: ['MODULE', 'COUNT']})) ``` #### Output: ``` MODULE COUNT commander@0.6.1 1 minimatch@0.2.14 3 mkdirp@0.3.5 2 sigmund@1.0.0 3 ``` ### Columnify Arrays of Objects Column headings are extracted from the keys in supplied objects. ```javascript var columnify = require('columnify') var columns = columnify([{ name: 'mod1', version: '0.0.1' }, { name: 'module2', version: '0.2.0' }]) console.log(columns) ``` #### Output: ``` NAME VERSION mod1 0.0.1 module2 0.2.0 ``` ### Wrapping Column Cells You can define the maximum width before wrapping for individual cells in columns. Minimum width is also supported. Wrapping will happen at word boundaries. Empty cells or those which do not fill the max/min width will be padded with spaces. ```javascript var columns = columnify([{ name: 'mod1', description: 'some description which happens to be far larger than the max', version: '0.0.1', }, { name: 'module-two', description: 'another description larger than the max', version: '0.2.0', }) console.log(columns) ``` #### Output: ``` NAME DESCRIPTION VERSION mod1 some description which happens 0.0.1 to be far larger than the max module-two another description larger 0.2.0 than the max ``` ### Truncating Column Cells You can disable wrapping and instead truncate content at the maximum column width. Truncation respects word boundaries. A truncation marker, `…` will appear next to the last word in any truncated line. ```javascript var columns = columnify(data, { truncate: true, config: { description: { maxWidth: 20 } } }) console.log(columns) ``` #### Output: ``` NAME DESCRIPTION VERSION mod1 some description… 0.0.1 module-two another description… 0.2.0 ``` ### Filtering & Ordering Columns By default, all properties are converted into columns, whether or not they exist on every object or not. To explicitly specify which columns to include, and in which order, supply a "columns" or "include" array ("include" is just an alias). ```javascript var data = [{ name: 'module1', description: 'some description', version: '0.0.1', }, { name: 'module2', description: 'another description', version: '0.2.0', }] var columns = columnify(data, { columns: ['name', 'version'] // note description not included }) console.log(columns) ``` #### Output: ``` NAME VERSION module1 0.0.1 module2 0.2.0 ``` ## Other Configuration Options ### Align Right/Center ```js var data = { "mocha@1.18.2": 1, "commander@2.0.0": 1, "debug@0.8.1": 1 } columnify(data, {config: {value: {align: 'right'}}}) ``` #### Output: ``` KEY VALUE mocha@1.18.2 1 commander@2.0.0 1 debug@0.8.1 1 ``` Align Center works in a similar way. ### Padding ```js var data = { "shortKey": "veryVeryVeryVeryVeryLongVal", "veryVeryVeryVeryVeryLongKey": "shortVal" } columnify(data, { paddingChr: '.'}) ``` #### Output: ``` KEY........................ VALUE...................... shortKey................... veryVeryVeryVeryVeryLongVal veryVeryVeryVeryVeryLongKey shortVal................... ``` ### Preserve existing newlines By default, `columnify` sanitises text by replacing any occurance of 1 or more whitespace characters with a single space. `columnify` can be configured to respect existing new line characters using the `preserveNewLines` option. Note this will still collapse all other whitespace. ```javascript var data = [{ name: "glob@3.2.9", paths: [ "node_modules/tap/node_modules/glob", "node_modules/tape/node_modules/glob" ].join('\n') }, { name: "nopt@2.2.1", paths: [ "node_modules/tap/node_modules/nopt" ] }, { name: "runforcover@0.0.2", paths: "node_modules/tap/node_modules/runforcover" }] console.log(columnify(data, {preserveNewLines: true})) ``` #### Output: ``` NAME PATHS glob@3.2.9 node_modules/tap/node_modules/glob node_modules/tape/node_modules/glob nopt@2.2.1 node_modules/tap/node_modules/nopt runforcover@0.0.2 node_modules/tap/node_modules/runforcover ``` Compare this with output without `preserveNewLines`: ```javascript console.log(columnify(data, {preserveNewLines: false})) // or just console.log(columnify(data)) ``` ``` NAME PATHS glob@3.2.9 node_modules/tap/node_modules/glob node_modules/tape/node_modules/glob nopt@2.2.1 node_modules/tap/node_modules/nopt runforcover@0.0.2 node_modules/tap/node_modules/runforcover ``` ### Custom Truncation Marker You can change the truncation marker to something other than the default `…`. ```javascript var columns = columnify(data, { truncate: true, truncateMarker: '>', widths: { description: { maxWidth: 20 } } }) console.log(columns) ``` #### Output: ``` NAME DESCRIPTION VERSION mod1 some description> 0.0.1 module-two another description> 0.2.0 ``` ### Custom Column Splitter If your columns need some bling, you can split columns with custom characters. ```javascript var columns = columnify(data, { columnSplitter: ' | ' }) console.log(columns) ``` #### Output: ``` NAME | DESCRIPTION | VERSION mod1 | some description which happens to be far larger than the max | 0.0.1 module-two | another description larger than the max | 0.2.0 ``` ## Multibyte Character Support `columnify` uses [mycoboco/wcwidth.js](https://github.com/mycoboco/wcwidth.js) to calculate length of multibyte characters: ```javascript var data = [{ name: 'module-one', description: 'some description', version: '0.0.1', }, { name: '这是一个很长的名字的模块', description: '这真的是一个描述的内容这个描述很长', version: "0.3.3" }] console.log(columnify(data)) ``` #### Without multibyte handling: i.e. before columnify added this feature ``` NAME DESCRIPTION VERSION module-one some description 0.0.1 这是一个很长的名字的模块 这真的是一个描述的内容这个描述很长 0.3.3 ``` #### With multibyte handling: ``` NAME DESCRIPTION VERSION module-one some description 0.0.1 这是一个很长的名字的模块 这真的是一个描述的内容这个描述很长 0.3.3 ``` ## License MIT ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/utils.js�������������������������000644 �000766 �000024 �00000011224 12455173731 027062� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������"use strict" var wcwidth = require('./width') /** * repeat string `str` up to total length of `len` * * @param String str string to repeat * @param Number len total length of output string */ function repeatString(str, len) { return Array.apply(null, {length: len + 1}).join(str).slice(0, len) } /** * Pad `str` up to total length `max` with `chr`. * If `str` is longer than `max`, padRight will return `str` unaltered. * * @param String str string to pad * @param Number max total length of output string * @param String chr optional. Character to pad with. default: ' ' * @return String padded str */ function padRight(str, max, chr) { str = str != null ? str : '' str = String(str) var length = max - wcwidth(str) if (length <= 0) return str return str + repeatString(chr || ' ', length) } /** * Pad `str` up to total length `max` with `chr`. * If `str` is longer than `max`, padCenter will return `str` unaltered. * * @param String str string to pad * @param Number max total length of output string * @param String chr optional. Character to pad with. default: ' ' * @return String padded str */ function padCenter(str, max, chr) { str = str != null ? str : '' str = String(str) var length = max - wcwidth(str) if (length <= 0) return str var lengthLeft = Math.floor(length/2) var lengthRight = length - lengthLeft return repeatString(chr || ' ', lengthLeft) + str + repeatString(chr || ' ', lengthRight) } /** * Pad `str` up to total length `max` with `chr`, on the left. * If `str` is longer than `max`, padRight will return `str` unaltered. * * @param String str string to pad * @param Number max total length of output string * @param String chr optional. Character to pad with. default: ' ' * @return String padded str */ function padLeft(str, max, chr) { str = str != null ? str : '' str = String(str) var length = max - wcwidth(str) if (length <= 0) return str return repeatString(chr || ' ', length) + str } /** * Split a String `str` into lines of maxiumum length `max`. * Splits on word boundaries. Preserves existing new lines. * * @param String str string to split * @param Number max length of each line * @return Array Array containing lines. */ function splitIntoLines(str, max) { function _splitIntoLines(str, max) { return str.trim().split(' ').reduce(function(lines, word) { var line = lines[lines.length - 1] if (line && wcwidth(line.join(' ')) + wcwidth(word) < max) { lines[lines.length - 1].push(word) // add to line } else lines.push([word]) // new line return lines }, []).map(function(l) { return l.join(' ') }) } return str.split('\n').map(function(str) { return _splitIntoLines(str, max) }).reduce(function(lines, line) { return lines.concat(line) }, []) } /** * Add spaces and `truncationChar` between words of * `str` which are longer than `max`. * * @param String str string to split * @param Number max length of each line * @param Number truncationChar character to append to split words * @return String */ function splitLongWords(str, max, truncationChar, result) { str = str.trim() result = result || [] if (!str) return result.join(' ') || '' var words = str.split(' ') var word = words.shift() || str if (wcwidth(word) > max) { // slice is based on length no wcwidth var i = 0 var wwidth = 0 var limit = max - wcwidth(truncationChar) while (i < word.length) { var w = wcwidth(word.charAt(i)) if(w + wwidth > limit) break wwidth += w ++i } var remainder = word.slice(i) // get remainder words.unshift(remainder) // save remainder for next loop word = word.slice(0, i) // grab truncated word word += truncationChar // add trailing … or whatever } result.push(word) return splitLongWords(words.join(' '), max, truncationChar, result) } /** * Truncate `str` into total width `max` * If `str` is shorter than `max`, will return `str` unaltered. * * @param String str string to truncated * @param Number max total wcwidth of output string * @return String truncated str */ function truncateString(str, max) { str = str != null ? str : '' str = String(str) if(max == Infinity) return str var i = 0 var wwidth = 0 while (i < str.length) { var w = wcwidth(str.charAt(i)) if(w + wwidth > max) break wwidth += w ++i } return str.slice(0, i) } /** * Exports */ module.exports.padRight = padRight module.exports.padCenter = padCenter module.exports.padLeft = padLeft module.exports.splitIntoLines = splitIntoLines module.exports.splitLongWords = splitLongWords module.exports.truncateString = truncateString ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/width.js�������������������������000644 �000766 �000024 �00000000214 12455173731 027036� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var stripAnsi = require('strip-ansi') var wcwidth = require('wcwidth') module.exports = function(str) { return wcwidth(stripAnsi(str)) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/���������000755 �000766 �000024 �00000000000 12456115117 032125� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/������������000755 �000766 �000024 �00000000000 12456115117 031505� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/.npmignore��000644 �000766 �000024 �00000000015 12455173731 033505� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/combining.js000644 �000766 �000024 �00000006006 12455173731 034017� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = [ [ 0x0300, 0x036F ], [ 0x0483, 0x0486 ], [ 0x0488, 0x0489 ], [ 0x0591, 0x05BD ], [ 0x05BF, 0x05BF ], [ 0x05C1, 0x05C2 ], [ 0x05C4, 0x05C5 ], [ 0x05C7, 0x05C7 ], [ 0x0600, 0x0603 ], [ 0x0610, 0x0615 ], [ 0x064B, 0x065E ], [ 0x0670, 0x0670 ], [ 0x06D6, 0x06E4 ], [ 0x06E7, 0x06E8 ], [ 0x06EA, 0x06ED ], [ 0x070F, 0x070F ], [ 0x0711, 0x0711 ], [ 0x0730, 0x074A ], [ 0x07A6, 0x07B0 ], [ 0x07EB, 0x07F3 ], [ 0x0901, 0x0902 ], [ 0x093C, 0x093C ], [ 0x0941, 0x0948 ], [ 0x094D, 0x094D ], [ 0x0951, 0x0954 ], [ 0x0962, 0x0963 ], [ 0x0981, 0x0981 ], [ 0x09BC, 0x09BC ], [ 0x09C1, 0x09C4 ], [ 0x09CD, 0x09CD ], [ 0x09E2, 0x09E3 ], [ 0x0A01, 0x0A02 ], [ 0x0A3C, 0x0A3C ], [ 0x0A41, 0x0A42 ], [ 0x0A47, 0x0A48 ], [ 0x0A4B, 0x0A4D ], [ 0x0A70, 0x0A71 ], [ 0x0A81, 0x0A82 ], [ 0x0ABC, 0x0ABC ], [ 0x0AC1, 0x0AC5 ], [ 0x0AC7, 0x0AC8 ], [ 0x0ACD, 0x0ACD ], [ 0x0AE2, 0x0AE3 ], [ 0x0B01, 0x0B01 ], [ 0x0B3C, 0x0B3C ], [ 0x0B3F, 0x0B3F ], [ 0x0B41, 0x0B43 ], [ 0x0B4D, 0x0B4D ], [ 0x0B56, 0x0B56 ], [ 0x0B82, 0x0B82 ], [ 0x0BC0, 0x0BC0 ], [ 0x0BCD, 0x0BCD ], [ 0x0C3E, 0x0C40 ], [ 0x0C46, 0x0C48 ], [ 0x0C4A, 0x0C4D ], [ 0x0C55, 0x0C56 ], [ 0x0CBC, 0x0CBC ], [ 0x0CBF, 0x0CBF ], [ 0x0CC6, 0x0CC6 ], [ 0x0CCC, 0x0CCD ], [ 0x0CE2, 0x0CE3 ], [ 0x0D41, 0x0D43 ], [ 0x0D4D, 0x0D4D ], [ 0x0DCA, 0x0DCA ], [ 0x0DD2, 0x0DD4 ], [ 0x0DD6, 0x0DD6 ], [ 0x0E31, 0x0E31 ], [ 0x0E34, 0x0E3A ], [ 0x0E47, 0x0E4E ], [ 0x0EB1, 0x0EB1 ], [ 0x0EB4, 0x0EB9 ], [ 0x0EBB, 0x0EBC ], [ 0x0EC8, 0x0ECD ], [ 0x0F18, 0x0F19 ], [ 0x0F35, 0x0F35 ], [ 0x0F37, 0x0F37 ], [ 0x0F39, 0x0F39 ], [ 0x0F71, 0x0F7E ], [ 0x0F80, 0x0F84 ], [ 0x0F86, 0x0F87 ], [ 0x0F90, 0x0F97 ], [ 0x0F99, 0x0FBC ], [ 0x0FC6, 0x0FC6 ], [ 0x102D, 0x1030 ], [ 0x1032, 0x1032 ], [ 0x1036, 0x1037 ], [ 0x1039, 0x1039 ], [ 0x1058, 0x1059 ], [ 0x1160, 0x11FF ], [ 0x135F, 0x135F ], [ 0x1712, 0x1714 ], [ 0x1732, 0x1734 ], [ 0x1752, 0x1753 ], [ 0x1772, 0x1773 ], [ 0x17B4, 0x17B5 ], [ 0x17B7, 0x17BD ], [ 0x17C6, 0x17C6 ], [ 0x17C9, 0x17D3 ], [ 0x17DD, 0x17DD ], [ 0x180B, 0x180D ], [ 0x18A9, 0x18A9 ], [ 0x1920, 0x1922 ], [ 0x1927, 0x1928 ], [ 0x1932, 0x1932 ], [ 0x1939, 0x193B ], [ 0x1A17, 0x1A18 ], [ 0x1B00, 0x1B03 ], [ 0x1B34, 0x1B34 ], [ 0x1B36, 0x1B3A ], [ 0x1B3C, 0x1B3C ], [ 0x1B42, 0x1B42 ], [ 0x1B6B, 0x1B73 ], [ 0x1DC0, 0x1DCA ], [ 0x1DFE, 0x1DFF ], [ 0x200B, 0x200F ], [ 0x202A, 0x202E ], [ 0x2060, 0x2063 ], [ 0x206A, 0x206F ], [ 0x20D0, 0x20EF ], [ 0x302A, 0x302F ], [ 0x3099, 0x309A ], [ 0xA806, 0xA806 ], [ 0xA80B, 0xA80B ], [ 0xA825, 0xA826 ], [ 0xFB1E, 0xFB1E ], [ 0xFE00, 0xFE0F ], [ 0xFE20, 0xFE23 ], [ 0xFEFF, 0xFEFF ], [ 0xFFF9, 0xFFFB ], [ 0x10A01, 0x10A03 ], [ 0x10A05, 0x10A06 ], [ 0x10A0C, 0x10A0F ], [ 0x10A38, 0x10A3A ], [ 0x10A3F, 0x10A3F ], [ 0x1D167, 0x1D169 ], [ 0x1D173, 0x1D182 ], [ 0x1D185, 0x1D18B ], [ 0x1D1AA, 0x1D1AD ], [ 0x1D242, 0x1D244 ], [ 0xE0001, 0xE0001 ], [ 0xE0020, 0xE007F ], [ 0xE0100, 0xE01EF ] ] ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/docs/�������000755 �000766 �000024 �00000000000 12456115117 032435� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/index.js����000644 �000766 �000024 �00000006105 12455173731 033161� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������"use strict" var defaults = require('defaults') var combining = require('./combining') var DEFAULTS = { nul: 0, control: 0 } module.exports = function wcwidth(str) { return wcswidth(str, DEFAULTS) } module.exports.config = function(opts) { opts = defaults(opts || {}, DEFAULTS) return function wcwidth(str) { return wcswidth(str, opts) } } /* * The following functions define the column width of an ISO 10646 * character as follows: * - The null character (U+0000) has a column width of 0. * - Other C0/C1 control characters and DEL will lead to a return value * of -1. * - Non-spacing and enclosing combining characters (general category * code Mn or Me in the * Unicode database) have a column width of 0. * - SOFT HYPHEN (U+00AD) has a column width of 1. * - Other format characters (general category code Cf in the Unicode * database) and ZERO WIDTH * SPACE (U+200B) have a column width of 0. * - Hangul Jamo medial vowels and final consonants (U+1160-U+11FF) * have a column width of 0. * - Spacing characters in the East Asian Wide (W) or East Asian * Full-width (F) category as * defined in Unicode Technical Report #11 have a column width of 2. * - All remaining characters (including all printable ISO 8859-1 and * WGL4 characters, Unicode control characters, etc.) have a column * width of 1. * This implementation assumes that characters are encoded in ISO 10646. */ function wcswidth(str, opts) { if (typeof str !== 'string') return wcwidth(str, opts) var s = 0 for (var i = 0; i < str.length; i++) { var n = wcwidth(str.charCodeAt(i), opts) if (n < 0) return -1 s += n } return s } function wcwidth(ucs, opts) { // test for 8-bit control characters if (ucs === 0) return opts.nul if (ucs < 32 || (ucs >= 0x7f && ucs < 0xa0)) return opts.control // binary search in table of non-spacing characters if (bisearch(ucs)) return 0 // if we arrive here, ucs is not a combining or C0/C1 control character return 1 + (ucs >= 0x1100 && (ucs <= 0x115f || // Hangul Jamo init. consonants ucs == 0x2329 || ucs == 0x232a || (ucs >= 0x2e80 && ucs <= 0xa4cf && ucs != 0x303f) || // CJK ... Yi (ucs >= 0xac00 && ucs <= 0xd7a3) || // Hangul Syllables (ucs >= 0xf900 && ucs <= 0xfaff) || // CJK Compatibility Ideographs (ucs >= 0xfe10 && ucs <= 0xfe19) || // Vertical forms (ucs >= 0xfe30 && ucs <= 0xfe6f) || // CJK Compatibility Forms (ucs >= 0xff00 && ucs <= 0xff60) || // Fullwidth Forms (ucs >= 0xffe0 && ucs <= 0xffe6) || (ucs >= 0x20000 && ucs <= 0x2fffd) || (ucs >= 0x30000 && ucs <= 0x3fffd))); } function bisearch(ucs) { var min = 0 var max = combining.length - 1 var mid if (ucs < combining[0][0] || ucs > combining[max][1]) return false while (max >= min) { mid = Math.floor((min + max) / 2) if (ucs > combining[mid][1]) min = mid + 1 else if (ucs < combining[mid][0]) max = mid - 1 else return true } return false } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/LICENSE�����000644 �000766 �000024 �00000003055 12455173731 032522� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������wcwidth.js: JavaScript Portng of Markus Kuhn's wcwidth() Implementation ======================================================================= Copyright (C) 2012 by Jun Woong. This package is a JavaScript porting of `wcwidth()` implementation [by Markus Kuhn](http://www.cl.cam.ac.uk/~mgk25/ucs/wcwidth.c). Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THIS SOFTWARE IS PROVIDED ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/����������������������000755 �000766 �000024 �00000000000 12456115117 034103� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/package.json000644 �000766 �000024 �00000002407 12455173731 034003� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "wcwidth", "version": "1.0.0", "description": "Port of C's wcwidth() and wcswidth()", "author": { "name": "Tim Oxley" }, "contributors": [ { "name": "Woong Jun", "email": "woong.jun@gmail.com", "url": "http://code.woong.org/" } ], "main": "index.js", "dependencies": { "defaults": "^1.0.0" }, "devDependencies": { "tape": "^2.13.4" }, "license": "MIT", "keywords": [ "wide character", "wc", "wide character string", "wcs", "terminal", "width", "wcwidth", "wcswidth" ], "directories": { "doc": "docs", "test": "test" }, "scripts": { "test": "tape test/*.js" }, "gitHead": "5bc3aafd45c89f233c27b9479c18a23ca91ba660", "_id": "wcwidth@1.0.0", "_shasum": "02d059ff7a8fc741e0f6b5da1e69b2b40daeca6f", "_from": "wcwidth@>=1.0.0 <2.0.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "timoxley", "email": "secoif@gmail.com" }, "maintainers": [ { "name": "timoxley", "email": "secoif@gmail.com" } ], "dist": { "shasum": "02d059ff7a8fc741e0f6b5da1e69b2b40daeca6f", "tarball": "http://registry.npmjs.org/wcwidth/-/wcwidth-1.0.0.tgz" }, "_resolved": "https://registry.npmjs.org/wcwidth/-/wcwidth-1.0.0.tgz" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/Readme.md���000644 �000766 �000024 �00000001567 12455173731 033242� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# wcwidth Determine columns needed for a fixed-size wide-character string ---- wcwidth is a simple JavaScript port of [wcwidth](http://man7.org/linux/man-pages/man3/wcswidth.3.html) implemented in C by Markus Kuhn. JavaScript port [originally](https://github.com/mycoboco/wcwidth.js) written by Woong Jun <woong.jun@gmail.com> (http://code.woong.org/) ## Example ```js '한'.length // => 1 wcwidth('한'); // => 2 '한글'.length // => 2 wcwidth('한글'); // => 4 ``` `wcwidth()` and its string version, `wcswidth()` are defined by IEEE Std 1002.1-2001, a.k.a. POSIX.1-2001, and return the number of columns used to represent the given wide character and string. Markus's implementation assumes the wide character given to those functions to be encoded in ISO 10646, which is almost true for JavaScript's characters. [Further explaination here](docs) ## License MIT �����������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/�������������000755 �000766 �000024 �00000000000 12456115117 035712� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/.npmignore���000644 �000766 �000024 �00000000015 12455173731 037712� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/index.js�����000644 �000766 �000024 �00000000425 12455173731 037365� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var clone = require('clone'); module.exports = function(options, defaults) { options = options || {}; Object.keys(defaults).forEach(function(key) { if (typeof options[key] === 'undefined') { options[key] = clone(defaults[key]); } }); return options; };�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/000755 �000766 �000024 �00000000000 12456115117 040367� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/package.json�000644 �000766 �000024 �00000004101 12455173731 040201� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "defaults", "version": "1.0.0", "description": "merge single level defaults over a config object", "main": "index.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git://github.com/tmpvar/defaults.git" }, "keywords": [ "config", "defaults" ], "author": { "name": "Elijah Insua", "email": "tmpvar@gmail.com" }, "license": "MIT", "readmeFilename": "README.md", "dependencies": { "clone": "~0.1.5" }, "devDependencies": { "tap": "~0.4.0" }, "readme": "# defaults\n\nA simple one level options merge utility\n\n## install\n\n`npm install defaults`\n\n## use\n\n```javascript\n\nvar defaults = require('defaults');\n\nvar handle = function(options, fn) {\n options = defaults(options, {\n timeout: 100\n });\n\n setTimeout(function() {\n fn(options);\n }, options.timeout);\n}\n\nhandle({ timeout: 1000 }, function() {\n // we're here 1000 ms later\n});\n\nhandle({ timeout: 10000 }, function() {\n // we're here 10s later\n});\n\n```\n\n## summary\n\nthis module exports a function that takes 2 arguments: `options` and `defaults`. When called, it overrides all of `undefined` properties in `options` with the clones of properties defined in `defaults`\n\nSidecases: if called with a falsy `options` value, options will be initialized to a new object before being merged onto.\n\n## license\n\nMIT", "_id": "defaults@1.0.0", "dist": { "shasum": "3ae25f44416c6c01f9809a25fcdd285912d2a6b1", "tarball": "http://registry.npmjs.org/defaults/-/defaults-1.0.0.tgz" }, "_npmVersion": "1.1.65", "_npmUser": { "name": "tmpvar", "email": "tmpvar@gmail.com" }, "maintainers": [ { "name": "tmpvar", "email": "tmpvar@gmail.com" } ], "directories": {}, "_shasum": "3ae25f44416c6c01f9809a25fcdd285912d2a6b1", "_resolved": "https://registry.npmjs.org/defaults/-/defaults-1.0.0.tgz", "_from": "defaults@>=1.0.0 <2.0.0", "bugs": { "url": "https://github.com/tmpvar/defaults/issues" }, "homepage": "https://github.com/tmpvar/defaults" } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/README.md����000644 �000766 �000024 �00000001457 12455173731 037205� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������# defaults A simple one level options merge utility ## install `npm install defaults` ## use ```javascript var defaults = require('defaults'); var handle = function(options, fn) { options = defaults(options, { timeout: 100 }); setTimeout(function() { fn(options); }, options.timeout); } handle({ timeout: 1000 }, function() { // we're here 1000 ms later }); handle({ timeout: 10000 }, function() { // we're here 10s later }); ``` ## summary this module exports a function that takes 2 arguments: `options` and `defaults`. When called, it overrides all of `undefined` properties in `options` with the clones of properties defined in `defaults` Sidecases: if called with a falsy `options` value, options will be initialized to a new object before being merged onto. ## license MIT�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/test.js������000644 �000766 �000024 �00000002032 12455173731 037231� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������var defaults = require('./'), test = require('tap').test; test("ensure options is an object", function(t) { var options = defaults(false, { a : true }); t.ok(options.a); t.end() }); test("ensure defaults override keys", function(t) { var result = defaults({}, { a: false, b: true }); t.ok(result.b, 'b merges over undefined'); t.equal(result.a, false, 'a merges over undefined'); t.end(); }); test("ensure defined keys are not overwritten", function(t) { var result = defaults({ b: false }, { a: false, b: true }); t.equal(result.b, false, 'b not merged'); t.equal(result.a, false, 'a merges over undefined'); t.end(); }); test("ensure defaults clone nested objects", function(t) { var d = { a: [1,2,3], b: { hello : 'world' } }; var result = defaults({}, d); t.equal(result.a.length, 3, 'objects should be clones'); t.ok(result.a !== d.a, 'objects should be clones'); t.equal(Object.keys(result.b).length, 1, 'objects should be clones'); t.ok(result.b !== d.b, 'objects should be clones'); t.end(); }); ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/�����������000755 �000766 �000024 �00000000000 12456115117 041467� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/.npmignore�000644 �000766 �000024 �00000000016 12455173731 043470� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������node_modules/ ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/.travis.yml000644 �000766 �000024 �00000000064 12455173731 043605� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - 0.6 - 0.8 - 0.10 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/clone.js���000644 �000766 �000024 �00000007603 12455173731 043140� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������'use strict'; function objectToString(o) { return Object.prototype.toString.call(o); } // shim for Node's 'util' package // DO NOT REMOVE THIS! It is required for compatibility with EnderJS (http://enderjs.com/). var util = { isArray: function (ar) { return Array.isArray(ar) || (typeof ar === 'object' && objectToString(ar) === '[object Array]'); }, isDate: function (d) { return typeof d === 'object' && objectToString(d) === '[object Date]'; }, isRegExp: function (re) { return typeof re === 'object' && objectToString(re) === '[object RegExp]'; }, getRegExpFlags: function (re) { var flags = ''; re.global && (flags += 'g'); re.ignoreCase && (flags += 'i'); re.multiline && (flags += 'm'); return flags; } }; if (typeof module === 'object') module.exports = clone; /** * Clones (copies) an Object using deep copying. * * This function supports circular references by default, but if you are certain * there are no circular references in your object, you can save some CPU time * by calling clone(obj, false). * * Caution: if `circular` is false and `parent` contains circular references, * your program may enter an infinite loop and crash. * * @param `parent` - the object to be cloned * @param `circular` - set to true if the object to be cloned may contain * circular references. (optional - true by default) * @param `depth` - set to a number if the object is only to be cloned to * a particular depth. (optional - defaults to Infinity) * @param `prototype` - sets the prototype to be used when cloning an object. * (optional - defaults to parent prototype). */ function clone(parent, circular, depth, prototype) { // maintain two arrays for circular references, where corresponding parents // and children have the same index var allParents = []; var allChildren = []; var useBuffer = typeof Buffer != 'undefined'; if (typeof circular == 'undefined') circular = true; if (typeof depth == 'undefined') depth = Infinity; // recurse this function so we don't reset allParents and allChildren function _clone(parent, depth) { // cloning null always returns null if (parent === null) return null; if (depth == 0) return parent; var child; var proto; if (typeof parent != 'object') { return parent; } if (util.isArray(parent)) { child = []; } else if (util.isRegExp(parent)) { child = new RegExp(parent.source, util.getRegExpFlags(parent)); if (parent.lastIndex) child.lastIndex = parent.lastIndex; } else if (util.isDate(parent)) { child = new Date(parent.getTime()); } else if (useBuffer && Buffer.isBuffer(parent)) { child = new Buffer(parent.length); parent.copy(child); return child; } else { if (typeof prototype == 'undefined') { proto = Object.getPrototypeOf(parent); child = Object.create(proto); } else { child = Object.create(prototype); proto = prototype; } } if (circular) { var index = allParents.indexOf(parent); if (index != -1) { return allChildren[index]; } allParents.push(parent); allChildren.push(child); } for (var i in parent) { var attrs; if (proto) { attrs = Object.getOwnPropertyDescriptor(proto, i); } if (attrs && attrs.set == null) { continue; } child[i] = _clone(parent[i], depth - 1); } return child; } return _clone(parent, depth); } /** * Simple flat clone using prototype, accepts only objects, usefull for property * override on FLAT configuration object (no nested props). * * USE WITH CAUTION! This may not behave as you wish if you do not know how this * works. */ clone.clonePrototype = function(parent) { if (parent === null) return null; var c = function () {}; c.prototype = parent; return new c(); }; �����������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/LICENSE����000644 �000766 �000024 �00000002056 12455173731 042504� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������Copyright © 2011-2014 Paul Vorbach <paul@vorba.ch> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/package.json���000644 �000766 �000024 �00000005531 12455173731 043766� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules/npm����������������������������������������������������������������������������������������������������������������������������{ "name": "clone", "description": "deep cloning of objects and arrays", "tags": [ "clone", "object", "array", "function", "date" ], "version": "0.1.19", "repository": { "type": "git", "url": "git://github.com/pvorb/node-clone.git" }, "bugs": { "url": "https://github.com/pvorb/node-clone/issues" }, "main": "clone.js", "author": { "name": "Paul Vorbach", "email": "paul@vorba.ch", "url": "http://paul.vorba.ch/" }, "contributors": [ { "name": "Blake Miner", "email": "miner.blake@gmail.com", "url": "http://www.blakeminer.com/" }, { "name": "Tian You", "email": "axqd001@gmail.com", "url": "http://blog.axqd.net/" }, { "name": "George Stagas", "email": "gstagas@gmail.com", "url": "http://stagas.com/" }, { "name": "Tobiasz Cudnik", "email": "tobiasz.cudnik@gmail.com", "url": "https://github.com/TobiaszCudnik" }, { "name": "Pavel Lang", "email": "langpavel@phpskelet.org", "url": "https://github.com/langpavel" }, { "name": "Dan MacTough", "url": "http://yabfog.com/" }, { "name": "w1nk", "url": "https://github.com/w1nk" }, { "name": "Hugh Kennedy", "url": "http://twitter.com/hughskennedy" }, { "name": "Dustin Diaz", "url": "http://dustindiaz.com" }, { "name": "Ilya Shaisultanov", "url": "https://github.com/diversario" }, { "name": "Nathan MacInnes", "email": "nathan@macinn.es", "url": "http://macinn.es/" }, { "name": "Benjamin E. Coe", "email": "ben@npmjs.com", "url": "https://twitter.com/benjamincoe" }, { "name": "Nathan Zadoks", "url": "https://github.com/nathan7" }, { "name": "Róbert Oroszi", "email": "robert+gh@oroszi.net", "url": "https://github.com/oroce" } ], "license": "MIT", "engines": { "node": "*" }, "dependencies": {}, "devDependencies": { "underscore": "*", "nodeunit": "*" }, "optionalDependencies": {}, "scripts": { "test": "nodeunit test.js" }, "gitHead": "bb11a43363a0f69e8ac014cb5376ce215ea1f8fd", "homepage": "https://github.com/pvorb/node-clone", "_id": "clone@0.1.19", "_shasum": "613fb68639b26a494ac53253e15b1a6bd88ada85", "_from": "clone@>=0.1.5 <0.2.0", "_npmVersion": "1.4.14", "_npmUser": { "name": "pvorb", "email": "paul@vorba.ch" }, "maintainers": [ { "name": "pvorb", "email": "paul@vorb.de" } ], "dist": { "shasum": "613fb68639b26a494ac53253e15b1a6bd88ada85", "tarball": "http://registry.npmjs.org/clone/-/clone-0.1.19.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/clone/-/clone-0.1.19.tgz", "readme": "ERROR: No README data found!" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/README.md��000644 �000766 �000024 �00000006474 12455173731 042766� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������# clone [![build status](https://secure.travis-ci.org/pvorb/node-clone.png)](http://travis-ci.org/pvorb/node-clone) offers foolproof _deep cloning_ of variables in JavaScript. ## Installation npm install clone or ender build clone ## Example ~~~ javascript var clone = require('clone'); var a, b; a = { foo: { bar: 'baz' } }; // initial value of a b = clone(a); // clone a -> b a.foo.bar = 'foo'; // change a console.log(a); // show a console.log(b); // show b ~~~ This will print: ~~~ javascript { foo: { bar: 'foo' } } { foo: { bar: 'baz' } } ~~~ **clone** masters cloning simple objects (even with custom prototype), arrays, Date objects, and RegExp objects. Everything is cloned recursively, so that you can clone dates in arrays in objects, for example. ## API `clone(val, circular, depth)` * `val` -- the value that you want to clone, any type allowed * `circular` -- boolean Call `clone` with `circular` set to `false` if you are certain that `obj` contains no circular references. This will give better performance if needed. There is no error if `undefined` or `null` is passed as `obj`. * `depth` -- depth to which the object is to be cloned (optional, defaults to infinity) `clone.clonePrototype(obj)` * `obj` -- the object that you want to clone Does a prototype clone as [described by Oran Looney](http://oranlooney.com/functional-javascript/). ## Circular References ~~~ javascript var a, b; a = { hello: 'world' }; a.myself = a; b = clone(a); console.log(b); ~~~ This will print: ~~~ javascript { hello: "world", myself: [Circular] } ~~~ So, `b.myself` points to `b`, not `a`. Neat! ## Test npm test ## Caveat Some special objects like a socket or `process.stdout`/`stderr` are known to not be cloneable. If you find other objects that cannot be cloned, please [open an issue](https://github.com/pvorb/node-clone/issues/new). ## Bugs and Issues If you encounter any bugs or issues, feel free to [open an issue at github](https://github.com/pvorb/node-clone/issues) or send me an email to <paul@vorba.ch>. I also always like to hear from you, if you’re using my code. ## License Copyright © 2011-2014 [Paul Vorbach](http://paul.vorba.ch/) and [contributors](https://github.com/pvorb/node-clone/graphs/contributors). Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/test.js����000644 �000766 �000024 �00000013233 12455173731 043013� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib/node_modules��������������������������������������������������������������������������������������������������������������������������������if(module.parent === null) { console.log('Run this test file with nodeunit:'); console.log('$ nodeunit test.js'); } var clone = require('./'); var util = require('util'); var _ = require('underscore'); exports["clone string"] = function(test) { test.expect(2); // how many tests? var a = "foo"; test.strictEqual(clone(a), a); a = ""; test.strictEqual(clone(a), a); test.done(); }; exports["clone number"] = function(test) { test.expect(5); // how many tests? var a = 0; test.strictEqual(clone(a), a); a = 1; test.strictEqual(clone(a), a); a = -1000; test.strictEqual(clone(a), a); a = 3.1415927; test.strictEqual(clone(a), a); a = -3.1415927; test.strictEqual(clone(a), a); test.done(); }; exports["clone date"] = function(test) { test.expect(3); // how many tests? var a = new Date; var c = clone(a); test.ok(a instanceof Date); test.ok(c instanceof Date); test.equal(c.getTime(), a.getTime()); test.done(); }; exports["clone object"] = function(test) { test.expect(2); // how many tests? var a = { foo: { bar: "baz" } }; var b = clone(a); test.ok(_(a).isEqual(b), "underscore equal"); test.deepEqual(b, a); test.done(); }; exports["clone array"] = function(test) { test.expect(2); // how many tests? var a = [ { foo: "bar" }, "baz" ]; var b = clone(a); test.ok(_(a).isEqual(b), "underscore equal"); test.deepEqual(b, a); test.done(); }; exports["clone buffer"] = function(test) { test.expect(1); var a = new Buffer("this is a test buffer"); var b = clone(a); // no underscore equal since it has no concept of Buffers test.deepEqual(b, a); test.done(); }; exports["clone regexp"] = function(test) { test.expect(5); var a = /abc123/gi; var b = clone(a); test.deepEqual(b, a); var c = /a/g; test.ok(c.lastIndex === 0); c.exec('123a456a'); test.ok(c.lastIndex === 4); var d = clone(c); test.ok(d.global); test.ok(d.lastIndex === 4); test.done(); }; exports["clone object containing array"] = function(test) { test.expect(2); // how many tests? var a = { arr1: [ { a: '1234', b: '2345' } ], arr2: [ { c: '345', d: '456' } ] }; var b = clone(a); test.ok(_(a).isEqual(b), "underscore equal"); test.deepEqual(b, a); test.done(); }; exports["clone object with circular reference"] = function(test) { test.expect(8); // how many tests? var _ = test.ok; var c = [1, "foo", {'hello': 'bar'}, function() {}, false, [2]]; var b = [c, 2, 3, 4]; var a = {'b': b, 'c': c}; a.loop = a; a.loop2 = a; c.loop = c; c.aloop = a; var aCopy = clone(a); _(a != aCopy); _(a.c != aCopy.c); _(aCopy.c == aCopy.b[0]); _(aCopy.c.loop.loop.aloop == aCopy); _(aCopy.c[0] == a.c[0]); //console.log(util.inspect(aCopy, true, null) ); //console.log("------------------------------------------------------------"); //console.log(util.inspect(a, true, null) ); _(eq(a, aCopy)); aCopy.c[0] = 2; _(!eq(a, aCopy)); aCopy.c = "2"; _(!eq(a, aCopy)); //console.log("------------------------------------------------------------"); //console.log(util.inspect(aCopy, true, null) ); function eq(x, y) { return util.inspect(x, true, null) === util.inspect(y, true, null); } test.done(); }; exports['clonePrototype'] = function(test) { test.expect(3); // how many tests? var a = { a: "aaa", x: 123, y: 45.65 }; var b = clone.clonePrototype(a); test.strictEqual(b.a, a.a); test.strictEqual(b.x, a.x); test.strictEqual(b.y, a.y); test.done(); } exports['cloneWithinNewVMContext'] = function(test) { test.expect(3); var vm = require('vm'); var ctx = vm.createContext({ clone: clone }); var script = "clone( {array: [1, 2, 3], date: new Date(), regex: /^foo$/ig} );"; var results = vm.runInContext(script, ctx); test.ok(results.array instanceof Array); test.ok(results.date instanceof Date); test.ok(results.regex instanceof RegExp); test.done(); } exports['cloneObjectWithNoConstructor'] = function(test) { test.expect(3); var n = null; var a = { foo: 'bar' }; a.__proto__ = n; test.ok(typeof a === 'object'); test.ok(typeof a !== null); var b = clone(a); test.ok(a.foo, b.foo); test.done(); } exports['clone object with depth argument'] = function (test) { test.expect(6); var a = { foo: { bar : { baz : 'qux' } } }; var b = clone(a, false, 1); test.deepEqual(b, a); test.notEqual(b, a); test.strictEqual(b.foo, a.foo); b = clone(a, true, 2); test.deepEqual(b, a); test.notEqual(b.foo, a.foo); test.strictEqual(b.foo.bar, a.foo.bar); test.done(); } exports['maintain prototype chain in clones'] = function (test) { test.expect(1); function Constructor() {} var a = new Constructor(); var b = clone(a); test.strictEqual(Object.getPrototypeOf(a), Object.getPrototypeOf(b)); test.done(); } exports['parent prototype is overriden with prototype provided'] = function (test) { test.expect(1); function Constructor() {} var a = new Constructor(); var b = clone(a, true, Infinity, null); test.strictEqual(b.__defineSetter__, undefined); test.done(); } exports['clone object with null children'] = function(test) { test.expect(1); var a = { foo: { bar: null, baz: { qux: false } } }; var b = clone(a); test.deepEqual(b, a); test.done(); } exports['clone instance with getter'] = function(test) { test.expect(1); function Ctor() {}; Object.defineProperty(Ctor.prototype, 'prop', { configurable: true, enumerable: true, get: function() { return 'value'; } }); var a = new Ctor(); var b = clone(a); test.strictEqual(b.prop, 'value'); test.done(); };���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/wcwidth/docs/index.md����������������������000644 �000766 �000024 �00000006222 12455173731 034016� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������### Javascript porting of Markus Kuhn's wcwidth() implementation The following explanation comes from the original C implementation: This is an implementation of wcwidth() and wcswidth() (defined in IEEE Std 1002.1-2001) for Unicode. http://www.opengroup.org/onlinepubs/007904975/functions/wcwidth.html http://www.opengroup.org/onlinepubs/007904975/functions/wcswidth.html In fixed-width output devices, Latin characters all occupy a single "cell" position of equal width, whereas ideographic CJK characters occupy two such cells. Interoperability between terminal-line applications and (teletype-style) character terminals using the UTF-8 encoding requires agreement on which character should advance the cursor by how many cell positions. No established formal standards exist at present on which Unicode character shall occupy how many cell positions on character terminals. These routines are a first attempt of defining such behavior based on simple rules applied to data provided by the Unicode Consortium. For some graphical characters, the Unicode standard explicitly defines a character-cell width via the definition of the East Asian FullWidth (F), Wide (W), Half-width (H), and Narrow (Na) classes. In all these cases, there is no ambiguity about which width a terminal shall use. For characters in the East Asian Ambiguous (A) class, the width choice depends purely on a preference of backward compatibility with either historic CJK or Western practice. Choosing single-width for these characters is easy to justify as the appropriate long-term solution, as the CJK practice of displaying these characters as double-width comes from historic implementation simplicity (8-bit encoded characters were displayed single-width and 16-bit ones double-width, even for Greek, Cyrillic, etc.) and not any typographic considerations. Much less clear is the choice of width for the Not East Asian (Neutral) class. Existing practice does not dictate a width for any of these characters. It would nevertheless make sense typographically to allocate two character cells to characters such as for instance EM SPACE or VOLUME INTEGRAL, which cannot be represented adequately with a single-width glyph. The following routines at present merely assign a single-cell width to all neutral characters, in the interest of simplicity. This is not entirely satisfactory and should be reconsidered before establishing a formal standard in this area. At the moment, the decision which Not East Asian (Neutral) characters should be represented by double-width glyphs cannot yet be answered by applying a simple rule from the Unicode database content. Setting up a proper standard for the behavior of UTF-8 character terminals will require a careful analysis not only of each Unicode character, but also of each presentation form, something the author of these routines has avoided to do so far. http://www.unicode.org/unicode/reports/tr11/ Markus Kuhn -- 2007-05-26 (Unicode 5.0) Permission to use, copy, modify, and distribute this software for any purpose and without fee is hereby granted. The author disclaims all warranties with regard to this software. Latest version: http://www.cl.cam.ac.uk/~mgk25/ucs/wcwidth.c ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/cli.js���000755 �000766 �000024 �00000001523 12455173731 033243� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node 'use strict'; var fs = require('fs'); var pkg = require('./package.json'); var stripAnsi = require('./'); var argv = process.argv.slice(2); var input = argv[0]; function help() { console.log([ '', ' ' + pkg.description, '', ' Usage', ' strip-ansi <input-file> > <output-file>', ' cat <input-file> | strip-ansi > <output-file>', '', ' Example', ' strip-ansi unicorn.txt > unicorn-stripped.txt' ].join('\n')); } function init(data) { process.stdout.write(stripAnsi(data)); } if (argv.indexOf('--help') !== -1) { help(); return; } if (argv.indexOf('--version') !== -1) { console.log(pkg.version); return; } if (process.stdin.isTTY) { if (!input) { help(); return; } init(fs.readFileSync(input, 'utf8')); } else { process.stdin.setEncoding('utf8'); process.stdin.on('data', init); } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/index.js�000644 �000766 �000024 �00000000241 12455173731 033574� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict'; var ansiRegex = require('ansi-regex')(); module.exports = function (str) { return typeof str === 'string' ? str.replace(ansiRegex, '') : str; }; ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/node_modules/�������������������000755 �000766 �000024 �00000000000 12456115117 034523� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/package.json��������������������000644 �000766 �000024 �00000003472 12455173731 034347� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "strip-ansi", "version": "2.0.0", "description": "Strip ANSI escape codes", "license": "MIT", "repository": { "type": "git", "url": "git://github.com/sindresorhus/strip-ansi" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "http://sindresorhus.com" }, "bin": { "strip-ansi": "cli.js" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "mocha" }, "files": [ "index.js", "cli.js" ], "keywords": [ "strip", "trim", "remove", "ansi", "styles", "color", "colour", "colors", "terminal", "console", "cli", "string", "tty", "escape", "formatting", "rgb", "256", "shell", "xterm", "log", "logging", "command-line", "text" ], "dependencies": { "ansi-regex": "^1.0.0" }, "devDependencies": { "mocha": "*" }, "gitHead": "c5e780acc07532f5d651cfb6ea035198095c6c74", "bugs": { "url": "https://github.com/sindresorhus/strip-ansi/issues" }, "homepage": "https://github.com/sindresorhus/strip-ansi", "_id": "strip-ansi@2.0.0", "_shasum": "fa8d69432e97674746f55f51d076ae78b18df13f", "_from": "strip-ansi@>=2.0.0 <3.0.0", "_npmVersion": "1.4.14", "_npmUser": { "name": "sindresorhus", "email": "sindresorhus@gmail.com" }, "maintainers": [ { "name": "sindresorhus", "email": "sindresorhus@gmail.com" }, { "name": "jbnicolai", "email": "jappelman@xebia.com" } ], "dist": { "shasum": "fa8d69432e97674746f55f51d076ae78b18df13f", "tarball": "http://registry.npmjs.org/strip-ansi/-/strip-ansi-2.0.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-2.0.0.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/readme.md000644 �000766 �000024 �00000001261 12455173731 033711� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# strip-ansi [![Build Status](https://travis-ci.org/sindresorhus/strip-ansi.svg?branch=master)](https://travis-ci.org/sindresorhus/strip-ansi) > Strip [ANSI escape codes](http://en.wikipedia.org/wiki/ANSI_escape_code) ## Install ```sh $ npm install --save strip-ansi ``` ## Usage ```js var stripAnsi = require('strip-ansi'); stripAnsi('\x1b[4mcake\x1b[0m'); //=> 'cake' ``` ## CLI ```sh $ npm install --global strip-ansi ``` ```sh $ strip-ansi --help Usage $ strip-ansi <input-file> > <output-file> $ cat <input-file> | strip-ansi > <output-file> Example $ strip-ansi unicorn.txt > unicorn-stripped.txt ``` ## License MIT © [Sindre Sorhus](http://sindresorhus.com) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/node_modules/ansi-regex/��������000755 �000766 �000024 �00000000000 12456115117 036565� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������lib/node_modules/npm/node_modules/columnify/node_modules/strip-ansi/node_modules/ansi-regex/index.js000644 �000766 �000024 �00000000221 12455173731 040232� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64�������������������������������������������������������������������������������������������������������������������������������������������������'use strict'; module.exports = function () { return /(?:(?:\u001b\[)|\u009b)(?:(?:[0-9]{1,3})?(?:(?:;[0-9]{0,3})*)?[A-M|f-m])|\u001b[A-M]/g; }; �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/columnify/node_modules/strip-ansi/node_modules/ansi-regex/package.json000644 �000766 �000024 �00000003402 12455173731 041057� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������{ "name": "ansi-regex", "version": "1.1.0", "description": "Regular expression for matching ANSI escape codes", "license": "MIT", "repository": { "type": "git", "url": "git://github.com/sindresorhus/ansi-regex" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "http://sindresorhus.com" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "mocha test/test.js", "view-supported": "node test/viewCodes.js" }, "files": [ "index.js" ], "keywords": [ "ansi", "styles", "color", "colour", "colors", "terminal", "console", "cli", "string", "tty", "escape", "formatting", "rgb", "256", "shell", "xterm", "command-line", "text", "regex", "regexp", "re", "match", "test", "find", "pattern" ], "devDependencies": { "mocha": "*" }, "bugs": { "url": "https://github.com/sindresorhus/ansi-regex/issues" }, "homepage": "https://github.com/sindresorhus/ansi-regex", "_id": "ansi-regex@1.1.0", "_shasum": "67792c5d6ad05c792d6cd6057ac8f5e69ebf4357", "_from": "ansi-regex@>=1.0.0 <2.0.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "sindresorhus", "email": "sindresorhus@gmail.com" }, "maintainers": [ { "name": "sindresorhus", "email": "sindresorhus@gmail.com" }, { "name": "jbnicolai", "email": "jappelman@xebia.com" } ], "dist": { "shasum": "67792c5d6ad05c792d6cd6057ac8f5e69ebf4357", "tarball": "http://registry.npmjs.org/ansi-regex/-/ansi-regex-1.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-1.1.0.tgz", "readme": "ERROR: No README data found!" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules/npm/node_modules/columnify/node_modules/strip-ansi/node_modules/ansi-regex/readme.md���000644 �000766 �000024 �00000001533 12455173731 040353� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 �iojs-v1.0.2-darwin-x64/lib���������������������������������������������������������������������������������������������������������������������������������������������# ansi-regex [![Build Status](https://travis-ci.org/sindresorhus/ansi-regex.svg?branch=master)](https://travis-ci.org/sindresorhus/ansi-regex) > Regular expression for matching [ANSI escape codes](http://en.wikipedia.org/wiki/ANSI_escape_code) ## Install ```sh $ npm install --save ansi-regex ``` ## Usage ```js var ansiRegex = require('ansi-regex'); ansiRegex().test('\u001b[4mcake\u001b[0m'); //=> true ansiRegex().test('cake'); //=> false '\u001b[4mcake\u001b[0m'.match(ansiRegex()); //=> ['\u001b[4m', '\u001b[0m'] ``` *It's a function so you can create multiple instances. Regexes with the global flag will have the `.lastIndex` property changed for each call to methods on the instance. Therefore reusing the instance with multiple calls will not work as expected for `.test()`.* ## License MIT © [Sindre Sorhus](http://sindresorhus.com) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/.npmignore������������������������000644 �000766 �000024 �00000000142 12455173731 027054� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������lib-cov *.seed *.log *.csv *.dat *.out *.pid *.gz pids logs results npm-debug.log node_modules ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/.travis.yml�����������������������000644 �000766 �000024 �00000000057 12455173731 027173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������language: node_js node_js: - "0.10" - "0.8"���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/index.js��������������������������000644 �000766 �000024 �00000010752 12455173731 026532� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// On windows, create a .cmd file. // Read the #! in the file to see what it uses. The vast majority // of the time, this will be either: // "#!/usr/bin/env <prog> <args...>" // or: // "#!<prog> <args...>" // // Write a binroot/pkg.bin + ".cmd" file that has this line in it: // @<prog> <args...> %~dp0<target> %* module.exports = cmdShim cmdShim.ifExists = cmdShimIfExists var fs = require("graceful-fs") var mkdir = require("mkdirp") , path = require("path") , shebangExpr = /^#\!\s*(?:\/usr\/bin\/env)?\s*([^ \t]+)(.*)$/ function cmdShimIfExists (from, to, cb) { fs.stat(from, function (er) { if (er) return cb() cmdShim(from, to, cb) }) } // Try to unlink, but ignore errors. // Any problems will surface later. function rm (path, cb) { fs.unlink(path, function(er) { cb() }) } function cmdShim (from, to, cb) { fs.stat(from, function (er, stat) { if (er) return cb(er) cmdShim_(from, to, cb) }) } function cmdShim_ (from, to, cb) { var then = times(2, next, cb) rm(to, then) rm(to + ".cmd", then) function next(er) { writeShim(from, to, cb) } } function writeShim (from, to, cb) { // make a cmd file and a sh script // First, check if the bin is a #! of some sort. // If not, then assume it's something that'll be compiled, or some other // sort of script, and just call it directly. mkdir(path.dirname(to), function (er) { if (er) return cb(er) fs.readFile(from, "utf8", function (er, data) { if (er) return writeShim_(from, to, null, null, cb) var firstLine = data.trim().split(/\r*\n/)[0] , shebang = firstLine.match(shebangExpr) if (!shebang) return writeShim_(from, to, null, null, cb) var prog = shebang[1] , args = shebang[2] || "" return writeShim_(from, to, prog, args, cb) }) }) } function writeShim_ (from, to, prog, args, cb) { var shTarget = path.relative(path.dirname(to), from) , target = shTarget.split("/").join("\\") , longProg , shProg = prog && prog.split("\\").join("/") , shLongProg shTarget = shTarget.split("\\").join("/") args = args || "" if (!prog) { prog = "\"%~dp0\\" + target + "\"" shProg = "\"$basedir/" + shTarget + "\"" args = "" target = "" shTarget = "" } else { longProg = "\"%~dp0\\" + prog + ".exe\"" shLongProg = "\"$basedir/" + prog + "\"" target = "\"%~dp0\\" + target + "\"" shTarget = "\"$basedir/" + shTarget + "\"" } // @IF EXIST "%~dp0\node.exe" ( // "%~dp0\node.exe" "%~dp0\.\node_modules\npm\bin\npm-cli.js" %* // ) ELSE ( // SETLOCAL // SET PATHEXT=%PATHEXT:;.JS;=;% // node "%~dp0\.\node_modules\npm\bin\npm-cli.js" %* // ) var cmd if (longProg) { cmd = "@IF EXIST " + longProg + " (\r\n" + " " + longProg + " " + args + " " + target + " %*\r\n" + ") ELSE (\r\n" + " @SETLOCAL\r\n" + " @SET PATHEXT=%PATHEXT:;.JS;=;%\r\n" + " " + prog + " " + args + " " + target + " %*\r\n" + ")" } else { cmd = prog + " " + args + " " + target + " %*\r\n" } // #!/bin/sh // basedir=`dirname "$0"` // // case `uname` in // *CYGWIN*) basedir=`cygpath -w "$basedir"`;; // esac // // if [ -x "$basedir/node.exe" ]; then // "$basedir/node.exe" "$basedir/node_modules/npm/bin/npm-cli.js" "$@" // ret=$? // else // node "$basedir/node_modules/npm/bin/npm-cli.js" "$@" // ret=$? // fi // exit $ret var sh = "#!/bin/sh\n" if (shLongProg) { sh = sh + "basedir=`dirname \"$0\"`\n" + "\n" + "case `uname` in\n" + " *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;\n" + "esac\n" + "\n" sh = sh + "if [ -x "+shLongProg+" ]; then\n" + " " + shLongProg + " " + args + " " + shTarget + " \"$@\"\n" + " ret=$?\n" + "else \n" + " " + shProg + " " + args + " " + shTarget + " \"$@\"\n" + " ret=$?\n" + "fi\n" + "exit $ret\n" } else { sh = shProg + " " + args + " " + shTarget + " \"$@\"\n" + "exit $?\n" } var then = times(2, next, cb) fs.writeFile(to + ".cmd", cmd, "utf8", then) fs.writeFile(to, sh, "utf8", then) function next () { chmodShim(to, cb) } } function chmodShim (to, cb) { var then = times(2, cb, cb) fs.chmod(to, 0755, then) fs.chmod(to + ".cmd", 0755, then) } function times(n, ok, cb) { var errState = null return function(er) { if (!errState) { if (er) cb(errState = er) else if (--n === 0) ok() } } } ����������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/LICENSE���������������������������000644 �000766 �000024 �00000002436 12455173731 026072� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/package.json����������������������000644 �000766 �000024 �00000002331 12455173731 027345� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "cmd-shim", "version": "2.0.1", "description": "Used in npm for command line application support", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/ForbesLindesay/cmd-shim.git" }, "license": "BSD", "dependencies": { "graceful-fs": ">3.0.1 <4.0.0-0", "mkdirp": "~0.5.0" }, "devDependencies": { "tap": "~0.4.11", "rimraf": "~2.2.8" }, "gitHead": "6f53d506be590fe9ac20c9801512cd1a3aad5974", "bugs": { "url": "https://github.com/ForbesLindesay/cmd-shim/issues" }, "homepage": "https://github.com/ForbesLindesay/cmd-shim", "_id": "cmd-shim@2.0.1", "_shasum": "4512a373d2391679aec51ad1d4733559e9b85d4a", "_from": "cmd-shim@>=2.0.1-0 <3.0.0-0", "_npmVersion": "1.5.0-alpha-4", "_npmUser": { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" }, "maintainers": [ { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" } ], "dist": { "shasum": "4512a373d2391679aec51ad1d4733559e9b85d4a", "tarball": "http://registry.npmjs.org/cmd-shim/-/cmd-shim-2.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/cmd-shim/-/cmd-shim-2.0.1.tgz" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/cmd-shim/README.md�������������������������000644 �000766 �000024 �00000002142 12455173731 026336� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# cmd-shim The cmd-shim used in npm to create executable scripts on Windows, since symlinks are not suitable for this purpose there. On Unix systems, you should use a symbolic link instead. [![Build Status](https://img.shields.io/travis/ForbesLindesay/cmd-shim/master.svg)](https://travis-ci.org/ForbesLindesay/cmd-shim) [![Dependency Status](https://img.shields.io/gemnasium/ForbesLindesay/cmd-shim.svg)](https://gemnasium.com/ForbesLindesay/cmd-shim) [![NPM version](https://img.shields.io/npm/v/cmd-shim.svg)](http://badge.fury.io/js/cmd-shim) ## Installation ``` npm install cmd-shim ``` ## API ### cmdShim(from, to, cb) Create a cmd shim at `to` for the command line program at `from`. e.g. ```javascript var cmdShim = require('cmd-shim'); cmdShim(__dirname + '/cli.js', '/usr/bin/command-name', function (err) { if (err) throw err; }); ``` ### cmdShim.ifExists(from, to, cb) The same as above, but will just continue if the file does not exist. Source: ```javascript function cmdShimIfExists (from, to, cb) { fs.stat(from, function (er) { if (er) return cb() cmdShim(from, to, cb) }) } ``` ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chownr/chownr.js���������������������������000644 �000766 �000024 �00000002077 12455173731 026523� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = chownr chownr.sync = chownrSync var fs = require("fs") , path = require("path") function chownr (p, uid, gid, cb) { fs.readdir(p, function (er, children) { // any error other than ENOTDIR means it's not readable, or // doesn't exist. give up. if (er && er.code !== "ENOTDIR") return cb(er) if (er || !children.length) return fs.chown(p, uid, gid, cb) var len = children.length , errState = null children.forEach(function (child) { chownr(path.resolve(p, child), uid, gid, then) }) function then (er) { if (errState) return if (er) return cb(errState = er) if (-- len === 0) return fs.chown(p, uid, gid, cb) } }) } function chownrSync (p, uid, gid) { var children try { children = fs.readdirSync(p) } catch (er) { if (er && er.code === "ENOTDIR") return fs.chownSync(p, uid, gid) throw er } if (!children.length) return fs.chownSync(p, uid, gid) children.forEach(function (child) { chownrSync(path.resolve(p, child), uid, gid) }) return fs.chownSync(p, uid, gid) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chownr/LICENCE�����������������������������000644 �000766 �000024 �00000002446 12455173731 025652� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chownr/package.json������������������������000644 �000766 �000024 �00000001522 12455173731 027145� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "chownr", "description": "like `chown -R`", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/isaacs/chownr.git" }, "main": "chownr.js", "devDependencies": { "tap": "0.2", "mkdirp": "0.3", "rimraf": "" }, "scripts": { "test": "tap test/*.js" }, "license": "BSD", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "_id": "chownr@0.0.1", "dependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "_engineSupported": true, "_npmVersion": "1.1.23", "_nodeVersion": "v0.7.10-pre", "_defaultsLoaded": true, "dist": { "shasum": "51d18189d9092d5f8afd623f3288bfd1c6bf1a62" }, "_from": "../chownr" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chownr/README.md���������������������������000644 �000766 �000024 �00000000073 12455173731 026136� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Like `chown -R`. Takes the same arguments as `fs.chown()` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chmodr/chmodr.js���������������������������000644 �000766 �000024 �00000002472 12455173731 026452� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = chmodr chmodr.sync = chmodrSync var fs = require("fs") , path = require("path") function chmodr (p, mode, cb) { fs.readdir(p, function (er, children) { // any error other than ENOTDIR means it's not readable, or // doesn't exist. give up. if (er && er.code !== "ENOTDIR") return cb(er) var isDir = !er var m = isDir ? dirMode(mode) : mode if (er || !children.length) return fs.chmod(p, m, cb) var len = children.length var errState = null children.forEach(function (child) { chmodr(path.resolve(p, child), mode, then) }) function then (er) { if (errState) return if (er) return cb(errState = er) if (-- len === 0) return fs.chmod(p, dirMode(mode), cb) } }) } function chmodrSync (p, mode) { var children try { children = fs.readdirSync(p) } catch (er) { if (er && er.code === "ENOTDIR") return fs.chmodSync(p, mode) throw er } if (!children.length) return fs.chmodSync(p, dirMode(mode)) children.forEach(function (child) { chmodrSync(path.resolve(p, child), mode) }) return fs.chmodSync(p, dirMode(mode)) } // If a party has r, add x // so that dirs are listable function dirMode(mode) { if (mode & 0400) mode |= 0100 if (mode & 040) mode |= 010 if (mode & 04) mode |= 01 return mode } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chmodr/LICENSE�����������������������������000644 �000766 �000024 �00000002436 12455173731 025645� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chmodr/package.json������������������������000644 �000766 �000024 �00000001155 12455173731 027123� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "chmodr", "description": "like `chmod -R`", "version": "0.1.0", "repository": { "type": "git", "url": "git://github.com/isaacs/chmodr.git" }, "main": "chmodr.js", "devDependencies": { "tap": "0.2", "mkdirp": "0.3", "rimraf": "" }, "scripts": { "test": "tap test/*.js" }, "license": "BSD", "readme": "Like `chmod -R`.\n\nTakes the same arguments as `fs.chmod()`\n", "readmeFilename": "README.md", "_id": "chmodr@0.1.0", "_from": "chmodr@latest" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/chmodr/README.md���������������������������000644 �000766 �000024 �00000000073 12455173731 026112� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Like `chmod -R`. Takes the same arguments as `fs.chmod()` ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/child-process-close/index.js���������������000644 �000766 �000024 �00000002140 12455173731 030663� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ var child_process = require('child_process'); // Re-export the child_process module. module.exports = child_process; // Only node versions up to v0.7.6 need this hook. if (!/^v0\.([0-6]\.|7\.[0-6](\D|$))/.test(process.version)) return; // Do not add the hook if already hooked. if (child_process.hasOwnProperty('_exit_hook')) return; // Version the hook in case there is ever the need to release a 0.2.0. child_process._exit_hook = 1; function hook(name) { var orig = child_process[name]; // Older node versions may not have all functions, e.g. fork(). if (!orig) return; // Store the unhooked version. child_process['_original_' + name] = orig; // Do the actual hooking. child_process[name] = function() { var child = orig.apply(this, arguments); child.once('exit', function(code, signal) { process.nextTick(function() { child.emit('close', code, signal); }); }); return child; } } hook('spawn'); hook('fork'); hook('execFile'); // Don't hook 'exec': it calls `exports.execFile` internally, so hooking it // would trigger the close event twice. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/child-process-close/package.json�����������000644 �000766 �000024 �00000005176 12455173731 031520� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "child-process-close", "version": "0.1.1", "description": "Make child_process objects emit 'close' events in node v0.6 like they do in v0.8. This makes it easier to write code that works correctly on both version of node.", "main": "index.js", "scripts": { "test": "node test/test.js" }, "repository": { "type": "git", "url": "git://github.com/piscisaureus/child-process-close.git" }, "keywords": [ "child_process", "spawn", "fork", "exec", "execFile", "close", "exit" ], "author": { "name": "Bert Belder" }, "license": "MIT", "readme": "\n# child-process-close\n\nThis module makes child process objects, (created with `spawn`, `fork`, `exec`\nor `execFile`) emit the `close` event in node v0.6 like they do in node v0.8.\nThis makes it easier to write code that works correctly on both version of\nnode.\n\n\n## Usage\n\nJust make sure to `require('child-process-close')` anywhere. It will patch the `child_process` module.\n\n```js\nrequire('child-process-close');\nvar spawn = require('child_process').spawn;\n\nvar cp = spawn('foo');\ncp.on('close', function(exitCode, signal) {\n // This now works on all node versions.\n});\n```\n\n\n## License\n\nCopyright (C) 2012 Bert Belder\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/piscisaureus/child-process-close/issues" }, "_id": "child-process-close@0.1.1", "dist": { "shasum": "c1909c6c3bbcea623e3bd74493ddb1c94c47c500" }, "_from": "child-process-close@", "_resolved": "https://registry.npmjs.org/child-process-close/-/child-process-close-0.1.1.tgz" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/child-process-close/README.md��������������000644 �000766 �000024 �00000003157 12455173731 030506� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ # child-process-close This module makes child process objects, (created with `spawn`, `fork`, `exec` or `execFile`) emit the `close` event in node v0.6 like they do in node v0.8. This makes it easier to write code that works correctly on both version of node. ## Usage Just make sure to `require('child-process-close')` anywhere. It will patch the `child_process` module. ```js require('child-process-close'); var spawn = require('child_process').spawn; var cp = spawn('foo'); cp.on('close', function(exitCode, signal) { // This now works on all node versions. }); ``` ## License Copyright (C) 2012 Bert Belder Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/char-spinner/LICENSE�����������������������000644 �000766 �000024 �00000001354 12455173731 026760� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/char-spinner/package.json������������������000644 �000766 �000024 �00000002454 12455173731 030243� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "char-spinner", "version": "1.0.1", "description": "Put a little spinner on process.stderr, as unobtrusively as possible.", "main": "spin.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "^0.4.10" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/char-spinner" }, "keywords": [ "char", "spinner" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/isaacs/char-spinner/issues" }, "homepage": "https://github.com/isaacs/char-spinner", "gitHead": "091b2ff5960aa083f68a5619fa93999d072aa152", "_id": "char-spinner@1.0.1", "_shasum": "e6ea67bd247e107112983b7ab0479ed362800081", "_from": "char-spinner@latest", "_npmVersion": "1.4.13", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "e6ea67bd247e107112983b7ab0479ed362800081", "tarball": "http://registry.npmjs.org/char-spinner/-/char-spinner-1.0.1.tgz" }, "_resolved": "https://registry.npmjs.org/char-spinner/-/char-spinner-1.0.1.tgz" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/char-spinner/README.md���������������������000644 �000766 �000024 �00000001752 12455173731 027234� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# char-spinner Put a little spinner on process.stderr, as unobtrusively as possible. ## USAGE ```javascript var spinner = require("char-spinner") // All options are optional // even the options argument itself is optional spinner(options) ``` ## OPTIONS Usually the defaults are what you want. Mostly they're just configurable for testing purposes. * `stream` Output stream. Default=`process.stderr` * `tty` Only show spinner if output stream has a truish `.isTTY`. Default=`true` * `string` String of chars to spin. Default=`'/-\\|'` * `interval` Number of ms between frames, bigger = slower. Default=`50` * `cleanup` Print `'\r \r'` to stream on process exit. Default=`true` * `unref` Unreference the spinner interval so that the process can exit normally. Default=`true` * `delay` Number of frames to "skip over" before printing the spinner. Useful if you want to avoid showing the spinner for very fast actions. Default=`2` Returns the generated interval, if one was created. ����������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/char-spinner/spin.js�����������������������000644 �000766 �000024 �00000002252 12455173731 027260� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = spinner function spinner(opt) { opt = opt || {} var str = opt.stream || process.stderr var tty = typeof opt.tty === 'boolean' ? opt.tty : true var string = opt.string || '/-\\|' var ms = typeof opt.interval === 'number' ? opt.interval : 50 if (ms < 0) ms = 0 if (tty && !str.isTTY) return false var CR = str.isTTY ? '\u001b[0G' : '\u000d'; var CLEAR = str.isTTY ? '\u001b[2K' : '\u000d \u000d'; var s = 0 var sprite = string.split('') var wrote = false var delay = typeof opt.delay === 'number' ? opt.delay : 2 var interval = setInterval(function() { if (--delay >= 0) return s = ++s % sprite.length var c = sprite[s] str.write(c + CR) wrote = true }, ms) var unref = typeof opt.unref === 'boolean' ? opt.unref : true if (unref && typeof interval.unref === 'function') { interval.unref() } var cleanup = typeof opt.cleanup === 'boolean' ? opt.cleanup : true if (cleanup) { process.on('exit', function() { if (wrote) { str.write(CLEAR); } }) } module.exports.clear = function () { str.write(CLEAR); }; return interval } module.exports.clear = function () {}; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/bench/������������������������000755 �000766 �000024 �00000000000 12456115117 027014� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/block-stream.js���������������000644 �000766 �000024 �00000014633 12455173731 030672� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// write data to it, and it'll emit data in 512 byte blocks. // if you .end() or .flush(), it'll emit whatever it's got, // padded with nulls to 512 bytes. module.exports = BlockStream var Stream = require("stream").Stream , inherits = require("inherits") , assert = require("assert").ok , debug = process.env.DEBUG ? console.error : function () {} function BlockStream (size, opt) { this.writable = this.readable = true this._opt = opt || {} this._chunkSize = size || 512 this._offset = 0 this._buffer = [] this._bufferLength = 0 if (this._opt.nopad) this._zeroes = false else { this._zeroes = new Buffer(this._chunkSize) for (var i = 0; i < this._chunkSize; i ++) { this._zeroes[i] = 0 } } } inherits(BlockStream, Stream) BlockStream.prototype.write = function (c) { // debug(" BS write", c) if (this._ended) throw new Error("BlockStream: write after end") if (c && !Buffer.isBuffer(c)) c = new Buffer(c + "") if (c.length) { this._buffer.push(c) this._bufferLength += c.length } // debug("pushed onto buffer", this._bufferLength) if (this._bufferLength >= this._chunkSize) { if (this._paused) { // debug(" BS paused, return false, need drain") this._needDrain = true return false } this._emitChunk() } return true } BlockStream.prototype.pause = function () { // debug(" BS pausing") this._paused = true } BlockStream.prototype.resume = function () { // debug(" BS resume") this._paused = false return this._emitChunk() } BlockStream.prototype.end = function (chunk) { // debug("end", chunk) if (typeof chunk === "function") cb = chunk, chunk = null if (chunk) this.write(chunk) this._ended = true this.flush() } BlockStream.prototype.flush = function () { this._emitChunk(true) } BlockStream.prototype._emitChunk = function (flush) { // debug("emitChunk flush=%j emitting=%j paused=%j", flush, this._emitting, this._paused) // emit a <chunkSize> chunk if (flush && this._zeroes) { // debug(" BS push zeroes", this._bufferLength) // push a chunk of zeroes var padBytes = (this._bufferLength % this._chunkSize) if (padBytes !== 0) padBytes = this._chunkSize - padBytes if (padBytes > 0) { // debug("padBytes", padBytes, this._zeroes.slice(0, padBytes)) this._buffer.push(this._zeroes.slice(0, padBytes)) this._bufferLength += padBytes // debug(this._buffer[this._buffer.length - 1].length, this._bufferLength) } } if (this._emitting || this._paused) return this._emitting = true // debug(" BS entering loops") var bufferIndex = 0 while (this._bufferLength >= this._chunkSize && (flush || !this._paused)) { // debug(" BS data emission loop", this._bufferLength) var out , outOffset = 0 , outHas = this._chunkSize while (outHas > 0 && (flush || !this._paused) ) { // debug(" BS data inner emit loop", this._bufferLength) var cur = this._buffer[bufferIndex] , curHas = cur.length - this._offset // debug("cur=", cur) // debug("curHas=%j", curHas) // If it's not big enough to fill the whole thing, then we'll need // to copy multiple buffers into one. However, if it is big enough, // then just slice out the part we want, to save unnecessary copying. // Also, need to copy if we've already done some copying, since buffers // can't be joined like cons strings. if (out || curHas < outHas) { out = out || new Buffer(this._chunkSize) cur.copy(out, outOffset, this._offset, this._offset + Math.min(curHas, outHas)) } else if (cur.length === outHas && this._offset === 0) { // shortcut -- cur is exactly long enough, and no offset. out = cur } else { // slice out the piece of cur that we need. out = cur.slice(this._offset, this._offset + outHas) } if (curHas > outHas) { // means that the current buffer couldn't be completely output // update this._offset to reflect how much WAS written this._offset += outHas outHas = 0 } else { // output the entire current chunk. // toss it away outHas -= curHas outOffset += curHas bufferIndex ++ this._offset = 0 } } this._bufferLength -= this._chunkSize assert(out.length === this._chunkSize) // debug("emitting data", out) // debug(" BS emitting, paused=%j", this._paused, this._bufferLength) this.emit("data", out) out = null } // debug(" BS out of loops", this._bufferLength) // whatever is left, it's not enough to fill up a block, or we're paused this._buffer = this._buffer.slice(bufferIndex) if (this._paused) { // debug(" BS paused, leaving", this._bufferLength) this._needsDrain = true this._emitting = false return } // if flushing, and not using null-padding, then need to emit the last // chunk(s) sitting in the queue. We know that it's not enough to // fill up a whole block, because otherwise it would have been emitted // above, but there may be some offset. var l = this._buffer.length if (flush && !this._zeroes && l) { if (l === 1) { if (this._offset) { this.emit("data", this._buffer[0].slice(this._offset)) } else { this.emit("data", this._buffer[0]) } } else { var outHas = this._bufferLength , out = new Buffer(outHas) , outOffset = 0 for (var i = 0; i < l; i ++) { var cur = this._buffer[i] , curHas = cur.length - this._offset cur.copy(out, outOffset, this._offset) this._offset = 0 outOffset += curHas this._bufferLength -= curHas } this.emit("data", out) } // truncate this._buffer.length = 0 this._bufferLength = 0 this._offset = 0 } // now either drained or ended // debug("either draining, or ended", this._bufferLength, this._ended) // means that we've flushed out all that we can so far. if (this._needDrain) { // debug("emitting drain", this._bufferLength) this._needDrain = false this.emit("drain") } if ((this._bufferLength === 0) && this._ended && !this._endEmitted) { // debug("emitting end", this._bufferLength) this._endEmitted = true this.emit("end") } this._emitting = false // debug(" BS no longer emitting", flush, this._paused, this._emitting, this._bufferLength, this._chunkSize) } �����������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/LICENCE�����������������������000644 �000766 �000024 �00000002446 12455173731 026735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright (c) Isaac Z. Schlueter All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/package.json������������������000644 �000766 �000024 �00000002127 12455173731 030232� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "block-stream", "description": "a stream of blocks", "version": "0.0.7", "repository": { "type": "git", "url": "git://github.com/isaacs/block-stream.git" }, "engines": { "node": "0.4 || >=0.5.8" }, "main": "block-stream.js", "dependencies": { "inherits": "~2.0.0" }, "devDependencies": { "tap": "0.x" }, "scripts": { "test": "tap test/" }, "license": "BSD", "readme": "# block-stream\n\nA stream of blocks.\n\nWrite data into it, and it'll output data in buffer blocks the size you\nspecify, padding with zeroes if necessary.\n\n```javascript\nvar block = new BlockStream(512)\nfs.createReadStream(\"some-file\").pipe(block)\nblock.pipe(fs.createWriteStream(\"block-file\"))\n```\n\nWhen `.end()` or `.flush()` is called, it'll pad the block with zeroes.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/block-stream/issues" }, "_id": "block-stream@0.0.7", "_from": "block-stream@latest" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/README.md���������������������000644 �000766 �000024 �00000000561 12455173731 027223� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# block-stream A stream of blocks. Write data into it, and it'll output data in buffer blocks the size you specify, padding with zeroes if necessary. ```javascript var block = new BlockStream(512) fs.createReadStream("some-file").pipe(block) block.pipe(fs.createWriteStream("block-file")) ``` When `.end()` or `.flush()` is called, it'll pad the block with zeroes. �����������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/bench/block-stream-pause.js���000644 �000766 �000024 �00000004031 12455173731 033053� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var BlockStream = require("../block-stream.js") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation f.pause() setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } f.resume() }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/bench/block-stream.js���������000644 �000766 �000024 �00000003766 12455173731 031756� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var BlockStream = require("../block-stream.js") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) ����������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/bench/dropper-pause.js��������000644 �000766 �000024 �00000004016 12455173731 032146� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var BlockStream = require("dropper") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation f.pause() setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } f.resume() }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/block-stream/bench/dropper.js��������������000644 �000766 �000024 �00000003753 12455173731 031042� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var BlockStream = require("dropper") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) ���������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/async-some/.eslintrc�����������������������000644 �000766 �000024 �00000000645 12455173731 027266� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "env" : { "node" : true }, "rules" : { "curly" : 0, "no-lonely-if" : 1, "no-mixed-requires" : 0, "no-underscore-dangle" : 0, "no-unused-vars" : [2, {"vars" : "all", "args" : "after-used"}], "no-use-before-define" : [2, "nofunc"], "quotes" : [1, "double", "avoid-escape"], "semi" : [2, "never"], "space-after-keywords" : 1, "space-infix-ops" : 0, "strict" : 0 } } �������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/async-some/.npmignore����������������������000644 �000766 �000024 �00000000015 12455173731 027430� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/async-some/package.json��������������������000644 �000766 �000024 �00000002605 12455173731 027726� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "async-some", "version": "1.0.1", "description": "short-circuited, asynchronous version of Array.protototype.some", "main": "some.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/othiym23/async-some.git" }, "keywords": [ "async", "some", "array", "collections", "fp" ], "author": { "name": "Forrest L Norvell", "email": "ogd@aoaioxxysz.net" }, "license": "ISC", "bugs": { "url": "https://github.com/othiym23/async-some/issues" }, "homepage": "https://github.com/othiym23/async-some", "dependencies": { "dezalgo": "^1.0.0" }, "devDependencies": { "tap": "^0.4.11" }, "gitHead": "e73d6d1fbc03cca5a0d54f456f39bab294a4c7b7", "_id": "async-some@1.0.1", "_shasum": "8b54f08d46f0f9babc72ea9d646c245d23a4d9e5", "_from": "async-some@>=1.0.1-0 <2.0.0-0", "_npmVersion": "1.5.0-pre", "_npmUser": { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, "maintainers": [ { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "dist": { "shasum": "8b54f08d46f0f9babc72ea9d646c245d23a4d9e5", "tarball": "http://registry.npmjs.org/async-some/-/async-some-1.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/async-some/-/async-some-1.0.1.tgz", "readme": "ERROR: No README data found!" } ���������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/async-some/README.md�����������������������000644 �000766 �000024 �00000004127 12455173731 026720� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# some Short-circuited async Array.prototype.some implementation. Serially evaluates a list of values from a JS array or arraylike against an asynchronous predicate, terminating on the first truthy value. If the predicate encounters an error, pass it to the completion callback. Otherwise, pass the truthy value passed by the predicate, or `false` if no truthy value was passed. Is [Zalgo](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony)-proof, browser-safe, and pretty efficient. ## Usage ```javascript var some = require("async-some"); var resolve = require("path").resolve; var stat = require("fs").stat; var readFileSync = require("fs").readFileSync; some(["apple", "seaweed", "ham", "quince"], porkDetector, function (error, match) { if (error) return console.error(error); if (match) return console.dir(JSON.parse(readFileSync(match))); console.error("time to buy more Sporkle™!"); }); var PREFIX = resolve(__dirname, "../pork_store"); function porkDetector(value, cb) { var path = resolve(PREFIX, value + ".json"); stat(path, function (er, stat) { if (er) { if (er.code === "ENOENT") return cb(null, false); return cb(er); } cb(er, path); }); } ``` ### some(list, test, callback) * `list` {Object} An arraylike (either an Array or the arguments arraylike) to be checked. * `test` {Function} The predicate against which the elements of `list` will be tested. Takes two parameters: * `element` {any} The element of the list to be tested. * `callback` {Function} The continuation to be called once the test is complete. Takes (again) two values: * `error` {Error} Any errors that the predicate encountered. * `value` {any} A truthy value. A non-falsy result terminates checking the entire list. * `callback` {Function} The callback to invoke when either a value has been found or the entire input list has been processed with no result. Is invoked with the traditional two parameters: * `error` {Error} Errors that were encountered during the evaluation of some(). * `match` {any} Value successfully matched by `test`, if any. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/async-some/some.js�������������������������000644 �000766 �000024 �00000002357 12455173731 026745� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") var dezalgoify = require("dezalgo") module.exports = some /** * short-circuited async Array.prototype.some implementation * * Serially evaluates a list of values from a JS array or arraylike * against an asynchronous predicate, terminating on the first truthy * value. If the predicate encounters an error, pass it to the completion * callback. Otherwise, pass the truthy value passed by the predicate, or * `false` if no truthy value was passed. */ function some (list, test, cb) { assert("length" in list, "array must be arraylike") assert.equal(typeof test, "function", "predicate must be callable") assert.equal(typeof cb, "function", "callback must be callable") var array = slice(list) , index = 0 , length = array.length , hecomes = dezalgoify(cb) map() function map () { if (index >= length) return hecomes(null, false) test(array[index], reduce) } function reduce (er, result) { if (er) return hecomes(er, false) if (result) return hecomes(null, result) index++ map() } } // Array.prototype.slice on arguments arraylike is expensive function slice(args) { var l = args.length, a = [], i for (i = 0; i < l; i++) a[i] = args[i] return a } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/examples/����������������������������000755 �000766 �000024 �00000000000 12456115117 026276� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/index.js�����������������������������000644 �000766 �000024 �00000002164 12455173731 026135� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = function archy (obj, prefix, opts) { if (prefix === undefined) prefix = ''; if (!opts) opts = {}; var chr = function (s) { var chars = { '│' : '|', '└' : '`', '├' : '+', '─' : '-', '┬' : '-' }; return opts.unicode === false ? chars[s] : s; }; if (typeof obj === 'string') obj = { label : obj }; var nodes = obj.nodes || []; var lines = (obj.label || '').split('\n'); var splitter = '\n' + prefix + (nodes.length ? chr('│') : ' ') + ' '; return prefix + lines.join(splitter) + '\n' + nodes.map(function (node, ix) { var last = ix === nodes.length - 1; var more = node.nodes && node.nodes.length; var prefix_ = prefix + (last ? ' ' : chr('│')) + ' '; return prefix + (last ? chr('└') : chr('├')) + chr('─') + (more ? chr('┬') : chr('─')) + ' ' + archy(node, prefix_, opts).slice(prefix.length + 2) ; }).join('') ; }; ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/LICENSE������������������������������000644 �000766 �000024 �00000002061 12455173731 025471� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/package.json�������������������������000644 �000766 �000024 �00000003201 12455173731 026747� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "archy", "version": "1.0.0", "description": "render nested hierarchies `npm ls` style with unicode pipes", "main": "index.js", "devDependencies": { "tap": "~0.3.3", "tape": "~0.1.1" }, "scripts": { "test": "tap test" }, "testling": { "files": "test/*.js", "browsers": { "iexplore": [ "6.0", "7.0", "8.0", "9.0" ], "chrome": [ "20.0" ], "firefox": [ "10.0", "15.0" ], "safari": [ "5.1" ], "opera": [ "12.0" ] } }, "repository": { "type": "git", "url": "http://github.com/substack/node-archy.git" }, "keywords": [ "hierarchy", "npm ls", "unicode", "pretty", "print" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "gitHead": "30223c16191e877bf027b15b12daf077b9b55b84", "bugs": { "url": "https://github.com/substack/node-archy/issues" }, "homepage": "https://github.com/substack/node-archy", "_id": "archy@1.0.0", "_shasum": "f9c8c13757cc1dd7bc379ac77b2c62a5c2868c40", "_from": "archy@>=1.0.0 <2.0.0", "_npmVersion": "1.4.25", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "dist": { "shasum": "f9c8c13757cc1dd7bc379ac77b2c62a5c2868c40", "tarball": "http://registry.npmjs.org/archy/-/archy-1.0.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/archy/-/archy-1.0.0.tgz" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/README.markdown����������������������000644 �000766 �000024 �00000003341 12455173731 027167� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# archy Render nested hierarchies `npm ls` style with unicode pipes. [![browser support](http://ci.testling.com/substack/node-archy.png)](http://ci.testling.com/substack/node-archy) [![build status](https://secure.travis-ci.org/substack/node-archy.png)](http://travis-ci.org/substack/node-archy) # example ``` js var archy = require('archy'); var s = archy({ label : 'beep', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny' ] }, 'human' ] }, 'party\ntime!' ] } ] }); console.log(s); ``` output ``` beep ├── ity └─┬ boop ├─┬ o_O │ ├─┬ oh │ │ ├── hello │ │ └── puny │ └── human └── party time! ``` # methods var archy = require('archy') ## archy(obj, prefix='', opts={}) Return a string representation of `obj` with unicode pipe characters like how `npm ls` looks. `obj` should be a tree of nested objects with `'label'` and `'nodes'` fields. `'label'` is a string of text to display at a node level and `'nodes'` is an array of the descendents of the current node. If a node is a string, that string will be used as the `'label'` and an empty array of `'nodes'` will be used. `prefix` gets prepended to all the lines and is used by the algorithm to recursively update. If `'label'` has newlines they will be indented at the present indentation level with the current prefix. To disable unicode results in favor of all-ansi output set `opts.unicode` to `false`. # install With [npm](http://npmjs.org) do: ``` npm install archy ``` # license MIT �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/examples/beep.js���������������������000644 �000766 �000024 �00000000603 12455173731 027553� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var archy = require('../'); var s = archy({ label : 'beep', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny' ] }, 'human' ] }, 'party\ntime!' ] } ] }); console.log(s); �����������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/archy/examples/multi_line.js���������������000644 �000766 �000024 �00000000636 12455173731 031007� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var archy = require('../'); var s = archy({ label : 'beep\none\ntwo', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O\nwheee', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny\nmeat' ] }, 'creature' ] }, 'party\ntime!' ] } ] }); console.log(s); ��������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansistyles/ansistyles.js�������������������000644 �000766 �000024 �00000001722 12455173731 030313� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict'; /* * Info: http://www.termsys.demon.co.uk/vtansi.htm#colors * Following caveats * bright - brightens the color (bold-blue is same as brigthtBlue) * dim - nothing on Mac or Linux * italic - nothing on Mac or Linux * underline - underlines string * blink - nothing on Mac or linux * inverse - background becomes foreground and vice versa * * In summary, the only styles that work are: * - bright, underline and inverse * - the others are only included for completeness */ var styleNums = { reset : [0, 22] , bright : [1, 22] , dim : [2, 22] , italic : [3, 23] , underline : [4, 24] , blink : [5, 25] , inverse : [7, 27] } , styles = {} ; Object.keys(styleNums).forEach(function (k) { styles[k] = function (s) { var open = styleNums[k][0] , close = styleNums[k][1]; return '\u001b[' + open + 'm' + s + '\u001b[' + close + 'm'; }; }); module.exports = styles; ����������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansistyles/LICENSE�������������������������000644 �000766 �000024 �00000002066 12455173731 026566� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2013 Thorsten Lorenz. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansistyles/package.json��������������������000644 �000766 �000024 �00000006236 12455173731 030052� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "ansistyles", "version": "0.1.3", "description": "Functions that surround a string with ansistyle codes so it prints in style.", "main": "ansistyles.js", "scripts": { "test": "node test/ansistyles.js" }, "repository": { "type": "git", "url": "git://github.com/thlorenz/ansistyles.git" }, "keywords": [ "ansi", "style", "terminal", "console" ], "author": { "name": "Thorsten Lorenz", "email": "thlorenz@gmx.de", "url": "thlorenz.com" }, "license": "MIT", "readmeFilename": "README.md", "gitHead": "27bf1bc65231bcc7fd109bf13b13601b51f8cd04", "readme": "# ansistyles [![build status](https://secure.travis-ci.org/thlorenz/ansistyles.png)](http://next.travis-ci.org/thlorenz/ansistyles)\n\nFunctions that surround a string with ansistyle codes so it prints in style.\n\nIn case you need colors, like `red`, have a look at [ansicolors](https://github.com/thlorenz/ansicolors).\n\n## Installation\n\n npm install ansistyles\n\n## Usage\n\n```js\nvar styles = require('ansistyles');\n\nconsole.log(styles.bright('hello world')); // prints hello world in 'bright' white\nconsole.log(styles.underline('hello world')); // prints hello world underlined\nconsole.log(styles.inverse('hello world')); // prints hello world black on white\n```\n\n## Combining with ansicolors\n\nGet the ansicolors module:\n\n npm install ansicolors\n\n```js\nvar styles = require('ansistyles')\n , colors = require('ansicolors');\n\n console.log(\n // prints hello world underlined in blue on a green background\n colors.bgGreen(colors.blue(styles.underline('hello world'))) \n );\n```\n\n## Tests\n\nLook at the [tests](https://github.com/thlorenz/ansistyles/blob/master/test/ansistyles.js) to see more examples and/or run them via: \n\n npm explore ansistyles && npm test\n\n## More Styles\n\nAs you can see from [here](https://github.com/thlorenz/ansistyles/blob/master/ansistyles.js#L4-L15), more styles are available,\nbut didn't have any effect on the terminals that I tested on Mac Lion and Ubuntu Linux.\n\nI included them for completeness, but didn't show them in the examples because they seem to have no effect.\n\n### reset\n\nA style reset function is also included, please note however that this is not nestable.\n\nTherefore the below only underlines `hell` only, but not `world`.\n\n```js\nconsole.log(styles.underline('hell' + styles.reset('o') + ' world'));\n```\n\nIt is essentially the same as:\n\n```js\nconsole.log(styles.underline('hell') + styles.reset('') + 'o world');\n```\n\n\n\n## Alternatives\n\n**ansistyles** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, \nI'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js).\n", "bugs": { "url": "https://github.com/thlorenz/ansistyles/issues" }, "homepage": "https://github.com/thlorenz/ansistyles", "_id": "ansistyles@0.1.3", "dist": { "shasum": "b14f315fe763a2b3a88df9d3261a517e666c4615" }, "_from": "ansistyles@0.1.3", "_resolved": "https://registry.npmjs.org/ansistyles/-/ansistyles-0.1.3.tgz" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansistyles/README.md�����������������������000644 �000766 �000024 �00000004201 12455173731 027031� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# ansistyles [![build status](https://secure.travis-ci.org/thlorenz/ansistyles.png)](http://next.travis-ci.org/thlorenz/ansistyles) Functions that surround a string with ansistyle codes so it prints in style. In case you need colors, like `red`, have a look at [ansicolors](https://github.com/thlorenz/ansicolors). ## Installation npm install ansistyles ## Usage ```js var styles = require('ansistyles'); console.log(styles.bright('hello world')); // prints hello world in 'bright' white console.log(styles.underline('hello world')); // prints hello world underlined console.log(styles.inverse('hello world')); // prints hello world black on white ``` ## Combining with ansicolors Get the ansicolors module: npm install ansicolors ```js var styles = require('ansistyles') , colors = require('ansicolors'); console.log( // prints hello world underlined in blue on a green background colors.bgGreen(colors.blue(styles.underline('hello world'))) ); ``` ## Tests Look at the [tests](https://github.com/thlorenz/ansistyles/blob/master/test/ansistyles.js) to see more examples and/or run them via: npm explore ansistyles && npm test ## More Styles As you can see from [here](https://github.com/thlorenz/ansistyles/blob/master/ansistyles.js#L4-L15), more styles are available, but didn't have any effect on the terminals that I tested on Mac Lion and Ubuntu Linux. I included them for completeness, but didn't show them in the examples because they seem to have no effect. ### reset A style reset function is also included, please note however that this is not nestable. Therefore the below only underlines `hell` only, but not `world`. ```js console.log(styles.underline('hell' + styles.reset('o') + ' world')); ``` It is essentially the same as: ```js console.log(styles.underline('hell') + styles.reset('') + 'o world'); ``` ## Alternatives **ansistyles** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, I'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js). �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansicolors/ansicolors.js�������������������000644 �000766 �000024 �00000003036 12455173731 030247� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// ColorCodes explained: http://www.termsys.demon.co.uk/vtansi.htm 'use strict'; var colorNums = { white : 37 , black : 30 , blue : 34 , cyan : 36 , green : 32 , magenta : 35 , red : 31 , yellow : 33 , brightBlack : 90 , brightRed : 91 , brightGreen : 92 , brightYellow : 93 , brightBlue : 94 , brightMagenta : 95 , brightCyan : 96 , brightWhite : 97 } , backgroundColorNums = { bgBlack : 40 , bgRed : 41 , bgGreen : 42 , bgYellow : 43 , bgBlue : 44 , bgMagenta : 45 , bgCyan : 46 , bgWhite : 47 , bgBrightBlack : 100 , bgBrightRed : 101 , bgBrightGreen : 102 , bgBrightYellow : 103 , bgBrightBlue : 104 , bgBrightMagenta : 105 , bgBrightCyan : 106 , bgBrightWhite : 107 } , open = {} , close = {} , colors = {} ; Object.keys(colorNums).forEach(function (k) { var o = open[k] = '\u001b[' + colorNums[k] + 'm'; var c = close[k] = '\u001b[39m'; colors[k] = function (s) { return o + s + c; }; }); Object.keys(backgroundColorNums).forEach(function (k) { var o = open[k] = '\u001b[' + backgroundColorNums[k] + 'm'; var c = close[k] = '\u001b[49m'; colors[k] = function (s) { return o + s + c; }; }); module.exports = colors; colors.open = open; colors.close = close; ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansicolors/LICENSE�������������������������000644 �000766 �000024 �00000002066 12455173731 026544� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2013 Thorsten Lorenz. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansicolors/package.json��������������������000644 �000766 �000024 �00000005661 12455173731 030031� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "ansicolors", "version": "0.3.2", "description": "Functions that surround a string with ansicolor codes so it prints in color.", "main": "ansicolors.js", "scripts": { "test": "node test/*.js" }, "repository": { "type": "git", "url": "git://github.com/thlorenz/ansicolors.git" }, "keywords": [ "ansi", "colors", "highlight", "string" ], "author": { "name": "Thorsten Lorenz", "email": "thlorenz@gmx.de", "url": "thlorenz.com" }, "license": "MIT", "readmeFilename": "README.md", "gitHead": "858847ca28e8b360d9b70eee0592700fa2ab087d", "readme": "# ansicolors [![build status](https://secure.travis-ci.org/thlorenz/ansicolors.png)](http://next.travis-ci.org/thlorenz/ansicolors)\n\nFunctions that surround a string with ansicolor codes so it prints in color.\n\nIn case you need styles, like `bold`, have a look at [ansistyles](https://github.com/thlorenz/ansistyles).\n\n## Installation\n\n npm install ansicolors\n\n## Usage\n\n```js\nvar colors = require('ansicolors');\n\n// foreground colors\nvar redHerring = colors.red('herring');\nvar blueMoon = colors.blue('moon');\nvar brighBlueMoon = colors.brightBlue('moon');\n\nconsole.log(redHerring); // this will print 'herring' in red\nconsole.log(blueMoon); // this 'moon' in blue\nconsole.log(brightBlueMoon); // I think you got the idea\n\n// background colors\nconsole.log(colors.bgYellow('printed on yellow background'));\nconsole.log(colors.bgBrightBlue('printed on bright blue background'));\n\n// mixing background and foreground colors\n// below two lines have same result (order in which bg and fg are combined doesn't matter)\nconsole.log(colors.bgYellow(colors.blue('printed on yellow background in blue')));\nconsole.log(colors.blue(colors.bgYellow('printed on yellow background in blue')));\n```\n\n## Advanced API\n\n**ansicolors** allows you to access opening and closing escape sequences separately.\n\n```js\nvar colors = require('ansicolors');\n\nfunction inspect(obj, depth) {\n return require('util').inspect(obj, false, depth || 5, true);\n}\n\nconsole.log('open blue', inspect(colors.open.blue));\nconsole.log('close bgBlack', inspect(colors.close.bgBlack));\n\n// => open blue '\\u001b[34m'\n// close bgBlack '\\u001b[49m'\n```\n\n## Tests\n\nLook at the [tests](https://github.com/thlorenz/ansicolors/blob/master/test/ansicolors.js) to see more examples and/or run them via: \n\n npm explore ansicolors && npm test\n\n## Alternatives\n\n**ansicolors** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, \nI'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js).\n", "bugs": { "url": "https://github.com/thlorenz/ansicolors/issues" }, "homepage": "https://github.com/thlorenz/ansicolors", "_id": "ansicolors@0.3.2", "_from": "ansicolors@latest" } �������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansicolors/README.md�����������������������000644 �000766 �000024 �00000004073 12455173731 027016� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# ansicolors [![build status](https://secure.travis-ci.org/thlorenz/ansicolors.png)](http://next.travis-ci.org/thlorenz/ansicolors) Functions that surround a string with ansicolor codes so it prints in color. In case you need styles, like `bold`, have a look at [ansistyles](https://github.com/thlorenz/ansistyles). ## Installation npm install ansicolors ## Usage ```js var colors = require('ansicolors'); // foreground colors var redHerring = colors.red('herring'); var blueMoon = colors.blue('moon'); var brighBlueMoon = colors.brightBlue('moon'); console.log(redHerring); // this will print 'herring' in red console.log(blueMoon); // this 'moon' in blue console.log(brightBlueMoon); // I think you got the idea // background colors console.log(colors.bgYellow('printed on yellow background')); console.log(colors.bgBrightBlue('printed on bright blue background')); // mixing background and foreground colors // below two lines have same result (order in which bg and fg are combined doesn't matter) console.log(colors.bgYellow(colors.blue('printed on yellow background in blue'))); console.log(colors.blue(colors.bgYellow('printed on yellow background in blue'))); ``` ## Advanced API **ansicolors** allows you to access opening and closing escape sequences separately. ```js var colors = require('ansicolors'); function inspect(obj, depth) { return require('util').inspect(obj, false, depth || 5, true); } console.log('open blue', inspect(colors.open.blue)); console.log('close bgBlack', inspect(colors.close.bgBlack)); // => open blue '\u001b[34m' // close bgBlack '\u001b[49m' ``` ## Tests Look at the [tests](https://github.com/thlorenz/ansicolors/blob/master/test/ansicolors.js) to see more examples and/or run them via: npm explore ansicolors && npm test ## Alternatives **ansicolors** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, I'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js). ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/.jshintrc�����������������������������000644 �000766 �000024 �00000000046 12455173731 026136� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "laxcomma": true, "asi": true } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/.npmignore����������������������������000644 �000766 �000024 �00000000015 12455173731 026304� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������node_modules �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/�����������������������������000755 �000766 �000024 �00000000000 12456115117 026122� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/History.md����������������������������000644 �000766 �000024 �00000000606 12455173731 026276� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ 0.3.0 / 2014-05-09 ================== * package: remove "test" script and "devDependencies" * package: remove "engines" section * pacakge: remove "bin" section * package: beautify * examples: remove `starwars` example (#15) * Documented goto, horizontalAbsolute, and eraseLine methods in README.md (#12, @Jammerwoch) * add `.jshintrc` file < 0.3.0 ======= * Prehistoric ��������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/lib/����������������������������������000755 �000766 �000024 �00000000000 12456115117 025052� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/package.json��������������������������000644 �000766 �000024 �00000002412 12455173731 026576� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "ansi", "description": "Advanced ANSI formatting tool for Node.js", "keywords": [ "ansi", "formatting", "cursor", "color", "terminal", "rgb", "256", "stream" ], "version": "0.3.0", "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://tootallnate.net" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/ansi.js.git" }, "main": "./lib/ansi.js", "bugs": { "url": "https://github.com/TooTallNate/ansi.js/issues" }, "homepage": "https://github.com/TooTallNate/ansi.js", "_id": "ansi@0.3.0", "_shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "_from": "ansi@latest", "_npmVersion": "1.4.9", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "TooTallNate", "email": "nathan@tootallnate.net" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "tarball": "http://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz", "readme": "ERROR: No README data found!" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/README.md�����������������������������000644 �000766 �000024 �00000006213 12455173731 025572� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ansi.js ========= ### Advanced ANSI formatting tool for Node.js `ansi.js` is a module for Node.js that provides an easy-to-use API for writing ANSI escape codes to `Stream` instances. ANSI escape codes are used to do fancy things in a terminal window, like render text in colors, delete characters, lines, the entire window, or hide and show the cursor, and lots more! #### Features: * 256 color support for the terminal! * Make a beep sound from your terminal! * Works with *any* writable `Stream` instance. * Allows you to move the cursor anywhere on the terminal window. * Allows you to delete existing contents from the terminal window. * Allows you to hide and show the cursor. * Converts CSS color codes and RGB values into ANSI escape codes. * Low-level; you are in control of when escape codes are used, it's not abstracted. Installation ------------ Install with `npm`: ``` bash $ npm install ansi ``` Example ------- ``` js var ansi = require('ansi') , cursor = ansi(process.stdout) // You can chain your calls forever: cursor .red() // Set font color to red .bg.grey() // Set background color to grey .write('Hello World!') // Write 'Hello World!' to stdout .bg.reset() // Reset the bgcolor before writing the trailing \n, // to avoid Terminal glitches .write('\n') // And a final \n to wrap things up // Rendering modes are persistent: cursor.hex('#660000').bold().underline() // You can use the regular logging functions, text will be green: console.log('This is blood red, bold text') // To reset just the foreground color: cursor.fg.reset() console.log('This will still be bold') // to go to a location (x,y) on the console // note: 1-indexed, not 0-indexed: cursor.goto(10, 5).write('Five down, ten over') // to clear the current line: cursor.horizontalAbsolute(0).eraseLine().write('Starting again') // to go to a different column on the current line: cursor.horizontalAbsolute(5).write('column five') // Clean up after yourself! cursor.reset() ``` License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/lib/ansi.js���������������������������000644 �000766 �000024 �00000017434 12455173731 026360� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ /** * References: * * - http://en.wikipedia.org/wiki/ANSI_escape_code * - http://www.termsys.demon.co.uk/vtansi.htm * */ /** * Module dependencies. */ var emitNewlineEvents = require('./newlines') , prefix = '\x1b[' // For all escape codes , suffix = 'm' // Only for color codes /** * The ANSI escape sequences. */ var codes = { up: 'A' , down: 'B' , forward: 'C' , back: 'D' , nextLine: 'E' , previousLine: 'F' , horizontalAbsolute: 'G' , eraseData: 'J' , eraseLine: 'K' , scrollUp: 'S' , scrollDown: 'T' , savePosition: 's' , restorePosition: 'u' , queryPosition: '6n' , hide: '?25l' , show: '?25h' } /** * Rendering ANSI codes. */ var styles = { bold: 1 , italic: 3 , underline: 4 , inverse: 7 } /** * The negating ANSI code for the rendering modes. */ var reset = { bold: 22 , italic: 23 , underline: 24 , inverse: 27 } /** * The standard, styleable ANSI colors. */ var colors = { white: 37 , black: 30 , blue: 34 , cyan: 36 , green: 32 , magenta: 35 , red: 31 , yellow: 33 , grey: 90 , brightBlack: 90 , brightRed: 91 , brightGreen: 92 , brightYellow: 93 , brightBlue: 94 , brightMagenta: 95 , brightCyan: 96 , brightWhite: 97 } /** * Creates a Cursor instance based off the given `writable stream` instance. */ function ansi (stream, options) { if (stream._ansicursor) { return stream._ansicursor } else { return stream._ansicursor = new Cursor(stream, options) } } module.exports = exports = ansi /** * The `Cursor` class. */ function Cursor (stream, options) { if (!(this instanceof Cursor)) { return new Cursor(stream, options) } if (typeof stream != 'object' || typeof stream.write != 'function') { throw new Error('a valid Stream instance must be passed in') } // the stream to use this.stream = stream // when 'enabled' is false then all the functions are no-ops except for write() this.enabled = options && options.enabled if (typeof this.enabled === 'undefined') { this.enabled = stream.isTTY } this.enabled = !!this.enabled // then `buffering` is true, then `write()` calls are buffered in // memory until `flush()` is invoked this.buffering = !!(options && options.buffering) this._buffer = [] // controls the foreground and background colors this.fg = this.foreground = new Colorer(this, 0) this.bg = this.background = new Colorer(this, 10) // defaults this.Bold = false this.Italic = false this.Underline = false this.Inverse = false // keep track of the number of "newlines" that get encountered this.newlines = 0 emitNewlineEvents(stream) stream.on('newline', function () { this.newlines++ }.bind(this)) } exports.Cursor = Cursor /** * Helper function that calls `write()` on the underlying Stream. * Returns `this` instead of the write() return value to keep * the chaining going. */ Cursor.prototype.write = function (data) { if (this.buffering) { this._buffer.push(arguments) } else { this.stream.write.apply(this.stream, arguments) } return this } /** * Buffer `write()` calls into memory. * * @api public */ Cursor.prototype.buffer = function () { this.buffering = true return this } /** * Write out the in-memory buffer. * * @api public */ Cursor.prototype.flush = function () { this.buffering = false var str = this._buffer.map(function (args) { if (args.length != 1) throw new Error('unexpected args length! ' + args.length); return args[0]; }).join(''); this._buffer.splice(0); // empty this.write(str); return this } /** * The `Colorer` class manages both the background and foreground colors. */ function Colorer (cursor, base) { this.current = null this.cursor = cursor this.base = base } exports.Colorer = Colorer /** * Write an ANSI color code, ensuring that the same code doesn't get rewritten. */ Colorer.prototype._setColorCode = function setColorCode (code) { var c = String(code) if (this.current === c) return this.cursor.enabled && this.cursor.write(prefix + c + suffix) this.current = c return this } /** * Set up the positional ANSI codes. */ Object.keys(codes).forEach(function (name) { var code = String(codes[name]) Cursor.prototype[name] = function () { var c = code if (arguments.length > 0) { c = toArray(arguments).map(Math.round).join(';') + code } this.enabled && this.write(prefix + c) return this } }) /** * Set up the functions for the rendering ANSI codes. */ Object.keys(styles).forEach(function (style) { var name = style[0].toUpperCase() + style.substring(1) , c = styles[style] , r = reset[style] Cursor.prototype[style] = function () { if (this[name]) return this.enabled && this.write(prefix + c + suffix) this[name] = true return this } Cursor.prototype['reset' + name] = function () { if (!this[name]) return this.enabled && this.write(prefix + r + suffix) this[name] = false return this } }) /** * Setup the functions for the standard colors. */ Object.keys(colors).forEach(function (color) { var code = colors[color] Colorer.prototype[color] = function () { this._setColorCode(this.base + code) return this.cursor } Cursor.prototype[color] = function () { return this.foreground[color]() } }) /** * Makes a beep sound! */ Cursor.prototype.beep = function () { this.enabled && this.write('\x07') return this } /** * Moves cursor to specific position */ Cursor.prototype.goto = function (x, y) { x = x | 0 y = y | 0 this.enabled && this.write(prefix + y + ';' + x + 'H') return this } /** * Resets the color. */ Colorer.prototype.reset = function () { this._setColorCode(this.base + 39) return this.cursor } /** * Resets all ANSI formatting on the stream. */ Cursor.prototype.reset = function () { this.enabled && this.write(prefix + '0' + suffix) this.Bold = false this.Italic = false this.Underline = false this.Inverse = false this.foreground.current = null this.background.current = null return this } /** * Sets the foreground color with the given RGB values. * The closest match out of the 216 colors is picked. */ Colorer.prototype.rgb = function (r, g, b) { var base = this.base + 38 , code = rgb(r, g, b) this._setColorCode(base + ';5;' + code) return this.cursor } /** * Same as `cursor.fg.rgb(r, g, b)`. */ Cursor.prototype.rgb = function (r, g, b) { return this.foreground.rgb(r, g, b) } /** * Accepts CSS color codes for use with ANSI escape codes. * For example: `#FF000` would be bright red. */ Colorer.prototype.hex = function (color) { return this.rgb.apply(this, hex(color)) } /** * Same as `cursor.fg.hex(color)`. */ Cursor.prototype.hex = function (color) { return this.foreground.hex(color) } // UTIL FUNCTIONS // /** * Translates a 255 RGB value to a 0-5 ANSI RGV value, * then returns the single ANSI color code to use. */ function rgb (r, g, b) { var red = r / 255 * 5 , green = g / 255 * 5 , blue = b / 255 * 5 return rgb5(red, green, blue) } /** * Turns rgb 0-5 values into a single ANSI color code to use. */ function rgb5 (r, g, b) { var red = Math.round(r) , green = Math.round(g) , blue = Math.round(b) return 16 + (red*36) + (green*6) + blue } /** * Accepts a hex CSS color code string (# is optional) and * translates it into an Array of 3 RGB 0-255 values, which * can then be used with rgb(). */ function hex (color) { var c = color[0] === '#' ? color.substring(1) : color , r = c.substring(0, 2) , g = c.substring(2, 4) , b = c.substring(4, 6) return [parseInt(r, 16), parseInt(g, 16), parseInt(b, 16)] } /** * Turns an array-like object into a real array. */ function toArray (a) { var i = 0 , l = a.length , rtn = [] for (; i<l; i++) { rtn.push(a[i]) } return rtn } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/lib/newlines.js�����������������������000644 �000766 �000024 �00000002770 12455173731 027247� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ /** * Accepts any node Stream instance and hijacks its "write()" function, * so that it can count any newlines that get written to the output. * * When a '\n' byte is encountered, then a "newline" event will be emitted * on the stream, with no arguments. It is up to the listeners to determine * any necessary deltas required for their use-case. * * Ex: * * var cursor = ansi(process.stdout) * , ln = 0 * process.stdout.on('newline', function () { * ln++ * }) */ /** * Module dependencies. */ var assert = require('assert') var NEWLINE = '\n'.charCodeAt(0) function emitNewlineEvents (stream) { if (stream._emittingNewlines) { // already emitting newline events return } var write = stream.write stream.write = function (data) { // first write the data var rtn = write.apply(stream, arguments) if (stream.listeners('newline').length > 0) { var len = data.length , i = 0 // now try to calculate any deltas if (typeof data == 'string') { for (; i<len; i++) { processByte(stream, data.charCodeAt(i)) } } else { // buffer for (; i<len; i++) { processByte(stream, data[i]) } } } return rtn } stream._emittingNewlines = true } module.exports = emitNewlineEvents /** * Processes an individual byte being written to a stream */ function processByte (stream, b) { assert.equal(typeof b, 'number') if (b === NEWLINE) { stream.emit('newline') } } ��������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/beep/������������������������000755 �000766 �000024 �00000000000 12456115117 027035� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/clear/�����������������������000755 �000766 �000024 �00000000000 12456115117 027210� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/cursorPosition.js������������000755 �000766 �000024 �00000001200 12455173731 031523� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var tty = require('tty') var cursor = require('../')(process.stdout) // listen for the queryPosition report on stdin process.stdin.resume() raw(true) process.stdin.once('data', function (b) { var match = /\[(\d+)\;(\d+)R$/.exec(b.toString()) if (match) { var xy = match.slice(1, 3).reverse().map(Number) console.error(xy) } // cleanup and close stdin raw(false) process.stdin.pause() }) // send the query position request code to stdout cursor.queryPosition() function raw (mode) { if (process.stdin.setRawMode) { process.stdin.setRawMode(mode) } else { tty.setRawMode(mode) } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/progress/��������������������000755 �000766 �000024 �00000000000 12456115117 027766� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/progress/index.js������������000644 �000766 �000024 �00000003274 12455173731 031446� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var assert = require('assert') , ansi = require('../../') function Progress (stream, width) { this.cursor = ansi(stream) this.delta = this.cursor.newlines this.width = width | 0 || 10 this.open = '[' this.close = ']' this.complete = '█' this.incomplete = '_' // initial render this.progress = 0 } Object.defineProperty(Progress.prototype, 'progress', { get: get , set: set , configurable: true , enumerable: true }) function get () { return this._progress } function set (v) { this._progress = Math.max(0, Math.min(v, 100)) var w = this.width - this.complete.length - this.incomplete.length , n = w * (this._progress / 100) | 0 , i = w - n , com = c(this.complete, n) , inc = c(this.incomplete, i) , delta = this.cursor.newlines - this.delta assert.equal(com.length + inc.length, w) if (delta > 0) { this.cursor.up(delta) this.delta = this.cursor.newlines } this.cursor .horizontalAbsolute(0) .eraseLine(2) .fg.white() .write(this.open) .fg.grey() .bold() .write(com) .resetBold() .write(inc) .fg.white() .write(this.close) .fg.reset() .write('\n') } function c (char, length) { return Array.apply(null, Array(length)).map(function () { return char }).join('') } // Usage var width = parseInt(process.argv[2], 10) || process.stdout.getWindowSize()[0] / 2 , p = new Progress(process.stdout, width) ;(function tick () { p.progress += Math.random() * 5 p.cursor .eraseLine(2) .write('Progress: ') .bold().write(p.progress.toFixed(2)) .write('%') .resetBold() .write('\n') if (p.progress < 100) setTimeout(tick, 100) })() ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/clear/index.js���������������000755 �000766 �000024 �00000000544 12455173731 030670� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node /** * Like GNU ncurses "clear" command. * https://github.com/mscdex/node-ncurses/blob/master/deps/ncurses/progs/clear.c */ process.title = 'clear' function lf () { return '\n' } require('../../')(process.stdout) .write(Array.apply(null, Array(process.stdout.getWindowSize()[1])).map(lf).join('')) .eraseData(2) .goto(1, 1) ������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/ansi/examples/beep/index.js����������������000755 �000766 �000024 �00000000512 12455173731 030510� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node /** * Invokes the terminal "beep" sound once per second on every exact second. */ process.title = 'beep' var cursor = require('../../')(process.stdout) function beep () { cursor.beep() setTimeout(beep, 1000 - (new Date()).getMilliseconds()) } setTimeout(beep, 1000 - (new Date()).getMilliseconds()) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/abbrev.js���������������������������000644 �000766 �000024 �00000003344 12455173731 026423� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = abbrev.abbrev = abbrev abbrev.monkeyPatch = monkeyPatch function monkeyPatch () { Object.defineProperty(Array.prototype, 'abbrev', { value: function () { return abbrev(this) }, enumerable: false, configurable: true, writable: true }) Object.defineProperty(Object.prototype, 'abbrev', { value: function () { return abbrev(Object.keys(this)) }, enumerable: false, configurable: true, writable: true }) } function abbrev (list) { if (arguments.length !== 1 || !Array.isArray(list)) { list = Array.prototype.slice.call(arguments, 0) } for (var i = 0, l = list.length, args = [] ; i < l ; i ++) { args[i] = typeof list[i] === "string" ? list[i] : String(list[i]) } // sort them lexicographically, so that they're next to their nearest kin args = args.sort(lexSort) // walk through each, seeing how much it has in common with the next and previous var abbrevs = {} , prev = "" for (var i = 0, l = args.length ; i < l ; i ++) { var current = args[i] , next = args[i + 1] || "" , nextMatches = true , prevMatches = true if (current === next) continue for (var j = 0, cl = current.length ; j < cl ; j ++) { var curChar = current.charAt(j) nextMatches = nextMatches && curChar === next.charAt(j) prevMatches = prevMatches && curChar === prev.charAt(j) if (!nextMatches && !prevMatches) { j ++ break } } prev = current if (j === cl) { abbrevs[current] = current continue } for (var a = current.substr(0, j) ; j <= cl ; j ++) { abbrevs[a] = current a += current.charAt(j) } } return abbrevs } function lexSort (a, b) { return a === b ? 0 : a > b ? 1 : -1 } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/CONTRIBUTING.md���������������������000644 �000766 �000024 �00000000173 12455173731 027052� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ To get started, <a href="http://www.clahub.com/agreements/isaacs/abbrev-js">sign the Contributor License Agreement</a>. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/LICENSE�����������������������������000644 �000766 �000024 �00000002104 12455173731 025622� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/package.json������������������������000644 �000766 �000024 �00000002111 12455173731 027101� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "abbrev", "version": "1.0.5", "description": "Like ruby's abbrev module, but in js", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me" }, "main": "abbrev.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "http://github.com/isaacs/abbrev-js" }, "license": { "type": "MIT", "url": "https://github.com/isaacs/abbrev-js/raw/master/LICENSE" }, "bugs": { "url": "https://github.com/isaacs/abbrev-js/issues" }, "homepage": "https://github.com/isaacs/abbrev-js", "_id": "abbrev@1.0.5", "_shasum": "5d8257bd9ebe435e698b2fa431afde4fe7b10b03", "_from": "abbrev@latest", "_npmVersion": "1.4.7", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "5d8257bd9ebe435e698b2fa431afde4fe7b10b03", "tarball": "http://registry.npmjs.org/abbrev/-/abbrev-1.0.5.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/abbrev/-/abbrev-1.0.5.tgz" } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/README.md���������������������������000644 �000766 �000024 �00000000763 12455173731 026105� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������# abbrev-js Just like [ruby's Abbrev](http://apidock.com/ruby/Abbrev). Usage: var abbrev = require("abbrev"); abbrev("foo", "fool", "folding", "flop"); // returns: { fl: 'flop' , flo: 'flop' , flop: 'flop' , fol: 'folding' , fold: 'folding' , foldi: 'folding' , foldin: 'folding' , folding: 'folding' , foo: 'foo' , fool: 'fool' } This is handy for command-line scripts, or other cases where you want to be able to accept shorthands. �������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/abbrev/test.js�����������������������������000644 �000766 �000024 �00000002055 12455173731 026137� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var abbrev = require('./abbrev.js') var assert = require("assert") var util = require("util") console.log("TAP Version 13") var count = 0 function test (list, expect) { count++ var actual = abbrev(list) assert.deepEqual(actual, expect, "abbrev("+util.inspect(list)+") === " + util.inspect(expect) + "\n"+ "actual: "+util.inspect(actual)) actual = abbrev.apply(exports, list) assert.deepEqual(abbrev.apply(exports, list), expect, "abbrev("+list.map(JSON.stringify).join(",")+") === " + util.inspect(expect) + "\n"+ "actual: "+util.inspect(actual)) console.log('ok - ' + list.join(' ')) } test([ "ruby", "ruby", "rules", "rules", "rules" ], { rub: 'ruby' , ruby: 'ruby' , rul: 'rules' , rule: 'rules' , rules: 'rules' }) test(["fool", "foom", "pool", "pope"], { fool: 'fool' , foom: 'foom' , poo: 'pool' , pool: 'pool' , pop: 'pope' , pope: 'pope' }) test(["a", "ab", "abc", "abcd", "abcde", "acde"], { a: 'a' , ab: 'ab' , abc: 'abc' , abcd: 'abcd' , abcde: 'abcde' , ac: 'acde' , acd: 'acde' , acde: 'acde' }) console.log("0..%d", count) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/mkdirp��������������������������������000644 �000766 �000024 �00000000421 12455173731 025413� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../mkdirp/bin/cmd.js" "$@" ret=$? else node "$basedir/../mkdirp/bin/cmd.js" "$@" ret=$? fi exit $ret �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/mkdirp.cmd����������������������������000644 �000766 �000024 �00000000260 12455173731 026156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\mkdirp\bin\cmd.js" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\mkdirp\bin\cmd.js" %* )������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/node-gyp������������������������������000644 �000766 �000024 �00000000437 12455173731 025656� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../node-gyp/bin/node-gyp.js" "$@" ret=$? else node "$basedir/../node-gyp/bin/node-gyp.js" "$@" ret=$? fi exit $ret ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/node-gyp.cmd��������������������������000644 �000766 �000024 �00000000276 12455173731 026421� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\node-gyp\bin\node-gyp.js" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\node-gyp\bin\node-gyp.js" %* )����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/nopt����������������������������������000644 �000766 �000024 �00000000417 12455173731 025112� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../nopt/bin/nopt.js" "$@" ret=$? else node "$basedir/../nopt/bin/nopt.js" "$@" ret=$? fi exit $ret �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/nopt.cmd������������������������������000644 �000766 �000024 �00000000256 12455173731 025655� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\nopt\bin\nopt.js" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\nopt\bin\nopt.js" %* )��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/opener��������������������������������000644 �000766 �000024 �00000000417 12455173731 025422� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../opener/opener.js" "$@" ret=$? else node "$basedir/../opener/opener.js" "$@" ret=$? fi exit $ret �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/opener.cmd����������������������������000644 �000766 �000024 �00000000256 12455173731 026165� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\opener\opener.js" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\opener\opener.js" %* )��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/rimraf��������������������������������000644 �000766 �000024 �00000000411 12455173731 025404� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../rimraf/bin.js" "$@" ret=$? else node "$basedir/../rimraf/bin.js" "$@" ret=$? fi exit $ret �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/rimraf.cmd����������������������������000644 �000766 �000024 �00000000250 12455173731 026147� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\rimraf\bin.js" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\rimraf\bin.js" %* )��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/semver��������������������������������000644 �000766 �000024 �00000000421 12455173731 025426� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../semver/bin/semver" "$@" ret=$? else node "$basedir/../semver/bin/semver" "$@" ret=$? fi exit $ret �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/semver.cmd����������������������������000644 �000766 �000024 �00000000260 12455173731 026171� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\semver\bin\semver" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\semver\bin\semver" %* )������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/which���������������������������������000644 �000766 �000024 �00000000415 12455173731 025232� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node" ]; then "$basedir/node" "$basedir/../which/bin/which" "$@" ret=$? else node "$basedir/../which/bin/which" "$@" ret=$? fi exit $ret ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/node_modules/.bin/which.cmd�����������������������������000644 �000766 �000024 �00000000254 12455173731 025775� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������@IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\..\which\bin\which" %* ) ELSE ( @SETLOCAL @SET PATHEXT=%PATHEXT:;.JS;=;% node "%~dp0\..\which\bin\which" %* )����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/�����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022304� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/�����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022306� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man5/�����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022310� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/�����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022312� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-coding-style.7�����������������������������000644 �000766 �000024 �00000012607 12455173731 025606� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-CODING\-STYLE" "7" "January 2015" "" "" .SH "NAME" \fBnpm-coding-style\fR \- npm's "funny" coding style .SH DESCRIPTION .P npm's coding style is a bit unconventional\. It is not different for difference's sake, but rather a carefully crafted style that is designed to reduce visual clutter and make bugs more apparent\. .P If you want to contribute to npm (which is very encouraged), you should make your code conform to npm's style\. .P Note: this concerns npm's code not the specific packages that you can download from the npm registry\. .SH Line Length .P Keep lines shorter than 80 characters\. It's better for lines to be too short than to be too long\. Break up long lists, objects, and other statements onto multiple lines\. .SH Indentation .P Two\-spaces\. Tabs are better, but they look like hell in web browsers (and on GitHub), and node uses 2 spaces, so that's that\. .P Configure your editor appropriately\. .SH Curly braces .P Curly braces belong on the same line as the thing that necessitates them\. .P Bad: .P .RS 2 .nf function () { .fi .RE .P Good: .P .RS 2 .nf function () { .fi .RE .P If a block needs to wrap to the next line, use a curly brace\. Don't use it if it doesn't\. .P Bad: .P .RS 2 .nf if (foo) { bar() } while (foo) bar() .fi .RE .P Good: .P .RS 2 .nf if (foo) bar() while (foo) { bar() } .fi .RE .SH Semicolons .P Don't use them except in four situations: .RS 0 .IP \(bu 2 \fBfor (;;)\fR loops\. They're actually required\. .IP \(bu 2 null loops like: \fBwhile (something) ;\fR (But you'd better have a good reason for doing that\.) .IP \(bu 2 \fBcase "foo": doSomething(); break\fR .IP \(bu 2 In front of a leading \fB(\fR or \fB[\fR at the start of the line\. This prevents the expression from being interpreted as a function call or property access, respectively\. .RE .P Some examples of good semicolon usage: .P .RS 2 .nf ;(x || y)\.doSomething() ;[a, b, c]\.forEach(doSomething) for (var i = 0; i < 10; i ++) { switch (state) { case "begin": start(); continue case "end": finish(); break default: throw new Error("unknown state") } end() } .fi .RE .P Note that starting lines with \fB\-\fR and \fB+\fR also should be prefixed with a semicolon, but this is much less common\. .SH Comma First .P If there is a list of things separated by commas, and it wraps across multiple lines, put the comma at the start of the next line, directly below the token that starts the list\. Put the final token in the list on a line by itself\. For example: .P .RS 2 .nf var magicWords = [ "abracadabra" , "gesundheit" , "ventrilo" ] , spells = { "fireball" : function () { setOnFire() } , "water" : function () { putOut() } } , a = 1 , b = "abc" , etc , somethingElse .fi .RE .SH Whitespace .P Put a single space in front of ( for anything other than a function call\. Also use a single space wherever it makes things more readable\. .P Don't leave trailing whitespace at the end of lines\. Don't indent empty lines\. Don't use more spaces than are helpful\. .SH Functions .P Use named functions\. They make stack traces a lot easier to read\. .SH Callbacks, Sync/async Style .P Use the asynchronous/non\-blocking versions of things as much as possible\. It might make more sense for npm to use the synchronous fs APIs, but this way, the fs and http and child process stuff all uses the same callback\-passing methodology\. .P The callback should always be the last argument in the list\. Its first argument is the Error or null\. .P Be very careful never to ever ever throw anything\. It's worse than useless\. Just send the error message back as the first argument to the callback\. .SH Errors .P Always create a new Error object with your message\. Don't just return a string message to the callback\. Stack traces are handy\. .SH Logging .P Logging is done using the npmlog \fIhttps://github\.com/npm/npmlog\fR utility\. .P Please clean up logs when they are no longer helpful\. In particular, logging the same object over and over again is not helpful\. Logs should report what's happening so that it's easier to track down where a fault occurs\. .P Use appropriate log levels\. See npm help 7 \fBnpm\-config\fR and search for "loglevel"\. .SH Case, naming, etc\. .P Use \fBlowerCamelCase\fR for multiword identifiers when they refer to objects, functions, methods, properties, or anything not specified in this section\. .P Use \fBUpperCamelCase\fR for class names (things that you'd pass to "new")\. .P Use \fBall\-lower\-hyphen\-css\-case\fR for multiword filenames and config keys\. .P Use named functions\. They make stack traces easier to follow\. .P Use \fBCAPS_SNAKE_CASE\fR for constants, things that should never change and are rarely used\. .P Use a single uppercase letter for function names where the function would normally be anonymous, but needs to call itself recursively\. It makes it clear that it's a "throwaway" function\. .SH null, undefined, false, 0 .P Boolean variables and functions should always be either \fBtrue\fR or \fBfalse\fR\|\. Don't set it to 0 unless it's supposed to be a number\. .P When something is intentionally missing or removed, set it to \fBnull\fR\|\. .P Don't set things to \fBundefined\fR\|\. Reserve that value to mean "not yet set to anything\." .P Boolean objects are verboten\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 developers .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help npm .RE �������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-config.7�����������������������������������000644 �000766 �000024 �00000057670 12455173731 024463� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-CONFIG" "7" "January 2015" "" "" .SH "NAME" \fBnpm-config\fR \- More than you probably want to know about npm configuration .SH DESCRIPTION .P npm gets its configuration values from 6 sources, in this priority: .SS Command Line Flags .P Putting \fB\-\-foo bar\fR on the command line sets the \fBfoo\fR configuration parameter to \fB"bar"\fR\|\. A \fB\-\-\fR argument tells the cli parser to stop reading flags\. A \fB\-\-flag\fR parameter that is at the \fIend\fR of the command will be given the value of \fBtrue\fR\|\. .SS Environment Variables .P Any environment variables that start with \fBnpm_config_\fR will be interpreted as a configuration parameter\. For example, putting \fBnpm_config_foo=bar\fR in your environment will set the \fBfoo\fR configuration parameter to \fBbar\fR\|\. Any environment configurations that are not given a value will be given the value of \fBtrue\fR\|\. Config values are case\-insensitive, so \fBNPM_CONFIG_FOO=bar\fR will work the same\. .SS npmrc Files .P The four relevant files are: .RS 0 .IP \(bu 2 per\-project config file (/path/to/my/project/\.npmrc) .IP \(bu 2 per\-user config file (~/\.npmrc) .IP \(bu 2 global config file ($PREFIX/npmrc) .IP \(bu 2 npm builtin config file (/path/to/npm/npmrc) .RE .P See npm help 5 npmrc for more details\. .SS Default Configs .P A set of configuration parameters that are internal to npm, and are defaults if nothing else is specified\. .SH Shorthands and Other CLI Niceties .P The following shorthands are parsed on the command\-line: .RS 0 .IP \(bu 2 \fB\-v\fR: \fB\-\-version\fR .IP \(bu 2 \fB\-h\fR, \fB\-?\fR, \fB\-\-help\fR, \fB\-H\fR: \fB\-\-usage\fR .IP \(bu 2 \fB\-s\fR, \fB\-\-silent\fR: \fB\-\-loglevel silent\fR .IP \(bu 2 \fB\-q\fR, \fB\-\-quiet\fR: \fB\-\-loglevel warn\fR .IP \(bu 2 \fB\-d\fR: \fB\-\-loglevel info\fR .IP \(bu 2 \fB\-dd\fR, \fB\-\-verbose\fR: \fB\-\-loglevel verbose\fR .IP \(bu 2 \fB\-ddd\fR: \fB\-\-loglevel silly\fR .IP \(bu 2 \fB\-g\fR: \fB\-\-global\fR .IP \(bu 2 \fB\-C\fR: \fB\-\-prefix\fR .IP \(bu 2 \fB\-l\fR: \fB\-\-long\fR .IP \(bu 2 \fB\-m\fR: \fB\-\-message\fR .IP \(bu 2 \fB\-p\fR, \fB\-\-porcelain\fR: \fB\-\-parseable\fR .IP \(bu 2 \fB\-reg\fR: \fB\-\-registry\fR .IP \(bu 2 \fB\-v\fR: \fB\-\-version\fR .IP \(bu 2 \fB\-f\fR: \fB\-\-force\fR .IP \(bu 2 \fB\-desc\fR: \fB\-\-description\fR .IP \(bu 2 \fB\-S\fR: \fB\-\-save\fR .IP \(bu 2 \fB\-D\fR: \fB\-\-save\-dev\fR .IP \(bu 2 \fB\-O\fR: \fB\-\-save\-optional\fR .IP \(bu 2 \fB\-B\fR: \fB\-\-save\-bundle\fR .IP \(bu 2 \fB\-E\fR: \fB\-\-save\-exact\fR .IP \(bu 2 \fB\-y\fR: \fB\-\-yes\fR .IP \(bu 2 \fB\-n\fR: \fB\-\-yes false\fR .IP \(bu 2 \fBll\fR and \fBla\fR commands: \fBls \-\-long\fR .RE .P If the specified configuration param resolves unambiguously to a known configuration parameter, then it is expanded to that configuration parameter\. For example: .P .RS 2 .nf npm ls \-\-par # same as: npm ls \-\-parseable .fi .RE .P If multiple single\-character shorthands are strung together, and the resulting combination is unambiguously not some other configuration param, then it is expanded to its various component pieces\. For example: .P .RS 2 .nf npm ls \-gpld # same as: npm ls \-\-global \-\-parseable \-\-long \-\-loglevel info .fi .RE .SH Per\-Package Config Settings .P When running scripts (see npm help 7 \fBnpm\-scripts\fR) the package\.json "config" keys are overwritten in the environment if there is a config param of \fB<name>[@<version>]:<key>\fR\|\. For example, if the package\.json has this: .P .RS 2 .nf { "name" : "foo" , "config" : { "port" : "8080" } , "scripts" : { "start" : "node server\.js" } } .fi .RE .P and the server\.js is this: .P .RS 2 .nf http\.createServer(\.\.\.)\.listen(process\.env\.npm_package_config_port) .fi .RE .P then the user could change the behavior by doing: .P .RS 2 .nf npm config set foo:port 80 .fi .RE .P See npm help 5 package\.json for more information\. .SH Config Settings .SS always\-auth .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Force npm to always require authentication when accessing the registry, even for \fBGET\fR requests\. .SS bin\-links .RS 0 .IP \(bu 2 Default: \fBtrue\fR .IP \(bu 2 Type: Boolean .RE .P Tells npm to create symlinks (or \fB\|\.cmd\fR shims on Windows) for package executables\. .P Set to false to have it not do this\. This can be used to work around the fact that some file systems don't support symlinks, even on ostensibly Unix systems\. .SS browser .RS 0 .IP \(bu 2 Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR .IP \(bu 2 Type: String .RE .P The browser that is called by the \fBnpm docs\fR command to open websites\. .SS ca .RS 0 .IP \(bu 2 Default: The npm CA certificate .IP \(bu 2 Type: String, Array or null .RE .P The Certificate Authority signing certificate that is trusted for SSL connections to the registry\. Values should be in PEM format with newlines replaced by the string "\\n"\. For example: .P .RS 2 .nf ca="\-\-\-\-\-BEGIN CERTIFICATE\-\-\-\-\-\\nXXXX\\nXXXX\\n\-\-\-\-\-END CERTIFICATE\-\-\-\-\-" .fi .RE .P Set to \fBnull\fR to only allow "known" registrars, or to a specific CA cert to trust only that specific signing authority\. .P Multiple CAs can be trusted by specifying an array of certificates: .P .RS 2 .nf ca[]="\.\.\." ca[]="\.\.\." .fi .RE .P See also the \fBstrict\-ssl\fR config\. .SS cafile .RS 0 .IP \(bu 2 Default: \fBnull\fR .IP \(bu 2 Type: path .RE .P A path to a file containing one or multiple Certificate Authority signing certificates\. Similar to the \fBca\fR setting, but allows for multiple CA's, as well as for the CA information to be stored in a file on disk\. .SS cache .RS 0 .IP \(bu 2 Default: Windows: \fB%AppData%\\npm\-cache\fR, Posix: \fB~/\.npm\fR .IP \(bu 2 Type: path .RE .P The location of npm's cache directory\. See npm help \fBnpm\-cache\fR .SS cache\-lock\-stale .RS 0 .IP \(bu 2 Default: 60000 (1 minute) .IP \(bu 2 Type: Number .RE .P The number of ms before cache folder lockfiles are considered stale\. .SS cache\-lock\-retries .RS 0 .IP \(bu 2 Default: 10 .IP \(bu 2 Type: Number .RE .P Number of times to retry to acquire a lock on cache folder lockfiles\. .SS cache\-lock\-wait .RS 0 .IP \(bu 2 Default: 10000 (10 seconds) .IP \(bu 2 Type: Number .RE .P Number of ms to wait for cache lock files to expire\. .SS cache\-max .RS 0 .IP \(bu 2 Default: Infinity .IP \(bu 2 Type: Number .RE .P The maximum time (in seconds) to keep items in the registry cache before re\-checking against the registry\. .P Note that no purging is done unless the \fBnpm cache clean\fR command is explicitly used, and that only GET requests use the cache\. .SS cache\-min .RS 0 .IP \(bu 2 Default: 10 .IP \(bu 2 Type: Number .RE .P The minimum time (in seconds) to keep items in the registry cache before re\-checking against the registry\. .P Note that no purging is done unless the \fBnpm cache clean\fR command is explicitly used, and that only GET requests use the cache\. .SS cert .RS 0 .IP \(bu 2 Default: \fBnull\fR .IP \(bu 2 Type: String .RE .P A client certificate to pass when accessing the registry\. .SS color .RS 0 .IP \(bu 2 Default: true on Posix, false on Windows .IP \(bu 2 Type: Boolean or \fB"always"\fR .RE .P If false, never shows colors\. If \fB"always"\fR then always shows colors\. If true, then only prints color codes for tty file descriptors\. .SS depth .RS 0 .IP \(bu 2 Default: Infinity .IP \(bu 2 Type: Number .RE .P The depth to go when recursing directories for \fBnpm ls\fR and \fBnpm cache ls\fR\|\. .SS description .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P Show the description in \fBnpm search\fR .SS dev .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Install \fBdev\-dependencies\fR along with packages\. .P Note that \fBdev\-dependencies\fR are also installed if the \fBnpat\fR flag is set\. .SS editor .RS 0 .IP \(bu 2 Default: \fBEDITOR\fR environment variable if set, or \fB"vi"\fR on Posix, or \fB"notepad"\fR on Windows\. .IP \(bu 2 Type: path .RE .P The command to run for \fBnpm edit\fR or \fBnpm config edit\fR\|\. .SS engine\-strict .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P If set to true, then npm will stubbornly refuse to install (or even consider installing) any package that claims to not be compatible with the current Node\.js version\. .SS force .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Makes various commands more forceful\. .RS 0 .IP \(bu 2 lifecycle script failure does not block progress\. .IP \(bu 2 publishing clobbers previously published versions\. .IP \(bu 2 skips cache when requesting from the registry\. .IP \(bu 2 prevents checks against clobbering non\-npm files\. .RE .SS fetch\-retries .RS 0 .IP \(bu 2 Default: 2 .IP \(bu 2 Type: Number .RE .P The "retries" config for the \fBretry\fR module to use when fetching packages from the registry\. .SS fetch\-retry\-factor .RS 0 .IP \(bu 2 Default: 10 .IP \(bu 2 Type: Number .RE .P The "factor" config for the \fBretry\fR module to use when fetching packages\. .SS fetch\-retry\-mintimeout .RS 0 .IP \(bu 2 Default: 10000 (10 seconds) .IP \(bu 2 Type: Number .RE .P The "minTimeout" config for the \fBretry\fR module to use when fetching packages\. .SS fetch\-retry\-maxtimeout .RS 0 .IP \(bu 2 Default: 60000 (1 minute) .IP \(bu 2 Type: Number .RE .P The "maxTimeout" config for the \fBretry\fR module to use when fetching packages\. .SS git .RS 0 .IP \(bu 2 Default: \fB"git"\fR .IP \(bu 2 Type: String .RE .P The command to use for git commands\. If git is installed on the computer, but is not in the \fBPATH\fR, then set this to the full path to the git binary\. .SS git\-tag\-version .RS 0 .IP \(bu 2 Default: \fBtrue\fR .IP \(bu 2 Type: Boolean .RE .P Tag the commit when using the \fBnpm version\fR command\. .SS global .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Operates in "global" mode, so that packages are installed into the \fBprefix\fR folder instead of the current working directory\. See npm help 5 \fBnpm\-folders\fR for more on the differences in behavior\. .RS 0 .IP \(bu 2 packages are installed into the \fB{prefix}/lib/node_modules\fR folder, instead of the current working directory\. .IP \(bu 2 bin files are linked to \fB{prefix}/bin\fR .IP \(bu 2 man pages are linked to \fB{prefix}/share/man\fR .RE .SS globalconfig .RS 0 .IP \(bu 2 Default: {prefix}/etc/npmrc .IP \(bu 2 Type: path .RE .P The config file to read for global config options\. .SS group .RS 0 .IP \(bu 2 Default: GID of the current process .IP \(bu 2 Type: String or Number .RE .P The group to use when running package scripts in global mode as the root user\. .SS heading .RS 0 .IP \(bu 2 Default: \fB"npm"\fR .IP \(bu 2 Type: String .RE .P The string that starts all the debugging log output\. .SS https\-proxy .RS 0 .IP \(bu 2 Default: null .IP \(bu 2 Type: url .RE .P A proxy to use for outgoing https requests\. If the \fBHTTPS_PROXY\fR or \fBhttps_proxy\fR or \fBHTTP_PROXY\fR or \fBhttp_proxy\fR environment variables are set, proxy settings will be honored by the underlying \fBrequest\fR library\. .SS ignore\-scripts .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P If true, npm does not run scripts specified in package\.json files\. .SS init\-module .RS 0 .IP \(bu 2 Default: ~/\.npm\-init\.js .IP \(bu 2 Type: path .RE .P A module that will be loaded by the \fBnpm init\fR command\. See the documentation for the init\-package\-json \fIhttps://github\.com/isaacs/init\-package\-json\fR module for more information, or npm help init\. .SS init\-author\-name .RS 0 .IP \(bu 2 Default: "" .IP \(bu 2 Type: String .RE .P The value \fBnpm init\fR should use by default for the package author's name\. .SS init\-author\-email .RS 0 .IP \(bu 2 Default: "" .IP \(bu 2 Type: String .RE .P The value \fBnpm init\fR should use by default for the package author's email\. .SS init\-author\-url .RS 0 .IP \(bu 2 Default: "" .IP \(bu 2 Type: String .RE .P The value \fBnpm init\fR should use by default for the package author's homepage\. .SS init\-license .RS 0 .IP \(bu 2 Default: "ISC" .IP \(bu 2 Type: String .RE .P The value \fBnpm init\fR should use by default for the package license\. .SS init\-version .RS 0 .IP \(bu 2 Default: "0\.0\.0" .IP \(bu 2 Type: semver .RE .P The value that \fBnpm init\fR should use by default for the package version number, if not already set in package\.json\. .SS json .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Whether or not to output JSON data, rather than the normal output\. .P This feature is currently experimental, and the output data structures for many commands is either not implemented in JSON yet, or subject to change\. Only the output from \fBnpm ls \-\-json\fR is currently valid\. .SS key .RS 0 .IP \(bu 2 Default: \fBnull\fR .IP \(bu 2 Type: String .RE .P A client key to pass when accessing the registry\. .SS link .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P If true, then local installs will link if there is a suitable globally installed package\. .P Note that this means that local installs can cause things to be installed into the global space at the same time\. The link is only done if one of the two conditions are met: .RS 0 .IP \(bu 2 The package is not already installed globally, or .IP \(bu 2 the globally installed version is identical to the version that is being installed locally\. .RE .SS local\-address .RS 0 .IP \(bu 2 Default: undefined .IP \(bu 2 Type: IP Address .RE .P The IP address of the local interface to use when making connections to the npm registry\. Must be IPv4 in versions of Node prior to 0\.12\. .SS loglevel .RS 0 .IP \(bu 2 Default: "warn" .IP \(bu 2 Type: String .IP \(bu 2 Values: "silent", "error", "warn", "http", "info", "verbose", "silly" .RE .P What level of logs to report\. On failure, \fIall\fR logs are written to \fBnpm\-debug\.log\fR in the current working directory\. .P Any logs of a higher level than the setting are shown\. The default is "warn", which shows warn and error output\. .SS logstream .RS 0 .IP \(bu 2 Default: process\.stderr .IP \(bu 2 Type: Stream .RE .P This is the stream that is passed to the npmlog \fIhttps://github\.com/npm/npmlog\fR module at run time\. .P It cannot be set from the command line, but if you are using npm programmatically, you may wish to send logs to somewhere other than stderr\. .P If the \fBcolor\fR config is set to true, then this stream will receive colored output if it is a TTY\. .SS long .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show extended information in \fBnpm ls\fR and \fBnpm search\fR\|\. .SS message .RS 0 .IP \(bu 2 Default: "%s" .IP \(bu 2 Type: String .RE .P Commit message which is used by \fBnpm version\fR when creating version commit\. .P Any "%s" in the message will be replaced with the version number\. .SS node\-version .RS 0 .IP \(bu 2 Default: process\.version .IP \(bu 2 Type: semver or false .RE .P The node version to use when checking a package's \fBengines\fR map\. .SS npat .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Run tests on installation\. .SS onload\-script .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: path .RE .P A node module to \fBrequire()\fR when npm loads\. Useful for programmatic usage\. .SS optional .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P Attempt to install packages in the \fBoptionalDependencies\fR object\. Note that if these packages fail to install, the overall installation process is not aborted\. .SS parseable .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Output parseable results from commands that write to standard output\. .SS prefix .RS 0 .IP \(bu 2 Default: see npm help 5 folders .IP \(bu 2 Type: path .RE .P The location to install global items\. If set on the command line, then it forces non\-global commands to run in the specified folder\. .SS production .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Set to true to run in "production" mode\. .RS 0 .IP 1. 3 devDependencies are not installed at the topmost level when running local \fBnpm install\fR without any arguments\. .IP 2. 3 Set the NODE_ENV="production" for lifecycle scripts\. .RE .SS proprietary\-attribs .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P Whether or not to include proprietary extended attributes in the tarballs created by npm\. .P Unless you are expecting to unpack package tarballs with something other than npm \-\- particularly a very outdated tar implementation \-\- leave this as true\. .SS proxy .RS 0 .IP \(bu 2 Default: null .IP \(bu 2 Type: url .RE .P A proxy to use for outgoing http requests\. If the \fBHTTP_PROXY\fR or \fBhttp_proxy\fR environment variables are set, proxy settings will be honored by the underlying \fBrequest\fR library\. .SS rebuild\-bundle .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P Rebuild bundled dependencies after installation\. .SS registry .RS 0 .IP \(bu 2 Default: https://registry\.npmjs\.org/ .IP \(bu 2 Type: url .RE .P The base URL of the npm package registry\. .SS rollback .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P Remove failed installs\. .SS save .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Save installed packages to a package\.json file as dependencies\. .P When used with the \fBnpm rm\fR command, it removes it from the \fBdependencies\fR object\. .P Only works if there is already a package\.json file present\. .SS save\-bundle .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P If a package would be saved at install time by the use of \fB\-\-save\fR, \fB\-\-save\-dev\fR, or \fB\-\-save\-optional\fR, then also put it in the \fBbundleDependencies\fR list\. .P When used with the \fBnpm rm\fR command, it removes it from the bundledDependencies list\. .SS save\-dev .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Save installed packages to a package\.json file as \fBdevDependencies\fR\|\. .P When used with the \fBnpm rm\fR command, it removes it from the \fBdevDependencies\fR object\. .P Only works if there is already a package\.json file present\. .SS save\-exact .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Dependencies saved to package\.json using \fB\-\-save\fR, \fB\-\-save\-dev\fR or \fB\-\-save\-optional\fR will be configured with an exact version rather than using npm's default semver range operator\. .SS save\-optional .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Save installed packages to a package\.json file as optionalDependencies\. .P When used with the \fBnpm rm\fR command, it removes it from the \fBdevDependencies\fR object\. .P Only works if there is already a package\.json file present\. .SS save\-prefix .RS 0 .IP \(bu 2 Default: '^' .IP \(bu 2 Type: String .RE .P Configure how versions of packages installed to a package\.json file via \fB\-\-save\fR or \fB\-\-save\-dev\fR get prefixed\. .P For example if a package has version \fB1\.2\.3\fR, by default it's version is set to \fB^1\.2\.3\fR which allows minor upgrades for that package, but after \fBnpm config set save\-prefix='~'\fR it would be set to \fB~1\.2\.3\fR which only allows patch upgrades\. .SS scope .RS 0 .IP \(bu 2 Default: "" .IP \(bu 2 Type: String .RE .P Associate an operation with a scope for a scoped registry\. Useful when logging in to a private registry for the first time: \fBnpm login \-\-scope=@organization \-\-registry=registry\.organization\.com\fR, which will cause \fB@organization\fR to be mapped to the registry for future installation of packages specified according to the pattern \fB@organization/package\fR\|\. .SS searchopts .RS 0 .IP \(bu 2 Default: "" .IP \(bu 2 Type: String .RE .P Space\-separated options that are always passed to search\. .SS searchexclude .RS 0 .IP \(bu 2 Default: "" .IP \(bu 2 Type: String .RE .P Space\-separated options that limit the results from search\. .SS searchsort .RS 0 .IP \(bu 2 Default: "name" .IP \(bu 2 Type: String .IP \(bu 2 Values: "name", "\-name", "date", "\-date", "description", "\-description", "keywords", "\-keywords" .RE .P Indication of which field to sort search results by\. Prefix with a \fB\-\fR character to indicate reverse sort\. .SS shell .RS 0 .IP \(bu 2 Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows .IP \(bu 2 Type: path .RE .P The shell to run for the \fBnpm explore\fR command\. .SS shrinkwrap .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P If set to false, then ignore \fBnpm\-shrinkwrap\.json\fR files when installing\. .SS sign\-git\-tag .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P If set to true, then the \fBnpm version\fR command will tag the version using \fB\-s\fR to add a signature\. .P Note that git requires you to have set up GPG keys in your git configs for this to work properly\. .SS spin .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean or \fB"always"\fR .RE .P When set to \fBtrue\fR, npm will display an ascii spinner while it is doing things, if \fBprocess\.stderr\fR is a TTY\. .P Set to \fBfalse\fR to suppress the spinner, or set to \fBalways\fR to output the spinner even for non\-TTY outputs\. .SS strict\-ssl .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P Whether or not to do SSL key validation when making requests to the registry via https\. .P See also the \fBca\fR config\. .SS tag .RS 0 .IP \(bu 2 Default: latest .IP \(bu 2 Type: String .RE .P If you ask npm to install a package and don't tell it a specific version, then it will install the specified tag\. .P Also the tag that is added to the package@version specified by the \fBnpm tag\fR command, if no explicit tag is given\. .SS tmp .RS 0 .IP \(bu 2 Default: TMPDIR environment variable, or "/tmp" .IP \(bu 2 Type: path .RE .P Where to store temporary files and folders\. All temp files are deleted on success, but left behind on failure for forensic purposes\. .SS unicode .RS 0 .IP \(bu 2 Default: true .IP \(bu 2 Type: Boolean .RE .P When set to true, npm uses unicode characters in the tree output\. When false, it uses ascii characters to draw trees\. .SS unsafe\-perm .RS 0 .IP \(bu 2 Default: false if running as root, true otherwise .IP \(bu 2 Type: Boolean .RE .P Set to true to suppress the UID/GID switching when running package scripts\. If set explicitly to false, then installing as a non\-root user will fail\. .SS usage .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Set to show short usage output (like the \-H output) instead of complete help when doing npm help \fBnpm\-help\fR\|\. .SS user .RS 0 .IP \(bu 2 Default: "nobody" .IP \(bu 2 Type: String or Number .RE .P The UID to set to when running package scripts as root\. .SS userconfig .RS 0 .IP \(bu 2 Default: ~/\.npmrc .IP \(bu 2 Type: path .RE .P The location of user\-level configuration settings\. .SS umask .RS 0 .IP \(bu 2 Default: 022 .IP \(bu 2 Type: Octal numeric string .RE .P The "umask" value to use when setting the file creation mode on files and folders\. .P Folders and executables are given a mode which is \fB0777\fR masked against this value\. Other files are given a mode which is \fB0666\fR masked against this value\. Thus, the defaults are \fB0755\fR and \fB0644\fR respectively\. .SS user\-agent .RS 0 .IP \(bu 2 Default: node/{process\.version} {process\.platform} {process\.arch} .IP \(bu 2 Type: String .RE .P Sets a User\-Agent to the request header .SS version .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: boolean .RE .P If true, output the npm version and exit successfully\. .P Only relevant when specified explicitly on the command line\. .SS versions .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: boolean .RE .P If true, output the npm version as well as node's \fBprocess\.versions\fR map, and exit successfully\. .P Only relevant when specified explicitly on the command line\. .SS viewer .RS 0 .IP \(bu 2 Default: "man" on Posix, "browser" on Windows .IP \(bu 2 Type: path .RE .P The program to use to view help content\. .P Set to \fB"browser"\fR to view html help content in the default web browser\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help npm .RE ������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-developers.7�������������������������������000644 �000766 �000024 �00000016661 12455173731 025361� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DEVELOPERS" "7" "January 2015" "" "" .SH "NAME" \fBnpm-developers\fR \- Developer Guide .SH DESCRIPTION .P So, you've decided to use npm to develop (and maybe publish/deploy) your project\. .P Fantastic! .P There are a few things that you need to do above the simple steps that your users will do to install your program\. .SH About These Documents .P These are man pages\. If you install npm, you should be able to then do \fBman npm\-thing\fR to get the documentation on a particular topic, or \fBnpm help thing\fR to see the same information\. .SH What is a \fBpackage\fR .P A package is: .RS 0 .IP \(bu 2 a) a folder containing a program described by a package\.json file .IP \(bu 2 b) a gzipped tarball containing (a) .IP \(bu 2 c) a url that resolves to (b) .IP \(bu 2 d) a \fB<name>@<version>\fR that is published on the registry with (c) .IP \(bu 2 e) a \fB<name>@<tag>\fR that points to (d) .IP \(bu 2 f) a \fB<name>\fR that has a "latest" tag satisfying (e) .IP \(bu 2 g) a \fBgit\fR url that, when cloned, results in (a)\. .RE .P Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b)\. .P Git urls can be of the form: .P .RS 2 .nf git://github\.com/user/project\.git#commit\-ish git+ssh://user@hostname:project\.git#commit\-ish git+http://user@hostname/project/blah\.git#commit\-ish git+https://user@hostname/project/blah\.git#commit\-ish .fi .RE .P The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as an argument to \fBgit checkout\fR\|\. The default is \fBmaster\fR\|\. .SH The package\.json File .P You need to have a \fBpackage\.json\fR file in the root of your project to do much of anything with npm\. That is basically the whole interface\. .P See npm help 5 \fBpackage\.json\fR for details about what goes in that file\. At the very least, you need: .RS 0 .IP \(bu 2 name: This should be a string that identifies your project\. Please do not use the name to specify that it runs on node, or is in JavaScript\. You can use the "engines" field to explicitly state the versions of node (or whatever else) that your program requires, and it's pretty well assumed that it's javascript\. It does not necessarily need to match your github repository name\. So, \fBnode\-foo\fR and \fBbar\-js\fR are bad names\. \fBfoo\fR or \fBbar\fR are better\. .IP \(bu 2 version: A semver\-compatible version\. .IP \(bu 2 engines: Specify the versions of node (or whatever else) that your program runs on\. The node API changes a lot, and there may be bugs or new functionality that you depend on\. Be explicit\. .IP \(bu 2 author: Take some credit\. .IP \(bu 2 scripts: If you have a special compilation or installation script, then you should put it in the \fBscripts\fR object\. You should definitely have at least a basic smoke\-test command as the "scripts\.test" field\. See npm help 7 scripts\. .IP \(bu 2 main: If you have a single module that serves as the entry point to your program (like what the "foo" package gives you at require("foo")), then you need to specify that in the "main" field\. .IP \(bu 2 directories: This is an object mapping names to folders\. The best ones to include are "lib" and "doc", but if you use "man" to specify a folder full of man pages, they'll get installed just like these ones\. .RE .P You can use \fBnpm init\fR in the root of your package in order to get you started with a pretty basic package\.json file\. See npm help \fBnpm\-init\fR for more info\. .SH Keeping files \fIout\fR of your package .P Use a \fB\|\.npmignore\fR file to keep stuff out of your package\. If there's no \fB\|\.npmignore\fR file, but there \fIis\fR a \fB\|\.gitignore\fR file, then npm will ignore the stuff matched by the \fB\|\.gitignore\fR file\. If you \fIwant\fR to include something that is excluded by your \fB\|\.gitignore\fR file, you can create an empty \fB\|\.npmignore\fR file to override it\. .P \fB\|\.npmignore\fR files follow the same pattern rules \fIhttp://git\-scm\.com/book/en/v2/Git\-Basics\-Recording\-Changes\-to\-the\-Repository#Ignoring\-Files\fR as \fB\|\.gitignore\fR files: .RS 0 .IP \(bu 2 Blank lines or lines starting with \fB#\fR are ignored\. .IP \(bu 2 Standard glob patterns work\. .IP \(bu 2 You can end patterns with a forward slash \fB/\fR to specify a directory\. .IP \(bu 2 You can negate a pattern by starting it with an exclamation point \fB!\fR\|\. .RE .P By default, the following paths and files are ignored, so there's no need to add them to \fB\|\.npmignore\fR explicitly: .RS 0 .IP \(bu 2 \fB\|\.*\.swp\fR .IP \(bu 2 \fB\|\._*\fR .IP \(bu 2 \fB\|\.DS_Store\fR .IP \(bu 2 \fB\|\.git\fR .IP \(bu 2 \fB\|\.hg\fR .IP \(bu 2 \fB\|\.lock\-wscript\fR .IP \(bu 2 \fB\|\.svn\fR .IP \(bu 2 \fB\|\.wafpickle\-*\fR .IP \(bu 2 \fBCVS\fR .IP \(bu 2 \fBnpm\-debug\.log\fR .RE .P Additionally, everything in \fBnode_modules\fR is ignored, except for bundled dependencies\. npm automatically handles this for you, so don't bother adding \fBnode_modules\fR to \fB\|\.npmignore\fR\|\. .P The following paths and files are never ignored, so adding them to \fB\|\.npmignore\fR is pointless: .RS 0 .IP \(bu 2 \fBpackage\.json\fR .IP \(bu 2 \fBREADME\.*\fR .RE .SH Link Packages .P \fBnpm link\fR is designed to install a development package and see the changes in real time without having to keep re\-installing it\. (You do need to either re\-link or \fBnpm rebuild \-g\fR to update compiled packages, of course\.) .P More info at npm help \fBnpm\-link\fR\|\. .SH Before Publishing: Make Sure Your Package Installs and Works .P \fBThis is important\.\fR .P If you can not install it locally, you'll have problems trying to publish it\. Or, worse yet, you'll be able to publish it, but you'll be publishing a broken or pointless package\. So don't do that\. .P In the root of your package, do this: .P .RS 2 .nf npm install \. \-g .fi .RE .P That'll show you that it's working\. If you'd rather just create a symlink package that points to your working directory, then do this: .P .RS 2 .nf npm link .fi .RE .P Use \fBnpm ls \-g\fR to see if it's there\. .P To test a local install, go into some other folder, and then do: .P .RS 2 .nf cd \.\./some\-other\-folder npm install \.\./my\-package .fi .RE .P to install it locally into the node_modules folder in that other place\. .P Then go into the node\-repl, and try using require("my\-thing") to bring in your module's main module\. .SH Create a User Account .P Create a user with the adduser command\. It works like this: .P .RS 2 .nf npm adduser .fi .RE .P and then follow the prompts\. .P This is documented better in npm help adduser\. .SH Publish your package .P This part's easy\. IN the root of your folder, do this: .P .RS 2 .nf npm publish .fi .RE .P You can give publish a url to a tarball, or a filename of a tarball, or a path to a folder\. .P Note that pretty much \fBeverything in that folder will be exposed\fR by default\. So, if you have secret stuff in there, use a \fB\|\.npmignore\fR file to list out the globs to ignore, or publish from a fresh checkout\. .SH Brag about it .P Send emails, write blogs, blab in IRC\. .P Tell the world how easy it is to install your program! .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help npm .IP \(bu 2 npm help init .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help publish .IP \(bu 2 npm help adduser .IP \(bu 2 npm help 7 registry .RE �������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-disputes.7���������������������������������000644 �000766 �000024 �00000011327 12455173731 025043� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DISPUTES" "7" "January 2015" "" "" .SH "NAME" \fBnpm-disputes\fR \- Handling Module Name Disputes .SH SYNOPSIS .RS 0 .IP 1. 3 Get the author email with \fBnpm owner ls <pkgname>\fR .IP 2. 3 Email the author, CC support@npmjs\.com .IP 3. 3 After a few weeks, if there's no resolution, we'll sort it out\. .RE .P Don't squat on package names\. Publish code or move out of the way\. .SH DESCRIPTION .P There sometimes arise cases where a user publishes a module, and then later, some other user wants to use that name\. Here are some common ways that happens (each of these is based on actual events\.) .RS 0 .IP 1. 3 Joe writes a JavaScript module \fBfoo\fR, which is not node\-specific\. Joe doesn't use node at all\. Bob wants to use \fBfoo\fR in node, so he wraps it in an npm module\. Some time later, Joe starts using node, and wants to take over management of his program\. .IP 2. 3 Bob writes an npm module \fBfoo\fR, and publishes it\. Perhaps much later, Joe finds a bug in \fBfoo\fR, and fixes it\. He sends a pull request to Bob, but Bob doesn't have the time to deal with it, because he has a new job and a new baby and is focused on his new erlang project, and kind of not involved with node any more\. Joe would like to publish a new \fBfoo\fR, but can't, because the name is taken\. .IP 3. 3 Bob writes a 10\-line flow\-control library, and calls it \fBfoo\fR, and publishes it to the npm registry\. Being a simple little thing, it never really has to be updated\. Joe works for Foo Inc, the makers of the critically acclaimed and widely\-marketed \fBfoo\fR JavaScript toolkit framework\. They publish it to npm as \fBfoojs\fR, but people are routinely confused when \fBnpm install foo\fR is some different thing\. .IP 4. 3 Bob writes a parser for the widely\-known \fBfoo\fR file format, because he needs it for work\. Then, he gets a new job, and never updates the prototype\. Later on, Joe writes a much more complete \fBfoo\fR parser, but can't publish, because Bob's \fBfoo\fR is in the way\. .RE .P The validity of Joe's claim in each situation can be debated\. However, Joe's appropriate course of action in each case is the same\. .RS 0 .IP 1. 3 \fBnpm owner ls foo\fR\|\. This will tell Joe the email address of the owner (Bob)\. .IP 2. 3 Joe emails Bob, explaining the situation \fBas respectfully as possible\fR, and what he would like to do with the module name\. He adds the npm support staff support@npmjs\.com to the CC list of the email\. Mention in the email that Bob can run \fBnpm owner add joe foo\fR to add Joe as an owner of the \fBfoo\fR package\. .IP 3. 3 After a reasonable amount of time, if Bob has not responded, or if Bob and Joe can't come to any sort of resolution, email support support@npmjs\.com and we'll sort it out\. ("Reasonable" is usually at least 4 weeks, but extra time is allowed around common holidays\.) .RE .SH REASONING .P In almost every case so far, the parties involved have been able to reach an amicable resolution without any major intervention\. Most people really do want to be reasonable, and are probably not even aware that they're in your way\. .P Module ecosystems are most vibrant and powerful when they are as self\-directed as possible\. If an admin one day deletes something you had worked on, then that is going to make most people quite upset, regardless of the justification\. When humans solve their problems by talking to other humans with respect, everyone has the chance to end up feeling good about the interaction\. .SH EXCEPTIONS .P Some things are not allowed, and will be removed without discussion if they are brought to the attention of the npm registry admins, including but not limited to: .RS 0 .IP 1. 3 Malware (that is, a package designed to exploit or harm the machine on which it is installed)\. .IP 2. 3 Violations of copyright or licenses (for example, cloning an MIT\-licensed program, and then removing or changing the copyright and license statement)\. .IP 3. 3 Illegal content\. .IP 4. 3 "Squatting" on a package name that you \fIplan\fR to use, but aren't actually using\. Sorry, I don't care how great the name is, or how perfect a fit it is for the thing that someday might happen\. If someone wants to use it today, and you're just taking up space with an empty tarball, you're going to be evicted\. .IP 5. 3 Putting empty packages in the registry\. Packages must have SOME functionality\. It can be silly, but it can't be \fInothing\fR\|\. (See also: squatting\.) .IP 6. 3 Doing weird things with the registry, like using it as your own personal application database or otherwise putting non\-packagey things into it\. .RE .P If you see bad behavior like this, please report it right away\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help owner .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-faq.7��������������������������������������000644 �000766 �000024 �00000036625 12455173731 023762� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-FAQ" "7" "January 2015" "" "" .SH "NAME" \fBnpm-faq\fR \- Frequently Asked Questions .SH Where can I find these docs in HTML? .P https://docs\.npmjs\.com/, or run: .P .RS 2 .nf npm config set viewer browser .fi .RE .P to open these documents in your default web browser rather than \fBman\fR\|\. .SH It didn't work\. .P That's not really a question\. .SH Why didn't it work? .P I don't know yet\. .P Read the error output, and if you can't figure out what it means, do what it says and post a bug with all the information it asks for\. .SH Where does npm put stuff? .P See npm help 5 \fBnpm\-folders\fR .P tl;dr: .RS 0 .IP \(bu 2 Use the \fBnpm root\fR command to see where modules go, and the \fBnpm bin\fR command to see where executables go .IP \(bu 2 Global installs are different from local installs\. If you install something with the \fB\-g\fR flag, then its executables go in \fBnpm bin \-g\fR and its modules go in \fBnpm root \-g\fR\|\. .RE .SH How do I install something on my computer in a central location? .P Install it globally by tacking \fB\-g\fR or \fB\-\-global\fR to the command\. (This is especially important for command line utilities that need to add their bins to the global system \fBPATH\fR\|\.) .SH I installed something globally, but I can't \fBrequire()\fR it .P Install it locally\. .P The global install location is a place for command\-line utilities to put their bins in the system \fBPATH\fR\|\. It's not for use with \fBrequire()\fR\|\. .P If you \fBrequire()\fR a module in your code, then that means it's a dependency, and a part of your program\. You need to install it locally in your program\. .SH Why can't npm just put everything in one place, like other package managers? .P Not every change is an improvement, but every improvement is a change\. This would be like asking git to do network IO for every commit\. It's not going to happen, because it's a terrible idea that causes more problems than it solves\. .P It is much harder to avoid dependency conflicts without nesting dependencies\. This is fundamental to the way that npm works, and has proven to be an extremely successful approach\. See npm help 5 \fBnpm\-folders\fR for more details\. .P If you want a package to be installed in one place, and have all your programs reference the same copy of it, then use the \fBnpm link\fR command\. That's what it's for\. Install it globally, then link it into each program that uses it\. .SH Whatever, I really want the old style 'everything global' style\. .P Write your own package manager\. You could probably even wrap up \fBnpm\fR in a shell script if you really wanted to\. .P npm will not help you do something that is known to be a bad idea\. .SH Should I check my \fBnode_modules\fR folder into git? .P Usually, no\. Allow npm to resolve dependencies for your packages\. .P For packages you \fBdeploy\fR, such as websites and apps, you should use npm shrinkwrap to lock down your full dependency tree: .P https://docs\.npmjs\.com/cli/shrinkwrap .P If you are paranoid about depending on the npm ecosystem, you should run a private npm mirror or a private cache\. .P If you want 100% confidence in being able to reproduce the specific bytes included in a deployment, you should use an additional mechanism that can verify contents rather than versions\. For example, Amazon machine images, DigitalOcean snapshots, Heroku slugs, or simple tarballs\. .SH Is it 'npm' or 'NPM' or 'Npm'? .P npm should never be capitalized unless it is being displayed in a location that is customarily all\-caps (such as the title of man pages\.) .SH If 'npm' is an acronym, why is it never capitalized? .P Contrary to the belief of many, "npm" is not in fact an abbreviation for "Node Package Manager"\. It is a recursive bacronymic abbreviation for "npm is not an acronym"\. (If it was "ninaa", then it would be an acronym, and thus incorrectly named\.) .P "NPM", however, \fIis\fR an acronym (more precisely, a capitonym) for the National Association of Pastoral Musicians\. You can learn more about them at http://npm\.org/\|\. .P In software, "NPM" is a Non\-Parametric Mapping utility written by Chris Rorden\. You can analyze pictures of brains with it\. Learn more about the (capitalized) NPM program at http://www\.cabiatl\.com/mricro/npm/\|\. .P The first seed that eventually grew into this flower was a bash utility named "pm", which was a shortened descendent of "pkgmakeinst", a bash function that was used to install various different things on different platforms, most often using Yahoo's \fByinst\fR\|\. If \fBnpm\fR was ever an acronym for anything, it was \fBnode pm\fR or maybe \fBnew pm\fR\|\. .P So, in all seriousness, the "npm" project is named after its command\-line utility, which was organically selected to be easily typed by a right\-handed programmer using a US QWERTY keyboard layout, ending with the right\-ring\-finger in a postition to type the \fB\-\fR key for flags and other command\-line arguments\. That command\-line utility is always lower\-case, though it starts most sentences it is a part of\. .SH How do I list installed packages? .P \fBnpm ls\fR .SH How do I search for packages? .P \fBnpm search\fR .P Arguments are greps\. \fBnpm search jsdom\fR shows jsdom packages\. .SH How do I update npm? .P .RS 2 .nf npm install npm \-g .fi .RE .P You can also update all outdated local packages by doing \fBnpm update\fR without any arguments, or global packages by doing \fBnpm update \-g\fR\|\. .P Occasionally, the version of npm will progress such that the current version cannot be properly installed with the version that you have installed already\. (Consider, if there is ever a bug in the \fBupdate\fR command\.) .P In those cases, you can do this: .P .RS 2 .nf curl https://www\.npmjs\.com/install\.sh | sh .fi .RE .SH What is a \fBpackage\fR? .P A package is: .RS 0 .IP \(bu 2 a) a folder containing a program described by a package\.json file .IP \(bu 2 b) a gzipped tarball containing (a) .IP \(bu 2 c) a url that resolves to (b) .IP \(bu 2 d) a \fB<name>@<version>\fR that is published on the registry with (c) .IP \(bu 2 e) a \fB<name>@<tag>\fR that points to (d) .IP \(bu 2 f) a \fB<name>\fR that has a "latest" tag satisfying (e) .IP \(bu 2 g) a \fBgit\fR url that, when cloned, results in (a)\. .RE .P Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b)\. .P Git urls can be of the form: .P .RS 2 .nf git://github\.com/user/project\.git#commit\-ish git+ssh://user@hostname:project\.git#commit\-ish git+http://user@hostname/project/blah\.git#commit\-ish git+https://user@hostname/project/blah\.git#commit\-ish .fi .RE .P The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as an argument to \fBgit checkout\fR\|\. The default is \fBmaster\fR\|\. .SH What is a \fBmodule\fR? .P A module is anything that can be loaded with \fBrequire()\fR in a Node\.js program\. The following things are all examples of things that can be loaded as modules: .RS 0 .IP \(bu 2 A folder with a \fBpackage\.json\fR file containing a \fBmain\fR field\. .IP \(bu 2 A folder with an \fBindex\.js\fR file in it\. .IP \(bu 2 A JavaScript file\. .RE .P Most npm packages are modules, because they are libraries that you load with \fBrequire\fR\|\. However, there's no requirement that an npm package be a module! Some only contain an executable command\-line interface, and don't provide a \fBmain\fR field for use in Node programs\. .P Almost all npm packages (at least, those that are Node programs) \fIcontain\fR many modules within them (because every file they load with \fBrequire()\fR is a module)\. .P In the context of a Node program, the \fBmodule\fR is also the thing that was loaded \fIfrom\fR a file\. For example, in the following program: .P .RS 2 .nf var req = require('request') .fi .RE .P we might say that "The variable \fBreq\fR refers to the \fBrequest\fR module"\. .SH So, why is it the "\fBnode_modules\fR" folder, but "\fBpackage\.json\fR" file? Why not \fBnode_packages\fR or \fBmodule\.json\fR? .P The \fBpackage\.json\fR file defines the package\. (See "What is a package?" above\.) .P The \fBnode_modules\fR folder is the place Node\.js looks for modules\. (See "What is a module?" above\.) .P For example, if you create a file at \fBnode_modules/foo\.js\fR and then had a program that did \fBvar f = require('foo\.js')\fR then it would load the module\. However, \fBfoo\.js\fR is not a "package" in this case, because it does not have a package\.json\. .P Alternatively, if you create a package which does not have an \fBindex\.js\fR or a \fB"main"\fR field in the \fBpackage\.json\fR file, then it is not a module\. Even if it's installed in \fBnode_modules\fR, it can't be an argument to \fBrequire()\fR\|\. .SH \fB"node_modules"\fR is the name of my deity's arch\-rival, and a Forbidden Word in my religion\. Can I configure npm to use a different folder? .P No\. This will never happen\. This question comes up sometimes, because it seems silly from the outside that npm couldn't just be configured to put stuff somewhere else, and then npm could load them from there\. It's an arbitrary spelling choice, right? What's the big deal? .P At the time of this writing, the string \fB\|'node_modules'\fR appears 151 times in 53 separate files in npm and node core (excluding tests and documentation)\. .P Some of these references are in node's built\-in module loader\. Since npm is not involved \fBat all\fR at run\-time, node itself would have to be configured to know where you've decided to stick stuff\. Complexity hurdle #1\. Since the Node module system is locked, this cannot be changed, and is enough to kill this request\. But I'll continue, in deference to your deity's delicate feelings regarding spelling\. .P Many of the others are in dependencies that npm uses, which are not necessarily tightly coupled to npm (in the sense that they do not read npm's configuration files, etc\.) Each of these would have to be configured to take the name of the \fBnode_modules\fR folder as a parameter\. Complexity hurdle #2\. .P Furthermore, npm has the ability to "bundle" dependencies by adding the dep names to the \fB"bundledDependencies"\fR list in package\.json, which causes the folder to be included in the package tarball\. What if the author of a module bundles its dependencies, and they use a different spelling for \fBnode_modules\fR? npm would have to rename the folder at publish time, and then be smart enough to unpack it using your locally configured name\. Complexity hurdle #3\. .P Furthermore, what happens when you \fIchange\fR this name? Fine, it's easy enough the first time, just rename the \fBnode_modules\fR folders to \fB\|\./blergyblerp/\fR or whatever name you choose\. But what about when you change it again? npm doesn't currently track any state about past configuration settings, so this would be rather difficult to do properly\. It would have to track every previous value for this config, and always accept any of them, or else yesterday's install may be broken tomorrow\. Complexity hurdle #4\. .P Never going to happen\. The folder is named \fBnode_modules\fR\|\. It is written indelibly in the Node Way, handed down from the ancient times of Node 0\.3\. .SH How do I install node with npm? .P You don't\. Try one of these node version managers: .P Unix: .RS 0 .IP \(bu 2 http://github\.com/isaacs/nave .IP \(bu 2 http://github\.com/visionmedia/n .IP \(bu 2 http://github\.com/creationix/nvm .RE .P Windows: .RS 0 .IP \(bu 2 http://github\.com/marcelklehr/nodist .IP \(bu 2 https://github\.com/hakobera/nvmw .IP \(bu 2 https://github\.com/nanjingboy/nvmw .RE .SH How can I use npm for development? .P See npm help 7 \fBnpm\-developers\fR and npm help 5 \fBpackage\.json\fR\|\. .P You'll most likely want to \fBnpm link\fR your development folder\. That's awesomely handy\. .P To set up your own private registry, check out npm help 7 \fBnpm\-registry\fR\|\. .SH Can I list a url as a dependency? .P Yes\. It should be a url to a gzipped tarball containing a single folder that has a package\.json in its root, or a git url\. (See "what is a package?" above\.) .SH How do I symlink to a dev folder so I don't have to keep re\-installing? .P See npm help \fBnpm\-link\fR .SH The package registry website\. What is that exactly? .P See npm help 7 \fBnpm\-registry\fR\|\. .SH I forgot my password, and can't publish\. How do I reset it? .P Go to https://npmjs\.com/forgot\|\. .SH I get ECONNREFUSED a lot\. What's up? .P Either the registry is down, or node's DNS isn't able to reach out\. .P To check if the registry is down, open up https://registry\.npmjs\.org/ in a web browser\. This will also tell you if you are just unable to access the internet for some reason\. .P If the registry IS down, let us know by emailing support@npmjs\.com or posting an issue at https://github\.com/npm/npm/issues\|\. If it's down for the world (and not just on your local network) then we're probably already being pinged about it\. .P You can also often get a faster response by visiting the #npm channel on Freenode IRC\. .SH Why no namespaces? .P npm has only one global namespace\. If you want to namespace your own packages, you may: simply use the \fB\-\fR character to separate the names\. npm is a mostly anarchic system\. There is not sufficient need to impose namespace rules on everyone\. .P As of 2\.0, npm supports scoped packages, which allow you to publish a group of related modules without worrying about name collisions\. .P Every npm user owns the scope associated with their username\. For example, the user named \fBnpm\fR owns the scope \fB@npm\fR\|\. Scoped packages are published inside a scope by naming them as if they were files under the scope directory, e\.g\., by setting \fBname\fR in \fBpackage\.json\fR to \fB@npm/npm\fR\|\. .P Scoped packages can coexist with public npm packages in a private npm registry\. At present (2014\-11\-04) scoped packages may NOT be published to the public npm registry\. .P Unscoped packages can only depend on other unscoped packages\. Scoped packages can depend on packages from their own scope, a different scope, or the public registry (unscoped)\. .P For the current documentation of scoped packages, see https://docs\.npmjs\.com/misc/scope .P References: .RS 0 .IP 1. 3 For the reasoning behind the "one global namespace", please see this discussion: https://github\.com/npm/npm/issues/798 (TL;DR: It doesn't actually make things better, and can make them worse\.) .IP 2. 3 For the pre\-implementation discussion of the scoped package feature, see this discussion: https://github\.com/npm/npm/issues/5239 .RE .SH Who does npm? .P npm was originally written by Isaac Z\. Schlueter, and many others have contributed to it, some of them quite substantially\. .P The npm open source project, The npm Registry, and the community website \fIhttps://www\.npmjs\.com\fR are maintained and operated by the good folks at npm, Inc\. \fIhttp://www\.npmjs\.com\fR .SH I have a question or request not addressed here\. Where should I put it? .P Post an issue on the github project: .RS 0 .IP \(bu 2 https://github\.com/npm/npm/issues .RE .SH Why does npm hate me? .P npm is not capable of hatred\. It loves everyone, especially you\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help npm .IP \(bu 2 npm help 7 developers .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 folders .RE �����������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-index.7������������������������������������000644 �000766 �000024 �00000012365 12455173731 024315� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-INDEX" "7" "January 2015" "" "" .SH "NAME" \fBnpm-index\fR \- Index of all npm documentation .SS npm help README .P a JavaScript package manager .SH Command Line Documentation .P Using npm on the command line .SS npm help npm .P node package manager .SS npm help adduser .P Add a registry user account .SS npm help bin .P Display npm bin folder .SS npm help bugs .P Bugs for a package in a web browser maybe .SS npm help build .P Build a package .SS npm help bundle .P REMOVED .SS npm help cache .P Manipulates packages cache .SS npm help completion .P Tab Completion for npm .SS npm help config .P Manage the npm configuration files .SS npm help dedupe .P Reduce duplication .SS npm help deprecate .P Deprecate a version of a package .SS npm help docs .P Docs for a package in a web browser maybe .SS npm help edit .P Edit an installed package .SS npm help explore .P Browse an installed package .SS npm help help\-search .P Search npm help documentation .SS npm help help .P Get help on npm .SS npm help init .P Interactively create a package\.json file .SS npm help install .P Install a package .SS npm help link .P Symlink a package folder .SS npm help ls .P List installed packages .SS npm help outdated .P Check for outdated packages .SS npm help owner .P Manage package owners .SS npm help pack .P Create a tarball from a package .SS npm help prefix .P Display prefix .SS npm help prune .P Remove extraneous packages .SS npm help publish .P Publish a package .SS npm help rebuild .P Rebuild a package .SS npm help repo .P Open package repository page in the browser .SS npm help restart .P Restart a package .SS npm help rm .P Remove a package .SS npm help root .P Display npm root .SS npm help run\-script .P Run arbitrary package scripts .SS npm help search .P Search for packages .SS npm help shrinkwrap .P Lock down dependency versions .SS npm help star .P Mark your favorite packages .SS npm help stars .P View packages marked as favorites .SS npm help start .P Start a package .SS npm help stop .P Stop a package .SS npm help tag .P Tag a published version .SS npm help test .P Test a package .SS npm help uninstall .P Remove a package .SS npm help unpublish .P Remove a package from the registry .SS npm help update .P Update a package .SS npm help version .P Bump a package version .SS npm help view .P View registry info .SS npm help whoami .P Display npm username .SH API Documentation .P Using npm in your Node programs .SS npm apihelp npm .P node package manager .SS npm apihelp bin .P Display npm bin folder .SS npm apihelp bugs .P Bugs for a package in a web browser maybe .SS npm apihelp cache .P manage the npm cache programmatically .SS npm apihelp commands .P npm commands .SS npm apihelp config .P Manage the npm configuration files .SS npm apihelp deprecate .P Deprecate a version of a package .SS npm apihelp docs .P Docs for a package in a web browser maybe .SS npm apihelp edit .P Edit an installed package .SS npm apihelp explore .P Browse an installed package .SS npm apihelp help\-search .P Search the help pages .SS npm apihelp init .P Interactively create a package\.json file .SS npm apihelp install .P install a package programmatically .SS npm apihelp link .P Symlink a package folder .SS npm apihelp load .P Load config settings .SS npm apihelp ls .P List installed packages .SS npm apihelp outdated .P Check for outdated packages .SS npm apihelp owner .P Manage package owners .SS npm apihelp pack .P Create a tarball from a package .SS npm apihelp prefix .P Display prefix .SS npm apihelp prune .P Remove extraneous packages .SS npm apihelp publish .P Publish a package .SS npm apihelp rebuild .P Rebuild a package .SS npm apihelp repo .P Open package repository page in the browser .SS npm apihelp restart .P Restart a package .SS npm apihelp root .P Display npm root .SS npm apihelp run\-script .P Run arbitrary package scripts .SS npm apihelp search .P Search for packages .SS npm apihelp shrinkwrap .P programmatically generate package shrinkwrap file .SS npm apihelp start .P Start a package .SS npm apihelp stop .P Stop a package .SS npm apihelp tag .P Tag a published version .SS npm apihelp test .P Test a package .SS npm apihelp uninstall .P uninstall a package programmatically .SS npm apihelp unpublish .P Remove a package from the registry .SS npm apihelp update .P Update a package .SS npm apihelp version .P Bump a package version .SS npm apihelp view .P View registry info .SS npm apihelp whoami .P Display npm username .SH Files .P File system structures npm uses .SS npm help 5 folders .P Folder Structures Used by npm .SS npm help 5 npmrc .P The npm config files .SS npm help 5 package\.json .P Specifics of npm's package\.json handling .SH Misc .P Various other bits and bobs .SS npm help 7 coding\-style .P npm's "funny" coding style .SS npm help 7 config .P More than you probably want to know about npm configuration .SS npm help 7 developers .P Developer Guide .SS npm help 7 disputes .P Handling Module Name Disputes .SS npm help 7 faq .P Frequently Asked Questions .SS npm help 7 index .P Index of all npm documentation .SS npm help 7 registry .P The JavaScript Package Registry .SS npm help 7 scope .P Scoped packages .SS npm help 7 scripts .P How npm handles the "scripts" field .SS npm help 7 removing\-npm .P Cleaning the Slate .SS npm help 7 semver .P The semantic versioner for npm ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-registry.7���������������������������������000644 �000766 �000024 �00000005301 12455173731 025046� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-REGISTRY" "7" "January 2015" "" "" .SH "NAME" \fBnpm-registry\fR \- The JavaScript Package Registry .SH DESCRIPTION .P To resolve packages by name and version, npm talks to a registry website that implements the CommonJS Package Registry specification for reading package info\. .P Additionally, npm's package registry implementation supports several write APIs as well, to allow for publishing packages and managing user account information\. .P The official public npm registry is at http://registry\.npmjs\.org/\|\. It is powered by a CouchDB database, of which there is a public mirror at http://skimdb\.npmjs\.com/registry\|\. The code for the couchapp is available at http://github\.com/npm/npm\-registry\-couchapp\|\. .P The registry URL used is determined by the scope of the package (see npm help 7 \fBnpm\-scope\fR)\. If no scope is specified, the default registry is used, which is supplied by the \fBregistry\fR config parameter\. See npm help \fBnpm\-config\fR, npm help 5 \fBnpmrc\fR, and npm help 7 \fBnpm\-config\fR for more on managing npm's configuration\. .SH Can I run my own private registry? .P Yes! .P The easiest way is to replicate the couch database, and use the same (or similar) design doc to implement the APIs\. .P If you set up continuous replication from the official CouchDB, and then set your internal CouchDB as the registry config, then you'll be able to read any published packages, in addition to your private ones, and by default will only publish internally\. If you then want to publish a package for the whole world to see, you can simply override the \fB\-\-registry\fR config for that command\. .SH I don't want my package published in the official registry\. It's private\. .P Set \fB"private": true\fR in your package\.json to prevent it from being published at all, or \fB"publishConfig":{"registry":"http://my\-internal\-registry\.local"}\fR to force it to be published only to your internal registry\. .P See npm help 5 \fBpackage\.json\fR for more info on what goes in the package\.json file\. .SH Will you replicate from my registry into the public one? .P No\. If you want things to be public, then publish them into the public registry using npm\. What little security there is would be for nought otherwise\. .SH Do I have to use couchdb to build a registry that npm can talk to? .P No, but it's way easier\. Basically, yes, you do, or you have to effectively implement the entire CouchDB API anyway\. .SH Is there a website or something to see package docs and such? .P Yes, head over to https://npmjs\.com/ .SH SEE ALSO .RS 0 .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 developers .IP \(bu 2 npm help 7 disputes .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-scope.7������������������������������������000644 �000766 �000024 �00000006427 12455173731 024321� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-SCOPE" "7" "January 2015" "" "" .SH "NAME" \fBnpm-scope\fR \- Scoped packages .SH DESCRIPTION .P All npm packages have a name\. Some package names also have a scope\. A scope follows the usual rules for package names (url\-safe characters, no leading dots or underscores)\. When used in package names, preceded by an @\-symbol and followed by a slash, e\.g\. .P .RS 2 .nf @somescope/somepackagename .fi .RE .P Scopes are a way of grouping related packages together, and also affect a few things about the way npm treats the package\. .P \fBAs of 2014\-09\-03, scoped packages are not supported by the public npm registry\fR\|\. However, the npm client is backwards\-compatible with un\-scoped registries, so it can be used to work with scoped and un\-scoped registries at the same time\. .SH Installing scoped packages .P Scoped packages are installed to a sub\-folder of the regular installation folder, e\.g\. if your other packages are installed in \fBnode_modules/packagename\fR, scoped modules will be in \fBnode_modules/@myorg/packagename\fR\|\. The scope folder (\fB@myorg\fR) is simply the name of the scope preceded by an @\-symbol, and can contain any number of scoped packages\. .P A scoped package is installed by referencing it by name, preceded by an @\-symbol, in \fBnpm install\fR: .P .RS 2 .nf npm install @myorg/mypackage .fi .RE .P Or in \fBpackage\.json\fR: .P .RS 2 .nf "dependencies": { "@myorg/mypackage": "^1\.3\.0" } .fi .RE .P Note that if the @\-symbol is omitted in either case npm will instead attempt to install from GitHub; see npm help \fBnpm\-install\fR\|\. .SH Requiring scoped packages .P Because scoped packages are installed into a scope folder, you have to include the name of the scope when requiring them in your code, e\.g\. .P .RS 2 .nf require('@myorg/mypackage') .fi .RE .P There is nothing special about the way Node treats scope folders, this is just specifying to require the module \fBmypackage\fR in the folder called \fB@myorg\fR\|\. .SH Publishing scoped packages .P Scoped packages can be published to any registry that supports them\. \fIAs of 2014\-09\-03, the public npm registry does not support scoped packages\fR, so attempting to publish a scoped package to the registry will fail unless you have associated that scope with a different registry, see below\. .SH Associating a scope with a registry .P Scopes can be associated with a separate registry\. This allows you to seamlessly use a mix of packages from the public npm registry and one or more private registries, such as npm Enterprise\. .P You can associate a scope with a registry at login, e\.g\. .P .RS 2 .nf npm login \-\-registry=http://reg\.example\.com \-\-scope=@myco .fi .RE .P Scopes have a many\-to\-one relationship with registries: one registry can host multiple scopes, but a scope only ever points to one registry\. .P You can also associate a scope with a registry using \fBnpm config\fR: .P .RS 2 .nf npm config set @myco:registry http://reg\.example\.com .fi .RE .P Once a scope is associated with a registry, any \fBnpm install\fR for a package with that scope will request packages from that registry instead\. Any \fBnpm publish\fR for a package name that contains the scope will be published to that registry instead\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help install .IP \(bu 2 npm help publish .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/npm-scripts.7����������������������������������000644 �000766 �000024 �00000022415 12455173731 024672� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-SCRIPTS" "7" "January 2015" "" "" .SH "NAME" \fBnpm-scripts\fR \- How npm handles the "scripts" field .SH DESCRIPTION .P npm supports the "scripts" property of the package\.json script, for the following scripts: .RS 0 .IP \(bu 2 prepublish: Run BEFORE the package is published\. (Also run on local \fBnpm install\fR without any arguments\.) .IP \(bu 2 publish, postpublish: Run AFTER the package is published\. .IP \(bu 2 preinstall: Run BEFORE the package is installed .IP \(bu 2 install, postinstall: Run AFTER the package is installed\. .IP \(bu 2 preuninstall, uninstall: Run BEFORE the package is uninstalled\. .IP \(bu 2 postuninstall: Run AFTER the package is uninstalled\. .IP \(bu 2 pretest, test, posttest: Run by the \fBnpm test\fR command\. .IP \(bu 2 prestop, stop, poststop: Run by the \fBnpm stop\fR command\. .IP \(bu 2 prestart, start, poststart: Run by the \fBnpm start\fR command\. .IP \(bu 2 prerestart, restart, postrestart: Run by the \fBnpm restart\fR command\. Note: \fBnpm restart\fR will run the stop and start scripts if no \fBrestart\fR script is provided\. .RE .P Additionally, arbitrary scripts can be executed by running \fBnpm run\-script <pkg> <stage>\fR\|\. \fIPre\fR and \fIpost\fR commands with matching names will be run for those as well (e\.g\. \fBpremyscript\fR, \fBmyscript\fR, \fBpostmyscript\fR)\. .SH NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN .P \fBtl;dr\fR Don't use \fBinstall\fR\|\. Use a \fB\|\.gyp\fR file for compilation, and \fBprepublish\fR for anything else\. .P You should almost never have to explicitly set a \fBpreinstall\fR or \fBinstall\fR script\. If you are doing this, please consider if there is another option\. .P The only valid use of \fBinstall\fR or \fBpreinstall\fR scripts is for compilation which must be done on the target architecture\. In early versions of node, this was often done using the \fBnode\-waf\fR scripts, or a standalone \fBMakefile\fR, and early versions of npm required that it be explicitly set in package\.json\. This was not portable, and harder to do properly\. .P In the current version of node, the standard way to do this is using a \fB\|\.gyp\fR file\. If you have a file with a \fB\|\.gyp\fR extension in the root of your package, then npm will run the appropriate \fBnode\-gyp\fR commands automatically at install time\. This is the only officially supported method for compiling binary addons, and does not require that you add anything to your package\.json file\. .P If you have to do other things before your package is used, in a way that is not dependent on the operating system or architecture of the target system, then use a \fBprepublish\fR script instead\. This includes tasks such as: .RS 0 .IP \(bu 2 Compile CoffeeScript source code into JavaScript\. .IP \(bu 2 Create minified versions of JavaScript source code\. .IP \(bu 2 Fetching remote resources that your package will use\. .RE .P The advantage of doing these things at \fBprepublish\fR time instead of \fBpreinstall\fR or \fBinstall\fR time is that they can be done once, in a single place, and thus greatly reduce complexity and variability\. Additionally, this means that: .RS 0 .IP \(bu 2 You can depend on \fBcoffee\-script\fR as a \fBdevDependency\fR, and thus your users don't need to have it installed\. .IP \(bu 2 You don't need to include the minifiers in your package, reducing the size for your users\. .IP \(bu 2 You don't need to rely on your users having \fBcurl\fR or \fBwget\fR or other system tools on the target machines\. .RE .SH DEFAULT VALUES .P npm will default some script values based on package contents\. .RS 0 .IP \(bu 2 \fB"start": "node server\.js"\fR: If there is a \fBserver\.js\fR file in the root of your package, then npm will default the \fBstart\fR command to \fBnode server\.js\fR\|\. .IP \(bu 2 \fB"preinstall": "node\-waf clean || true; node\-waf configure build"\fR: If there is a \fBwscript\fR file in the root of your package, npm will default the \fBpreinstall\fR command to compile using node\-waf\. .RE .SH USER .P If npm was invoked with root privileges, then it will change the uid to the user account or uid specified by the \fBuser\fR config, which defaults to \fBnobody\fR\|\. Set the \fBunsafe\-perm\fR flag to run scripts with root privileges\. .SH ENVIRONMENT .P Package scripts run in an environment where many pieces of information are made available regarding the setup of npm and the current state of the process\. .SS path .P If you depend on modules that define executable scripts, like test suites, then those executables will be added to the \fBPATH\fR for executing the scripts\. So, if your package\.json has this: .P .RS 2 .nf { "name" : "foo" , "dependencies" : { "bar" : "0\.1\.x" } , "scripts": { "start" : "bar \./test" } } .fi .RE .P then you could run \fBnpm start\fR to execute the \fBbar\fR script, which is exported into the \fBnode_modules/\.bin\fR directory on \fBnpm install\fR\|\. .SS package\.json vars .P The package\.json fields are tacked onto the \fBnpm_package_\fR prefix\. So, for instance, if you had \fB{"name":"foo", "version":"1\.2\.5"}\fR in your package\.json file, then your package scripts would have the \fBnpm_package_name\fR environment variable set to "foo", and the \fBnpm_package_version\fR set to "1\.2\.5" .SS configuration .P Configuration parameters are put in the environment with the \fBnpm_config_\fR prefix\. For instance, you can view the effective \fBroot\fR config by checking the \fBnpm_config_root\fR environment variable\. .SS Special: package\.json "config" object .P The package\.json "config" keys are overwritten in the environment if there is a config param of \fB<name>[@<version>]:<key>\fR\|\. For example, if the package\.json has this: .P .RS 2 .nf { "name" : "foo" , "config" : { "port" : "8080" } , "scripts" : { "start" : "node server\.js" } } .fi .RE .P and the server\.js is this: .P .RS 2 .nf http\.createServer(\.\.\.)\.listen(process\.env\.npm_package_config_port) .fi .RE .P then the user could change the behavior by doing: .P .RS 2 .nf npm config set foo:port 80 .fi .RE .SS current lifecycle event .P Lastly, the \fBnpm_lifecycle_event\fR environment variable is set to whichever stage of the cycle is being executed\. So, you could have a single script used for different parts of the process which switches based on what's currently happening\. .P Objects are flattened following this format, so if you had \fB{"scripts":{"install":"foo\.js"}}\fR in your package\.json, then you'd see this in the script: .P .RS 2 .nf process\.env\.npm_package_scripts_install === "foo\.js" .fi .RE .SH EXAMPLES .P For example, if your package\.json contains this: .P .RS 2 .nf { "scripts" : { "install" : "scripts/install\.js" , "postinstall" : "scripts/install\.js" , "uninstall" : "scripts/uninstall\.js" } } .fi .RE .P then the \fBscripts/install\.js\fR will be called for the install, post\-install, stages of the lifecycle, and the \fBscripts/uninstall\.js\fR would be called when the package is uninstalled\. Since \fBscripts/install\.js\fR is running for three different phases, it would be wise in this case to look at the \fBnpm_lifecycle_event\fR environment variable\. .P If you want to run a make command, you can do so\. This works just fine: .P .RS 2 .nf { "scripts" : { "preinstall" : "\./configure" , "install" : "make && make install" , "test" : "make test" } } .fi .RE .SH EXITING .P Scripts are run by passing the line as a script argument to \fBsh\fR\|\. .P If the script exits with a code other than 0, then this will abort the process\. .P Note that these script files don't have to be nodejs or even javascript programs\. They just have to be some kind of executable file\. .SH HOOK SCRIPTS .P If you want to run a specific script at a specific lifecycle event for ALL packages, then you can use a hook script\. .P Place an executable file at \fBnode_modules/\.hooks/{eventname}\fR, and it'll get run for all packages when they are going through that point in the package lifecycle for any packages installed in that root\. .P Hook scripts are run exactly the same way as package\.json scripts\. That is, they are in a separate child process, with the env described above\. .SH BEST PRACTICES .RS 0 .IP \(bu 2 Don't exit with a non\-zero error code unless you \fIreally\fR mean it\. Except for uninstall scripts, this will cause the npm action to fail, and potentially be rolled back\. If the failure is minor or only will prevent some optional features, then it's better to just print a warning and exit successfully\. .IP \(bu 2 Try not to use scripts to do what npm can do for you\. Read through npm help 5 \fBpackage\.json\fR to see all the things that you can specify and enable by simply describing your package appropriately\. In general, this will lead to a more robust and consistent state\. .IP \(bu 2 Inspect the env to determine where to put things\. For instance, if the \fBnpm_config_binroot\fR environ is set to \fB/home/user/bin\fR, then don't try to install executables into \fB/usr/local/bin\fR\|\. The user probably set it up that way for a reason\. .IP \(bu 2 Don't prefix your script commands with "sudo"\. If root permissions are required for some reason, then it'll fail with that error, and the user will sudo the npm command in question\. .RE .SH SEE ALSO .RS 0 .IP \(bu 2 npm help run\-script .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help 7 developers .IP \(bu 2 npm help install .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/removing-npm.7���������������������������������000644 �000766 �000024 �00000003451 12455173731 025030� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-REMOVAL" "1" "January 2015" "" "" .SH "NAME" \fBnpm-removal\fR \- Cleaning the Slate .SH SYNOPSIS .P So sad to see you go\. .P .RS 2 .nf sudo npm uninstall npm \-g .fi .RE .P Or, if that fails, get the npm source code, and do: .P .RS 2 .nf sudo make uninstall .fi .RE .SH More Severe Uninstalling .P Usually, the above instructions are sufficient\. That will remove npm, but leave behind anything you've installed\. .P If that doesn't work, or if you require more drastic measures, continue reading\. .P Note that this is only necessary for globally\-installed packages\. Local installs are completely contained within a project's \fBnode_modules\fR folder\. Delete that folder, and everything is gone (unless a package's install script is particularly ill\-behaved)\. .P This assumes that you installed node and npm in the default place\. If you configured node with a different \fB\-\-prefix\fR, or installed npm with a different prefix setting, then adjust the paths accordingly, replacing \fB/usr/local\fR with your install prefix\. .P To remove everything npm\-related manually: .P .RS 2 .nf rm \-rf /usr/local/{lib/node{,/\.npm,_modules},bin,share/man}/npm* .fi .RE .P If you installed things \fIwith\fR npm, then your best bet is to uninstall them with npm first, and then install them again once you have a proper install\. This can help find any symlinks that are lying around: .P .RS 2 .nf ls \-laF /usr/local/{lib/node{,/\.npm},bin,share/man} | grep npm .fi .RE .P Prior to version 0\.3, npm used shim files for executables and node modules\. To track those down, you can do the following: .P .RS 2 .nf find /usr/local/{lib/node,bin} \-exec grep \-l npm \\{\\} \\; ; .fi .RE .P (This is also in the README file\.) .SH SEE ALSO .RS 0 .IP \(bu 2 README .IP \(bu 2 npm help rm .IP \(bu 2 npm help prune .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man7/semver.7���������������������������������������000644 �000766 �000024 �00000032713 12455173731 023716� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "SEMVER" "7" "January 2015" "" "" .SH "NAME" \fBsemver\fR \- The semantic versioner for npm .SH Usage .P .RS 2 .nf $ npm install semver semver\.valid('1\.2\.3') // '1\.2\.3' semver\.valid('a\.b\.c') // null semver\.clean(' =v1\.2\.3 ') // '1\.2\.3' semver\.satisfies('1\.2\.3', '1\.x || >=2\.5\.0 || 5\.0\.0 \- 7\.2\.3') // true semver\.gt('1\.2\.3', '9\.8\.7') // false semver\.lt('1\.2\.3', '9\.8\.7') // true .fi .RE .P As a command\-line utility: .P .RS 2 .nf $ semver \-h Usage: semver <version> [<version> [\.\.\.]] [\-r <range> | \-i <inc> | \-\-preid <identifier> | \-l | \-rv] Test if version(s) satisfy the supplied range(s), and sort them\. Multiple versions or ranges may be supplied, unless increment option is specified\. In that case, only a single version may be used, and it is incremented by the specified level Program exits successfully if any valid version satisfies all supplied ranges, and prints all satisfying versions\. If no versions are valid, or ranges are not satisfied, then exits failure\. Versions are printed in ascending order, so supplying multiple versions to the utility will just sort them\. .fi .RE .SH Versions .P A "version" is described by the \fBv2\.0\.0\fR specification found at http://semver\.org/\|\. .P A leading \fB"="\fR or \fB"v"\fR character is stripped off and ignored\. .SH Ranges .P A \fBversion range\fR is a set of \fBcomparators\fR which specify versions that satisfy the range\. .P A \fBcomparator\fR is composed of an \fBoperator\fR and a \fBversion\fR\|\. The set of primitive \fBoperators\fR is: .RS 0 .IP \(bu 2 \fB<\fR Less than .IP \(bu 2 \fB<=\fR Less than or equal to .IP \(bu 2 \fB>\fR Greater than .IP \(bu 2 \fB>=\fR Greater than or equal to .IP \(bu 2 \fB=\fR Equal\. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included\. .RE .P For example, the comparator \fB>=1\.2\.7\fR would match the versions \fB1\.2\.7\fR, \fB1\.2\.8\fR, \fB2\.5\.3\fR, and \fB1\.3\.9\fR, but not the versions \fB1\.2\.6\fR or \fB1\.1\.0\fR\|\. .P Comparators can be joined by whitespace to form a \fBcomparator set\fR, which is satisfied by the \fBintersection\fR of all of the comparators it includes\. .P A range is composed of one or more comparator sets, joined by \fB||\fR\|\. A version matches a range if and only if every comparator in at least one of the \fB||\fR\-separated comparator sets is satisfied by the version\. .P For example, the range \fB>=1\.2\.7 <1\.3\.0\fR would match the versions \fB1\.2\.7\fR, \fB1\.2\.8\fR, and \fB1\.2\.99\fR, but not the versions \fB1\.2\.6\fR, \fB1\.3\.0\fR, or \fB1\.1\.0\fR\|\. .P The range \fB1\.2\.7 || >=1\.2\.9 <2\.0\.0\fR would match the versions \fB1\.2\.7\fR, \fB1\.2\.9\fR, and \fB1\.4\.6\fR, but not the versions \fB1\.2\.8\fR or \fB2\.0\.0\fR\|\. .SS Prerelease Tags .P If a version has a prerelease tag (for example, \fB1\.2\.3\-alpha\.3\fR) then it will only be allowed to satisfy comparator sets if at least one comparator with the same \fB[major, minor, patch]\fR tuple also has a prerelease tag\. .P For example, the range \fB>1\.2\.3\-alpha\.3\fR would be allowed to match the version \fB1\.2\.3\-alpha\.7\fR, but it would \fInot\fR be satisfied by \fB3\.4\.5\-alpha\.9\fR, even though \fB3\.4\.5\-alpha\.9\fR is technically "greater than" \fB1\.2\.3\-alpha\.3\fR according to the SemVer sort rules\. The version range only accepts prerelease tags on the \fB1\.2\.3\fR version\. The version \fB3\.4\.5\fR \fIwould\fR satisfy the range, because it does not have a prerelease flag, and \fB3\.4\.5\fR is greater than \fB1\.2\.3\-alpha\.7\fR\|\. .P The purpose for this behavior is twofold\. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author's design) not yet fit for public consumption\. Therefore, by default, they are excluded from range matching semantics\. .P Second, a user who has opted into using a prerelease version has clearly indicated the intent to use \fIthat specific\fR set of alpha/beta/rc versions\. By including a prerelease tag in the range, the user is indicating that they are aware of the risk\. However, it is still not appropriate to assume that they have opted into taking a similar risk on the \fInext\fR set of prerelease versions\. .SS Prerelease Identifiers .P The method \fB\|\.inc\fR takes an additional \fBidentifier\fR string argument that will append the value of the string as a prerelease identifier: .P .RS 2 .nf > semver\.inc('1\.2\.3', 'pre', 'beta') \|'1\.2\.4\-beta\.0' .fi .RE .P command\-line example: .P .RS 2 .nf $ semver 1\.2\.3 \-i prerelease \-\-preid beta 1\.2\.4\-beta\.0 .fi .RE .P Which then can be used to increment further: .P .RS 2 .nf $ semver 1\.2\.4\-beta\.0 \-i prerelease 1\.2\.4\-beta\.1 .fi .RE .SS Advanced Range Syntax .P Advanced range syntax desugars to primitive comparators in deterministic ways\. .P Advanced ranges may be combined in the same way as primitive comparators using white space or \fB||\fR\|\. .SS Hyphen Ranges \fBX\.Y\.Z \- A\.B\.C\fR .P Specifies an inclusive set\. .RS 0 .IP \(bu 2 \fB1\.2\.3 \- 2\.3\.4\fR := \fB>=1\.2\.3 <=2\.3\.4\fR .RE .P If a partial version is provided as the first version in the inclusive range, then the missing pieces are replaced with zeroes\. .RS 0 .IP \(bu 2 \fB1\.2 \- 2\.3\.4\fR := \fB>=1\.2\.0 <=2\.3\.4\fR .RE .P If a partial version is provided as the second version in the inclusive range, then all versions that start with the supplied parts of the tuple are accepted, but nothing that would be greater than the provided tuple parts\. .RS 0 .IP \(bu 2 \fB1\.2\.3 \- 2\.3\fR := \fB>=1\.2\.3 <2\.4\.0\fR .IP \(bu 2 \fB1\.2\.3 \- 2\fR := \fB>=1\.2\.3 <3\.0\.0\fR .RE .SS X\-Ranges \fB1\.2\.x\fR \fB1\.X\fR \fB1\.2\.*\fR \fB*\fR .P Any of \fBX\fR, \fBx\fR, or \fB*\fR may be used to "stand in" for one of the numeric values in the \fB[major, minor, patch]\fR tuple\. .RS 0 .IP \(bu 2 \fB*\fR := \fB>=0\.0\.0\fR (Any version satisfies) .IP \(bu 2 \fB1\.x\fR := \fB>=1\.0\.0 <2\.0\.0\fR (Matching major version) .IP \(bu 2 \fB1\.2\.x\fR := \fB>=1\.2\.0 <1\.3\.0\fR (Matching major and minor versions) .RE .P A partial version range is treated as an X\-Range, so the special character is in fact optional\. .RS 0 .IP \(bu 2 \fB""\fR (empty string) := \fB*\fR := \fB>=0\.0\.0\fR .IP \(bu 2 \fB1\fR := \fB1\.x\.x\fR := \fB>=1\.0\.0 <2\.0\.0\fR .IP \(bu 2 \fB1\.2\fR := \fB1\.2\.x\fR := \fB>=1\.2\.0 <1\.3\.0\fR .RE .SS Tilde Ranges \fB~1\.2\.3\fR \fB~1\.2\fR \fB~1\fR .P Allows patch\-level changes if a minor version is specified on the comparator\. Allows minor\-level changes if not\. .RS 0 .IP \(bu 2 \fB~1\.2\.3\fR := \fB>=1\.2\.3 <1\.(2+1)\.0\fR := \fB>=1\.2\.3 <1\.3\.0\fR .IP \(bu 2 \fB~1\.2\fR := \fB>=1\.2\.0 <1\.(2+1)\.0\fR := \fB>=1\.2\.0 <1\.3\.0\fR (Same as \fB1\.2\.x\fR) .IP \(bu 2 \fB~1\fR := \fB>=1\.0\.0 <(1+1)\.0\.0\fR := \fB>=1\.0\.0 <2\.0\.0\fR (Same as \fB1\.x\fR) .IP \(bu 2 \fB~0\.2\.3\fR := \fB>=0\.2\.3 <0\.(2+1)\.0\fR := \fB>=0\.2\.3 <0\.3\.0\fR .IP \(bu 2 \fB~0\.2\fR := \fB>=0\.2\.0 <0\.(2+1)\.0\fR := \fB>=0\.2\.0 <0\.3\.0\fR (Same as \fB0\.2\.x\fR) .IP \(bu 2 \fB~0\fR := \fB>=0\.0\.0 <(0+1)\.0\.0\fR := \fB>=0\.0\.0 <1\.0\.0\fR (Same as \fB0\.x\fR) .IP \(bu 2 \fB~1\.2\.3\-beta\.2\fR := \fB>=1\.2\.3\-beta\.2 <1\.3\.0\fR Note that prereleases in the \fB1\.2\.3\fR version will be allowed, if they are greater than or equal to \fBbeta\.2\fR\|\. So, \fB1\.2\.3\-beta\.4\fR would be allowed, but \fB1\.2\.4\-beta\.2\fR would not, because it is a prerelease of a different \fB[major, minor, patch]\fR tuple\. .RE .SS Caret Ranges \fB^1\.2\.3\fR \fB^0\.2\.5\fR \fB^0\.0\.4\fR .P Allows changes that do not modify the left\-most non\-zero digit in the \fB[major, minor, patch]\fR tuple\. In other words, this allows patch and minor updates for versions \fB1\.0\.0\fR and above, patch updates for versions \fB0\.X >=0\.1\.0\fR, and \fIno\fR updates for versions \fB0\.0\.X\fR\|\. .P Many authors treat a \fB0\.x\fR version as if the \fBx\fR were the major "breaking\-change" indicator\. .P Caret ranges are ideal when an author may make breaking changes between \fB0\.2\.4\fR and \fB0\.3\.0\fR releases, which is a common practice\. However, it presumes that there will \fInot\fR be breaking changes between \fB0\.2\.4\fR and \fB0\.2\.5\fR\|\. It allows for changes that are presumed to be additive (but non\-breaking), according to commonly observed practices\. .RS 0 .IP \(bu 2 \fB^1\.2\.3\fR := \fB>=1\.2\.3 <2\.0\.0\fR .IP \(bu 2 \fB^0\.2\.3\fR := \fB>=0\.2\.3 <0\.3\.0\fR .IP \(bu 2 \fB^0\.0\.3\fR := \fB>=0\.0\.3 <0\.0\.4\fR .IP \(bu 2 \fB^1\.2\.3\-beta\.2\fR := \fB>=1\.2\.3\-beta\.2 <2\.0\.0\fR Note that prereleases in the \fB1\.2\.3\fR version will be allowed, if they are greater than or equal to \fBbeta\.2\fR\|\. So, \fB1\.2\.3\-beta\.4\fR would be allowed, but \fB1\.2\.4\-beta\.2\fR would not, because it is a prerelease of a different \fB[major, minor, patch]\fR tuple\. .IP \(bu 2 \fB^0\.0\.3\-beta\fR := \fB>=0\.0\.3\-beta <0\.0\.4\fR Note that prereleases in the \fB0\.0\.3\fR version \fIonly\fR will be allowed, if they are greater than or equal to \fBbeta\fR\|\. So, \fB0\.0\.3\-pr\.2\fR would be allowed\. .RE .P When parsing caret ranges, a missing \fBpatch\fR value desugars to the number \fB0\fR, but will allow flexibility within that value, even if the major and minor versions are both \fB0\fR\|\. .RS 0 .IP \(bu 2 \fB^1\.2\.x\fR := \fB>=1\.2\.0 <2\.0\.0\fR .IP \(bu 2 \fB^0\.0\.x\fR := \fB>=0\.0\.0 <0\.1\.0\fR .IP \(bu 2 \fB^0\.0\fR := \fB>=0\.0\.0 <0\.1\.0\fR .RE .P A missing \fBminor\fR and \fBpatch\fR values will desugar to zero, but also allow flexibility within those values, even if the major version is zero\. .RS 0 .IP \(bu 2 \fB^1\.x\fR := \fB>=1\.0\.0 <2\.0\.0\fR .IP \(bu 2 \fB^0\.x\fR := \fB>=0\.0\.0 <1\.0\.0\fR .RE .SH Functions .P All methods and classes take a final \fBloose\fR boolean argument that, if true, will be more forgiving about not\-quite\-valid semver strings\. The resulting output will always be 100% strict, of course\. .P Strict\-mode Comparators and Ranges will be strict about the SemVer strings that they parse\. .RS 0 .IP \(bu 2 \fBvalid(v)\fR: Return the parsed version, or null if it's not valid\. .IP \(bu 2 \fBinc(v, release)\fR: Return the version incremented by the release type (\fBmajor\fR, \fBpremajor\fR, \fBminor\fR, \fBpreminor\fR, \fBpatch\fR, \fBprepatch\fR, or \fBprerelease\fR), or null if it's not valid .RS 0 .IP \(bu 2 \fBpremajor\fR in one call will bump the version up to the next major version and down to a prerelease of that major version\. \fBpreminor\fR, and \fBprepatch\fR work the same way\. .IP \(bu 2 If called from a non\-prerelease version, the \fBprerelease\fR will work the same as \fBprepatch\fR\|\. It increments the patch version, then makes a prerelease\. If the input version is already a prerelease it simply increments it\. .RE .RE .SS Comparison .RS 0 .IP \(bu 2 \fBgt(v1, v2)\fR: \fBv1 > v2\fR .IP \(bu 2 \fBgte(v1, v2)\fR: \fBv1 >= v2\fR .IP \(bu 2 \fBlt(v1, v2)\fR: \fBv1 < v2\fR .IP \(bu 2 \fBlte(v1, v2)\fR: \fBv1 <= v2\fR .IP \(bu 2 \fBeq(v1, v2)\fR: \fBv1 == v2\fR This is true if they're logically equivalent, even if they're not the exact same string\. You already know how to compare strings\. .IP \(bu 2 \fBneq(v1, v2)\fR: \fBv1 != v2\fR The opposite of \fBeq\fR\|\. .IP \(bu 2 \fBcmp(v1, comparator, v2)\fR: Pass in a comparison string, and it'll call the corresponding function above\. \fB"==="\fR and \fB"!=="\fR do simple string comparison, but are included for completeness\. Throws if an invalid comparison string is provided\. .IP \(bu 2 \fBcompare(v1, v2)\fR: Return \fB0\fR if \fBv1 == v2\fR, or \fB1\fR if \fBv1\fR is greater, or \fB\-1\fR if \fBv2\fR is greater\. Sorts in ascending order if passed to \fBArray\.sort()\fR\|\. .IP \(bu 2 \fBrcompare(v1, v2)\fR: The reverse of compare\. Sorts an array of versions in descending order when passed to \fBArray\.sort()\fR\|\. .IP \(bu 2 \fBdiff(v1, v2)\fR: Returns difference between two versions by the release type (\fBmajor\fR, \fBpremajor\fR, \fBminor\fR, \fBpreminor\fR, \fBpatch\fR, \fBprepatch\fR, or \fBprerelease\fR), or null if the versions are the same\. .RE .SS Ranges .RS 0 .IP \(bu 2 \fBvalidRange(range)\fR: Return the valid range or null if it's not valid .IP \(bu 2 \fBsatisfies(version, range)\fR: Return true if the version satisfies the range\. .IP \(bu 2 \fBmaxSatisfying(versions, range)\fR: Return the highest version in the list that satisfies the range, or \fBnull\fR if none of them do\. .IP \(bu 2 \fBgtr(version, range)\fR: Return \fBtrue\fR if version is greater than all the versions possible in the range\. .IP \(bu 2 \fBltr(version, range)\fR: Return \fBtrue\fR if version is less than all the versions possible in the range\. .IP \(bu 2 \fBoutside(version, range, hilo)\fR: Return true if the version is outside the bounds of the range in either the high or low direction\. The \fBhilo\fR argument must be either the string \fB\|'>'\fR or \fB\|'<'\fR\|\. (This is the function called by \fBgtr\fR and \fBltr\fR\|\.) .RE .P Note that, since ranges may be non\-contiguous, a version might not be greater than a range, less than a range, \fIor\fR satisfy a range! For example, the range \fB1\.2 <1\.2\.9 || >2\.0\.0\fR would have a hole from \fB1\.2\.9\fR until \fB2\.0\.0\fR, so the version \fB1\.2\.10\fR would not be greater than the range (because \fB2\.0\.1\fR satisfies, which is higher), nor less than the range (since \fB1\.2\.8\fR satisfies, which is lower), and it also does not satisfy the range\. .P If you want to know if a version satisfies or does not satisfy a range, use the \fBsatisfies(version, range)\fR function\. �����������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man5/npm-folders.5����������������������������������000644 �000766 �000024 �00000020624 12455173731 024635� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-FOLDERS" "5" "January 2015" "" "" .SH "NAME" \fBnpm-folders\fR \- Folder Structures Used by npm .SH DESCRIPTION .P npm puts various things on your computer\. That's its job\. .P This document will tell you what it puts where\. .SS tl;dr .RS 0 .IP \(bu 2 Local install (default): puts stuff in \fB\|\./node_modules\fR of the current package root\. .IP \(bu 2 Global install (with \fB\-g\fR): puts stuff in /usr/local or wherever node is installed\. .IP \(bu 2 Install it \fBlocally\fR if you're going to \fBrequire()\fR it\. .IP \(bu 2 Install it \fBglobally\fR if you're going to run it on the command line\. .IP \(bu 2 If you need both, then install it in both places, or use \fBnpm link\fR\|\. .RE .SS prefix Configuration .P The \fBprefix\fR config defaults to the location where node is installed\. On most systems, this is \fB/usr/local\fR, and most of the time is the same as node's \fBprocess\.installPrefix\fR\|\. .P On windows, this is the exact location of the node\.exe binary\. On Unix systems, it's one level up, since node is typically installed at \fB{prefix}/bin/node\fR rather than \fB{prefix}/node\.exe\fR\|\. .P When the \fBglobal\fR flag is set, npm installs things into this prefix\. When it is not set, it uses the root of the current package, or the current working directory if not in a package already\. .SS Node Modules .P Packages are dropped into the \fBnode_modules\fR folder under the \fBprefix\fR\|\. When installing locally, this means that you can \fBrequire("packagename")\fR to load its main module, or \fBrequire("packagename/lib/path/to/sub/module")\fR to load other modules\. .P Global installs on Unix systems go to \fB{prefix}/lib/node_modules\fR\|\. Global installs on Windows go to \fB{prefix}/node_modules\fR (that is, no \fBlib\fR folder\.) .P Scoped packages are installed the same way, except they are grouped together in a sub\-folder of the relevant \fBnode_modules\fR folder with the name of that scope prefix by the @ symbol, e\.g\. \fBnpm install @myorg/package\fR would place the package in \fB{prefix}/node_modules/@myorg/package\fR\|\. See npm help 7 \fBscopes\fR for more details\. .P If you wish to \fBrequire()\fR a package, then install it locally\. .SS Executables .P When in global mode, executables are linked into \fB{prefix}/bin\fR on Unix, or directly into \fB{prefix}\fR on Windows\. .P When in local mode, executables are linked into \fB\|\./node_modules/\.bin\fR so that they can be made available to scripts run through npm\. (For example, so that a test runner will be in the path when you run \fBnpm test\fR\|\.) .SS Man Pages .P When in global mode, man pages are linked into \fB{prefix}/share/man\fR\|\. .P When in local mode, man pages are not installed\. .P Man pages are not installed on Windows systems\. .SS Cache .P See npm help \fBnpm\-cache\fR\|\. Cache files are stored in \fB~/\.npm\fR on Posix, or \fB~/npm\-cache\fR on Windows\. .P This is controlled by the \fBcache\fR configuration param\. .SS Temp Files .P Temporary files are stored by default in the folder specified by the \fBtmp\fR config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or \fB/tmp\fR on Unix and \fBc:\\windows\\temp\fR on Windows\. .P Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit\. .SH More Information .P When installing locally, npm first tries to find an appropriate \fBprefix\fR folder\. This is so that \fBnpm install foo@1\.2\.3\fR will install to the sensible root of your package, even if you happen to have \fBcd\fRed into some other folder\. .P Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a \fBpackage\.json\fR file, or a \fBnode_modules\fR folder\. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands\. (This behavior is inspired by and similar to git's \.git\-folder seeking logic when running git commands in a working dir\.) .P If no package root is found, then the current folder is used\. .P When you run \fBnpm install foo@1\.2\.3\fR, then the package is loaded into the cache, and then unpacked into \fB\|\./node_modules/foo\fR\|\. Then, any of foo's dependencies are similarly unpacked into \fB\|\./node_modules/foo/node_modules/\.\.\.\fR\|\. .P Any bin files are symlinked to \fB\|\./node_modules/\.bin/\fR, so that they may be found by npm scripts when necessary\. .SS Global Installation .P If the \fBglobal\fR configuration is set to true, then npm will install packages "globally"\. .P For global installation, packages are installed roughly the same way, but using the folders described above\. .SS Cycles, Conflicts, and Folder Parsimony .P Cycles are handled using the property of node's module system that it walks up the directories looking for \fBnode_modules\fR folders\. So, at every stage, if a package is already installed in an ancestor \fBnode_modules\fR folder, then it is not installed at the current location\. .P Consider the case above, where \fBfoo \-> bar \-> baz\fR\|\. Imagine if, in addition to that, baz depended on bar, so you'd have: \fBfoo \-> bar \-> baz \-> bar \-> baz \.\.\.\fR\|\. However, since the folder structure is: \fBfoo/node_modules/bar/node_modules/baz\fR, there's no need to put another copy of bar into \fB\|\.\.\./baz/node_modules\fR, since when it calls require("bar"), it will get the copy that is installed in \fBfoo/node_modules/bar\fR\|\. .P This shortcut is only used if the exact same version would be installed in multiple nested \fBnode_modules\fR folders\. It is still possible to have \fBa/node_modules/b/node_modules/a\fR if the two "a" packages are different versions\. However, without repeating the exact same package multiple times, an infinite regress will always be prevented\. .P Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder\. .SS Example .P Consider this dependency graph: .P .RS 2 .nf foo +\-\- blerg@1\.2\.5 +\-\- bar@1\.2\.3 | +\-\- blerg@1\.x (latest=1\.3\.7) | +\-\- baz@2\.x | | `\-\- quux@3\.x | | `\-\- bar@1\.2\.3 (cycle) | `\-\- asdf@* `\-\- baz@1\.2\.3 `\-\- quux@3\.x `\-\- bar .fi .RE .P In this case, we might expect a folder structure like this: .P .RS 2 .nf foo +\-\- node_modules +\-\- blerg (1\.2\.5) <\-\-\-[A] +\-\- bar (1\.2\.3) <\-\-\-[B] | `\-\- node_modules | +\-\- baz (2\.0\.2) <\-\-\-[C] | | `\-\- node_modules | | `\-\- quux (3\.2\.0) | `\-\- asdf (2\.3\.4) `\-\- baz (1\.2\.3) <\-\-\-[D] `\-\- node_modules `\-\- quux (3\.2\.0) <\-\-\-[E] .fi .RE .P Since foo depends directly on \fBbar@1\.2\.3\fR and \fBbaz@1\.2\.3\fR, those are installed in foo's \fBnode_modules\fR folder\. .P Even though the latest copy of blerg is 1\.3\.7, foo has a specific dependency on version 1\.2\.5\. So, that gets installed at [A]\. Since the parent installation of blerg satisfies bar's dependency on \fBblerg@1\.x\fR, it does not install another copy under [B]\. .P Bar [B] also has dependencies on baz and asdf, so those are installed in bar's \fBnode_modules\fR folder\. Because it depends on \fBbaz@2\.x\fR, it cannot re\-use the \fBbaz@1\.2\.3\fR installed in the parent \fBnode_modules\fR folder [D], and must install its own copy [C]\. .P Underneath bar, the \fBbaz \-> quux \-> bar\fR dependency creates a cycle\. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder\. .P Underneath \fBfoo \-> baz\fR [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B]\. .P For a graphical breakdown of what is installed where, use \fBnpm ls\fR\|\. .SS Publishing .P Upon publishing, npm will look in the \fBnode_modules\fR folder\. If any of the items there are not in the \fBbundledDependencies\fR array, then they will not be included in the package tarball\. .P This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re\-publish those items that cannot be found elsewhere\. See npm help 5 \fBpackage\.json\fR for more information\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help install .IP \(bu 2 npm help pack .IP \(bu 2 npm help cache .IP \(bu 2 npm help config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help publish .RE ������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man5/npm-global.5�����������������������������������000644 �000766 �000024 �00000020624 12455173731 024437� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-FOLDERS" "5" "January 2015" "" "" .SH "NAME" \fBnpm-folders\fR \- Folder Structures Used by npm .SH DESCRIPTION .P npm puts various things on your computer\. That's its job\. .P This document will tell you what it puts where\. .SS tl;dr .RS 0 .IP \(bu 2 Local install (default): puts stuff in \fB\|\./node_modules\fR of the current package root\. .IP \(bu 2 Global install (with \fB\-g\fR): puts stuff in /usr/local or wherever node is installed\. .IP \(bu 2 Install it \fBlocally\fR if you're going to \fBrequire()\fR it\. .IP \(bu 2 Install it \fBglobally\fR if you're going to run it on the command line\. .IP \(bu 2 If you need both, then install it in both places, or use \fBnpm link\fR\|\. .RE .SS prefix Configuration .P The \fBprefix\fR config defaults to the location where node is installed\. On most systems, this is \fB/usr/local\fR, and most of the time is the same as node's \fBprocess\.installPrefix\fR\|\. .P On windows, this is the exact location of the node\.exe binary\. On Unix systems, it's one level up, since node is typically installed at \fB{prefix}/bin/node\fR rather than \fB{prefix}/node\.exe\fR\|\. .P When the \fBglobal\fR flag is set, npm installs things into this prefix\. When it is not set, it uses the root of the current package, or the current working directory if not in a package already\. .SS Node Modules .P Packages are dropped into the \fBnode_modules\fR folder under the \fBprefix\fR\|\. When installing locally, this means that you can \fBrequire("packagename")\fR to load its main module, or \fBrequire("packagename/lib/path/to/sub/module")\fR to load other modules\. .P Global installs on Unix systems go to \fB{prefix}/lib/node_modules\fR\|\. Global installs on Windows go to \fB{prefix}/node_modules\fR (that is, no \fBlib\fR folder\.) .P Scoped packages are installed the same way, except they are grouped together in a sub\-folder of the relevant \fBnode_modules\fR folder with the name of that scope prefix by the @ symbol, e\.g\. \fBnpm install @myorg/package\fR would place the package in \fB{prefix}/node_modules/@myorg/package\fR\|\. See npm help 7 \fBscopes\fR for more details\. .P If you wish to \fBrequire()\fR a package, then install it locally\. .SS Executables .P When in global mode, executables are linked into \fB{prefix}/bin\fR on Unix, or directly into \fB{prefix}\fR on Windows\. .P When in local mode, executables are linked into \fB\|\./node_modules/\.bin\fR so that they can be made available to scripts run through npm\. (For example, so that a test runner will be in the path when you run \fBnpm test\fR\|\.) .SS Man Pages .P When in global mode, man pages are linked into \fB{prefix}/share/man\fR\|\. .P When in local mode, man pages are not installed\. .P Man pages are not installed on Windows systems\. .SS Cache .P See npm help \fBnpm\-cache\fR\|\. Cache files are stored in \fB~/\.npm\fR on Posix, or \fB~/npm\-cache\fR on Windows\. .P This is controlled by the \fBcache\fR configuration param\. .SS Temp Files .P Temporary files are stored by default in the folder specified by the \fBtmp\fR config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or \fB/tmp\fR on Unix and \fBc:\\windows\\temp\fR on Windows\. .P Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit\. .SH More Information .P When installing locally, npm first tries to find an appropriate \fBprefix\fR folder\. This is so that \fBnpm install foo@1\.2\.3\fR will install to the sensible root of your package, even if you happen to have \fBcd\fRed into some other folder\. .P Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a \fBpackage\.json\fR file, or a \fBnode_modules\fR folder\. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands\. (This behavior is inspired by and similar to git's \.git\-folder seeking logic when running git commands in a working dir\.) .P If no package root is found, then the current folder is used\. .P When you run \fBnpm install foo@1\.2\.3\fR, then the package is loaded into the cache, and then unpacked into \fB\|\./node_modules/foo\fR\|\. Then, any of foo's dependencies are similarly unpacked into \fB\|\./node_modules/foo/node_modules/\.\.\.\fR\|\. .P Any bin files are symlinked to \fB\|\./node_modules/\.bin/\fR, so that they may be found by npm scripts when necessary\. .SS Global Installation .P If the \fBglobal\fR configuration is set to true, then npm will install packages "globally"\. .P For global installation, packages are installed roughly the same way, but using the folders described above\. .SS Cycles, Conflicts, and Folder Parsimony .P Cycles are handled using the property of node's module system that it walks up the directories looking for \fBnode_modules\fR folders\. So, at every stage, if a package is already installed in an ancestor \fBnode_modules\fR folder, then it is not installed at the current location\. .P Consider the case above, where \fBfoo \-> bar \-> baz\fR\|\. Imagine if, in addition to that, baz depended on bar, so you'd have: \fBfoo \-> bar \-> baz \-> bar \-> baz \.\.\.\fR\|\. However, since the folder structure is: \fBfoo/node_modules/bar/node_modules/baz\fR, there's no need to put another copy of bar into \fB\|\.\.\./baz/node_modules\fR, since when it calls require("bar"), it will get the copy that is installed in \fBfoo/node_modules/bar\fR\|\. .P This shortcut is only used if the exact same version would be installed in multiple nested \fBnode_modules\fR folders\. It is still possible to have \fBa/node_modules/b/node_modules/a\fR if the two "a" packages are different versions\. However, without repeating the exact same package multiple times, an infinite regress will always be prevented\. .P Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder\. .SS Example .P Consider this dependency graph: .P .RS 2 .nf foo +\-\- blerg@1\.2\.5 +\-\- bar@1\.2\.3 | +\-\- blerg@1\.x (latest=1\.3\.7) | +\-\- baz@2\.x | | `\-\- quux@3\.x | | `\-\- bar@1\.2\.3 (cycle) | `\-\- asdf@* `\-\- baz@1\.2\.3 `\-\- quux@3\.x `\-\- bar .fi .RE .P In this case, we might expect a folder structure like this: .P .RS 2 .nf foo +\-\- node_modules +\-\- blerg (1\.2\.5) <\-\-\-[A] +\-\- bar (1\.2\.3) <\-\-\-[B] | `\-\- node_modules | +\-\- baz (2\.0\.2) <\-\-\-[C] | | `\-\- node_modules | | `\-\- quux (3\.2\.0) | `\-\- asdf (2\.3\.4) `\-\- baz (1\.2\.3) <\-\-\-[D] `\-\- node_modules `\-\- quux (3\.2\.0) <\-\-\-[E] .fi .RE .P Since foo depends directly on \fBbar@1\.2\.3\fR and \fBbaz@1\.2\.3\fR, those are installed in foo's \fBnode_modules\fR folder\. .P Even though the latest copy of blerg is 1\.3\.7, foo has a specific dependency on version 1\.2\.5\. So, that gets installed at [A]\. Since the parent installation of blerg satisfies bar's dependency on \fBblerg@1\.x\fR, it does not install another copy under [B]\. .P Bar [B] also has dependencies on baz and asdf, so those are installed in bar's \fBnode_modules\fR folder\. Because it depends on \fBbaz@2\.x\fR, it cannot re\-use the \fBbaz@1\.2\.3\fR installed in the parent \fBnode_modules\fR folder [D], and must install its own copy [C]\. .P Underneath bar, the \fBbaz \-> quux \-> bar\fR dependency creates a cycle\. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder\. .P Underneath \fBfoo \-> baz\fR [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B]\. .P For a graphical breakdown of what is installed where, use \fBnpm ls\fR\|\. .SS Publishing .P Upon publishing, npm will look in the \fBnode_modules\fR folder\. If any of the items there are not in the \fBbundledDependencies\fR array, then they will not be included in the package tarball\. .P This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re\-publish those items that cannot be found elsewhere\. See npm help 5 \fBpackage\.json\fR for more information\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help install .IP \(bu 2 npm help pack .IP \(bu 2 npm help cache .IP \(bu 2 npm help config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help publish .RE ������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man5/npm-json.5�������������������������������������000644 �000766 �000024 �00000054100 12455173731 024144� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "PACKAGE\.JSON" "5" "January 2015" "" "" .SH "NAME" \fBpackage.json\fR \- Specifics of npm's package\.json handling .SH DESCRIPTION .P This document is all you need to know about what's required in your package\.json file\. It must be actual JSON, not just a JavaScript object literal\. .P A lot of the behavior described in this document is affected by the config settings described in npm help 7 \fBnpm\-config\fR\|\. .SH name .P The \fImost\fR important things in your package\.json are the name and version fields\. Those are actually required, and your package won't install without them\. The name and version together form an identifier that is assumed to be completely unique\. Changes to the package should come along with changes to the version\. .P The name is what your thing is called\. Some tips: .RS 0 .IP \(bu 2 Don't put "js" or "node" in the name\. It's assumed that it's js, since you're writing a package\.json file, and you can specify the engine using the "engines" field\. (See below\.) .IP \(bu 2 The name ends up being part of a URL, an argument on the command line, and a folder name\. Any name with non\-url\-safe characters will be rejected\. Also, it can't start with a dot or an underscore\. .IP \(bu 2 The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive\. .IP \(bu 2 You may want to check the npm registry to see if there's something by that name already, before you get too attached to it\. http://registry\.npmjs\.org/ .RE .P A name can be optionally prefixed by a scope, e\.g\. \fB@myorg/mypackage\fR\|\. See npm help 7 \fBnpm\-scope\fR for more detail\. .SH version .P The \fImost\fR important things in your package\.json are the name and version fields\. Those are actually required, and your package won't install without them\. The name and version together form an identifier that is assumed to be completely unique\. Changes to the package should come along with changes to the version\. .P Version must be parseable by node\-semver \fIhttps://github\.com/isaacs/node\-semver\fR, which is bundled with npm as a dependency\. (\fBnpm install semver\fR to use it yourself\.) .P More on version numbers and ranges at npm help 7 semver\. .SH description .P Put a description in it\. It's a string\. This helps people discover your package, as it's listed in \fBnpm search\fR\|\. .SH keywords .P Put keywords in it\. It's an array of strings\. This helps people discover your package as it's listed in \fBnpm search\fR\|\. .SH homepage .P The url to the project homepage\. .P \fBNOTE\fR: This is \fInot\fR the same as "url"\. If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you\. .P Literally\. Spit\. I'm so not kidding\. .SH bugs .P The url to your project's issue tracker and / or the email address to which issues should be reported\. These are helpful for people who encounter issues with your package\. .P It should look like this: .P .RS 2 .nf { "url" : "http://github\.com/owner/project/issues" , "email" : "project@hostname\.com" } .fi .RE .P You can specify either one or both values\. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object\. .P If a url is provided, it will be used by the \fBnpm bugs\fR command\. .SH license .P You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it\. .P The simplest way, assuming you're using a common license such as BSD\-3\-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this: .P .RS 2 .nf { "license" : "BSD\-3\-Clause" } .fi .RE .P You can check the full list of SPDX license IDs \fIhttps://spdx\.org/licenses/\fR\|\. Ideally you should pick one that is OSI \fIhttp://opensource\.org/licenses/alphabetical\fR approved\. .P It's also a good idea to include a LICENSE file at the top level in your package\. .SH people fields: author, contributors .P The "author" is one person\. "contributors" is an array of people\. A "person" is an object with a "name" field and optionally "url" and "email", like this: .P .RS 2 .nf { "name" : "Barney Rubble" , "email" : "b@rubble\.com" , "url" : "http://barnyrubble\.tumblr\.com/" } .fi .RE .P Or you can shorten that all into a single string, and npm will parse it for you: .P .RS 2 .nf "Barney Rubble <b@rubble\.com> (http://barnyrubble\.tumblr\.com/) .fi .RE .P Both email and url are optional either way\. .P npm also sets a top\-level "maintainers" field with your npm user info\. .SH files .P The "files" field is an array of files to include in your project\. If you name a folder in the array, then it will also include the files inside that folder\. (Unless they would be ignored by another rule\.) .P You can also provide a "\.npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array\. The "\.npmignore" file works just like a "\.gitignore"\. .SH main .P The main field is a module ID that is the primary entry point to your program\. That is, if your package is named \fBfoo\fR, and a user installs it, and then does \fBrequire("foo")\fR, then your main module's exports object will be returned\. .P This should be a module ID relative to the root of your package folder\. .P For most modules, it makes the most sense to have a main script and often not much else\. .SH bin .P A lot of packages have one or more executable files that they'd like to install into the PATH\. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable\.) .P To use this, supply a \fBbin\fR field in your package\.json which is a map of command name to local file name\. On install, npm will symlink that file into \fBprefix/bin\fR for global installs, or \fB\|\./node_modules/\.bin/\fR for local installs\. .P For example, npm has this: .P .RS 2 .nf { "bin" : { "npm" : "\./cli\.js" } } .fi .RE .P So, when you install npm, it'll create a symlink from the \fBcli\.js\fR script to \fB/usr/local/bin/npm\fR\|\. .P If you have a single executable, and its name should be the name of the package, then you can just supply it as a string\. For example: .P .RS 2 .nf { "name": "my\-program" , "version": "1\.2\.5" , "bin": "\./path/to/program" } .fi .RE .P would be the same as this: .P .RS 2 .nf { "name": "my\-program" , "version": "1\.2\.5" , "bin" : { "my\-program" : "\./path/to/program" } } .fi .RE .SH man .P Specify either a single file or an array of filenames to put in place for the \fBman\fR program to find\. .P If only a single file is provided, then it's installed such that it is the result from \fBman <pkgname>\fR, regardless of its actual filename\. For example: .P .RS 2 .nf { "name" : "foo" , "version" : "1\.2\.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo\.js" , "man" : "\./man/doc\.1" } .fi .RE .P would link the \fB\|\./man/doc\.1\fR file in such that it is the target for \fBman foo\fR .P If the filename doesn't start with the package name, then it's prefixed\. So, this: .P .RS 2 .nf { "name" : "foo" , "version" : "1\.2\.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo\.js" , "man" : [ "\./man/foo\.1", "\./man/bar\.1" ] } .fi .RE .P will create files to do \fBman foo\fR and \fBman foo\-bar\fR\|\. .P Man files must end with a number, and optionally a \fB\|\.gz\fR suffix if they are compressed\. The number dictates which man section the file is installed into\. .P .RS 2 .nf { "name" : "foo" , "version" : "1\.2\.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo\.js" , "man" : [ "\./man/foo\.1", "\./man/foo\.2" ] } .fi .RE .P will create entries for \fBman foo\fR and \fBman 2 foo\fR .SH directories .P The CommonJS Packages \fIhttp://wiki\.commonjs\.org/wiki/Packages/1\.0\fR spec details a few ways that you can indicate the structure of your package using a \fBdirectories\fR object\. If you look at npm's package\.json \fIhttp://registry\.npmjs\.org/npm/latest\fR, you'll see that it has directories for doc, lib, and man\. .P In the future, this information may be used in other creative ways\. .SS directories\.lib .P Tell people where the bulk of your library is\. Nothing special is done with the lib folder in any way, but it's useful meta info\. .SS directories\.bin .P If you specify a \fBbin\fR directory, then all the files in that folder will be added as children of the \fBbin\fR path\. .P If you have a \fBbin\fR path already, then this has no effect\. .SS directories\.man .P A folder that is full of man pages\. Sugar to generate a "man" array by walking the folder\. .SS directories\.doc .P Put markdown files in here\. Eventually, these will be displayed nicely, maybe, someday\. .SS directories\.example .P Put example scripts in here\. Someday, it might be exposed in some clever way\. .SH repository .P Specify the place where your code lives\. This is helpful for people who want to contribute\. If the git repo is on GitHub, then the \fBnpm docs\fR command will be able to find you\. .P Do it like this: .P .RS 2 .nf "repository" : { "type" : "git" , "url" : "http://github\.com/npm/npm\.git" } "repository" : { "type" : "svn" , "url" : "http://v8\.googlecode\.com/svn/trunk/" } .fi .RE .P The URL should be a publicly available (perhaps read\-only) url that can be handed directly to a VCS program without any modification\. It should not be a url to an html project page that you put in your browser\. It's for computers\. .SH scripts .P The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package\. The key is the lifecycle event, and the value is the command to run at that point\. .P See npm help 7 \fBnpm\-scripts\fR to find out more about writing package scripts\. .SH config .P A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades\. For instance, if a package had the following: .P .RS 2 .nf { "name" : "foo" , "config" : { "port" : "8080" } } .fi .RE .P and then had a "start" command that then referenced the \fBnpm_package_config_port\fR environment variable, then the user could override that by doing \fBnpm config set foo:port 8001\fR\|\. .P See npm help 7 \fBnpm\-config\fR and npm help 7 \fBnpm\-scripts\fR for more on package configs\. .SH dependencies .P Dependencies are specified in a simple object that maps a package name to a version range\. The version range is a string which has one or more space\-separated descriptors\. Dependencies can also be identified with a tarball or git URL\. .P \fBPlease do not put test harnesses or transpilers in your \fBdependencies\fR object\.\fR See \fBdevDependencies\fR, below\. .P See npm help 7 semver for more details about specifying version ranges\. .RS 0 .IP \(bu 2 \fBversion\fR Must match \fBversion\fR exactly .IP \(bu 2 \fB>version\fR Must be greater than \fBversion\fR .IP \(bu 2 \fB>=version\fR etc .IP \(bu 2 \fB<version\fR .IP \(bu 2 \fB<=version\fR .IP \(bu 2 \fB~version\fR "Approximately equivalent to version" See npm help 7 semver .IP \(bu 2 \fB^version\fR "Compatible with version" See npm help 7 semver .IP \(bu 2 \fB1\.2\.x\fR 1\.2\.0, 1\.2\.1, etc\., but not 1\.3\.0 .IP \(bu 2 \fBhttp://\.\.\.\fR See 'URLs as Dependencies' below .IP \(bu 2 \fB*\fR Matches any version .IP \(bu 2 \fB""\fR (just an empty string) Same as \fB*\fR .IP \(bu 2 \fBversion1 \- version2\fR Same as \fB>=version1 <=version2\fR\|\. .IP \(bu 2 \fBrange1 || range2\fR Passes if either range1 or range2 are satisfied\. .IP \(bu 2 \fBgit\.\.\.\fR See 'Git URLs as Dependencies' below .IP \(bu 2 \fBuser/repo\fR See 'GitHub URLs' below .IP \(bu 2 \fBtag\fR A specific version tagged and published as \fBtag\fR See npm help \fBnpm\-tag\fR .IP \(bu 2 \fBpath/path/path\fR See Local Paths below .RE .P For example, these are all valid: .P .RS 2 .nf { "dependencies" : { "foo" : "1\.0\.0 \- 2\.9999\.9999" , "bar" : ">=1\.0\.2 <2\.1\.2" , "baz" : ">1\.0\.2 <=2\.3\.4" , "boo" : "2\.0\.1" , "qux" : "<1\.0\.0 || >=2\.3\.1 <2\.4\.5 || >=2\.5\.2 <3\.0\.0" , "asd" : "http://asdf\.com/asdf\.tar\.gz" , "til" : "~1\.2" , "elf" : "~1\.2\.3" , "two" : "2\.x" , "thr" : "3\.3\.x" , "lat" : "latest" , "dyl" : "file:\.\./dyl" } } .fi .RE .SS URLs as Dependencies .P You may specify a tarball URL in place of a version range\. .P This tarball will be downloaded and installed locally to your package at install time\. .SS Git URLs as Dependencies .P Git urls can be of the form: .P .RS 2 .nf git://github\.com/user/project\.git#commit\-ish git+ssh://user@hostname:project\.git#commit\-ish git+ssh://user@hostname/project\.git#commit\-ish git+http://user@hostname/project/blah\.git#commit\-ish git+https://user@hostname/project/blah\.git#commit\-ish .fi .RE .P The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as an argument to \fBgit checkout\fR\|\. The default is \fBmaster\fR\|\. .SH GitHub URLs .P As of version 1\.1\.65, you can refer to GitHub urls as just "foo": "user/foo\-project"\. Just as with git URLs, a \fBcommit\-ish\fR suffix can be included\. For example: .P .RS 2 .nf { "name": "foo", "version": "0\.0\.0", "dependencies": { "express": "visionmedia/express", "mocha": "visionmedia/mocha#4727d357ea" } } .fi .RE .SH Local Paths .P As of version 2\.0\.0 you can provide a path to a local directory that contains a package\. Local paths can be saved using \fBnpm install \-\-save\fR, using any of these forms: .P .RS 2 .nf \|\.\./foo/bar ~/foo/bar \|\./foo/bar /foo/bar .fi .RE .P in which case they will be normalized to a relative path and added to your \fBpackage\.json\fR\|\. For example: .P .RS 2 .nf { "name": "baz", "dependencies": { "bar": "file:\.\./foo/bar" } } .fi .RE .P This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry\. .SH devDependencies .P If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use\. .P In this case, it's best to map these additional items in a \fBdevDependencies\fR object\. .P These things will be installed when doing \fBnpm link\fR or \fBnpm install\fR from the root of a package, and can be managed like any other npm configuration param\. See npm help 7 \fBnpm\-config\fR for more on the topic\. .P For build steps that are not platform\-specific, such as compiling CoffeeScript or other languages to JavaScript, use the \fBprepublish\fR script to do this, and make the required package a devDependency\. .P For example: .P .RS 2 .nf { "name": "ethopia\-waza", "description": "a delightfully fruity coffee varietal", "version": "1\.2\.3", "devDependencies": { "coffee\-script": "~1\.6\.3" }, "scripts": { "prepublish": "coffee \-o lib/ \-c src/waza\.coffee" }, "main": "lib/waza\.js" } .fi .RE .P The \fBprepublish\fR script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves\. In dev mode (ie, locally running \fBnpm install\fR), it'll run this script as well, so that you can test it easily\. .SH peerDependencies .P In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a \fBrequire\fR of this host\. This is usually referred to as a \fIplugin\fR\|\. Notably, your module may be exposing a specific interface, expected and specified by the host documentation\. .P For example: .P .RS 2 .nf { "name": "tea\-latte", "version": "1\.3\.5" "peerDependencies": { "tea": "2\.x" } } .fi .RE .P This ensures your package \fBtea\-latte\fR can be installed \fIalong\fR with the second major version of the host package \fBtea\fR only\. The host package is automatically installed if needed\. \fBnpm install tea\-latte\fR could possibly yield the following dependency graph: .P .RS 2 .nf ├── tea\-latte@1\.3\.5 └── tea@2\.2\.0 .fi .RE .P Trying to install another plugin with a conflicting requirement will cause an error\. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions\. .P Assuming the host complies with semver \fIhttp://semver\.org/\fR, only changes in the host package's major version will break your plugin\. Thus, if you've worked with every 1\.x version of the host package, use \fB"^1\.0"\fR or \fB"1\.x"\fR to express this\. If you depend on features introduced in 1\.5\.2, use \fB">= 1\.5\.2 < 2"\fR\|\. .SH bundledDependencies .P Array of package names that will be bundled when publishing the package\. .P If this is spelled \fB"bundleDependencies"\fR, then that is also honorable\. .SH optionalDependencies .P If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the \fBoptionalDependencies\fR object\. This is a map of package name to version or url, just like the \fBdependencies\fR object\. The difference is that build failures do not cause installation to fail\. .P It is still your program's responsibility to handle the lack of the dependency\. For example, something like this: .P .RS 2 .nf try { var foo = require('foo') var fooVersion = require('foo/package\.json')\.version } catch (er) { foo = null } if ( notGoodFooVersion(fooVersion) ) { foo = null } // \.\. then later in your program \.\. if (foo) { foo\.doFooThings() } .fi .RE .P Entries in \fBoptionalDependencies\fR will override entries of the same name in \fBdependencies\fR, so it's usually best to only put in one place\. .SH engines .P You can specify the version of node that your stuff works on: .P .RS 2 .nf { "engines" : { "node" : ">=0\.10\.3 <0\.12" } } .fi .RE .P And, like with dependencies, if you don't specify the version (or if you specify "*" as the version), then any version of node will do\. .P If you specify an "engines" field, then npm will require that "node" be somewhere on that list\. If "engines" is omitted, then npm will just assume that it works on node\. .P You can also use the "engines" field to specify which versions of npm are capable of properly installing your program\. For example: .P .RS 2 .nf { "engines" : { "npm" : "~1\.0\.20" } } .fi .RE .P Note that, unless the user has set the \fBengine\-strict\fR config flag, this field is advisory only\. .SH engineStrict .P If you are sure that your module will \fIdefinitely not\fR run properly on versions of Node/npm other than those specified in the \fBengines\fR object, then you can set \fB"engineStrict": true\fR in your package\.json file\. This will override the user's \fBengine\-strict\fR config setting\. .P Please do not do this unless you are really very very sure\. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node\. Consider this choice carefully\. If people abuse it, it will be removed in a future version of npm\. .SH os .P You can specify which operating systems your module will run on: .P .RS 2 .nf "os" : [ "darwin", "linux" ] .fi .RE .P You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!': .P .RS 2 .nf "os" : [ "!win32" ] .fi .RE .P The host operating system is determined by \fBprocess\.platform\fR .P It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this\. .SH cpu .P If your code only runs on certain cpu architectures, you can specify which ones\. .P .RS 2 .nf "cpu" : [ "x64", "ia32" ] .fi .RE .P Like the \fBos\fR option, you can also blacklist architectures: .P .RS 2 .nf "cpu" : [ "!arm", "!mips" ] .fi .RE .P The host architecture is determined by \fBprocess\.arch\fR .SH preferGlobal .P If your package is primarily a command\-line application that should be installed globally, then set this value to \fBtrue\fR to provide a warning if it is installed locally\. .P It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected\. .SH private .P If you set \fB"private": true\fR in your package\.json, then npm will refuse to publish it\. .P This is a way to prevent accidental publication of private repositories\. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the \fBpublishConfig\fR dictionary described below to override the \fBregistry\fR config param at publish\-time\. .SH publishConfig .P This is a set of config values that will be used at publish\-time\. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default\. .P Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing\. .P See npm help 7 \fBnpm\-config\fR to see the list of config options that can be overridden\. .SH DEFAULT VALUES .P npm will default some values based on package contents\. .RS 0 .IP \(bu 2 \fB"scripts": {"start": "node server\.js"}\fR If there is a \fBserver\.js\fR file in the root of your package, then npm will default the \fBstart\fR command to \fBnode server\.js\fR\|\. .IP \(bu 2 \fB"scripts":{"preinstall": "node\-gyp rebuild"}\fR If there is a \fBbinding\.gyp\fR file in the root of your package, npm will default the \fBpreinstall\fR command to compile using node\-gyp\. .IP \(bu 2 \fB"contributors": [\.\.\.]\fR If there is an \fBAUTHORS\fR file in the root of your package, npm will treat each line as a \fBName <email> (url)\fR format, where email and url are optional\. Lines which start with a \fB#\fR or are blank, will be ignored\. .RE .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 semver .IP \(bu 2 npm help init .IP \(bu 2 npm help version .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help help .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help install .IP \(bu 2 npm help publish .IP \(bu 2 npm help rm .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man5/npmrc.5����������������������������������������000644 �000766 �000024 �00000005005 12455173731 023522� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPMRC" "5" "January 2015" "" "" .SH "NAME" \fBnpmrc\fR \- The npm config files .SH DESCRIPTION .P npm gets its config settings from the command line, environment variables, and \fBnpmrc\fR files\. .P The \fBnpm config\fR command can be used to update and edit the contents of the user and global npmrc files\. .P For a list of available configuration options, see npm help 7 config\. .SH FILES .P The four relevant files are: .RS 0 .IP \(bu 2 per\-project config file (/path/to/my/project/\.npmrc) .IP \(bu 2 per\-user config file (~/\.npmrc) .IP \(bu 2 global config file ($PREFIX/npmrc) .IP \(bu 2 npm builtin config file (/path/to/npm/npmrc) .RE .P All npm config files are an ini\-formatted list of \fBkey = value\fR parameters\. Environment variables can be replaced using \fB${VARIABLE_NAME}\fR\|\. For example: .P .RS 2 .nf prefix = ${HOME}/\.npm\-packages .fi .RE .P Each of these files is loaded, and config options are resolved in priority order\. For example, a setting in the userconfig file would override the setting in the globalconfig file\. .P Array values are specified by adding "[]" after the key name\. For example: .P .RS 2 .nf key[] = "first value" key[] = "second value" .fi .RE .SS Per\-project config file .P When working locally in a project, a \fB\|\.npmrc\fR file in the root of the project (ie, a sibling of \fBnode_modules\fR and \fBpackage\.json\fR) will set config values specific to this project\. .P Note that this only applies to the root of the project that you're running npm in\. It has no effect when your module is published\. For example, you can't publish a module that forces itself to install globally, or in a different location\. .SS Per\-user config file .P \fB$HOME/\.npmrc\fR (or the \fBuserconfig\fR param, if set in the environment or on the command line) .SS Global config file .P \fB$PREFIX/etc/npmrc\fR (or the \fBglobalconfig\fR param, if set above): This file is an ini\-file formatted list of \fBkey = value\fR parameters\. Environment variables can be replaced as above\. .SS Built\-in config file .P \fBpath/to/npm/itself/npmrc\fR .P This is an unchangeable "builtin" configuration file that npm keeps consistent across updates\. Set fields in here using the \fB\|\./configure\fR script that comes with npm\. This is primarily for distribution maintainers to override default configs in a standard and consistent manner\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help npm .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man5/package.json.5���������������������������������000644 �000766 �000024 �00000054100 12455173731 024746� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "PACKAGE\.JSON" "5" "January 2015" "" "" .SH "NAME" \fBpackage.json\fR \- Specifics of npm's package\.json handling .SH DESCRIPTION .P This document is all you need to know about what's required in your package\.json file\. It must be actual JSON, not just a JavaScript object literal\. .P A lot of the behavior described in this document is affected by the config settings described in npm help 7 \fBnpm\-config\fR\|\. .SH name .P The \fImost\fR important things in your package\.json are the name and version fields\. Those are actually required, and your package won't install without them\. The name and version together form an identifier that is assumed to be completely unique\. Changes to the package should come along with changes to the version\. .P The name is what your thing is called\. Some tips: .RS 0 .IP \(bu 2 Don't put "js" or "node" in the name\. It's assumed that it's js, since you're writing a package\.json file, and you can specify the engine using the "engines" field\. (See below\.) .IP \(bu 2 The name ends up being part of a URL, an argument on the command line, and a folder name\. Any name with non\-url\-safe characters will be rejected\. Also, it can't start with a dot or an underscore\. .IP \(bu 2 The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive\. .IP \(bu 2 You may want to check the npm registry to see if there's something by that name already, before you get too attached to it\. http://registry\.npmjs\.org/ .RE .P A name can be optionally prefixed by a scope, e\.g\. \fB@myorg/mypackage\fR\|\. See npm help 7 \fBnpm\-scope\fR for more detail\. .SH version .P The \fImost\fR important things in your package\.json are the name and version fields\. Those are actually required, and your package won't install without them\. The name and version together form an identifier that is assumed to be completely unique\. Changes to the package should come along with changes to the version\. .P Version must be parseable by node\-semver \fIhttps://github\.com/isaacs/node\-semver\fR, which is bundled with npm as a dependency\. (\fBnpm install semver\fR to use it yourself\.) .P More on version numbers and ranges at npm help 7 semver\. .SH description .P Put a description in it\. It's a string\. This helps people discover your package, as it's listed in \fBnpm search\fR\|\. .SH keywords .P Put keywords in it\. It's an array of strings\. This helps people discover your package as it's listed in \fBnpm search\fR\|\. .SH homepage .P The url to the project homepage\. .P \fBNOTE\fR: This is \fInot\fR the same as "url"\. If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you\. .P Literally\. Spit\. I'm so not kidding\. .SH bugs .P The url to your project's issue tracker and / or the email address to which issues should be reported\. These are helpful for people who encounter issues with your package\. .P It should look like this: .P .RS 2 .nf { "url" : "http://github\.com/owner/project/issues" , "email" : "project@hostname\.com" } .fi .RE .P You can specify either one or both values\. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object\. .P If a url is provided, it will be used by the \fBnpm bugs\fR command\. .SH license .P You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it\. .P The simplest way, assuming you're using a common license such as BSD\-3\-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this: .P .RS 2 .nf { "license" : "BSD\-3\-Clause" } .fi .RE .P You can check the full list of SPDX license IDs \fIhttps://spdx\.org/licenses/\fR\|\. Ideally you should pick one that is OSI \fIhttp://opensource\.org/licenses/alphabetical\fR approved\. .P It's also a good idea to include a LICENSE file at the top level in your package\. .SH people fields: author, contributors .P The "author" is one person\. "contributors" is an array of people\. A "person" is an object with a "name" field and optionally "url" and "email", like this: .P .RS 2 .nf { "name" : "Barney Rubble" , "email" : "b@rubble\.com" , "url" : "http://barnyrubble\.tumblr\.com/" } .fi .RE .P Or you can shorten that all into a single string, and npm will parse it for you: .P .RS 2 .nf "Barney Rubble <b@rubble\.com> (http://barnyrubble\.tumblr\.com/) .fi .RE .P Both email and url are optional either way\. .P npm also sets a top\-level "maintainers" field with your npm user info\. .SH files .P The "files" field is an array of files to include in your project\. If you name a folder in the array, then it will also include the files inside that folder\. (Unless they would be ignored by another rule\.) .P You can also provide a "\.npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array\. The "\.npmignore" file works just like a "\.gitignore"\. .SH main .P The main field is a module ID that is the primary entry point to your program\. That is, if your package is named \fBfoo\fR, and a user installs it, and then does \fBrequire("foo")\fR, then your main module's exports object will be returned\. .P This should be a module ID relative to the root of your package folder\. .P For most modules, it makes the most sense to have a main script and often not much else\. .SH bin .P A lot of packages have one or more executable files that they'd like to install into the PATH\. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable\.) .P To use this, supply a \fBbin\fR field in your package\.json which is a map of command name to local file name\. On install, npm will symlink that file into \fBprefix/bin\fR for global installs, or \fB\|\./node_modules/\.bin/\fR for local installs\. .P For example, npm has this: .P .RS 2 .nf { "bin" : { "npm" : "\./cli\.js" } } .fi .RE .P So, when you install npm, it'll create a symlink from the \fBcli\.js\fR script to \fB/usr/local/bin/npm\fR\|\. .P If you have a single executable, and its name should be the name of the package, then you can just supply it as a string\. For example: .P .RS 2 .nf { "name": "my\-program" , "version": "1\.2\.5" , "bin": "\./path/to/program" } .fi .RE .P would be the same as this: .P .RS 2 .nf { "name": "my\-program" , "version": "1\.2\.5" , "bin" : { "my\-program" : "\./path/to/program" } } .fi .RE .SH man .P Specify either a single file or an array of filenames to put in place for the \fBman\fR program to find\. .P If only a single file is provided, then it's installed such that it is the result from \fBman <pkgname>\fR, regardless of its actual filename\. For example: .P .RS 2 .nf { "name" : "foo" , "version" : "1\.2\.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo\.js" , "man" : "\./man/doc\.1" } .fi .RE .P would link the \fB\|\./man/doc\.1\fR file in such that it is the target for \fBman foo\fR .P If the filename doesn't start with the package name, then it's prefixed\. So, this: .P .RS 2 .nf { "name" : "foo" , "version" : "1\.2\.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo\.js" , "man" : [ "\./man/foo\.1", "\./man/bar\.1" ] } .fi .RE .P will create files to do \fBman foo\fR and \fBman foo\-bar\fR\|\. .P Man files must end with a number, and optionally a \fB\|\.gz\fR suffix if they are compressed\. The number dictates which man section the file is installed into\. .P .RS 2 .nf { "name" : "foo" , "version" : "1\.2\.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo\.js" , "man" : [ "\./man/foo\.1", "\./man/foo\.2" ] } .fi .RE .P will create entries for \fBman foo\fR and \fBman 2 foo\fR .SH directories .P The CommonJS Packages \fIhttp://wiki\.commonjs\.org/wiki/Packages/1\.0\fR spec details a few ways that you can indicate the structure of your package using a \fBdirectories\fR object\. If you look at npm's package\.json \fIhttp://registry\.npmjs\.org/npm/latest\fR, you'll see that it has directories for doc, lib, and man\. .P In the future, this information may be used in other creative ways\. .SS directories\.lib .P Tell people where the bulk of your library is\. Nothing special is done with the lib folder in any way, but it's useful meta info\. .SS directories\.bin .P If you specify a \fBbin\fR directory, then all the files in that folder will be added as children of the \fBbin\fR path\. .P If you have a \fBbin\fR path already, then this has no effect\. .SS directories\.man .P A folder that is full of man pages\. Sugar to generate a "man" array by walking the folder\. .SS directories\.doc .P Put markdown files in here\. Eventually, these will be displayed nicely, maybe, someday\. .SS directories\.example .P Put example scripts in here\. Someday, it might be exposed in some clever way\. .SH repository .P Specify the place where your code lives\. This is helpful for people who want to contribute\. If the git repo is on GitHub, then the \fBnpm docs\fR command will be able to find you\. .P Do it like this: .P .RS 2 .nf "repository" : { "type" : "git" , "url" : "http://github\.com/npm/npm\.git" } "repository" : { "type" : "svn" , "url" : "http://v8\.googlecode\.com/svn/trunk/" } .fi .RE .P The URL should be a publicly available (perhaps read\-only) url that can be handed directly to a VCS program without any modification\. It should not be a url to an html project page that you put in your browser\. It's for computers\. .SH scripts .P The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package\. The key is the lifecycle event, and the value is the command to run at that point\. .P See npm help 7 \fBnpm\-scripts\fR to find out more about writing package scripts\. .SH config .P A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades\. For instance, if a package had the following: .P .RS 2 .nf { "name" : "foo" , "config" : { "port" : "8080" } } .fi .RE .P and then had a "start" command that then referenced the \fBnpm_package_config_port\fR environment variable, then the user could override that by doing \fBnpm config set foo:port 8001\fR\|\. .P See npm help 7 \fBnpm\-config\fR and npm help 7 \fBnpm\-scripts\fR for more on package configs\. .SH dependencies .P Dependencies are specified in a simple object that maps a package name to a version range\. The version range is a string which has one or more space\-separated descriptors\. Dependencies can also be identified with a tarball or git URL\. .P \fBPlease do not put test harnesses or transpilers in your \fBdependencies\fR object\.\fR See \fBdevDependencies\fR, below\. .P See npm help 7 semver for more details about specifying version ranges\. .RS 0 .IP \(bu 2 \fBversion\fR Must match \fBversion\fR exactly .IP \(bu 2 \fB>version\fR Must be greater than \fBversion\fR .IP \(bu 2 \fB>=version\fR etc .IP \(bu 2 \fB<version\fR .IP \(bu 2 \fB<=version\fR .IP \(bu 2 \fB~version\fR "Approximately equivalent to version" See npm help 7 semver .IP \(bu 2 \fB^version\fR "Compatible with version" See npm help 7 semver .IP \(bu 2 \fB1\.2\.x\fR 1\.2\.0, 1\.2\.1, etc\., but not 1\.3\.0 .IP \(bu 2 \fBhttp://\.\.\.\fR See 'URLs as Dependencies' below .IP \(bu 2 \fB*\fR Matches any version .IP \(bu 2 \fB""\fR (just an empty string) Same as \fB*\fR .IP \(bu 2 \fBversion1 \- version2\fR Same as \fB>=version1 <=version2\fR\|\. .IP \(bu 2 \fBrange1 || range2\fR Passes if either range1 or range2 are satisfied\. .IP \(bu 2 \fBgit\.\.\.\fR See 'Git URLs as Dependencies' below .IP \(bu 2 \fBuser/repo\fR See 'GitHub URLs' below .IP \(bu 2 \fBtag\fR A specific version tagged and published as \fBtag\fR See npm help \fBnpm\-tag\fR .IP \(bu 2 \fBpath/path/path\fR See Local Paths below .RE .P For example, these are all valid: .P .RS 2 .nf { "dependencies" : { "foo" : "1\.0\.0 \- 2\.9999\.9999" , "bar" : ">=1\.0\.2 <2\.1\.2" , "baz" : ">1\.0\.2 <=2\.3\.4" , "boo" : "2\.0\.1" , "qux" : "<1\.0\.0 || >=2\.3\.1 <2\.4\.5 || >=2\.5\.2 <3\.0\.0" , "asd" : "http://asdf\.com/asdf\.tar\.gz" , "til" : "~1\.2" , "elf" : "~1\.2\.3" , "two" : "2\.x" , "thr" : "3\.3\.x" , "lat" : "latest" , "dyl" : "file:\.\./dyl" } } .fi .RE .SS URLs as Dependencies .P You may specify a tarball URL in place of a version range\. .P This tarball will be downloaded and installed locally to your package at install time\. .SS Git URLs as Dependencies .P Git urls can be of the form: .P .RS 2 .nf git://github\.com/user/project\.git#commit\-ish git+ssh://user@hostname:project\.git#commit\-ish git+ssh://user@hostname/project\.git#commit\-ish git+http://user@hostname/project/blah\.git#commit\-ish git+https://user@hostname/project/blah\.git#commit\-ish .fi .RE .P The \fBcommit\-ish\fR can be any tag, sha, or branch which can be supplied as an argument to \fBgit checkout\fR\|\. The default is \fBmaster\fR\|\. .SH GitHub URLs .P As of version 1\.1\.65, you can refer to GitHub urls as just "foo": "user/foo\-project"\. Just as with git URLs, a \fBcommit\-ish\fR suffix can be included\. For example: .P .RS 2 .nf { "name": "foo", "version": "0\.0\.0", "dependencies": { "express": "visionmedia/express", "mocha": "visionmedia/mocha#4727d357ea" } } .fi .RE .SH Local Paths .P As of version 2\.0\.0 you can provide a path to a local directory that contains a package\. Local paths can be saved using \fBnpm install \-\-save\fR, using any of these forms: .P .RS 2 .nf \|\.\./foo/bar ~/foo/bar \|\./foo/bar /foo/bar .fi .RE .P in which case they will be normalized to a relative path and added to your \fBpackage\.json\fR\|\. For example: .P .RS 2 .nf { "name": "baz", "dependencies": { "bar": "file:\.\./foo/bar" } } .fi .RE .P This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry\. .SH devDependencies .P If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use\. .P In this case, it's best to map these additional items in a \fBdevDependencies\fR object\. .P These things will be installed when doing \fBnpm link\fR or \fBnpm install\fR from the root of a package, and can be managed like any other npm configuration param\. See npm help 7 \fBnpm\-config\fR for more on the topic\. .P For build steps that are not platform\-specific, such as compiling CoffeeScript or other languages to JavaScript, use the \fBprepublish\fR script to do this, and make the required package a devDependency\. .P For example: .P .RS 2 .nf { "name": "ethopia\-waza", "description": "a delightfully fruity coffee varietal", "version": "1\.2\.3", "devDependencies": { "coffee\-script": "~1\.6\.3" }, "scripts": { "prepublish": "coffee \-o lib/ \-c src/waza\.coffee" }, "main": "lib/waza\.js" } .fi .RE .P The \fBprepublish\fR script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves\. In dev mode (ie, locally running \fBnpm install\fR), it'll run this script as well, so that you can test it easily\. .SH peerDependencies .P In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a \fBrequire\fR of this host\. This is usually referred to as a \fIplugin\fR\|\. Notably, your module may be exposing a specific interface, expected and specified by the host documentation\. .P For example: .P .RS 2 .nf { "name": "tea\-latte", "version": "1\.3\.5" "peerDependencies": { "tea": "2\.x" } } .fi .RE .P This ensures your package \fBtea\-latte\fR can be installed \fIalong\fR with the second major version of the host package \fBtea\fR only\. The host package is automatically installed if needed\. \fBnpm install tea\-latte\fR could possibly yield the following dependency graph: .P .RS 2 .nf ├── tea\-latte@1\.3\.5 └── tea@2\.2\.0 .fi .RE .P Trying to install another plugin with a conflicting requirement will cause an error\. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions\. .P Assuming the host complies with semver \fIhttp://semver\.org/\fR, only changes in the host package's major version will break your plugin\. Thus, if you've worked with every 1\.x version of the host package, use \fB"^1\.0"\fR or \fB"1\.x"\fR to express this\. If you depend on features introduced in 1\.5\.2, use \fB">= 1\.5\.2 < 2"\fR\|\. .SH bundledDependencies .P Array of package names that will be bundled when publishing the package\. .P If this is spelled \fB"bundleDependencies"\fR, then that is also honorable\. .SH optionalDependencies .P If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the \fBoptionalDependencies\fR object\. This is a map of package name to version or url, just like the \fBdependencies\fR object\. The difference is that build failures do not cause installation to fail\. .P It is still your program's responsibility to handle the lack of the dependency\. For example, something like this: .P .RS 2 .nf try { var foo = require('foo') var fooVersion = require('foo/package\.json')\.version } catch (er) { foo = null } if ( notGoodFooVersion(fooVersion) ) { foo = null } // \.\. then later in your program \.\. if (foo) { foo\.doFooThings() } .fi .RE .P Entries in \fBoptionalDependencies\fR will override entries of the same name in \fBdependencies\fR, so it's usually best to only put in one place\. .SH engines .P You can specify the version of node that your stuff works on: .P .RS 2 .nf { "engines" : { "node" : ">=0\.10\.3 <0\.12" } } .fi .RE .P And, like with dependencies, if you don't specify the version (or if you specify "*" as the version), then any version of node will do\. .P If you specify an "engines" field, then npm will require that "node" be somewhere on that list\. If "engines" is omitted, then npm will just assume that it works on node\. .P You can also use the "engines" field to specify which versions of npm are capable of properly installing your program\. For example: .P .RS 2 .nf { "engines" : { "npm" : "~1\.0\.20" } } .fi .RE .P Note that, unless the user has set the \fBengine\-strict\fR config flag, this field is advisory only\. .SH engineStrict .P If you are sure that your module will \fIdefinitely not\fR run properly on versions of Node/npm other than those specified in the \fBengines\fR object, then you can set \fB"engineStrict": true\fR in your package\.json file\. This will override the user's \fBengine\-strict\fR config setting\. .P Please do not do this unless you are really very very sure\. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node\. Consider this choice carefully\. If people abuse it, it will be removed in a future version of npm\. .SH os .P You can specify which operating systems your module will run on: .P .RS 2 .nf "os" : [ "darwin", "linux" ] .fi .RE .P You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!': .P .RS 2 .nf "os" : [ "!win32" ] .fi .RE .P The host operating system is determined by \fBprocess\.platform\fR .P It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this\. .SH cpu .P If your code only runs on certain cpu architectures, you can specify which ones\. .P .RS 2 .nf "cpu" : [ "x64", "ia32" ] .fi .RE .P Like the \fBos\fR option, you can also blacklist architectures: .P .RS 2 .nf "cpu" : [ "!arm", "!mips" ] .fi .RE .P The host architecture is determined by \fBprocess\.arch\fR .SH preferGlobal .P If your package is primarily a command\-line application that should be installed globally, then set this value to \fBtrue\fR to provide a warning if it is installed locally\. .P It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected\. .SH private .P If you set \fB"private": true\fR in your package\.json, then npm will refuse to publish it\. .P This is a way to prevent accidental publication of private repositories\. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the \fBpublishConfig\fR dictionary described below to override the \fBregistry\fR config param at publish\-time\. .SH publishConfig .P This is a set of config values that will be used at publish\-time\. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default\. .P Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing\. .P See npm help 7 \fBnpm\-config\fR to see the list of config options that can be overridden\. .SH DEFAULT VALUES .P npm will default some values based on package contents\. .RS 0 .IP \(bu 2 \fB"scripts": {"start": "node server\.js"}\fR If there is a \fBserver\.js\fR file in the root of your package, then npm will default the \fBstart\fR command to \fBnode server\.js\fR\|\. .IP \(bu 2 \fB"scripts":{"preinstall": "node\-gyp rebuild"}\fR If there is a \fBbinding\.gyp\fR file in the root of your package, npm will default the \fBpreinstall\fR command to compile using node\-gyp\. .IP \(bu 2 \fB"contributors": [\.\.\.]\fR If there is an \fBAUTHORS\fR file in the root of your package, npm will treat each line as a \fBName <email> (url)\fR format, where email and url are optional\. Lines which start with a \fB#\fR or are blank, will be ignored\. .RE .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 semver .IP \(bu 2 npm help init .IP \(bu 2 npm help version .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help help .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help install .IP \(bu 2 npm help publish .IP \(bu 2 npm help rm .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-bin.3��������������������������������������000644 �000766 �000024 �00000000521 12455173731 023735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-BIN" "3" "January 2015" "" "" .SH "NAME" \fBnpm-bin\fR \- Display npm bin folder .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.bin(args, cb) .fi .RE .SH DESCRIPTION .P Print the folder where npm will install executables\. .P This function should not be used programmatically\. Instead, just refer to the \fBnpm\.bin\fR property\. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-bugs.3�������������������������������������000644 �000766 �000024 �00000001175 12455173731 024133� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-BUGS" "3" "January 2015" "" "" .SH "NAME" \fBnpm-bugs\fR \- Bugs for a package in a web browser maybe .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.bugs(package, callback) .fi .RE .SH DESCRIPTION .P This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the \fB\-\-browser\fR config param\. .P Like other commands, the first parameter is an array\. This command only uses the first element, which is expected to be a package name with an optional version number\. .P This command will launch a browser, so this command may not be the most friendly for programmatic use\. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-cache.3������������������������������������000644 �000766 �000024 �00000002000 12455173731 024222� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-CACHE" "3" "January 2015" "" "" .SH "NAME" \fBnpm-cache\fR \- manage the npm cache programmatically .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.cache([args], callback) // helpers npm\.commands\.cache\.clean([args], callback) npm\.commands\.cache\.add([args], callback) npm\.commands\.cache\.read(name, version, forceBypass, callback) .fi .RE .SH DESCRIPTION .P This acts much the same ways as the npm help cache command line functionality\. .P The callback is called with the package\.json data of the thing that is eventually added to or read from the cache\. .P The top level \fBnpm\.commands\.cache(\.\.\.)\fR functionality is a public interface, and like all commands on the \fBnpm\.commands\fR object, it will match the command line behavior exactly\. .P However, the cache folder structure and the cache helper functions are considered \fBinternal\fR API surface, and as such, may change in future releases of npm, potentially without warning or significant version incrementation\. .P Use at your own risk\. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-commands.3���������������������������������000644 �000766 �000024 �00000001307 12455173731 024771� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-COMMANDS" "3" "January 2015" "" "" .SH "NAME" \fBnpm-commands\fR \- npm commands .SH SYNOPSIS .P .RS 2 .nf npm\.commands[<command>](args, callback) .fi .RE .SH DESCRIPTION .P npm comes with a full set of commands, and each of the commands takes a similar set of arguments\. .P In general, all commands on the command object take an \fBarray\fR of positional argument \fBstrings\fR\|\. The last argument to any function is a callback\. Some commands are special and take other optional arguments\. .P All commands have their own man page\. See \fBman npm\-<command>\fR for command\-line usage, or \fBman 3 npm\-<command>\fR for programmatic usage\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 index .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-config.3�����������������������������������000644 �000766 �000024 �00000002470 12455173731 024437� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-CONFIG" "3" "January 2015" "" "" .SH "NAME" \fBnpm-config\fR \- Manage the npm configuration files .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.config(args, callback) var val = npm\.config\.get(key) npm\.config\.set(key, val) .fi .RE .SH DESCRIPTION .P This function acts much the same way as the command\-line version\. The first element in the array tells config what to do\. Possible values are: .RS 0 .IP \(bu 2 \fBset\fR Sets a config parameter\. The second element in \fBargs\fR is interpreted as the key, and the third element is interpreted as the value\. .IP \(bu 2 \fBget\fR Gets the value of a config parameter\. The second element in \fBargs\fR is the key to get the value of\. .IP \(bu 2 \fBdelete\fR (\fBrm\fR or \fBdel\fR) Deletes a parameter from the config\. The second element in \fBargs\fR is the key to delete\. .IP \(bu 2 \fBlist\fR (\fBls\fR) Show all configs that aren't secret\. No parameters necessary\. .IP \(bu 2 \fBedit\fR: Opens the config file in the default editor\. This command isn't very useful programmatically, but it is made available\. .RE .P To programmatically access npm configuration settings, or set them for the duration of a program, use the \fBnpm\.config\.set\fR and \fBnpm\.config\.get\fR functions instead\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm apihelp npm .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-deprecate.3��������������������������������000644 �000766 �000024 �00000001753 12455173731 025131� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DEPRECATE" "3" "January 2015" "" "" .SH "NAME" \fBnpm-deprecate\fR \- Deprecate a version of a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.deprecate(args, callback) .fi .RE .SH DESCRIPTION .P This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it\. .P The 'args' parameter must have exactly two elements: .RS 0 .IP \(bu 2 \fBpackage[@version]\fR The \fBversion\fR portion is optional, and may be either a range, or a specific version, or a tag\. .IP \(bu 2 \fBmessage\fR The warning message that will be printed whenever a user attempts to install the package\. .RE .P Note that you must be the package owner to deprecate something\. See the \fBowner\fR and \fBadduser\fR help topics\. .P To un\-deprecate a package, specify an empty string (\fB""\fR) for the \fBmessage\fR argument\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm apihelp publish .IP \(bu 2 npm apihelp unpublish .IP \(bu 2 npm help 7 registry .RE ���������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-docs.3�������������������������������������000644 �000766 �000024 �00000001177 12455173731 024125� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DOCS" "3" "January 2015" "" "" .SH "NAME" \fBnpm-docs\fR \- Docs for a package in a web browser maybe .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.docs(package, callback) .fi .RE .SH DESCRIPTION .P This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the \fB\-\-browser\fR config param\. .P Like other commands, the first parameter is an array\. This command only uses the first element, which is expected to be a package name with an optional version number\. .P This command will launch a browser, so this command may not be the most friendly for programmatic use\. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-edit.3�������������������������������������000644 �000766 �000024 �00000001573 12455173731 024122� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-EDIT" "3" "January 2015" "" "" .SH "NAME" \fBnpm-edit\fR \- Edit an installed package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.edit(package, callback) .fi .RE .SH DESCRIPTION .P Opens the package folder in the default editor (or whatever you've configured as the npm \fBeditor\fR config \-\- see \fBnpm help config\fR\|\.) .P After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages\. .P For instance, you can do \fBnpm install connect\fR to install connect into your package, and then \fBnpm\.commands\.edit(["connect"], callback)\fR to make a few changes to your locally installed copy\. .P The first parameter is a string array with a single element, the package to open\. The package can optionally have a version number attached\. .P Since this command opens an editor in a new process, be careful about where and how this is used\. �������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-explore.3����������������������������������000644 �000766 �000024 �00000001323 12455173731 024644� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-EXPLORE" "3" "January 2015" "" "" .SH "NAME" \fBnpm-explore\fR \- Browse an installed package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.explore(args, callback) .fi .RE .SH DESCRIPTION .P Spawn a subshell in the directory of the installed package specified\. .P If a command is specified, then it is run in the subshell, which then immediately terminates\. .P Note that the package is \fInot\fR automatically rebuilt afterwards, so be sure to use \fBnpm rebuild <pkg>\fR if you make any changes\. .P The first element in the 'args' parameter must be a package name\. After that is the optional command, which can be any number of strings\. All of the strings will be combined into one, space\-delimited command\. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-help-search.3������������������������������000644 �000766 �000024 �00000002017 12455173731 025362� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-HELP\-SEARCH" "3" "January 2015" "" "" .SH "NAME" \fBnpm-help-search\fR \- Search the help pages .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.helpSearch(args, [silent,] callback) .fi .RE .SH DESCRIPTION .P This command is rarely useful, but it exists in the rare case that it is\. .P This command takes an array of search terms and returns the help pages that match in order of best match\. .P If there is only one match, then npm displays that help section\. If there are multiple results, the results are printed to the screen formatted and the array of results is returned\. Each result is an object with these properties: .RS 0 .IP \(bu 2 hits: A map of args to number of hits on that arg\. For example, {"npm": 3} .IP \(bu 2 found: Total number of unique args that matched\. .IP \(bu 2 totalHits: Total number of hits\. .IP \(bu 2 lines: An array of all matching lines (and some adjacent lines)\. .IP \(bu 2 file: Name of the file that matched .RE .P The silent parameter is not necessary not used, but it may in the future\. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-init.3�������������������������������������000644 �000766 �000024 �00000001756 12455173731 024143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM" "" "January 2015" "" "" .SH "NAME" \fBnpm\fR .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.init(args, callback) .fi .RE .SH DESCRIPTION .P This will ask you a bunch of questions, and then write a package\.json for you\. .P It attempts to make reasonable guesses about what you want things to be set to, and then writes a package\.json file with the options you've selected\. .P If you already have a package\.json file, it'll read that first, and default to the options in there\. .P It is strictly additive, so it does not delete options from your package\.json without a really good reason to do so\. .P Since this function expects to be run on the command\-line, it doesn't work very well as a programmatically\. The best option is to roll your own, and since JavaScript makes it stupid simple to output formatted JSON, that is the preferred method\. If you're sure you want to handle command\-line prompting, then go ahead and use this programmatically\. .SH SEE ALSO .P npm help 5 package\.json ������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-install.3����������������������������������000644 �000766 �000024 �00000001230 12455173731 024631� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-INSTALL" "3" "January 2015" "" "" .SH "NAME" \fBnpm-install\fR \- install a package programmatically .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.install([where,] packages, callback) .fi .RE .SH DESCRIPTION .P This acts much the same ways as installing on the command\-line\. .P The 'where' parameter is optional and only used internally, and it specifies where the packages should be installed to\. .P The 'packages' parameter is an array of strings\. Each element in the array is the name of a package to be installed\. .P Finally, 'callback' is a function that will be called when all packages have been installed or when an error has been encountered\. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-link.3�������������������������������������000644 �000766 �000024 �00000002136 12455173731 024126� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-LINK" "3" "January 2015" "" "" .SH "NAME" \fBnpm-link\fR \- Symlink a package folder .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.link(callback) npm\.commands\.link(packages, callback) .fi .RE .SH DESCRIPTION .P Package linking is a two\-step process\. .P Without parameters, link will create a globally\-installed symbolic link from \fBprefix/package\-name\fR to the current folder\. .P With a parameters, link will create a symlink from the local \fBnode_modules\fR folder to the global symlink\. .P When creating tarballs for \fBnpm publish\fR, the linked packages are "snapshotted" to their current state by resolving the symbolic links\. .P This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild\. .P For example: .P .RS 2 .nf npm\.commands\.link(cb) # creates global link from the cwd # (say redis package) npm\.commands\.link('redis', cb) # link\-install the package .fi .RE .P Now, any changes to the redis package will be reflected in the package in the current working directory ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-load.3�������������������������������������000644 �000766 �000024 �00000001372 12455173731 024111� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-LOAD" "3" "January 2015" "" "" .SH "NAME" \fBnpm-load\fR \- Load config settings .SH SYNOPSIS .P .RS 2 .nf npm\.load(conf, cb) .fi .RE .SH DESCRIPTION .P npm\.load() must be called before any other function call\. Both parameters are optional, but the second is recommended\. .P The first parameter is an object containing command\-line config params, and the second parameter is a callback that will be called when npm is loaded and ready to serve\. .P The first parameter should follow a similar structure as the package\.json config object\. .P For example, to emulate the \-\-dev flag, pass an object that looks like this: .P .RS 2 .nf { "dev": true } .fi .RE .P For a list of all the available command\-line configs, see \fBnpm help config\fR ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-ls.3���������������������������������������000644 �000766 �000024 �00000003367 12455173731 023616� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-LS" "3" "January 2015" "" "" .SH "NAME" \fBnpm-ls\fR \- List installed packages .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.ls(args, [silent,] callback) .fi .RE .SH DESCRIPTION .P This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree\-structure\. It will also return that data using the callback\. .P This command does not take any arguments, but args must be defined\. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments, though you may set config flags like with any other command, such as \fBglobal\fR to list global packages\. .P It will print out extraneous, missing, and invalid packages\. .P If the silent parameter is set to true, nothing will be output to the screen, but the data will still be returned\. .P Callback is provided an error if one occurred, the full data about which packages are installed and which dependencies they will receive, and a "lite" data object which just shows which versions are installed where\. Note that the full data object is a circular structure, so care must be taken if it is serialized to JSON\. .SH CONFIGURATION .SS long .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show extended information\. .SS parseable .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show parseable output instead of tree view\. .SS global .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P List packages in the global install prefix instead of in the current project\. .P Note, if parseable is set or long isn't set, then duplicates will be trimmed\. This means that if a submodule has the same dependency as a parent module, then the dependency will only be output once\. �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-outdated.3���������������������������������000644 �000766 �000024 �00000000567 12455173731 025010� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-OUTDATED" "3" "January 2015" "" "" .SH "NAME" \fBnpm-outdated\fR \- Check for outdated packages .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.outdated([packages,] callback) .fi .RE .SH DESCRIPTION .P This command will check the registry to see if the specified packages are currently outdated\. .P If the 'packages' parameter is left out, npm will check all packages\. �����������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-owner.3������������������������������������000644 �000766 �000024 �00000002141 12455173731 024317� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-OWNER" "3" "January 2015" "" "" .SH "NAME" \fBnpm-owner\fR \- Manage package owners .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.owner(args, callback) .fi .RE .SH DESCRIPTION .P The first element of the 'args' parameter defines what to do, and the subsequent elements depend on the action\. Possible values for the action are (order of parameters are given in parenthesis): .RS 0 .IP \(bu 2 ls (package): List all the users who have access to modify a package and push new versions\. Handy when you need to know who to bug for help\. .IP \(bu 2 add (user, package): Add a new user as a maintainer of a package\. This user is enabled to modify metadata, publish new versions, and add other owners\. .IP \(bu 2 rm (user, package): Remove a user from the package owner list\. This immediately revokes their privileges\. .RE .P Note that there is only one level of access\. Either you can modify a package, or you can't\. Future versions may contain more fine\-grained access levels, but that is not implemented at this time\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm apihelp publish .IP \(bu 2 npm help 7 registry .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-pack.3�������������������������������������000644 �000766 �000024 �00000001241 12455173731 024103� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PACK" "3" "January 2015" "" "" .SH "NAME" \fBnpm-pack\fR \- Create a tarball from a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.pack([packages,] callback) .fi .RE .SH DESCRIPTION .P For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as \fB<name>\-<version>\.tgz\fR, and then write the filenames out to stdout\. .P If the same package is specified multiple times, then the file will be overwritten the second time\. .P If no arguments are supplied, then npm packs the current package folder\. ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-prefix.3�����������������������������������000644 �000766 �000024 �00000000573 12455173731 024471� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PREFIX" "3" "January 2015" "" "" .SH "NAME" \fBnpm-prefix\fR \- Display prefix .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.prefix(args, callback) .fi .RE .SH DESCRIPTION .P Print the prefix to standard out\. .P \|'args' is never used and callback is never called with data\. \|'args' must be present or things will break\. .P This function is not useful programmatically �������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-prune.3������������������������������������000644 �000766 �000024 �00000000745 12455173731 024326� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PRUNE" "3" "January 2015" "" "" .SH "NAME" \fBnpm-prune\fR \- Remove extraneous packages .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.prune([packages,] callback) .fi .RE .SH DESCRIPTION .P This command removes "extraneous" packages\. .P The first parameter is optional, and it specifies packages to be removed\. .P No packages are specified, then all packages will be checked\. .P Extraneous packages are packages that are not listed on the parent package's dependencies list\. ���������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-publish.3����������������������������������000644 �000766 �000024 �00000001575 12455173731 024645� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PUBLISH" "3" "January 2015" "" "" .SH "NAME" \fBnpm-publish\fR \- Publish a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.publish([packages,] callback) .fi .RE .SH DESCRIPTION .P Publishes a package to the registry so that it can be installed by name\. Possible values in the 'packages' array are: .RS 0 .IP \(bu 2 \fB<folder>\fR: A folder containing a package\.json file .IP \(bu 2 \fB<tarball>\fR: A url or file path to a gzipped tar archive containing a single folder with a package\.json file inside\. .RE .P If the package array is empty, npm will try to publish something in the current working directory\. .P This command could fails if one of the packages specified already exists in the registry\. Overwrites when the "force" environment variable is set\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help adduser .IP \(bu 2 npm apihelp owner .RE �����������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-rebuild.3����������������������������������000644 �000766 �000024 �00000000773 12455173731 024624� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-REBUILD" "3" "January 2015" "" "" .SH "NAME" \fBnpm-rebuild\fR \- Rebuild a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.rebuild([packages,] callback) .fi .RE .SH DESCRIPTION .P This command runs the \fBnpm build\fR command on each of the matched packages\. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary\. If no 'packages' parameter is specify, every package will be rebuilt\. .SH CONFIGURATION .P See \fBnpm help build\fR �����iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-repo.3�������������������������������������000644 �000766 �000024 �00000001176 12455173731 024141� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-REPO" "3" "January 2015" "" "" .SH "NAME" \fBnpm-repo\fR \- Open package repository page in the browser .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.repo(package, callback) .fi .RE .SH DESCRIPTION .P This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the \fB\-\-browser\fR config param\. .P Like other commands, the first parameter is an array\. This command only uses the first element, which is expected to be a package name with an optional version number\. .P This command will launch a browser, so this command may not be the most friendly for programmatic use\. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-restart.3����������������������������������000644 �000766 �000024 �00000002053 12455173731 024653� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-RESTART" "3" "January 2015" "" "" .SH "NAME" \fBnpm-restart\fR \- Restart a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.restart(packages, callback) .fi .RE .SH DESCRIPTION .P This restarts a package (or multiple packages)\. .P This runs a package's "stop", "restart", and "start" scripts, and associated pre\- and post\- scripts, in the order given below: .RS 0 .IP 1. 3 prerestart .IP 2. 3 prestop .IP 3. 3 stop .IP 4. 3 poststop .IP 5. 3 restart .IP 6. 3 prestart .IP 7. 3 start .IP 8. 3 poststart .IP 9. 3 postrestart .RE .P If no version is specified, then it restarts the "active" version\. .P npm can restart multiple packages\. Just specify multiple packages in the \fBpackages\fR parameter\. .SH NOTE .P Note that the "restart" script is run \fBin addition to\fR the "stop" and "start" scripts, not instead of them\. .P This is the behavior as of \fBnpm\fR major version 2\. A change in this behavior will be accompanied by an increase in major version number .SH SEE ALSO .RS 0 .IP \(bu 2 npm apihelp start .IP \(bu 2 npm apihelp stop .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-root.3�������������������������������������000644 �000766 �000024 �00000000626 12455173731 024156� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-ROOT" "3" "January 2015" "" "" .SH "NAME" \fBnpm-root\fR \- Display npm root .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.root(args, callback) .fi .RE .SH DESCRIPTION .P Print the effective \fBnode_modules\fR folder to standard out\. .P \|'args' is never used and callback is never called with data\. \|'args' must be present or things will break\. .P This function is not useful programmatically\. ����������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-run-script.3�������������������������������000644 �000766 �000024 �00000001633 12455173731 025300� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-RUN\-SCRIPT" "3" "January 2015" "" "" .SH "NAME" \fBnpm-run-script\fR \- Run arbitrary package scripts .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.run\-script(args, callback) .fi .RE .SH DESCRIPTION .P This runs an arbitrary command from a package's "scripts" object\. .P It is used by the test, start, restart, and stop commands, but can be called directly, as well\. .P The 'args' parameter is an array of strings\. Behavior depends on the number of elements\. If there is only one element, npm assumes that the element represents a command to be run on the local repository\. If there is more than one element, then the first is assumed to be the package and the second is assumed to be the command to run\. All other elements are ignored\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm apihelp test .IP \(bu 2 npm apihelp start .IP \(bu 2 npm apihelp restart .IP \(bu 2 npm apihelp stop .RE �����������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-search.3�����������������������������������000644 �000766 �000024 �00000002545 12455173731 024442� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-SEARCH" "3" "January 2015" "" "" .SH "NAME" \fBnpm-search\fR \- Search for packages .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.search(searchTerms, [silent,] [staleness,] callback) .fi .RE .SH DESCRIPTION .P Search the registry for packages matching the search terms\. The available parameters are: .RS 0 .IP \(bu 2 searchTerms: Array of search terms\. These terms are case\-insensitive\. .IP \(bu 2 silent: If true, npm will not log anything to the console\. .IP \(bu 2 staleness: This is the threshold for stale packages\. "Fresh" packages are not refreshed from the registry\. This value is measured in seconds\. .IP \(bu 2 callback: Returns an object where each key is the name of a package, and the value is information about that package along with a 'words' property, which is a space\-delimited string of all of the interesting words in that package\. The only properties included are those that are searched, which generally include: .RS 0 .IP \(bu 2 name .IP \(bu 2 description .IP \(bu 2 maintainers .IP \(bu 2 url .IP \(bu 2 keywords .RE .RE .P A search on the registry excludes any result that does not match all of the search terms\. It also removes any items from the results that contain an excluded term (the "searchexclude" config)\. The search is case insensitive and doesn't try to read your mind (it doesn't do any verb tense matching or the like)\. �����������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-shrinkwrap.3�������������������������������000644 �000766 �000024 �00000001320 12455173731 025353� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-SHRINKWRAP" "3" "January 2015" "" "" .SH "NAME" \fBnpm-shrinkwrap\fR \- programmatically generate package shrinkwrap file .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.shrinkwrap(args, [silent,] callback) .fi .RE .SH DESCRIPTION .P This acts much the same ways as shrinkwrapping on the command\-line\. .P This command does not take any arguments, but 'args' must be defined\. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments\. .P If the 'silent' parameter is set to true, nothing will be output to the screen, but the shrinkwrap file will still be written\. .P Finally, 'callback' is a function that will be called when the shrinkwrap has been saved\. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-start.3������������������������������������000644 �000766 �000024 �00000000531 12455173731 024323� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-START" "3" "January 2015" "" "" .SH "NAME" \fBnpm-start\fR \- Start a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.start(packages, callback) .fi .RE .SH DESCRIPTION .P This runs a package's "start" script, if one was provided\. .P npm can start multiple packages\. Just specify multiple packages in the \fBpackages\fR parameter\. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-stop.3�������������������������������������000644 �000766 �000024 �00000000532 12455173731 024154� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-STOP" "3" "January 2015" "" "" .SH "NAME" \fBnpm-stop\fR \- Stop a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.stop(packages, callback) .fi .RE .SH DESCRIPTION .P This runs a package's "stop" script, if one was provided\. .P npm can run stop on multiple packages\. Just specify multiple packages in the \fBpackages\fR parameter\. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-submodule.3��������������������������������000644 �000766 �000024 �00000002146 12455173731 025171� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.\" Generated with Ronnjs 0.3.8 .\" http://github.com/kapouer/ronnjs/ . .TH "NPM\-SUBMODULE" "3" "September 2014" "" "" . .SH "NAME" \fBnpm-submodule\fR \-\- Add a package as a git submodule . .SH "SYNOPSIS" . .nf npm\.commands\.submodule(packages, callback) . .fi . .SH "DESCRIPTION" For each package specified, npm will check if it has a git repository url in its package\.json description then add it as a git submodule at \fBnode_modules/<pkg name>\fR\|\. . .P This is a convenience only\. From then on, it\'s up to you to manage updates by using the appropriate git commands\. npm will stubbornly refuse to update, modify, or remove anything with a \fB\|\.git\fR subfolder in it\. . .P This command also does not install missing dependencies, if the package does not include them in its git repository\. If \fBnpm ls\fR reports that things are missing, you can either install, link, or submodule them yourself, or you can do \fBnpm explore <pkgname> \-\- npm install\fR to install the dependencies into the submodule folder\. . .SH "SEE ALSO" . .IP "\(bu" 4 npm help json . .IP "\(bu" 4 git help submodule . .IP "" 0 ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-tag.3��������������������������������������000644 �000766 �000024 �00000001573 12455173731 023750� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-TAG" "3" "January 2015" "" "" .SH "NAME" \fBnpm-tag\fR \- Tag a published version .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.tag(package@version, tag, callback) .fi .RE .SH DESCRIPTION .P Tags the specified version of the package with the specified tag, or the \fB\-\-tag\fR config if not specified\. .P The 'package@version' is an array of strings, but only the first two elements are currently used\. .P The first element must be in the form package@version, where package is the package name and version is the version number (much like installing a specific version)\. .P The second element is the name of the tag to tag this version with\. If this parameter is missing or falsey (empty), the default froom the config will be used\. For more information about how to set this config, check \fBman 3 npm\-config\fR for programmatic usage or \fBman npm\-config\fR for cli usage\. �������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-test.3�������������������������������������000644 �000766 �000024 �00000000661 12455173731 024151� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-TEST" "3" "January 2015" "" "" .SH "NAME" \fBnpm-test\fR \- Test a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.test(packages, callback) .fi .RE .SH DESCRIPTION .P This runs a package's "test" script, if one was provided\. .P To run tests as a condition of installation, set the \fBnpat\fR config to true\. .P npm can run tests on multiple packages\. Just specify multiple packages in the \fBpackages\fR parameter\. �������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-uninstall.3��������������������������������000644 �000766 �000024 �00000001041 12455173731 025174� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-UNINSTALL" "3" "January 2015" "" "" .SH "NAME" \fBnpm-uninstall\fR \- uninstall a package programmatically .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.uninstall(packages, callback) .fi .RE .SH DESCRIPTION .P This acts much the same ways as uninstalling on the command\-line\. .P The 'packages' parameter is an array of strings\. Each element in the array is the name of a package to be uninstalled\. .P Finally, 'callback' is a function that will be called when all packages have been uninstalled or when an error has been encountered\. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-unpublish.3��������������������������������000644 �000766 �000024 �00000001224 12455173731 025177� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-UNPUBLISH" "3" "January 2015" "" "" .SH "NAME" \fBnpm-unpublish\fR \- Remove a package from the registry .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.unpublish(package, callback) .fi .RE .SH DESCRIPTION .P This removes a package version from the registry, deleting its entry and removing the tarball\. .P The package parameter must be defined\. .P Only the first element in the package parameter is used\. If there is no first element, then npm assumes that the package at the current working directory is what is meant\. .P If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely\. ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-update.3�����������������������������������000644 �000766 �000024 �00000000735 12455173731 024456� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-UPDATE" "3" "January 2015" "" "" .SH "NAME" \fBnpm-update\fR \- Update a package .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.update(packages, callback) .fi .RE .TH "DESCRIPTION" "" "January 2015" "" "" .SH "NAME" \fBDESCRIPTION\fR .P Updates a package, upgrading it to the latest version\. It also installs any missing packages\. .P The 'packages' argument is an array of packages to update\. The 'callback' parameter will be called when done or when an error occurs\. �����������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-version.3����������������������������������000644 �000766 �000024 �00000001201 12455173731 024646� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-VERSION" "3" "January 2015" "" "" .SH "NAME" \fBnpm-version\fR \- Bump a package version .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.version(newversion, callback) .fi .RE .SH DESCRIPTION .P Run this in a package directory to bump the version and write the new data back to the package\.json file\. .P If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean\. .P Like all other commands, this function takes a string array as its first parameter\. The difference, however, is this function will fail if it does not have exactly one element\. The only element should be a version number\. �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-view.3�������������������������������������000644 �000766 �000024 �00000007002 12455173731 024140� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-VIEW" "3" "January 2015" "" "" .SH "NAME" \fBnpm-view\fR \- View registry info .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.view(args, [silent,] callback) .fi .RE .SH DESCRIPTION .P This command shows data about a package and prints it to the stream referenced by the \fBoutfd\fR config, which defaults to stdout\. .P The "args" parameter is an ordered list that closely resembles the command\-line usage\. The elements should be ordered such that the first element is the package and version (package@version)\. The version is optional\. After that, the rest of the parameters are fields with optional subfields ("field\.subfield") which can be used to get only the information desired from the registry\. .P The callback will be passed all of the data returned by the query\. .P For example, to get the package registry entry for the \fBconnect\fR package, you can do this: .P .RS 2 .nf npm\.commands\.view(["connect"], callback) .fi .RE .P If no version is specified, "latest" is assumed\. .P Field names can be specified after the package descriptor\. For example, to show the dependencies of the \fBronn\fR package at version 0\.3\.5, you could do the following: .P .RS 2 .nf npm\.commands\.view(["ronn@0\.3\.5", "dependencies"], callback) .fi .RE .P You can view child field by separating them with a period\. To view the git repository URL for the latest version of npm, you could do this: .P .RS 2 .nf npm\.commands\.view(["npm", "repository\.url"], callback) .fi .RE .P For fields that are arrays, requesting a non\-numeric field will return all of the values from the objects in the list\. For example, to get all the contributor names for the "express" project, you can do this: .P .RS 2 .nf npm\.commands\.view(["express", "contributors\.email"], callback) .fi .RE .P You may also use numeric indices in square braces to specifically select an item in an array field\. To just get the email address of the first contributor in the list, you can do this: .P .RS 2 .nf npm\.commands\.view(["express", "contributors[0]\.email"], callback) .fi .RE .P Multiple fields may be specified, and will be printed one after another\. For exampls, to get all the contributor names and email addresses, you can do this: .P .RS 2 .nf npm\.commands\.view(["express", "contributors\.name", "contributors\.email"], callback) .fi .RE .P "Person" fields are shown as a string if they would be shown as an object\. So, for example, this will show the list of npm contributors in the shortened string format\. (See \fBnpm help json\fR for more on this\.) .P .RS 2 .nf npm\.commands\.view(["npm", "contributors"], callback) .fi .RE .P If a version range is provided, then data will be printed for every matching version of the package\. This will show which version of jsdom was required by each matching version of yui3: .P .RS 2 .nf npm\.commands\.view(["yui3@'>0\.5\.4'", "dependencies\.jsdom"], callback) .fi .RE .SH OUTPUT .P If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command\. .P If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to\. .P If multiple fields are requested, than each of them are prefixed with the field name\. .P Console output can be disabled by setting the 'silent' parameter to true\. .SH RETURN VALUE .P The data returned will be an object in this formation: .P .RS 2 .nf { <version>: { <field>: <value> , \.\.\. } , \.\.\. } .fi .RE .P corresponding to the list of fields selected\. ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm-whoami.3�����������������������������������000644 �000766 �000024 �00000000623 12455173731 024454� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-WHOAMI" "3" "January 2015" "" "" .SH "NAME" \fBnpm-whoami\fR \- Display npm username .SH SYNOPSIS .P .RS 2 .nf npm\.commands\.whoami(args, callback) .fi .RE .SH DESCRIPTION .P Print the \fBusername\fR config to standard output\. .P \|'args' is never used and callback is never called with data\. \|'args' must be present or things will break\. .P This function is not useful programmatically �������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man3/npm.3������������������������������������������000644 �000766 �000024 �00000007113 12455173731 023173� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM" "3" "January 2015" "" "" .SH "NAME" \fBnpm\fR \- node package manager .SH SYNOPSIS .P .RS 2 .nf var npm = require("npm") npm\.load([configObject, ]function (er, npm) { // use the npm object, now that it's loaded\. npm\.config\.set(key, val) val = npm\.config\.get(key) console\.log("prefix = %s", npm\.prefix) npm\.commands\.install(["package"], cb) }) .fi .RE .SH VERSION .P 2.1.18 .SH DESCRIPTION .P This is the API documentation for npm\. To find documentation of the command line client, see npm help \fBnpm\fR\|\. .P Prior to using npm's commands, \fBnpm\.load()\fR must be called\. If you provide \fBconfigObject\fR as an object map of top\-level configs, they override the values stored in the various config locations\. In the npm command line client, this set of configs is parsed from the command line options\. Additional configuration params are loaded from two configuration files\. See npm help \fBnpm\-config\fR, npm help 7 \fBnpm\-config\fR, and npm help 5 \fBnpmrc\fR for more information\. .P After that, each of the functions are accessible in the commands object: \fBnpm\.commands\.<cmd>\fR\|\. See npm help 7 \fBnpm\-index\fR for a list of all possible commands\. .P All commands on the command object take an \fBarray\fR of positional argument \fBstrings\fR\|\. The last argument to any function is a callback\. Some commands take other optional arguments\. .P Configs cannot currently be set on a per function basis, as each call to npm\.config\.set will change the value for \fIall\fR npm commands in that process\. .P To find API documentation for a specific command, run the \fBnpm apihelp\fR command\. .SH METHODS AND PROPERTIES .RS 0 .IP \(bu 2 \fBnpm\.load(configs, cb)\fR Load the configuration params, and call the \fBcb\fR function once the globalconfig and userconfig files have been loaded as well, or on nextTick if they've already been loaded\. .IP \(bu 2 \fBnpm\.config\fR An object for accessing npm configuration parameters\. .RS 0 .IP \(bu 2 \fBnpm\.config\.get(key)\fR .IP \(bu 2 \fBnpm\.config\.set(key, val)\fR .IP \(bu 2 \fBnpm\.config\.del(key)\fR .RE .IP \(bu 2 \fBnpm\.dir\fR or \fBnpm\.root\fR The \fBnode_modules\fR directory where npm will operate\. .IP \(bu 2 \fBnpm\.prefix\fR The prefix where npm is operating\. (Most often the current working directory\.) .IP \(bu 2 \fBnpm\.cache\fR The place where npm keeps JSON and tarballs it fetches from the registry (or uploads to the registry)\. .IP \(bu 2 \fBnpm\.tmp\fR npm's temporary working directory\. .IP \(bu 2 \fBnpm\.deref\fR Get the "real" name for a command that has either an alias or abbreviation\. .RE .SH MAGIC .P For each of the methods in the \fBnpm\.commands\fR object, a method is added to the npm object, which takes a set of positional string arguments rather than an array and a callback\. .P If the last argument is a callback, then it will use the supplied callback\. However, if no callback is provided, then it will print out the error or results\. .P For example, this would work in a node repl: .P .RS 2 .nf > npm = require("npm") > npm\.load() // wait a sec\.\.\. > npm\.install("dnode", "express") .fi .RE .P Note that that \fIwon't\fR work in a node program, since the \fBinstall\fR method will get called before the configuration load is completed\. .SH ABBREVS .P In order to support \fBnpm ins foo\fR instead of \fBnpm install foo\fR, the \fBnpm\.commands\fR object has a set of abbreviations as well as the full method names\. Use the \fBnpm\.deref\fR method to find the real name\. .P For example: .P .RS 2 .nf var cmd = npm\.deref("unp") // cmd === "unpublish" .fi .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-adduser.1����������������������������������000644 �000766 �000024 �00000004703 12455173731 024616� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-ADDUSER" "1" "January 2015" "" "" .SH "NAME" \fBnpm-adduser\fR \- Add a registry user account .SH SYNOPSIS .P .RS 2 .nf npm adduser [\-\-registry=url] [\-\-scope=@orgname] [\-\-always\-auth] .fi .RE .SH DESCRIPTION .P Create or verify a user named \fB<username>\fR in the specified registry, and save the credentials to the \fB\|\.npmrc\fR file\. If no registry is specified, the default registry will be used (see npm help 7 \fBnpm\-config\fR)\. .P The username, password, and email are read in from prompts\. .P To reset your password, go to https://www\.npmjs\.com/forgot .P To change your email address, go to https://www\.npmjs\.com/email\-edit .P You may use this command multiple times with the same user account to authorize on a new machine\. When authenticating on a new machine, the username, password and email address must all match with your existing record\. .P \fBnpm login\fR is an alias to \fBadduser\fR and behaves exactly the same way\. .SH CONFIGURATION .SS registry .P Default: http://registry\.npmjs\.org/ .P The base URL of the npm package registry\. If \fBscope\fR is also specified, this registry will only be used for packages with that scope\. See npm help 7 \fBnpm\-scope\fR\|\. .SS scope .P Default: none .P If specified, the user and login credentials given will be associated with the specified scope\. See npm help 7 \fBnpm\-scope\fR\|\. You can use both at the same time, e\.g\. .P .RS 2 .nf npm adduser \-\-registry=http://myregistry\.example\.com \-\-scope=@myco .fi .RE .P This will set a registry for the given scope and login or create a user for that registry at the same time\. .SS always\-auth .P Default: false .P If specified, save configuration indicating that all requests to the given registry should include authorization information\. Useful for private registries\. Can be used with \fB\-\-registry\fR and / or \fB\-\-scope\fR, e\.g\. .P .RS 2 .nf npm adduser \-\-registry=http://private\-registry\.example\.com \-\-always\-auth .fi .RE .P This will ensure that all requests to that registry (including for tarballs) include an authorization header\. See \fBalways\-auth\fR in npm help 7 \fBnpm\-config\fR for more details on always\-auth\. Registry\-specific configuration of \fBalways\-auth\fR takes precedence over any global configuration\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help owner .IP \(bu 2 npm help whoami .RE �������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-bin.1��������������������������������������000644 �000766 �000024 �00000000615 12455173731 023735� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-BIN" "1" "January 2015" "" "" .SH "NAME" \fBnpm-bin\fR \- Display npm bin folder .SH SYNOPSIS .P .RS 2 .nf npm bin .fi .RE .SH DESCRIPTION .P Print the folder where npm will install executables\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help prefix .IP \(bu 2 npm help root .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE �������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-bugs.1�������������������������������������000644 �000766 �000024 �00000002157 12455173731 024130� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-BUGS" "1" "January 2015" "" "" .SH "NAME" \fBnpm-bugs\fR \- Bugs for a package in a web browser maybe .SH SYNOPSIS .P .RS 2 .nf npm bugs <pkgname> npm bugs (with no args in a package dir) .fi .RE .SH DESCRIPTION .P This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the \fB\-\-browser\fR config param\. If no package name is provided, it will search for a \fBpackage\.json\fR in the current folder and use the \fBname\fR property\. .SH CONFIGURATION .SS browser .RS 0 .IP \(bu 2 Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR .IP \(bu 2 Type: String .RE .P The browser that is called by the \fBnpm bugs\fR command to open websites\. .SS registry .RS 0 .IP \(bu 2 Default: https://registry\.npmjs\.org/ .IP \(bu 2 Type: url .RE .P The base URL of the npm package registry\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help docs .IP \(bu 2 npm help view .IP \(bu 2 npm help publish .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 5 package\.json .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-build.1������������������������������������000644 �000766 �000024 �00000001045 12455173731 024262� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-BUILD" "1" "January 2015" "" "" .SH "NAME" \fBnpm-build\fR \- Build a package .SH SYNOPSIS .P .RS 2 .nf npm build <package\-folder> .fi .RE .RS 0 .IP \(bu 2 \fB<package\-folder>\fR: A folder containing a \fBpackage\.json\fR file in its root\. .RE .SH DESCRIPTION .P This is the plumbing command called by \fBnpm link\fR and \fBnpm install\fR\|\. .P It should generally not be called directly\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help install .IP \(bu 2 npm help link .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help 5 package\.json .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-bundle.1�����������������������������������000644 �000766 �000024 �00000000634 12455173731 024437� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-BUNDLE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-bundle\fR \- REMOVED .SH DESCRIPTION .P The \fBnpm bundle\fR command has been removed in 1\.0, for the simple reason that it is no longer necessary, as the default behavior is now to install packages into the local space\. .P Just use \fBnpm install\fR now to do what \fBnpm bundle\fR used to do\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help install .RE ����������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-cache.1������������������������������������000644 �000766 �000024 �00000004227 12455173731 024233� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-CACHE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-cache\fR \- Manipulates packages cache .SH SYNOPSIS .P .RS 2 .nf npm cache add <tarball file> npm cache add <folder> npm cache add <tarball url> npm cache add <name>@<version> npm cache ls [<path>] npm cache clean [<path>] .fi .RE .SH DESCRIPTION .P Used to add, list, or clear the npm cache folder\. .RS 0 .IP \(bu 2 add: Add the specified package to the local cache\. This command is primarily intended to be used internally by npm, but it can provide a way to add data to the local installation cache explicitly\. .IP \(bu 2 ls: Show the data in the cache\. Argument is a path to show in the cache folder\. Works a bit like the \fBfind\fR program, but limited by the \fBdepth\fR config\. .IP \(bu 2 clean: Delete data out of the cache folder\. If an argument is provided, then it specifies a subpath to delete\. If no argument is provided, then the entire cache is cleared\. .RE .SH DETAILS .P npm stores cache data in the directory specified in \fBnpm config get cache\fR\|\. For each package that is added to the cache, three pieces of information are stored in \fB{cache}/{name}/{version}\fR: .RS 0 .IP \(bu 2 \|\.\.\./package/package\.json: The package\.json file, as npm sees it\. .IP \(bu 2 \|\.\.\./package\.tgz: The tarball for that version\. .RE .P Additionally, whenever a registry request is made, a \fB\|\.cache\.json\fR file is placed at the corresponding URI, to store the ETag and the requested data\. This is stored in \fB{cache}/{hostname}/{path}/\.cache\.json\fR\|\. .P Commands that make non\-essential registry requests (such as \fBsearch\fR and \fBview\fR, or the completion scripts) generally specify a minimum timeout\. If the \fB\|\.cache\.json\fR file is younger than the specified timeout, then they do not make an HTTP request to the registry\. .SH CONFIGURATION .SS cache .P Default: \fB~/\.npm\fR on Posix, or \fB%AppData%/npm\-cache\fR on Windows\. .P The root cache folder\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help install .IP \(bu 2 npm help publish .IP \(bu 2 npm help pack .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-completion.1�������������������������������000644 �000766 �000024 �00000001542 12455173731 025336� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-COMPLETION" "1" "January 2015" "" "" .SH "NAME" \fBnpm-completion\fR \- Tab Completion for npm .SH SYNOPSIS .P .RS 2 .nf \|\. <(npm completion) .fi .RE .SH DESCRIPTION .P Enables tab\-completion in all npm commands\. .P The synopsis above loads the completions into your current shell\. Adding it to your ~/\.bashrc or ~/\.zshrc will make the completions available everywhere\. .P You may of course also pipe the output of npm completion to a file such as \fB/usr/local/etc/bash_completion\.d/npm\fR if you have a system that will read that file for you\. .P When \fBCOMP_CWORD\fR, \fBCOMP_LINE\fR, and \fBCOMP_POINT\fR are defined in the environment, \fBnpm completion\fR acts in "plumbing mode", and outputs completions based on the arguments\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 developers .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help npm .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-config.1�����������������������������������000644 �000766 �000024 �00000003050 12455173731 024426� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-CONFIG" "1" "January 2015" "" "" .SH "NAME" \fBnpm-config\fR \- Manage the npm configuration files .SH SYNOPSIS .P .RS 2 .nf npm config set <key> <value> [\-\-global] npm config get <key> npm config delete <key> npm config list npm config edit npm c [set|get|delete|list] npm get <key> npm set <key> <value> [\-\-global] .fi .RE .SH DESCRIPTION .P npm gets its config settings from the command line, environment variables, \fBnpmrc\fR files, and in some cases, the \fBpackage\.json\fR file\. .P See npm help 5 npmrc for more information about the npmrc files\. .P See npm help 7 \fBnpm\-config\fR for a more thorough discussion of the mechanisms involved\. .P The \fBnpm config\fR command can be used to update and edit the contents of the user and global npmrc files\. .SH Sub\-commands .P Config supports the following sub\-commands: .SS set .P .RS 2 .nf npm config set key value .fi .RE .P Sets the config key to the value\. .P If value is omitted, then it sets it to "true"\. .SS get .P .RS 2 .nf npm config get key .fi .RE .P Echo the config value to stdout\. .SS list .P .RS 2 .nf npm config list .fi .RE .P Show all the config settings\. .SS delete .P .RS 2 .nf npm config delete key .fi .RE .P Deletes the key from all configuration files\. .SS edit .P .RS 2 .nf npm config edit .fi .RE .P Opens the config file in an editor\. Use the \fB\-\-global\fR flag to edit the global config\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help npm .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-dedupe.1�����������������������������������000644 �000766 �000024 �00000003256 12455173731 024437� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DEDUPE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-dedupe\fR \- Reduce duplication .SH SYNOPSIS .P .RS 2 .nf npm dedupe [package names\.\.\.] npm ddp [package names\.\.\.] .fi .RE .SH DESCRIPTION .P Searches the local package tree and attempts to simplify the overall structure by moving dependencies further up the tree, where they can be more effectively shared by multiple dependent packages\. .P For example, consider this dependency graph: .P .RS 2 .nf a +\-\- b <\-\- depends on c@1\.0\.x | `\-\- c@1\.0\.3 `\-\- d <\-\- depends on c@~1\.0\.9 `\-\- c@1\.0\.10 .fi .RE .P In this case, npm help \fBnpm\-dedupe\fR will transform the tree to: .P .RS 2 .nf a +\-\- b +\-\- d `\-\- c@1\.0\.10 .fi .RE .P Because of the hierarchical nature of node's module lookup, b and d will both get their dependency met by the single c package at the root level of the tree\. .P If a suitable version exists at the target location in the tree already, then it will be left untouched, but the other duplicates will be deleted\. .P If no suitable version can be found, then a warning is printed, and nothing is done\. .P If any arguments are supplied, then they are filters, and only the named packages will be touched\. .P Note that this operation transforms the dependency tree, and may result in packages getting updated versions, perhaps from the npm registry\. .P This feature is experimental, and may change in future versions\. .P The \fB\-\-tag\fR argument will apply to all of the affected dependencies\. If a tag with the given name exists, the tagged version is preferred over newer versions\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help ls .IP \(bu 2 npm help update .IP \(bu 2 npm help install .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-deprecate.1��������������������������������000644 �000766 �000024 �00000001474 12455173731 025125� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DEPRECATE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-deprecate\fR \- Deprecate a version of a package .SH SYNOPSIS .P .RS 2 .nf npm deprecate <name>[@<version>] <message> .fi .RE .SH DESCRIPTION .P This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it\. .P It works on version ranges as well as specific versions, so you can do something like this: .P .RS 2 .nf npm deprecate my\-thing@"< 0\.2\.3" "critical bug fixed in v0\.2\.3" .fi .RE .P Note that you must be the package owner to deprecate something\. See the \fBowner\fR and \fBadduser\fR help topics\. .P To un\-deprecate a package, specify an empty string (\fB""\fR) for the \fBmessage\fR argument\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help publish .IP \(bu 2 npm help 7 registry .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-docs.1�������������������������������������000644 �000766 �000024 �00000002354 12455173731 024117� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-DOCS" "1" "January 2015" "" "" .SH "NAME" \fBnpm-docs\fR \- Docs for a package in a web browser maybe .SH SYNOPSIS .P .RS 2 .nf npm docs [<pkgname> [<pkgname> \.\.\.]] npm docs (with no args in a package dir) npm home [<pkgname> [<pkgname> \.\.\.]] npm home (with no args in a package dir) .fi .RE .SH DESCRIPTION .P This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the \fB\-\-browser\fR config param\. You can pass multiple package names at once\. If no package name is provided, it will search for a \fBpackage\.json\fR in the current folder and use the \fBname\fR property\. .SH CONFIGURATION .SS browser .RS 0 .IP \(bu 2 Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR .IP \(bu 2 Type: String .RE .P The browser that is called by the \fBnpm docs\fR command to open websites\. .SS registry .RS 0 .IP \(bu 2 Default: https://registry\.npmjs\.org/ .IP \(bu 2 Type: url .RE .P The base URL of the npm package registry\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help view .IP \(bu 2 npm help publish .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 5 package\.json .RE ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-edit.1�������������������������������������000644 �000766 �000024 �00000002041 12455173731 024105� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-EDIT" "1" "January 2015" "" "" .SH "NAME" \fBnpm-edit\fR \- Edit an installed package .SH SYNOPSIS .P .RS 2 .nf npm edit <name>[@<version>] .fi .RE .SH DESCRIPTION .P Opens the package folder in the default editor (or whatever you've configured as the npm \fBeditor\fR config \-\- see npm help 7 \fBnpm\-config\fR\|\.) .P After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages\. .P For instance, you can do \fBnpm install connect\fR to install connect into your package, and then \fBnpm edit connect\fR to make a few changes to your locally installed copy\. .SH CONFIGURATION .SS editor .RS 0 .IP \(bu 2 Default: \fBEDITOR\fR environment variable if set, or \fB"vi"\fR on Posix, or \fB"notepad"\fR on Windows\. .IP \(bu 2 Type: path .RE .P The command to run for \fBnpm edit\fR or \fBnpm config edit\fR\|\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help explore .IP \(bu 2 npm help install .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-explore.1����������������������������������000644 �000766 �000024 �00000002016 12455173731 024640� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-EXPLORE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-explore\fR \- Browse an installed package .SH SYNOPSIS .P .RS 2 .nf npm explore <name> [ \-\- <cmd>] .fi .RE .SH DESCRIPTION .P Spawn a subshell in the directory of the installed package specified\. .P If a command is specified, then it is run in the subshell, which then immediately terminates\. .P This is particularly handy in the case of git submodules in the \fBnode_modules\fR folder: .P .RS 2 .nf npm explore some\-dependency \-\- git pull origin master .fi .RE .P Note that the package is \fInot\fR automatically rebuilt afterwards, so be sure to use \fBnpm rebuild <pkg>\fR if you make any changes\. .SH CONFIGURATION .SS shell .RS 0 .IP \(bu 2 Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows .IP \(bu 2 Type: path .RE .P The shell to run for the \fBnpm explore\fR command\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help edit .IP \(bu 2 npm help rebuild .IP \(bu 2 npm help build .IP \(bu 2 npm help install .RE ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-help-search.1������������������������������000644 �000766 �000024 �00000001704 12455173731 025360� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-HELP\-SEARCH" "1" "January 2015" "" "" .SH "NAME" \fBnpm-help-search\fR \- Search npm help documentation .SH SYNOPSIS .P .RS 2 .nf npm help\-search some search terms .fi .RE .SH DESCRIPTION .P This command will search the npm markdown documentation files for the terms provided, and then list the results, sorted by relevance\. .P If only one result is found, then it will show that help topic\. .P If the argument to \fBnpm help\fR is not a known help topic, then it will call \fBhelp\-search\fR\|\. It is rarely if ever necessary to call this command directly\. .SH CONFIGURATION .SS long .RS 0 .IP \(bu 2 Type: Boolean .IP \(bu 2 Default false .RE .P If true, the "long" flag will cause help\-search to output context around where the terms were found in the documentation\. .P If false, then help\-search will just list out the help topics found\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help npm .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help help .RE ������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-help.1�������������������������������������000644 �000766 �000024 �00000002100 12455173731 024104� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-HELP" "1" "January 2015" "" "" .SH "NAME" \fBnpm-help\fR \- Get help on npm .SH SYNOPSIS .P .RS 2 .nf npm help <topic> npm help some search terms .fi .RE .SH DESCRIPTION .P If supplied a topic, then show the appropriate documentation page\. .P If the topic does not exist, or if multiple terms are provided, then run the \fBhelp\-search\fR command to find a match\. Note that, if \fBhelp\-search\fR finds a single subject, then it will run \fBhelp\fR on that topic, so unique matches are equivalent to specifying a topic name\. .SH CONFIGURATION .SS viewer .RS 0 .IP \(bu 2 Default: "man" on Posix, "browser" on Windows .IP \(bu 2 Type: path .RE .P The program to use to view help content\. .P Set to \fB"browser"\fR to view html help content in the default web browser\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help npm .IP \(bu 2 README .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help help\-search .IP \(bu 2 npm help 7 index .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-init.1�������������������������������������000644 �000766 �000024 �00000001676 12455173731 024140� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-INIT" "1" "January 2015" "" "" .SH "NAME" \fBnpm-init\fR \- Interactively create a package\.json file .SH SYNOPSIS .P .RS 2 .nf npm init [\-f|\-\-force|\-y|\-\-yes] .fi .RE .SH DESCRIPTION .P This will ask you a bunch of questions, and then write a package\.json for you\. .P It attempts to make reasonable guesses about what you want things to be set to, and then writes a package\.json file with the options you've selected\. .P If you already have a package\.json file, it'll read that first, and default to the options in there\. .P It is strictly additive, so it does not delete options from your package\.json without a really good reason to do so\. .P If you invoke it with \fB\-f\fR, \fB\-\-force\fR, \fB\-y\fR, or \fB\-\-yes\fR, it will use only defaults and not prompt you for any options\. .SH SEE ALSO .RS 0 .IP \(bu 2 https://github\.com/isaacs/init\-package\-json .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help version .RE ������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-install.1����������������������������������000644 �000766 �000024 �00000024024 12455173731 024633� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-INSTALL" "1" "January 2015" "" "" .SH "NAME" \fBnpm-install\fR \- Install a package .SH SYNOPSIS .P .RS 2 .nf npm install (with no args in a package dir) npm install <tarball file> npm install <tarball url> npm install <folder> npm install [@<scope>/]<name> [\-\-save|\-\-save\-dev|\-\-save\-optional] [\-\-save\-exact] npm install [@<scope>/]<name>@<tag> npm install [@<scope>/]<name>@<version> npm install [@<scope>/]<name>@<version range> npm i (with any of the previous argument usage) .fi .RE .SH DESCRIPTION .P This command installs a package, and any packages that it depends on\. If the package has a shrinkwrap file, the installation of dependencies will be driven by that\. See npm help shrinkwrap\. .P A \fBpackage\fR is: .RS 0 .IP \(bu 2 a) a folder containing a program described by a package\.json file .IP \(bu 2 b) a gzipped tarball containing (a) .IP \(bu 2 c) a url that resolves to (b) .IP \(bu 2 d) a \fB<name>@<version>\fR that is published on the registry (see npm help 7 \fBnpm\-registry\fR) with (c) .IP \(bu 2 e) a \fB<name>@<tag>\fR that points to (d) .IP \(bu 2 f) a \fB<name>\fR that has a "latest" tag satisfying (e) .IP \(bu 2 g) a \fB<git remote url>\fR that resolves to (b) .RE .P Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b)\. .RS 0 .IP \(bu 2 \fBnpm install\fR (in package directory, no arguments): Install the dependencies in the local node_modules folder\. In global mode (ie, with \fB\-g\fR or \fB\-\-global\fR appended to the command), it installs the current package context (ie, the current working directory) as a global package\. By default, \fBnpm install\fR will install all modules listed as dependencies\. With the \fB\-\-production\fR flag, npm will not install modules listed in \fBdevDependencies\fR\|\. .IP \(bu 2 \fBnpm install <folder>\fR: Install a package that is sitting in a folder on the filesystem\. .IP \(bu 2 \fBnpm install <tarball file>\fR: Install a package that is sitting on the filesystem\. Note: if you just want to link a dev directory into your npm root, you can do this more easily by using \fBnpm link\fR\|\. Example: .P .RS 2 .nf npm install \./package\.tgz .fi .RE .IP \(bu 2 \fBnpm install <tarball url>\fR: Fetch the tarball url, and then install it\. In order to distinguish between this and other options, the argument must start with "http://" or "https://" Example: .P .RS 2 .nf npm install https://github\.com/indexzero/forever/tarball/v0\.5\.6 .fi .RE .IP \(bu 2 \fBnpm install [@<scope>/]<name> [\-\-save|\-\-save\-dev|\-\-save\-optional]\fR: Do a \fB<name>@<tag>\fR install, where \fB<tag>\fR is the "tag" config\. (See npm help 7 \fBnpm\-config\fR\|\.) In most cases, this will install the latest version of the module published on npm\. Example: .P .RS 2 .nf npm install sax .fi .RE \fBnpm install\fR takes 3 exclusive, optional flags which save or update the package version in your main package\.json: .RS 0 .IP \(bu 2 \fB\-\-save\fR: Package will appear in your \fBdependencies\fR\|\. .IP \(bu 2 \fB\-\-save\-dev\fR: Package will appear in your \fBdevDependencies\fR\|\. .IP \(bu 2 \fB\-\-save\-optional\fR: Package will appear in your \fBoptionalDependencies\fR\|\. When using any of the above options to save dependencies to your package\.json, there is an additional, optional flag: .IP \(bu 2 \fB\-\-save\-exact\fR: Saved dependencies will be configured with an exact version rather than using npm's default semver range operator\. \fB<scope>\fR is optional\. The package will be downloaded from the registry associated with the specified scope\. If no registry is associated with the given scope the default registry is assumed\. See npm help 7 \fBnpm\-scope\fR\|\. Note: if you do not include the @\-symbol on your scope name, npm will interpret this as a GitHub repository instead, see below\. Scopes names must also be followed by a slash\. Examples: .P .RS 2 .nf npm install sax \-\-save npm install githubname/reponame npm install @myorg/privatepackage npm install node\-tap \-\-save\-dev npm install dtrace\-provider \-\-save\-optional npm install readable\-stream \-\-save \-\-save\-exact .fi .RE .RE .RE .P .RS 2 .nf **Note**: If there is a file or folder named `<name>` in the current working directory, then it will try to install that, and only try to fetch the package by name if it is not valid\. .fi .RE .RS 0 .IP \(bu 2 \fBnpm install [@<scope>/]<name>@<tag>\fR: Install the version of the package that is referenced by the specified tag\. If the tag does not exist in the registry data for that package, then this will fail\. Example: .P .RS 2 .nf npm install sax@latest npm install @myorg/mypackage@latest .fi .RE .IP \(bu 2 \fBnpm install [@<scope>/]<name>@<version>\fR: Install the specified version of the package\. This will fail if the version has not been published to the registry\. Example: .P .RS 2 .nf npm install sax@0\.1\.1 npm install @myorg/privatepackage@1\.5\.0 .fi .RE .IP \(bu 2 \fBnpm install [@<scope>/]<name>@<version range>\fR: Install a version of the package matching the specified version range\. This will follow the same rules for resolving dependencies described in npm help 5 \fBpackage\.json\fR\|\. Note that most version ranges must be put in quotes so that your shell will treat it as a single argument\. Example: .P .RS 2 .nf npm install sax@">=0\.1\.0 <0\.2\.0" npm install @myorg/privatepackage@">=0\.1\.0 <0\.2\.0" .fi .RE .IP \(bu 2 \fBnpm install <githubname>/<githubrepo>\fR: Install the package at \fBhttps://github\.com/githubname/githubrepo" by attempting to clone it using\fRgit`\. Example: .P .RS 2 .nf npm install mygithubuser/myproject .fi .RE To reference a package in a git repo that is not on GitHub, see git remote urls below\. .IP \(bu 2 \fBnpm install <git remote url>\fR: Install a package by cloning a git remote url\. The format of the git url is: .P .RS 2 .nf <protocol>://[<user>@]<hostname><separator><path>[#<commit\-ish>] .fi .RE \fB<protocol>\fR is one of \fBgit\fR, \fBgit+ssh\fR, \fBgit+http\fR, or \fBgit+https\fR\|\. If no \fB<commit\-ish>\fR is specified, then \fBmaster\fR is used\. Examples: .P .RS 2 .nf git+ssh://git@github\.com:npm/npm\.git#v1\.0\.27 git+https://isaacs@github\.com/npm/npm\.git git://github\.com/npm/npm\.git#v1\.0\.27 .fi .RE .RE .P You may combine multiple arguments, and even multiple types of arguments\. For example: .P .RS 2 .nf npm install sax@">=0\.1\.0 <0\.2\.0" bench supervisor .fi .RE .P The \fB\-\-tag\fR argument will apply to all of the specified install targets\. If a tag with the given name exists, the tagged version is preferred over newer versions\. .P The \fB\-\-force\fR argument will force npm to fetch remote resources even if a local copy exists on disk\. .P .RS 2 .nf npm install sax \-\-force .fi .RE .P The \fB\-\-global\fR argument will cause npm to install the package globally rather than locally\. See npm help 5 \fBnpm\-folders\fR\|\. .P The \fB\-\-link\fR argument will cause npm to link global installs into the local space in some cases\. .P The \fB\-\-no\-bin\-links\fR argument will prevent npm from creating symlinks for any binaries the package might contain\. .P The \fB\-\-no\-optional\fR argument will prevent optional dependencies from being installed\. .P The \fB\-\-no\-shrinkwrap\fR argument, which will ignore an available shrinkwrap file and use the package\.json instead\. .P The \fB\-\-nodedir=/path/to/node/source\fR argument will allow npm to find the node source code so that npm can compile native modules\. .P See npm help 7 \fBnpm\-config\fR\|\. Many of the configuration params have some effect on installation, since that's most of what npm does\. .SH ALGORITHM .P To install a package, npm uses the following algorithm: .P .RS 2 .nf install(where, what, family, ancestors) fetch what, unpack to <where>/node_modules/<what> for each dep in what\.dependencies resolve dep to precise version for each dep@version in what\.dependencies not in <where>/node_modules/<what>/node_modules/* and not in <family> add precise version deps to <family> install(<where>/node_modules/<what>, dep, family) .fi .RE .P For this \fBpackage{dep}\fR structure: \fBA{B,C}, B{C}, C{D}\fR, this algorithm produces: .P .RS 2 .nf A +\-\- B `\-\- C `\-\- D .fi .RE .P That is, the dependency from B to C is satisfied by the fact that A already caused C to be installed at a higher level\. .P See npm help 5 folders for a more detailed description of the specific folder structures that npm creates\. .SS Limitations of npm's Install Algorithm .P There are some very rare and pathological edge\-cases where a cycle can cause npm to try to install a never\-ending tree of packages\. Here is the simplest case: .P .RS 2 .nf A \-> B \-> A' \-> B' \-> A \-> B \-> A' \-> B' \-> A \-> \.\.\. .fi .RE .P where \fBA\fR is some version of a package, and \fBA'\fR is a different version of the same package\. Because \fBB\fR depends on a different version of \fBA\fR than the one that is already in the tree, it must install a separate copy\. The same is true of \fBA'\fR, which must install \fBB'\fR\|\. Because \fBB'\fR depends on the original version of \fBA\fR, which has been overridden, the cycle falls into infinite regress\. .P To avoid this situation, npm flat\-out refuses to install any \fBname@version\fR that is already present anywhere in the tree of package folder ancestors\. A more correct, but more complex, solution would be to symlink the existing version into the new location\. If this ever affects a real use\-case, it will be investigated\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help update .IP \(bu 2 npm help link .IP \(bu 2 npm help rebuild .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help build .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help tag .IP \(bu 2 npm help rm .IP \(bu 2 npm help shrinkwrap .RE ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-link.1�������������������������������������000644 �000766 �000024 �00000004746 12455173731 024133� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-LINK" "1" "January 2015" "" "" .SH "NAME" \fBnpm-link\fR \- Symlink a package folder .SH SYNOPSIS .P .RS 2 .nf npm link (in package folder) npm link [@<scope>/]<pkgname> npm ln (with any of the previous argument usage) .fi .RE .SH DESCRIPTION .P Package linking is a two\-step process\. .P First, \fBnpm link\fR in a package folder will create a globally\-installed symbolic link from \fBprefix/package\-name\fR to the current folder (see npm help 7 \fBnpm\-config\fR for the value of \fBprefix\fR)\. .P Next, in some other location, \fBnpm link package\-name\fR will create a symlink from the local \fBnode_modules\fR folder to the global symlink\. .P Note that \fBpackage\-name\fR is taken from \fBpackage\.json\fR, not from directory name\. .P The package name can be optionally prefixed with a scope\. See npm help 7 \fBnpm\-scope\fR\|\. The scope must be preceded by an @\-symbol and followed by a slash\. .P When creating tarballs for \fBnpm publish\fR, the linked packages are "snapshotted" to their current state by resolving the symbolic links\. .P This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild\. .P For example: .P .RS 2 .nf cd ~/projects/node\-redis # go into the package directory npm link # creates global link cd ~/projects/node\-bloggy # go into some other package directory\. npm link redis # link\-install the package .fi .RE .P Now, any changes to ~/projects/node\-redis will be reflected in ~/projects/node\-bloggy/node_modules/redis/ .P You may also shortcut the two steps in one\. For example, to do the above use\-case in a shorter way: .P .RS 2 .nf cd ~/projects/node\-bloggy # go into the dir of your main project npm link \.\./node\-redis # link the dir of your dependency .fi .RE .P The second line is the equivalent of doing: .P .RS 2 .nf (cd \.\./node\-redis; npm link) npm link redis .fi .RE .P That is, it first creates a global link, and then links the global installation target into your project's \fBnode_modules\fR folder\. .P If your linked package is scoped (see npm help 7 \fBnpm\-scope\fR) your link command must include that scope, e\.g\. .P .RS 2 .nf npm link @myorg/privatepackage .fi .RE .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 developers .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help install .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE ��������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-ls.1���������������������������������������000644 �000766 �000024 �00000003741 12455173731 023606� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-LS" "1" "January 2015" "" "" .SH "NAME" \fBnpm-ls\fR \- List installed packages .SH SYNOPSIS .P .RS 2 .nf npm list [[@<scope>/]<pkg> \.\.\.] npm ls [[@<scope>/]<pkg> \.\.\.] npm la [[@<scope>/]<pkg> \.\.\.] npm ll [[@<scope>/]<pkg> \.\.\.] .fi .RE .SH DESCRIPTION .P This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree\-structure\. .P Positional arguments are \fBname@version\-range\fR identifiers, which will limit the results to only the paths to the packages named\. Note that nested packages will \fIalso\fR show the paths to the specified packages\. For example, running \fBnpm ls promzard\fR in npm's source tree will show: .P .RS 2 .nf npm@2.1.18 /path/to/npm └─┬ init\-package\-json@0\.0\.4 └── promzard@0\.1\.5 .fi .RE .P It will print out extraneous, missing, and invalid packages\. .P If a project specifies git urls for dependencies these are shown in parentheses after the name@version to make it easier for users to recognize potential forks of a project\. .P When run as \fBll\fR or \fBla\fR, it shows extended information by default\. .SH CONFIGURATION .SS json .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show information in JSON format\. .SS long .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show extended information\. .SS parseable .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show parseable output instead of tree view\. .SS global .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P List packages in the global install prefix instead of in the current project\. .SS depth .RS 0 .IP \(bu 2 Type: Int .RE .P Max display depth of the dependency tree\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help install .IP \(bu 2 npm help link .IP \(bu 2 npm help prune .IP \(bu 2 npm help outdated .IP \(bu 2 npm help update .RE �������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-outdated.1���������������������������������000644 �000766 �000024 �00000002244 12455173731 024776� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-OUTDATED" "1" "January 2015" "" "" .SH "NAME" \fBnpm-outdated\fR \- Check for outdated packages .SH SYNOPSIS .P .RS 2 .nf npm outdated [<name> [<name> \.\.\.]] .fi .RE .SH DESCRIPTION .P This command will check the registry to see if any (or, specific) installed packages are currently outdated\. .P The resulting field 'wanted' shows the latest version according to the version specified in the package\.json, the field 'latest' the very latest version of the package\. .SH CONFIGURATION .SS json .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show information in JSON format\. .SS long .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show extended information\. .SS parseable .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Show parseable output instead of tree view\. .SS global .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Check packages in the global install prefix instead of in the current project\. .SS depth .RS 0 .IP \(bu 2 Type: Int .RE .P Max depth for checking dependency tree\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help update .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help 5 folders .RE ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-owner.1������������������������������������000644 �000766 �000024 �00000002022 12455173731 024311� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-OWNER" "1" "January 2015" "" "" .SH "NAME" \fBnpm-owner\fR \- Manage package owners .SH SYNOPSIS .P .RS 2 .nf npm owner ls <package name> npm owner add <user> <package name> npm owner rm <user> <package name> .fi .RE .SH DESCRIPTION .P Manage ownership of published packages\. .RS 0 .IP \(bu 2 ls: List all the users who have access to modify a package and push new versions\. Handy when you need to know who to bug for help\. .IP \(bu 2 add: Add a new user as a maintainer of a package\. This user is enabled to modify metadata, publish new versions, and add other owners\. .IP \(bu 2 rm: Remove a user from the package owner list\. This immediately revokes their privileges\. .RE .P Note that there is only one level of access\. Either you can modify a package, or you can't\. Future versions may contain more fine\-grained access levels, but that is not implemented at this time\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help publish .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help adduser .IP \(bu 2 npm help 7 disputes .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-pack.1�������������������������������������000644 �000766 �000024 �00000001471 12455173731 024104� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PACK" "1" "January 2015" "" "" .SH "NAME" \fBnpm-pack\fR \- Create a tarball from a package .SH SYNOPSIS .P .RS 2 .nf npm pack [<pkg> [<pkg> \.\.\.]] .fi .RE .SH DESCRIPTION .P For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as \fB<name>\-<version>\.tgz\fR, and then write the filenames out to stdout\. .P If the same package is specified multiple times, then the file will be overwritten the second time\. .P If no arguments are supplied, then npm packs the current package folder\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help cache .IP \(bu 2 npm help publish .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-prefix.1�����������������������������������000644 �000766 �000024 �00000001155 12455173731 024462� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PREFIX" "1" "January 2015" "" "" .SH "NAME" \fBnpm-prefix\fR \- Display prefix .SH SYNOPSIS .P .RS 2 .nf npm prefix [\-g] .fi .RE .SH DESCRIPTION .P Print the local prefix to standard out\. This is the closest parent directory to contain a package\.json file unless \fB\-g\fR is also specified\. .P If \fB\-g\fR is specified, this will be the value of the global prefix\. See npm help 7 \fBnpm\-config\fR for more detail\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help root .IP \(bu 2 npm help bin .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-prune.1������������������������������������000644 �000766 �000024 �00000001313 12455173731 024312� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PRUNE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-prune\fR \- Remove extraneous packages .SH SYNOPSIS .P .RS 2 .nf npm prune [<name> [<name \.\.\.]] npm prune [<name> [<name \.\.\.]] [\-\-production] .fi .RE .SH DESCRIPTION .P This command removes "extraneous" packages\. If a package name is provided, then only packages matching one of the supplied names are removed\. .P Extraneous packages are packages that are not listed on the parent package's dependencies list\. .P If the \fB\-\-production\fR flag is specified, this command will remove the packages specified in your \fBdevDependencies\fR\|\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help rm .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help ls .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-publish.1����������������������������������000644 �000766 �000024 �00000003015 12455173731 024630� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-PUBLISH" "1" "January 2015" "" "" .SH "NAME" \fBnpm-publish\fR \- Publish a package .SH SYNOPSIS .P .RS 2 .nf npm publish <tarball> [\-\-tag <tag>] npm publish <folder> [\-\-tag <tag>] .fi .RE .SH DESCRIPTION .P Publishes a package to the registry so that it can be installed by name\. See npm help 7 \fBnpm\-developers\fR for details on what's included in the published package, as well as details on how the package is built\. .P By default npm will publish to the public registry\. This can be overridden by specifying a different default registry or using a npm help 7 \fBnpm\-scope\fR in the name (see npm help 5 \fBpackage\.json\fR)\. .RS 0 .IP \(bu 2 \fB<folder>\fR: A folder containing a package\.json file .IP \(bu 2 \fB<tarball>\fR: A url or file path to a gzipped tar archive containing a single folder with a package\.json file inside\. .IP \(bu 2 \fB[\-\-tag <tag>]\fR Registers the published package with the given tag, such that \fBnpm install <name>@<tag>\fR will install this version\. By default, \fBnpm publish\fR updates and \fBnpm install\fR installs the \fBlatest\fR tag\. .RE .P Fails if the package name and version combination already exists in the specified registry\. .P Once a package is published with a given name and version, that specific name and version combination can never be used again, even if it is removed with npm help unpublish\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help adduser .IP \(bu 2 npm help owner .IP \(bu 2 npm help deprecate .IP \(bu 2 npm help tag .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-README.1�����������������������������������000644 �000766 �000024 �00000017151 12455173731 024125� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM" "1" "January 2015" "" "" .SH "NAME" \fBnpm\fR \- a JavaScript package manager .P Build Status \fIhttps://img\.shields\.io/travis/npm/npm/master\.svg\fR \fIhttps://travis\-ci\.org/npm/npm\fR .SH SYNOPSIS .P This is just enough info to get you up and running\. .P Much more info available via \fBnpm help\fR once it's installed\. .SH IMPORTANT .P \fBYou need node v0\.8 or higher to run this program\.\fR .P To install an old \fBand unsupported\fR version of npm that works on node 0\.3 and prior, clone the git repo and dig through the old tags and branches\. .SH Super Easy Install .P npm comes with node \fIhttp://nodejs\.org/download/\fR now\. .SS Windows Computers .P Get the MSI \fIhttp://nodejs\.org/download/\fR\|\. npm is in it\. .SS Apple Macintosh Computers .P Get the pkg \fIhttp://nodejs\.org/download/\fR\|\. npm is in it\. .SS Other Sorts of Unices .P Run \fBmake install\fR\|\. npm will be installed with node\. .P If you want a more fancy pants install (a different version, customized paths, etc\.) then read on\. .SH Fancy Install (Unix) .P There's a pretty robust install script at https://www\.npmjs\.com/install\.sh\|\. You can download that and run it\. .P Here's an example using curl: .P .RS 2 .nf curl \-L https://npmjs\.com/install\.sh | sh .fi .RE .SS Slightly Fancier .P You can set any npm configuration params with that script: .P .RS 2 .nf npm_config_prefix=/some/path sh install\.sh .fi .RE .P Or, you can run it in uber\-debuggery mode: .P .RS 2 .nf npm_debug=1 sh install\.sh .fi .RE .SS Even Fancier .P Get the code with git\. Use \fBmake\fR to build the docs and do other stuff\. If you plan on hacking on npm, \fBmake link\fR is your friend\. .P If you've got the npm source code, you can also semi\-permanently set arbitrary config keys using the \fB\|\./configure \-\-key=val \.\.\.\fR, and then run npm commands by doing \fBnode cli\.js <cmd> <args>\fR\|\. (This is helpful for testing, or running stuff without actually installing npm itself\.) .SH Windows Install or Upgrade .P You can download a zip file from https://github\.com/npm/npm/releases, and unpack it in the same folder where node\.exe lives\. .P The latest version in a zip file is 1\.4\.12\. To upgrade to npm 2, follow the Windows upgrade instructions in the npm Troubleshooting Guide: .P https://github\.com/npm/npm/wiki/Troubleshooting#upgrading\-on\-windows .P If that's not fancy enough for you, then you can fetch the code with git, and mess with it directly\. .SH Installing on Cygwin .P No\. .SH Uninstalling .P So sad to see you go\. .P .RS 2 .nf sudo npm uninstall npm \-g .fi .RE .P Or, if that fails, .P .RS 2 .nf sudo make uninstall .fi .RE .SH More Severe Uninstalling .P Usually, the above instructions are sufficient\. That will remove npm, but leave behind anything you've installed\. .P If you would like to remove all the packages that you have installed, then you can use the \fBnpm ls\fR command to find them, and then \fBnpm rm\fR to remove them\. .P To remove cruft left behind by npm 0\.x, you can use the included \fBclean\-old\.sh\fR script file\. You can run it conveniently like this: .P .RS 2 .nf npm explore npm \-g \-\- sh scripts/clean\-old\.sh .fi .RE .P npm uses two configuration files, one for per\-user configs, and another for global (every\-user) configs\. You can view them by doing: .P .RS 2 .nf npm config get userconfig # defaults to ~/\.npmrc npm config get globalconfig # defaults to /usr/local/etc/npmrc .fi .RE .P Uninstalling npm does not remove configuration files by default\. You must remove them yourself manually if you want them gone\. Note that this means that future npm installs will not remember the settings that you have chosen\. .SH Using npm Programmatically .P If you would like to use npm programmatically, you can do that\. It's not very well documented, but it \fIis\fR rather simple\. .P Most of the time, unless you actually want to do all the things that npm does, you should try using one of npm's dependencies rather than using npm itself, if possible\. .P Eventually, npm will be just a thin cli wrapper around the modules that it depends on, but for now, there are some things that you must use npm itself to do\. .P .RS 2 .nf var npm = require("npm") npm\.load(myConfigObject, function (er) { if (er) return handlError(er) npm\.commands\.install(["some", "args"], function (er, data) { if (er) return commandFailed(er) // command succeeded, and data might have some info }) npm\.registry\.log\.on("log", function (message) { \.\.\.\. }) }) .fi .RE .P The \fBload\fR function takes an object hash of the command\-line configs\. The various \fBnpm\.commands\.<cmd>\fR functions take an \fBarray\fR of positional argument \fBstrings\fR\|\. The last argument to any \fBnpm\.commands\.<cmd>\fR function is a callback\. Some commands take other optional arguments\. Read the source\. .P You cannot set configs individually for any single npm function at this time\. Since \fBnpm\fR is a singleton, any call to \fBnpm\.config\.set\fR will change the value for \fIall\fR npm commands in that process\. .P See \fB\|\./bin/npm\-cli\.js\fR for an example of pulling config values off of the command line arguments using nopt\. You may also want to check out \fBnpm help config\fR to learn about all the options you can set there\. .SH More Docs .P Check out the docs \fIhttps://docs\.npmjs\.com/\fR, especially the faq \fIhttps://docs\.npmjs\.com/misc/faq\fR\|\. .P You can use the \fBnpm help\fR command to read any of them\. .P If you're a developer, and you want to use npm to publish your program, you should read this \fIhttps://docs\.npmjs\.com/misc/developers\fR .SH Legal Stuff .P "npm" and "The npm Registry" are owned by npm, Inc\. All rights reserved\. See the included LICENSE file for more details\. .P "Node\.js" and "node" are trademarks owned by Joyent, Inc\. .P Modules published on the npm registry are not officially endorsed by npm, Inc\. or the Node\.js project\. .P Data published to the npm registry is not part of npm itself, and is the sole property of the publisher\. While every effort is made to ensure accountability, there is absolutely no guarantee, warrantee, or assertion expressed or implied as to the quality, fitness for a specific purpose, or lack of malice in any given npm package\. .P If you have a complaint about a package in the public npm registry, and cannot resolve it with the package owner \fIhttps://docs\.npmjs\.com/misc/disputes\fR, please email support@npmjs\.com and explain the situation\. .P Any data published to The npm Registry (including user account information) may be removed or modified at the sole discretion of the npm server administrators\. .SS In plainer english .P npm is the property of npm, Inc\. .P If you publish something, it's yours, and you are solely accountable for it\. .P If other people publish something, it's theirs\. .P Users can publish Bad Stuff\. It will be removed promptly if reported\. But there is no vetting process for published modules, and you use them at your own risk\. Please inspect the source\. .P If you publish Bad Stuff, we may delete it from the registry, or even ban your account in extreme cases\. So don't do that\. .SH BUGS .P When you find issues, please report them: .RS 0 .IP \(bu 2 web: https://github\.com/npm/npm/issues .RE .P Be sure to include \fIall\fR of the output from the npm command that didn't work as expected\. The \fBnpm\-debug\.log\fR file is also helpful to provide\. .P You can also look for isaacs in #node\.js on irc://irc\.freenode\.net\. He will no doubt tell you to put the output in a gist or email\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help npm .IP \(bu 2 npm help 7 faq .IP \(bu 2 npm help help .IP \(bu 2 npm help 7 index .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-rebuild.1����������������������������������000644 �000766 �000024 �00000001036 12455173731 024611� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-REBUILD" "1" "January 2015" "" "" .SH "NAME" \fBnpm-rebuild\fR \- Rebuild a package .SH SYNOPSIS .P .RS 2 .nf npm rebuild [<name> [<name> \.\.\.]] npm rb [<name> [<name> \.\.\.]] .fi .RE .RS 0 .IP \(bu 2 \fB<name>\fR: The package to rebuild .RE .SH DESCRIPTION .P This command runs the \fBnpm build\fR command on the matched folders\. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help build .IP \(bu 2 npm help install .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-repo.1�������������������������������������000644 �000766 �000024 �00000001462 12455173731 024133� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-REPO" "1" "January 2015" "" "" .SH "NAME" \fBnpm-repo\fR \- Open package repository page in the browser .SH SYNOPSIS .P .RS 2 .nf npm repo <pkgname> npm repo (with no args in a package dir) .fi .RE .SH DESCRIPTION .P This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the \fB\-\-browser\fR config param\. If no package name is provided, it will search for a \fBpackage\.json\fR in the current folder and use the \fBname\fR property\. .SH CONFIGURATION .SS browser .RS 0 .IP \(bu 2 Default: OS X: \fB"open"\fR, Windows: \fB"start"\fR, Others: \fB"xdg\-open"\fR .IP \(bu 2 Type: String .RE .P The browser that is called by the \fBnpm repo\fR command to open websites\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help docs .IP \(bu 2 npm help config .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-restart.1����������������������������������000644 �000766 �000024 �00000001704 12455173731 024651� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-RESTART" "1" "January 2015" "" "" .SH "NAME" \fBnpm-restart\fR \- Restart a package .SH SYNOPSIS .P .RS 2 .nf npm restart [\-\- <args>] .fi .RE .SH DESCRIPTION .P This restarts a package\. .P This runs a package's "stop", "restart", and "start" scripts, and associated pre\- and post\- scripts, in the order given below: .RS 0 .IP 1. 3 prerestart .IP 2. 3 prestop .IP 3. 3 stop .IP 4. 3 poststop .IP 5. 3 restart .IP 6. 3 prestart .IP 7. 3 start .IP 8. 3 poststart .IP 9. 3 postrestart .RE .SH NOTE .P Note that the "restart" script is run \fBin addition to\fR the "stop" and "start" scripts, not instead of them\. .P This is the behavior as of \fBnpm\fR major version 2\. A change in this behavior will be accompanied by an increase in major version number .SH SEE ALSO .RS 0 .IP \(bu 2 npm help run\-script .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help test .IP \(bu 2 npm help start .IP \(bu 2 npm help stop .IP \(bu 2 npm apihelp restart .RE ������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-rm.1���������������������������������������000644 �000766 �000024 �00000000737 12455173731 023610� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-RM" "1" "January 2015" "" "" .SH "NAME" \fBnpm-rm\fR \- Remove a package .SH SYNOPSIS .P .RS 2 .nf npm rm <name> npm r <name> npm uninstall <name> npm un <name> .fi .RE .SH DESCRIPTION .P This uninstalls a package, completely removing everything npm installed on its behalf\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help prune .IP \(bu 2 npm help install .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE ���������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-root.1�������������������������������������000644 �000766 �000024 �00000000623 12455173731 024147� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-ROOT" "1" "January 2015" "" "" .SH "NAME" \fBnpm-root\fR \- Display npm root .SH SYNOPSIS .P .RS 2 .nf npm root .fi .RE .SH DESCRIPTION .P Print the effective \fBnode_modules\fR folder to standard out\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help prefix .IP \(bu 2 npm help bin .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE �������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-run-script.1�������������������������������000644 �000766 �000024 �00000002413 12455173731 025271� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-RUN\-SCRIPT" "1" "January 2015" "" "" .SH "NAME" \fBnpm-run-script\fR \- Run arbitrary package scripts .SH SYNOPSIS .P .RS 2 .nf npm run\-script [command] [\-\- <args>] npm run [command] [\-\- <args>] .fi .RE .SH DESCRIPTION .P This runs an arbitrary command from a package's \fB"scripts"\fR object\. If no package name is provided, it will search for a \fBpackage\.json\fR in the current folder and use its \fB"scripts"\fR object\. If no \fB"command"\fR is provided, it will list the available top level scripts\. .P It is used by the test, start, restart, and stop commands, but can be called directly, as well\. .P As of \fBnpm@2\.0\.0\fR \fIhttp://blog\.npmjs\.org/post/98131109725/npm\-2\-0\-0\fR, you can use custom arguments when executing scripts\. The special option \fB\-\-\fR is used by getopt \fIhttp://goo\.gl/KxMmtG\fR to delimit the end of the options\. npm will pass all the arguments after the \fB\-\-\fR directly to your script: .P .RS 2 .nf npm run test \-\- \-\-grep="pattern" .fi .RE .P The arguments will only be passed to the script specified after \fBnpm run\fR and not to any pre or post script\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help test .IP \(bu 2 npm help start .IP \(bu 2 npm help restart .IP \(bu 2 npm help stop .RE �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-search.1�����������������������������������000644 �000766 �000024 �00000001771 12455173731 024436� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-SEARCH" "1" "January 2015" "" "" .SH "NAME" \fBnpm-search\fR \- Search for packages .SH SYNOPSIS .P .RS 2 .nf npm search [\-\-long] [search terms \.\.\.] npm s [search terms \.\.\.] npm se [search terms \.\.\.] .fi .RE .SH DESCRIPTION .P Search the registry for packages matching the search terms\. .P If a term starts with \fB/\fR, then it's interpreted as a regular expression\. A trailing \fB/\fR will be ignored in this case\. (Note that many regular expression characters must be escaped or quoted in most shells\.) .SH CONFIGURATION .SS long .RS 0 .IP \(bu 2 Default: false .IP \(bu 2 Type: Boolean .RE .P Display full package descriptions and other long text across multiple lines\. When disabled (default) search results are truncated to fit neatly on a single line\. Modules with extremely long names will fall on multiple lines\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help view .RE �������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-shrinkwrap.1�������������������������������000644 �000766 �000024 �00000014052 12455173731 025355� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-SHRINKWRAP" "1" "January 2015" "" "" .SH "NAME" \fBnpm-shrinkwrap\fR \- Lock down dependency versions .SH SYNOPSIS .P .RS 2 .nf npm shrinkwrap .fi .RE .SH DESCRIPTION .P This command locks down the versions of a package's dependencies so that you can control exactly which versions of each dependency will be used when your package is installed\. The "package\.json" file is still required if you want to use "npm install"\. .P By default, "npm install" recursively installs the target's dependencies (as specified in package\.json), choosing the latest available version that satisfies the dependency's semver pattern\. In some situations, particularly when shipping software where each change is tightly managed, it's desirable to fully specify each version of each dependency recursively so that subsequent builds and deploys do not inadvertently pick up newer versions of a dependency that satisfy the semver pattern\. Specifying specific semver patterns in each dependency's package\.json would facilitate this, but that's not always possible or desirable, as when another author owns the npm package\. It's also possible to check dependencies directly into source control, but that may be undesirable for other reasons\. .P As an example, consider package A: .P .RS 2 .nf { "name": "A", "version": "0\.1\.0", "dependencies": { "B": "<0\.1\.0" } } .fi .RE .P package B: .P .RS 2 .nf { "name": "B", "version": "0\.0\.1", "dependencies": { "C": "<0\.1\.0" } } .fi .RE .P and package C: .P .RS 2 .nf { "name": "C, "version": "0\.0\.1" } .fi .RE .P If these are the only versions of A, B, and C available in the registry, then a normal "npm install A" will install: .P .RS 2 .nf A@0\.1\.0 `\-\- B@0\.0\.1 `\-\- C@0\.0\.1 .fi .RE .P However, if B@0\.0\.2 is published, then a fresh "npm install A" will install: .P .RS 2 .nf A@0\.1\.0 `\-\- B@0\.0\.2 `\-\- C@0\.0\.1 .fi .RE .P assuming the new version did not modify B's dependencies\. Of course, the new version of B could include a new version of C and any number of new dependencies\. If such changes are undesirable, the author of A could specify a dependency on B@0\.0\.1\. However, if A's author and B's author are not the same person, there's no way for A's author to say that he or she does not want to pull in newly published versions of C when B hasn't changed at all\. .P In this case, A's author can run .P .RS 2 .nf npm shrinkwrap .fi .RE .P This generates npm\-shrinkwrap\.json, which will look something like this: .P .RS 2 .nf { "name": "A", "version": "0\.1\.0", "dependencies": { "B": { "version": "0\.0\.1", "dependencies": { "C": { "version": "0\.1\.0" } } } } } .fi .RE .P The shrinkwrap command has locked down the dependencies based on what's currently installed in node_modules\. When "npm install" installs a package with a npm\-shrinkwrap\.json file in the package root, the shrinkwrap file (rather than package\.json files) completely drives the installation of that package and all of its dependencies (recursively)\. So now the author publishes A@0\.1\.0, and subsequent installs of this package will use B@0\.0\.1 and C@0\.1\.0, regardless the dependencies and versions listed in A's, B's, and C's package\.json files\. .SS Using shrinkwrapped packages .P Using a shrinkwrapped package is no different than using any other package: you can "npm install" it by hand, or add a dependency to your package\.json file and "npm install" it\. .SS Building shrinkwrapped packages .P To shrinkwrap an existing package: .RS 0 .IP 1. 3 Run "npm install" in the package root to install the current versions of all dependencies\. .IP 2. 3 Validate that the package works as expected with these versions\. .IP 3. 3 Run "npm shrinkwrap", add npm\-shrinkwrap\.json to git, and publish your package\. .RE .P To add or update a dependency in a shrinkwrapped package: .RS 0 .IP 1. 3 Run "npm install" in the package root to install the current versions of all dependencies\. .IP 2. 3 Add or update dependencies\. "npm install" each new or updated package individually and then update package\.json\. Note that they must be explicitly named in order to be installed: running \fBnpm install\fR with no arguments will merely reproduce the existing shrinkwrap\. .IP 3. 3 Validate that the package works as expected with the new dependencies\. .IP 4. 3 Run "npm shrinkwrap", commit the new npm\-shrinkwrap\.json, and publish your package\. .RE .P You can use npm help outdated to view dependencies with newer versions available\. .SS Other Notes .P A shrinkwrap file must be consistent with the package's package\.json file\. "npm shrinkwrap" will fail if required dependencies are not already installed, since that would result in a shrinkwrap that wouldn't actually work\. Similarly, the command will fail if there are extraneous packages (not referenced by package\.json), since that would indicate that package\.json is not correct\. .P Since "npm shrinkwrap" is intended to lock down your dependencies for production use, \fBdevDependencies\fR will not be included unless you explicitly set the \fB\-\-dev\fR flag when you run \fBnpm shrinkwrap\fR\|\. If installed \fBdevDependencies\fR are excluded, then npm will print a warning\. If you want them to be installed with your module by default, please consider adding them to \fBdependencies\fR instead\. .P If shrinkwrapped package A depends on shrinkwrapped package B, B's shrinkwrap will not be used as part of the installation of A\. However, because A's shrinkwrap is constructed from a valid installation of B and recursively specifies all dependencies, the contents of B's shrinkwrap will implicitly be included in A's shrinkwrap\. .SS Caveats .P If you wish to lock down the specific bytes included in a package, for example to have 100% confidence in being able to reproduce a deployment or build, then you ought to check your dependencies into source control, or pursue some other mechanism that can verify contents rather than versions\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help install .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help ls .RE ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-star.1�������������������������������������000644 �000766 �000024 �00000001072 12455173731 024134� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-STAR" "1" "January 2015" "" "" .SH "NAME" \fBnpm-star\fR \- Mark your favorite packages .SH SYNOPSIS .P .RS 2 .nf npm star <pkgname> [<pkg>, \.\.\.] npm unstar <pkgname> [<pkg>, \.\.\.] .fi .RE .SH DESCRIPTION .P "Starring" a package means that you have some interest in it\. It's a vaguely positive way to show that you care\. .P "Unstarring" is the same thing, but in reverse\. .P It's a boolean thing\. Starring repeatedly has no additional effect\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help view .IP \(bu 2 npm help whoami .IP \(bu 2 npm help adduser .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-stars.1������������������������������������000644 �000766 �000024 �00000001051 12455173731 024314� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-STARS" "1" "January 2015" "" "" .SH "NAME" \fBnpm-stars\fR \- View packages marked as favorites .SH SYNOPSIS .P .RS 2 .nf npm stars npm stars [username] .fi .RE .SH DESCRIPTION .P If you have starred a lot of neat things and want to find them again quickly this command lets you do just that\. .P You may also want to see your friend's favorite packages, in this case you will most certainly enjoy this command\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help star .IP \(bu 2 npm help view .IP \(bu 2 npm help whoami .IP \(bu 2 npm help adduser .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-start.1������������������������������������000644 �000766 �000024 �00000000606 12455173731 024322� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-START" "1" "January 2015" "" "" .SH "NAME" \fBnpm-start\fR \- Start a package .SH SYNOPSIS .P .RS 2 .nf npm start [\-\- <args>] .fi .RE .SH DESCRIPTION .P This runs a package's "start" script, if one was provided\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help run\-script .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help test .IP \(bu 2 npm help restart .IP \(bu 2 npm help stop .RE ��������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-stop.1�������������������������������������000644 �000766 �000024 �00000000602 12455173731 024146� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-STOP" "1" "January 2015" "" "" .SH "NAME" \fBnpm-stop\fR \- Stop a package .SH SYNOPSIS .P .RS 2 .nf npm stop [\-\- <args>] .fi .RE .SH DESCRIPTION .P This runs a package's "stop" script, if one was provided\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help run\-script .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help test .IP \(bu 2 npm help start .IP \(bu 2 npm help restart .RE ������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-submodule.1��������������������������������000644 �000766 �000024 �00000002123 12455173731 025160� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.\" Generated with Ronnjs 0.3.8 .\" http://github.com/kapouer/ronnjs/ . .TH "NPM\-SUBMODULE" "1" "September 2014" "" "" . .SH "NAME" \fBnpm-submodule\fR \-\- Add a package as a git submodule . .SH "SYNOPSIS" . .nf npm submodule <pkg> . .fi . .SH "DESCRIPTION" If the specified package has a git repository url in its package\.json description, then this command will add it as a git submodule at \fBnode_modules/<pkg name>\fR\|\. . .P This is a convenience only\. From then on, it\'s up to you to manage updates by using the appropriate git commands\. npm will stubbornly refuse to update, modify, or remove anything with a \fB\|\.git\fR subfolder in it\. . .P This command also does not install missing dependencies, if the package does not include them in its git repository\. If \fBnpm ls\fR reports that things are missing, you can either install, link, or submodule them yourself, or you can do \fBnpm explore <pkgname> \-\- npm install\fR to install the dependencies into the submodule folder\. . .SH "SEE ALSO" . .IP "\(bu" 4 npm help 5 package\.json . .IP "\(bu" 4 git help submodule . .IP "" 0 ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-tag.1��������������������������������������000644 �000766 �000024 �00000001640 12455173731 023737� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-TAG" "1" "January 2015" "" "" .SH "NAME" \fBnpm-tag\fR \- Tag a published version .SH SYNOPSIS .P .RS 2 .nf npm tag <name>@<version> [<tag>] .fi .RE .SH DESCRIPTION .P Tags the specified version of the package with the specified tag, or the \fB\-\-tag\fR config if not specified\. .P A tag can be used when installing packages as a reference to a version instead of using a specific version number: .P .RS 2 .nf npm install <name>@<tag> .fi .RE .P When installing dependencies, a preferred tagged version may be specified: .P .RS 2 .nf npm install \-\-tag <tag> .fi .RE .P This also applies to \fBnpm dedupe\fR\|\. .P Publishing a package always sets the "latest" tag to the published version\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help publish .IP \(bu 2 npm help install .IP \(bu 2 npm help dedupe .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE ������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-test.1�������������������������������������000644 �000766 �000024 �00000000760 12455173731 024145� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-TEST" "1" "January 2015" "" "" .SH "NAME" \fBnpm-test\fR \- Test a package .SH SYNOPSIS .P .RS 2 .nf npm test [\-\- <args>] npm tst [\-\- <args>] .fi .RE .SH DESCRIPTION .P This runs a package's "test" script, if one was provided\. .P To run tests as a condition of installation, set the \fBnpat\fR config to true\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help run\-script .IP \(bu 2 npm help 7 scripts .IP \(bu 2 npm help start .IP \(bu 2 npm help restart .IP \(bu 2 npm help stop .RE ����������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-uninstall.1��������������������������������000644 �000766 �000024 �00000002636 12455173731 025203� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-RM" "1" "January 2015" "" "" .SH "NAME" \fBnpm-rm\fR \- Remove a package .SH SYNOPSIS .P .RS 2 .nf npm uninstall [@<scope>/]<package> [\-\-save|\-\-save\-dev|\-\-save\-optional] npm rm (with any of the previous argument usage) .fi .RE .SH DESCRIPTION .P This uninstalls a package, completely removing everything npm installed on its behalf\. .P Example: .P .RS 2 .nf npm uninstall sax .fi .RE .P In global mode (ie, with \fB\-g\fR or \fB\-\-global\fR appended to the command), it uninstalls the current package context as a global package\. .P \fBnpm uninstall\fR takes 3 exclusive, optional flags which save or update the package version in your main package\.json: .RS 0 .IP \(bu 2 \fB\-\-save\fR: Package will be removed from your \fBdependencies\fR\|\. .IP \(bu 2 \fB\-\-save\-dev\fR: Package will be removed from your \fBdevDependencies\fR\|\. .IP \(bu 2 \fB\-\-save\-optional\fR: Package will be removed from your \fBoptionalDependencies\fR\|\. .RE .P Scope is optional and follows the usual rules for npm help 7 \fBnpm\-scope\fR\|\. .P Examples: .P .RS 2 .nf npm uninstall sax \-\-save npm uninstall @myorg/privatepackage \-\-save npm uninstall node\-tap \-\-save\-dev npm uninstall dtrace\-provider \-\-save\-optional .fi .RE .SH SEE ALSO .RS 0 .IP \(bu 2 npm help prune .IP \(bu 2 npm help install .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .RE ��������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-unpublish.1��������������������������������000644 �000766 �000024 �00000002202 12455173731 025170� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-UNPUBLISH" "1" "January 2015" "" "" .SH "NAME" \fBnpm-unpublish\fR \- Remove a package from the registry .SH SYNOPSIS .P .RS 2 .nf npm unpublish [@<scope>/]<name>[@<version>] .fi .RE .SH WARNING .P \fBIt is generally considered bad behavior to remove versions of a library that others are depending on!\fR .P Consider using the \fBdeprecate\fR command instead, if your intent is to encourage users to upgrade\. .P There is plenty of room on the registry\. .SH DESCRIPTION .P This removes a package version from the registry, deleting its entry and removing the tarball\. .P If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely\. .P Even if a package version is unpublished, that specific name and version combination can never be reused\. In order to publish the package again, a new version number must be used\. .P The scope is optional and follows the usual rules for npm help 7 \fBnpm\-scope\fR\|\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help deprecate .IP \(bu 2 npm help publish .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help adduser .IP \(bu 2 npm help owner .RE ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-update.1�����������������������������������000644 �000766 �000024 �00000001305 12455173731 024444� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-UPDATE" "1" "January 2015" "" "" .SH "NAME" \fBnpm-update\fR \- Update a package .SH SYNOPSIS .P .RS 2 .nf npm update [\-g] [<name> [<name> \.\.\.]] .fi .RE .SH DESCRIPTION .P This command will update all the packages listed to the latest version (specified by the \fBtag\fR config)\. .P It will also install missing packages\. .P If the \fB\-g\fR flag is specified, this command will update globally installed packages\. .P If no package name is specified, all packages in the specified location (global or local) will be updated\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help install .IP \(bu 2 npm help outdated .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help 5 folders .IP \(bu 2 npm help ls .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-version.1����������������������������������000644 �000766 �000024 �00000003277 12455173731 024661� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-VERSION" "1" "January 2015" "" "" .SH "NAME" \fBnpm-version\fR \- Bump a package version .SH SYNOPSIS .P .RS 2 .nf npm version [<newversion> | major | minor | patch | premajor | preminor | prepatch | prerelease] .fi .RE .SH DESCRIPTION .P Run this in a package directory to bump the version and write the new data back to \fBpackage\.json\fR and, if present, \fBnpm\-shrinkwrap\.json\fR\|\. .P The \fBnewversion\fR argument should be a valid semver string, \fIor\fR a valid second argument to semver\.inc (one of "patch", "minor", "major", "prepatch", "preminor", "premajor", "prerelease")\. In the second case, the existing version will be incremented by 1 in the specified field\. .P If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean\. .P If supplied with \fB\-\-message\fR (shorthand: \fB\-m\fR) config option, npm will use it as a commit message when creating a version commit\. If the \fBmessage\fR config contains \fB%s\fR then that will be replaced with the resulting version number\. For example: .P .RS 2 .nf npm version patch \-m "Upgrade to %s for reasons" .fi .RE .P If the \fBsign\-git\-tag\fR config is set, then the tag will be signed using the \fB\-s\fR flag to git\. Note that you must have a default GPG key set up in your git config for this to work properly\. For example: .P .RS 2 .nf $ npm config set sign\-git\-tag true $ npm version patch You need a passphrase to unlock the secret key for user: "isaacs (http://blog\.izs\.me/) <i@izs\.me>" 2048\-bit RSA key, ID 6C481CF6, created 2010\-08\-31 Enter passphrase: .fi .RE .SH SEE ALSO .RS 0 .IP \(bu 2 npm help init .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help 7 semver .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-view.1�������������������������������������000644 �000766 �000024 �00000006343 12455173731 024143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-VIEW" "1" "January 2015" "" "" .SH "NAME" \fBnpm-view\fR \- View registry info .SH SYNOPSIS .P .RS 2 .nf npm view [@<scope>/]<name>[@<version>] [<field>[\.<subfield>]\.\.\.] npm v [@<scope>/]<name>[@<version>] [<field>[\.<subfield>]\.\.\.] .fi .RE .SH DESCRIPTION .P This command shows data about a package and prints it to the stream referenced by the \fBoutfd\fR config, which defaults to stdout\. .P To show the package registry entry for the \fBconnect\fR package, you can do this: .P .RS 2 .nf npm view connect .fi .RE .P The default version is "latest" if unspecified\. .P Field names can be specified after the package descriptor\. For example, to show the dependencies of the \fBronn\fR package at version 0\.3\.5, you could do the following: .P .RS 2 .nf npm view ronn@0\.3\.5 dependencies .fi .RE .P You can view child field by separating them with a period\. To view the git repository URL for the latest version of npm, you could do this: .P .RS 2 .nf npm view npm repository\.url .fi .RE .P This makes it easy to view information about a dependency with a bit of shell scripting\. For example, to view all the data about the version of opts that ronn depends on, you can do this: .P .RS 2 .nf npm view opts@$(npm view ronn dependencies\.opts) .fi .RE .P For fields that are arrays, requesting a non\-numeric field will return all of the values from the objects in the list\. For example, to get all the contributor names for the "express" project, you can do this: .P .RS 2 .nf npm view express contributors\.email .fi .RE .P You may also use numeric indices in square braces to specifically select an item in an array field\. To just get the email address of the first contributor in the list, you can do this: .P .RS 2 .nf npm view express contributors[0]\.email .fi .RE .P Multiple fields may be specified, and will be printed one after another\. For exampls, to get all the contributor names and email addresses, you can do this: .P .RS 2 .nf npm view express contributors\.name contributors\.email .fi .RE .P "Person" fields are shown as a string if they would be shown as an object\. So, for example, this will show the list of npm contributors in the shortened string format\. (See npm help 5 \fBpackage\.json\fR for more on this\.) .P .RS 2 .nf npm view npm contributors .fi .RE .P If a version range is provided, then data will be printed for every matching version of the package\. This will show which version of jsdom was required by each matching version of yui3: .P .RS 2 .nf npm view yui3@'>0\.5\.4' dependencies\.jsdom .fi .RE .SH OUTPUT .P If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command\. If the field is an object, it will be output as a JavaScript object literal\. .P If the \-\-json flag is given, the outputted fields will be JSON\. .P If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to\. .P If multiple fields are requested, than each of them are prefixed with the field name\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help search .IP \(bu 2 npm help 7 registry .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help docs .RE ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm-whoami.1�����������������������������������000644 �000766 �000024 �00000000535 12455173731 024452� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM\-WHOAMI" "1" "January 2015" "" "" .SH "NAME" \fBnpm-whoami\fR \- Display npm username .SH SYNOPSIS .P .RS 2 .nf npm whoami .fi .RE .SH DESCRIPTION .P Print the \fBusername\fR config to standard output\. .SH SEE ALSO .RS 0 .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help adduser .RE �������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/man/man1/npm.1������������������������������������������000644 �000766 �000024 �00000013602 12455173731 023167� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������.TH "NPM" "1" "January 2015" "" "" .SH "NAME" \fBnpm\fR \- node package manager .SH SYNOPSIS .P .RS 2 .nf npm <command> [args] .fi .RE .SH VERSION .P 2.1.18 .SH DESCRIPTION .P npm is the package manager for the Node JavaScript platform\. It puts modules in place so that node can find them, and manages dependency conflicts intelligently\. .P It is extremely configurable to support a wide variety of use cases\. Most commonly, it is used to publish, discover, install, and develop node programs\. .P Run \fBnpm help\fR to get a list of available commands\. .SH INTRODUCTION .P You probably got npm because you want to install stuff\. .P Use \fBnpm install blerg\fR to install the latest version of "blerg"\. Check out npm help \fBnpm\-install\fR for more info\. It can do a lot of stuff\. .P Use the \fBnpm search\fR command to show everything that's available\. Use \fBnpm ls\fR to show everything you've installed\. .SH DEPENDENCIES .P If a package references to another package with a git URL, npm depends on a preinstalled git\. .P If one of the packages npm tries to install is a native node module and requires compiling of C++ Code, npm will use node\-gyp \fIhttps://github\.com/TooTallNate/node\-gyp\fR for that task\. For a Unix system, node\-gyp \fIhttps://github\.com/TooTallNate/node\-gyp\fR needs Python, make and a buildchain like GCC\. On Windows, Python and Microsoft Visual Studio C++ is needed\. Python 3 is not supported by node\-gyp \fIhttps://github\.com/TooTallNate/node\-gyp\fR\|\. For more information visit the node\-gyp repository \fIhttps://github\.com/TooTallNate/node\-gyp\fR and the node\-gyp Wiki \fIhttps://github\.com/TooTallNate/node\-gyp/wiki\fR\|\. .SH DIRECTORIES .P See npm help 5 \fBnpm\-folders\fR to learn about where npm puts stuff\. .P In particular, npm has two modes of operation: .RS 0 .IP \(bu 2 global mode: .br npm installs packages into the install prefix at \fBprefix/lib/node_modules\fR and bins are installed in \fBprefix/bin\fR\|\. .IP \(bu 2 local mode: .br npm installs packages into the current project directory, which defaults to the current working directory\. Packages are installed to \fB\|\./node_modules\fR, and bins are installed to \fB\|\./node_modules/\.bin\fR\|\. .RE .P Local mode is the default\. Use \fB\-\-global\fR or \fB\-g\fR on any command to operate in global mode instead\. .SH DEVELOPER USAGE .P If you're using npm to develop and publish your code, check out the following help topics: .RS 0 .IP \(bu 2 json: Make a package\.json file\. See npm help 5 \fBpackage\.json\fR\|\. .IP \(bu 2 link: For linking your current working code into Node's path, so that you don't have to reinstall every time you make a change\. Use \fBnpm link\fR to do this\. .IP \(bu 2 install: It's a good idea to install things if you don't need the symbolic link\. Especially, installing other peoples code from the registry is done via \fBnpm install\fR .IP \(bu 2 adduser: Create an account or log in\. Credentials are stored in the user config file\. .IP \(bu 2 publish: Use the \fBnpm publish\fR command to upload your code to the registry\. .RE .SH CONFIGURATION .P npm is extremely configurable\. It reads its configuration options from 5 places\. .RS 0 .IP \(bu 2 Command line switches: .br Set a config with \fB\-\-key val\fR\|\. All keys take a value, even if they are booleans (the config parser doesn't know what the options are at the time of parsing\.) If no value is provided, then the option is set to boolean \fBtrue\fR\|\. .IP \(bu 2 Environment Variables: .br Set any config by prefixing the name in an environment variable with \fBnpm_config_\fR\|\. For example, \fBexport npm_config_key=val\fR\|\. .IP \(bu 2 User Configs: .br The file at $HOME/\.npmrc is an ini\-formatted list of configs\. If present, it is parsed\. If the \fBuserconfig\fR option is set in the cli or env, then that will be used instead\. .IP \(bu 2 Global Configs: .br The file found at \.\./etc/npmrc (from the node executable, by default this resolves to /usr/local/etc/npmrc) will be parsed if it is found\. If the \fBglobalconfig\fR option is set in the cli, env, or user config, then that file is parsed instead\. .IP \(bu 2 Defaults: .br npm's default configuration options are defined in lib/utils/config\-defs\.js\. These must not be changed\. .RE .P See npm help 7 \fBnpm\-config\fR for much much more information\. .SH CONTRIBUTIONS .P Patches welcome! .RS 0 .IP \(bu 2 code: Read through npm help 7 \fBnpm\-coding\-style\fR if you plan to submit code\. You don't have to agree with it, but you do have to follow it\. .IP \(bu 2 docs: If you find an error in the documentation, edit the appropriate markdown file in the "doc" folder\. (Don't worry about generating the man page\.) .RE .P Contributors are listed in npm's \fBpackage\.json\fR file\. You can view them easily by doing \fBnpm view npm contributors\fR\|\. .P If you would like to contribute, but don't know what to work on, check the issues list or ask on the mailing list\. .RS 0 .IP \(bu 2 http://github\.com/npm/npm/issues .IP \(bu 2 npm\-@googlegroups\.com .RE .SH BUGS .P When you find issues, please report them: .RS 0 .IP \(bu 2 web: http://github\.com/npm/npm/issues .IP \(bu 2 email: npm\-@googlegroups\.com .RE .P Be sure to include \fIall\fR of the output from the npm command that didn't work as expected\. The \fBnpm\-debug\.log\fR file is also helpful to provide\. .P You can also look for isaacs in #node\.js on irc://irc\.freenode\.net\. He will no doubt tell you to put the output in a gist or email\. .SH AUTHOR .P Isaac Z\. Schlueter \fIhttp://blog\.izs\.me/\fR :: isaacs \fIhttps://github\.com/isaacs/\fR :: @izs \fIhttp://twitter\.com/izs\fR :: i@izs\.me .SH SEE ALSO .RS 0 .IP \(bu 2 npm help help .IP \(bu 2 npm help 7 faq .IP \(bu 2 README .IP \(bu 2 npm help 5 package\.json .IP \(bu 2 npm help install .IP \(bu 2 npm help config .IP \(bu 2 npm help 7 config .IP \(bu 2 npm help 5 npmrc .IP \(bu 2 npm help 7 index .IP \(bu 2 npm apihelp npm .RE ������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/adduser.js������������������������������������������000644 �000766 �000024 �00000007514 12455173731 023444� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = adduser var log = require("npmlog") , npm = require("./npm.js") , read = require("read") , userValidate = require("npm-user-validate") , crypto try { crypto = process.binding("crypto") && require("crypto") } catch (ex) {} adduser.usage = "npm adduser\nThen enter stuff at the prompts" function adduser (args, cb) { npm.spinner.stop() if (!crypto) return cb(new Error( "You must compile node with ssl support to use the adduser feature")) var creds = npm.config.getCredentialsByURI(npm.config.get("registry")) var c = { u : creds.username || "" , p : creds.password || "" , e : creds.email || "" } , u = {} , fns = [readUsername, readPassword, readEmail, save] loop() function loop (er) { if (er) return cb(er) var fn = fns.shift() if (fn) return fn(c, u, loop) cb() } } function readUsername (c, u, cb) { var v = userValidate.username read({prompt: "Username: ", default: c.u || ""}, function (er, un) { if (er) { return cb(er.message === "cancelled" ? er.message : er) } // make sure it's valid. we have to do this here, because // couchdb will only ever say "bad password" with a 401 when // you try to PUT a _users record that the validate_doc_update // rejects for *any* reason. if (!un) { return readUsername(c, u, cb) } var error = v(un) if (error) { log.warn(error.message) return readUsername(c, u, cb) } c.changed = c.u !== un u.u = un cb(er) }) } function readPassword (c, u, cb) { var v = userValidate.pw var prompt if (c.p && !c.changed) { prompt = "Password: (or leave unchanged) " } else { prompt = "Password: " } read({prompt: prompt, silent: true}, function (er, pw) { if (er) { return cb(er.message === "cancelled" ? er.message : er) } if (!c.changed && pw === "") { // when the username was not changed, // empty response means "use the old value" pw = c.p } if (!pw) { return readPassword(c, u, cb) } var error = v(pw) if (error) { log.warn(error.message) return readPassword(c, u, cb) } c.changed = c.changed || c.p !== pw u.p = pw cb(er) }) } function readEmail (c, u, cb) { var v = userValidate.email var r = { prompt: "Email: (this IS public) ", default: c.e || "" } read(r, function (er, em) { if (er) { return cb(er.message === "cancelled" ? er.message : er) } if (!em) { return readEmail(c, u, cb) } var error = v(em) if (error) { log.warn(error.message) return readEmail(c, u, cb) } u.e = em cb(er) }) } function save (c, u, cb) { npm.spinner.start() // save existing configs, but yank off for this PUT var uri = npm.config.get("registry") var scope = npm.config.get("scope") // there may be a saved scope and no --registry (for login) if (scope) { if (scope.charAt(0) !== "@") scope = "@" + scope var scopedRegistry = npm.config.get(scope + ":registry") if (scopedRegistry) uri = scopedRegistry } var params = { auth : { username : u.u, password : u.p, email : u.e } } npm.registry.adduser(uri, params, function (er, doc) { npm.spinner.stop() if (er) return cb(er) // don't want this polluting the configuration npm.config.del("_token", "user") if (scope) npm.config.set(scope + ":registry", uri, "user") if (doc && doc.token) { npm.config.setCredentialsByURI(uri, { token : doc.token }) } else { npm.config.setCredentialsByURI(uri, { username : u.u, password : u.p, email : u.e, alwaysAuth : npm.config.get("always-auth") }) } log.info("adduser", "Authorized user %s", u.u) npm.config.save("user", cb) }) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/bin.js����������������������������������������������000644 �000766 �000024 �00000000752 12455173731 022562� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = bin var npm = require("./npm.js") bin.usage = "npm bin\nnpm bin -g\n(just prints the bin folder)" function bin (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false var b = npm.bin , PATH = (process.env.PATH || "").split(":") if (!silent) console.log(b) process.nextTick(cb.bind(this, null, b)) if (npm.config.get("global") && PATH.indexOf(b) === -1) { npm.config.get("logstream").write("(not in PATH env variable)\n") } } ����������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/bugs.js���������������������������������������������000644 �000766 �000024 �00000004026 12455173731 022750� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = bugs bugs.usage = "npm bugs <pkgname>" var npm = require("./npm.js") , log = require("npmlog") , opener = require("opener") , path = require("path") , readJson = require("read-package-json") , npa = require("npm-package-arg") , fs = require("fs") , mapToRegistry = require("./utils/map-to-registry.js") bugs.completion = function (opts, cb) { if (opts.conf.argv.remain.length > 2) return cb() mapToRegistry("-/short", npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { timeout : 60000, auth : auth }, function (er, list) { return cb(null, list || []) }) }) } function bugs (args, cb) { var n = args.length && npa(args[0]).name || "." fs.stat(n, function (er, s) { if (er) { if (er.code === "ENOENT") return callRegistry(n, cb) return cb(er) } if (!s.isDirectory()) return callRegistry(n, cb) readJson(path.resolve(n, "package.json"), function(er, d) { if (er) return cb(er) getUrlAndOpen(d, cb) }) }) } function getUrlAndOpen (d, cb) { var repo = d.repository || d.repositories , url if (d.bugs) { url = (typeof d.bugs === "string") ? d.bugs : d.bugs.url } else if (repo) { if (Array.isArray(repo)) repo = repo.shift() if (repo.hasOwnProperty("url")) repo = repo.url log.verbose("bugs", "repository", repo) if (repo && repo.match(/^(https?:\/\/|git(:\/\/|@))github.com/)) { url = repo.replace(/^git(@|:\/\/)/, "https://") .replace(/^https?:\/\/github.com:/, "https://github.com/") .replace(/\.git$/, "")+"/issues" } } if (!url) { url = "https://www.npmjs.org/package/" + d.name } log.silly("bugs", "url", url) opener(url, { command: npm.config.get("browser") }, cb) } function callRegistry (name, cb) { mapToRegistry(name, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri + "/latest", { auth : auth }, function (er, d) { if (er) return cb(er) getUrlAndOpen(d, cb) }) }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/build.js��������������������������������������������000644 �000766 �000024 �00000017577 12455173731 023126� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm build command // everything about the installation after the creation of // the .npm/{name}/{version}/package folder. // linking the modules into the npm.root, // resolving dependencies, etc. // This runs AFTER install or link are completed. var npm = require("./npm.js") , log = require("npmlog") , chain = require("slide").chain , fs = require("graceful-fs") , path = require("path") , lifecycle = require("./utils/lifecycle.js") , readJson = require("read-package-json") , link = require("./utils/link.js") , linkIfExists = link.ifExists , cmdShim = require("cmd-shim") , cmdShimIfExists = cmdShim.ifExists , asyncMap = require("slide").asyncMap , ini = require("ini") , writeFile = require("write-file-atomic") module.exports = build build.usage = "npm build <folder>\n(this is plumbing)" build._didBuild = {} build._noLC = {} function build (args, global, didPre, didRB, cb) { if (typeof cb !== "function") cb = didRB, didRB = false if (typeof cb !== "function") cb = didPre, didPre = false if (typeof cb !== "function") { cb = global, global = npm.config.get("global") } // it'd be nice to asyncMap these, but actually, doing them // in parallel generally munges up the output from node-waf var builder = build_(global, didPre, didRB) chain(args.map(function (arg) { return function (cb) { builder(arg, cb) }}), cb) } function build_ (global, didPre, didRB) { return function (folder, cb) { folder = path.resolve(folder) if (build._didBuild[folder]) log.info("build", "already built", folder) build._didBuild[folder] = true log.info("build", folder) readJson(path.resolve(folder, "package.json"), function (er, pkg) { if (er) return cb(er) chain ( [ !didPre && [lifecycle, pkg, "preinstall", folder] , [linkStuff, pkg, folder, global, didRB] , [writeBuiltinConf, pkg, folder] , didPre !== build._noLC && [lifecycle, pkg, "install", folder] , didPre !== build._noLC && [lifecycle, pkg, "postinstall", folder] , didPre !== build._noLC && npm.config.get("npat") && [lifecycle, pkg, "test", folder] ] , cb ) }) }} function writeBuiltinConf (pkg, folder, cb) { // the builtin config is "sticky". Any time npm installs // itself globally, it puts its builtin config file there var parent = path.dirname(folder) var dir = npm.globalDir if (pkg.name !== "npm" || !npm.config.get("global") || !npm.config.usingBuiltin || dir !== parent) { return cb() } var data = ini.stringify(npm.config.sources.builtin.data) writeFile(path.resolve(folder, "npmrc"), data, cb) } function linkStuff (pkg, folder, global, didRB, cb) { // allow to opt out of linking binaries. if (npm.config.get("bin-links") === false) return cb() // if it's global, and folder is in {prefix}/node_modules, // then bins are in {prefix}/bin // otherwise, then bins are in folder/../.bin var parent = pkg.name[0] === "@" ? path.dirname(path.dirname(folder)) : path.dirname(folder) , gnm = global && npm.globalDir , gtop = parent === gnm log.verbose("linkStuff", [global, gnm, gtop, parent]) log.info("linkStuff", pkg._id) shouldWarn(pkg, folder, global, function() { asyncMap( [linkBins, linkMans, !didRB && rebuildBundles] , function (fn, cb) { if (!fn) return cb() log.verbose(fn.name, pkg._id) fn(pkg, folder, parent, gtop, cb) }, cb) }) } function shouldWarn(pkg, folder, global, cb) { var parent = path.dirname(folder) , top = parent === npm.dir , cwd = npm.localPrefix readJson(path.resolve(cwd, "package.json"), function(er, topPkg) { if (er) return cb(er) var linkedPkg = path.basename(cwd) , currentPkg = path.basename(folder) // current searched package is the linked package on first call if (linkedPkg !== currentPkg) { if (!topPkg.dependencies) return cb() // don't generate a warning if it's listed in dependencies if (Object.keys(topPkg.dependencies).indexOf(currentPkg) === -1) { if (top && pkg.preferGlobal && !global) { log.warn("prefer global", pkg._id + " should be installed with -g") } } } cb() }) } function rebuildBundles (pkg, folder, parent, gtop, cb) { if (!npm.config.get("rebuild-bundle")) return cb() var deps = Object.keys(pkg.dependencies || {}) .concat(Object.keys(pkg.devDependencies || {})) , bundles = pkg.bundleDependencies || pkg.bundledDependencies || [] fs.readdir(path.resolve(folder, "node_modules"), function (er, files) { // error means no bundles if (er) return cb() log.verbose("rebuildBundles", files) // don't asyncMap these, because otherwise build script output // gets interleaved and is impossible to read chain(files.filter(function (file) { // rebuild if: // not a .folder, like .bin or .hooks return !file.match(/^[\._-]/) // not some old 0.x style bundle && file.indexOf("@") === -1 // either not a dep, or explicitly bundled && (deps.indexOf(file) === -1 || bundles.indexOf(file) !== -1) }).map(function (file) { file = path.resolve(folder, "node_modules", file) return function (cb) { if (build._didBuild[file]) return cb() log.verbose("rebuild bundle", file) // if file is not a package dir, then don't do it. fs.lstat(path.resolve(file, "package.json"), function (er) { if (er) return cb() build_(false)(file, cb) }) }}), cb) }) } function linkBins (pkg, folder, parent, gtop, cb) { if (!pkg.bin || !gtop && path.basename(parent) !== "node_modules") { return cb() } var binRoot = gtop ? npm.globalBin : path.resolve(parent, ".bin") log.verbose("link bins", [pkg.bin, binRoot, gtop]) asyncMap(Object.keys(pkg.bin), function (b, cb) { linkBin( path.resolve(folder, pkg.bin[b]) , path.resolve(binRoot, b) , gtop && folder , function (er) { if (er) return cb(er) // bins should always be executable. // XXX skip chmod on windows? var src = path.resolve(folder, pkg.bin[b]) fs.chmod(src, npm.modes.exec, function (er) { if (er && er.code === "ENOENT" && npm.config.get("ignore-scripts")) { return cb() } if (er || !gtop) return cb(er) var dest = path.resolve(binRoot, b) , out = npm.config.get("parseable") ? dest + "::" + src + ":BINFILE" : dest + " -> " + src console.log(out) cb() }) }) }, cb) } function linkBin (from, to, gently, cb) { if (process.platform !== "win32") { return linkIfExists(from, to, gently, cb) } else { return cmdShimIfExists(from, to, cb) } } function linkMans (pkg, folder, parent, gtop, cb) { if (!pkg.man || !gtop || process.platform === "win32") return cb() var manRoot = path.resolve(npm.config.get("prefix"), "share", "man") log.verbose("linkMans", "man files are", pkg.man, "in", manRoot) // make sure that the mans are unique. // otherwise, if there are dupes, it'll fail with EEXIST var set = pkg.man.reduce(function (acc, man) { acc[path.basename(man)] = man return acc }, {}) pkg.man = pkg.man.filter(function (man) { return set[path.basename(man)] === man }) asyncMap(pkg.man, function (man, cb) { if (typeof man !== "string") return cb() log.silly("linkMans", "preparing to link", man) var parseMan = man.match(/(.*\.([0-9]+)(\.gz)?)$/) if (!parseMan) { return cb(new Error( man+" is not a valid name for a man file. " + "Man files must end with a number, " + "and optionally a .gz suffix if they are compressed." )) } var stem = parseMan[1] var sxn = parseMan[2] var bn = path.basename(stem) var manDest = path.join(manRoot, "man" + sxn, bn) linkIfExists(man, manDest, gtop && folder, cb) }, cb) } ���������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022506� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache.js��������������������������������������������000644 �000766 �000024 �00000024410 12455173731 023052� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// XXX lib/utils/tar.js and this file need to be rewritten. // URL-to-cache folder mapping: // : -> ! // @ -> _ // http://registry.npmjs.org/foo/version -> cache/http!/... // /* fetching a URL: 1. Check for URL in inflight URLs. If present, add cb, and return. 2. Acquire lock at {cache}/{sha(url)}.lock retries = {cache-lock-retries, def=3} stale = {cache-lock-stale, def=30000} wait = {cache-lock-wait, def=100} 3. if lock can't be acquired, then fail 4. fetch url, clear lock, call cbs cache folders: 1. urls: http!/server.com/path/to/thing 2. c:\path\to\thing: file!/c!/path/to/thing 3. /path/to/thing: file!/path/to/thing 4. git@ private: git_github.com!npm/npm 5. git://public: git!/github.com/npm/npm 6. git+blah:// git-blah!/server.com/foo/bar adding a folder: 1. tar into tmp/random/package.tgz 2. untar into tmp/random/contents/package, stripping one dir piece 3. tar tmp/random/contents/package to cache/n/v/package.tgz 4. untar cache/n/v/package.tgz into cache/n/v/package 5. rm tmp/random Adding a url: 1. fetch to tmp/random/package.tgz 2. goto folder(2) adding a name@version: 1. registry.get(name/version) 2. if response isn't 304, add url(dist.tarball) adding a name@range: 1. registry.get(name) 2. Find a version that satisfies 3. add name@version adding a local tarball: 1. untar to tmp/random/{blah} 2. goto folder(2) adding a namespaced package: 1. lookup registry for @namespace 2. namespace_registry.get('name') 3. add url(namespace/latest.tarball) */ exports = module.exports = cache cache.unpack = unpack cache.clean = clean cache.read = read var npm = require("./npm.js") , fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , assert = require("assert") , rm = require("./utils/gently-rm.js") , readJson = require("read-package-json") , log = require("npmlog") , path = require("path") , asyncMap = require("slide").asyncMap , tar = require("./utils/tar.js") , fileCompletion = require("./utils/completion/file-completion.js") , deprCheck = require("./utils/depr-check.js") , addNamed = require("./cache/add-named.js") , addLocal = require("./cache/add-local.js") , addRemoteTarball = require("./cache/add-remote-tarball.js") , addRemoteGit = require("./cache/add-remote-git.js") , maybeGithub = require("./cache/maybe-github.js") , inflight = require("inflight") , realizePackageSpecifier = require("realize-package-specifier") , npa = require("npm-package-arg") , getStat = require("./cache/get-stat.js") , cachedPackageRoot = require("./cache/cached-package-root.js") , mapToRegistry = require("./utils/map-to-registry.js") cache.usage = "npm cache add <tarball file>" + "\nnpm cache add <folder>" + "\nnpm cache add <tarball url>" + "\nnpm cache add <git url>" + "\nnpm cache add <name>@<version>" + "\nnpm cache ls [<path>]" + "\nnpm cache clean [<pkg>[@<version>]]" cache.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length === 2) { return cb(null, ["add", "ls", "clean"]) } switch (argv[2]) { case "clean": case "ls": // cache and ls are easy, because the completion is // what ls_ returns anyway. // just get the partial words, minus the last path part var p = path.dirname(opts.partialWords.slice(3).join("/")) if (p === ".") p = "" return ls_(p, 2, cb) case "add": // Same semantics as install and publish. return npm.commands.install.completion(opts, cb) } } function cache (args, cb) { var cmd = args.shift() switch (cmd) { case "rm": case "clear": case "clean": return clean(args, cb) case "list": case "sl": case "ls": return ls(args, cb) case "add": return add(args, npm.prefix, cb) default: return cb("Usage: "+cache.usage) } } // if the pkg and ver are in the cache, then // just do a readJson and return. // if they're not, then fetch them from the registry. function read (name, ver, forceBypass, cb) { assert(typeof name === "string", "must include name of module to install") assert(typeof cb === "function", "must include callback") if (forceBypass === undefined || forceBypass === null) forceBypass = true var root = cachedPackageRoot({name : name, version : ver}) function c (er, data) { log.silly("cache", "addNamed cb", name+"@"+ver) if (er) log.verbose("cache", "addNamed error for", name+"@"+ver, er) if (data) deprCheck(data) return cb(er, data) } if (forceBypass && npm.config.get("force")) { log.verbose("using force", "skipping cache") return addNamed(name, ver, null, c) } readJson(path.join(root, "package", "package.json"), function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (data) { if (!data.name) return cb(new Error("No name provided")) if (!data.version) return cb(new Error("No version provided")) } if (er) return addNamed(name, ver, null, c) else c(er, data) }) } function normalize (args) { var normalized = "" if (args.length > 0) { var a = npa(args[0]) if (a.name) normalized = a.name if (a.rawSpec) normalized = [normalized, a.rawSpec].join("/") if (args.length > 1) normalized = [normalized].concat(args.slice(1)).join("/") } if (normalized.substr(-1) === "/") { normalized = normalized.substr(0, normalized.length - 1) } normalized = path.normalize(normalized) log.silly("ls", "normalized", normalized) return normalized } // npm cache ls [<path>] function ls (args, cb) { var prefix = npm.config.get("cache") if (prefix.indexOf(process.env.HOME) === 0) { prefix = "~" + prefix.substr(process.env.HOME.length) } ls_(normalize(args), npm.config.get("depth"), function (er, files) { console.log(files.map(function (f) { return path.join(prefix, f) }).join("\n").trim()) cb(er, files) }) } // Calls cb with list of cached pkgs matching show. function ls_ (req, depth, cb) { return fileCompletion(npm.cache, req, depth, cb) } // npm cache clean [<path>] function clean (args, cb) { assert(typeof cb === "function", "must include callback") if (!args) args = [] var f = path.join(npm.cache, normalize(args)) if (f === npm.cache) { fs.readdir(npm.cache, function (er, files) { if (er) return cb() asyncMap( files.filter(function (f) { return npm.config.get("force") || f !== "-" }).map(function (f) { return path.join(npm.cache, f) }) , rm, cb ) }) } else { rm(f, cb) } } // npm cache add <tarball-url> // npm cache add <pkg> <ver> // npm cache add <tarball> // npm cache add <folder> cache.add = function (pkg, ver, where, scrub, cb) { assert(typeof pkg === "string", "must include name of package to install") assert(typeof cb === "function", "must include callback") if (scrub) { return clean([], function (er) { if (er) return cb(er) add([pkg, ver], where, cb) }) } return add([pkg, ver], where, cb) } var adding = 0 function add (args, where, cb) { // this is hot code. almost everything passes through here. // the args can be any of: // ["url"] // ["pkg", "version"] // ["pkg@version"] // ["pkg", "url"] // This is tricky, because urls can contain @ // Also, in some cases we get [name, null] rather // that just a single argument. var usage = "Usage:\n" + " npm cache add <tarball-url>\n" + " npm cache add <pkg>@<ver>\n" + " npm cache add <tarball>\n" + " npm cache add <folder>\n" , spec log.silly("cache add", "args", args) if (args[1] === undefined) args[1] = null // at this point the args length must ==2 if (args[1] !== null) { spec = args[0]+"@"+args[1] } else if (args.length === 2) { spec = args[0] } log.verbose("cache add", "spec", spec) if (!spec) return cb(usage) if (adding <= 0) { npm.spinner.start() } adding++ cb = afterAdd(cb) realizePackageSpecifier(spec, where, function (err, p) { if (err) return cb(err) log.silly("cache add", "parsed spec", p) switch (p.type) { case "local": case "directory": addLocal(p, null, cb) break case "remote": // get auth, if possible mapToRegistry(spec, npm.config, function (err, uri, auth) { if (err) return cb(err) addRemoteTarball(p.spec, {name : p.name}, null, auth, cb) }) break case "git": addRemoteGit(p.spec, false, cb) break case "github": maybeGithub(p.spec, cb) break default: if (p.name) return addNamed(p.name, p.spec, null, cb) cb(new Error("couldn't figure out how to install " + spec)) } }) } function unpack (pkg, ver, unpackTarget, dMode, fMode, uid, gid, cb) { if (typeof cb !== "function") cb = gid, gid = null if (typeof cb !== "function") cb = uid, uid = null if (typeof cb !== "function") cb = fMode, fMode = null if (typeof cb !== "function") cb = dMode, dMode = null read(pkg, ver, false, function (er) { if (er) { log.error("unpack", "Could not read data for %s", pkg + "@" + ver) return cb(er) } npm.commands.unbuild([unpackTarget], true, function (er) { if (er) return cb(er) tar.unpack( path.join(cachedPackageRoot({name : pkg, version : ver}), "package.tgz") , unpackTarget , dMode, fMode , uid, gid , cb ) }) }) } function afterAdd (cb) { return function (er, data) { adding-- if (adding <= 0) npm.spinner.stop() if (er || !data || !data.name || !data.version) return cb(er, data) log.silly("cache", "afterAdd", data.name+"@"+data.version) // Save the resolved, shasum, etc. into the data so that the next // time we load from this cached data, we have all the same info. var pj = path.join(cachedPackageRoot(data), "package", "package.json") var done = inflight(pj, cb) if (!done) return log.verbose("afterAdd", pj, "already in flight; not writing") log.verbose("afterAdd", pj, "not in flight; writing") getStat(function (er, cs) { if (er) return done(er) writeFileAtomic(pj, JSON.stringify(data), {chown : cs}, function (er) { if (!er) log.verbose("afterAdd", pj, "written") return done(er, data) }) }) }} ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/completion.js���������������������������������������000644 �000766 �000024 �00000016536 12455173731 024172� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = completion completion.usage = "npm completion >> ~/.bashrc\n" + "npm completion >> ~/.zshrc\n" + "source <(npm completion)" var npm = require("./npm.js") , npmconf = require("./config/core.js") , configDefs = npmconf.defs , configTypes = configDefs.types , shorthands = configDefs.shorthands , nopt = require("nopt") , configNames = Object.keys(configTypes).filter(function (e) { return e.charAt(0) !== "_" }) , shorthandNames = Object.keys(shorthands) , allConfs = configNames.concat(shorthandNames) , once = require("once") completion.completion = function (opts, cb) { if (opts.w > 3) return cb() var fs = require("graceful-fs") , path = require("path") , bashExists = null , zshExists = null fs.stat(path.resolve(process.env.HOME, ".bashrc"), function (er) { bashExists = !er next() }) fs.stat(path.resolve(process.env.HOME, ".zshrc"), function (er) { zshExists = !er next() }) function next () { if (zshExists === null || bashExists === null) return var out = [] if (zshExists) out.push("~/.zshrc") if (bashExists) out.push("~/.bashrc") if (opts.w === 2) out = out.map(function (m) { return [">>", m] }) cb(null, out) } } function completion (args, cb) { if (process.platform === "win32") { var e = new Error("npm completion not supported on windows") e.code = "ENOTSUP" e.errno = require("constants").ENOTSUP return cb(e) } // if the COMP_* isn't in the env, then just dump the script. if (process.env.COMP_CWORD === undefined ||process.env.COMP_LINE === undefined ||process.env.COMP_POINT === undefined ) return dumpScript(cb) console.error(process.env.COMP_CWORD) console.error(process.env.COMP_LINE) console.error(process.env.COMP_POINT) //console.log("abracadabrasauce\nabracad cat monger") //if (Math.random() * 3 < 1) console.log("man\\ bear\\ pig") //else if (Math.random() * 3 < 1) // console.log("porkchop\\ sandwiches\nporkman") //else console.log("encephylophagy") // get the partial line and partial word, // if the point isn't at the end. // ie, tabbing at: npm foo b|ar var w = +process.env.COMP_CWORD , words = args.map(unescape) , word = words[w] , line = process.env.COMP_LINE , point = +process.env.COMP_POINT , partialLine = line.substr(0, point) , partialWords = words.slice(0, w) // figure out where in that last word the point is. var partialWord = args[w] , i = partialWord.length while (partialWord.substr(0, i) !== partialLine.substr(-1*i) && i > 0) { i -- } partialWord = unescape(partialWord.substr(0, i)) partialWords.push(partialWord) var opts = { words : words , w : w , word : word , line : line , lineLength : line.length , point : point , partialLine : partialLine , partialWords : partialWords , partialWord : partialWord , raw: args } cb = wrapCb(cb, opts) console.error(opts) if (partialWords.slice(0, -1).indexOf("--") === -1) { if (word.charAt(0) === "-") return configCompl(opts, cb) if (words[w - 1] && words[w - 1].charAt(0) === "-" && !isFlag(words[w - 1])) { // awaiting a value for a non-bool config. // don't even try to do this for now console.error("configValueCompl") return configValueCompl(opts, cb) } } // try to find the npm command. // it's the first thing after all the configs. // take a little shortcut and use npm's arg parsing logic. // don't have to worry about the last arg being implicitly // boolean'ed, since the last block will catch that. var parsed = opts.conf = nopt(configTypes, shorthands, partialWords.slice(0, -1), 0) // check if there's a command already. console.error(parsed) var cmd = parsed.argv.remain[1] if (!cmd) return cmdCompl(opts, cb) Object.keys(parsed).forEach(function (k) { npm.config.set(k, parsed[k]) }) // at this point, if words[1] is some kind of npm command, // then complete on it. // otherwise, do nothing cmd = npm.commands[cmd] if (cmd && cmd.completion) return cmd.completion(opts, cb) // nothing to do. cb() } function dumpScript (cb) { var fs = require("graceful-fs") , path = require("path") , p = path.resolve(__dirname, "utils/completion.sh") // The Darwin patch below results in callbacks first for the write and then // for the error handler, so make sure we only call our callback once. cb = once(cb) fs.readFile(p, "utf8", function (er, d) { if (er) return cb(er) d = d.replace(/^\#\!.*?\n/, "") process.stdout.write(d, function () { cb() }) process.stdout.on("error", function (er) { // Darwin is a real dick sometimes. // // This is necessary because the "source" or "." program in // bash on OS X closes its file argument before reading // from it, meaning that you get exactly 1 write, which will // work most of the time, and will always raise an EPIPE. // // Really, one should not be tossing away EPIPE errors, or any // errors, so casually. But, without this, `. <(npm completion)` // can never ever work on OS X. if (er.errno === "EPIPE") er = null cb(er) }) }) } function unescape (w) { if (w.charAt(0) === "\"") return w.replace(/^"|"$/g, "") else return w.replace(/\\ /g, " ") } function escape (w) { if (!w.match(/\s+/)) return w return "\"" + w + "\"" } // The command should respond with an array. Loop over that, // wrapping quotes around any that have spaces, and writing // them to stdout. Use console.log, not the outfd config. // If any of the items are arrays, then join them with a space. // Ie, returning ["a", "b c", ["d", "e"]] would allow it to expand // to: "a", "b c", or "d" "e" function wrapCb (cb, opts) { return function (er, compls) { if (!Array.isArray(compls)) compls = compls ? [compls] : [] compls = compls.map(function (c) { if (Array.isArray(c)) c = c.map(escape).join(" ") else c = escape(c) return c }) if (opts.partialWord) compls = compls.filter(function (c) { return c.indexOf(opts.partialWord) === 0 }) console.error([er && er.stack, compls, opts.partialWord]) if (er || compls.length === 0) return cb(er) console.log(compls.join("\n")) cb() }} // the current word has a dash. Return the config names, // with the same number of dashes as the current word has. function configCompl (opts, cb) { var word = opts.word , split = word.match(/^(-+)((?:no-)*)(.*)$/) , dashes = split[1] , no = split[2] , flags = configNames.filter(isFlag) console.error(flags) return cb(null, allConfs.map(function (c) { return dashes + c }).concat(flags.map(function (f) { return dashes + (no || "no-") + f }))) } // expand with the valid values of various config values. // not yet implemented. function configValueCompl (opts, cb) { console.error("configValue", opts) return cb(null, []) } // check if the thing is a flag or not. function isFlag (word) { // shorthands never take args. var split = word.match(/^(-*)((?:no-)+)?(.*)$/) , no = split[2] , conf = split[3] return no || configTypes[conf] === Boolean || shorthands[conf] } // complete against the npm commands function cmdCompl (opts, cb) { return cb(null, npm.fullList) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/���������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022710� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config.js�������������������������������������������000644 �000766 �000024 �00000017207 12455173731 023262� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = config config.usage = "npm config set <key> <value>" + "\nnpm config get [<key>]" + "\nnpm config delete <key>" + "\nnpm config list" + "\nnpm config edit" + "\nnpm set <key> <value>" + "\nnpm get [<key>]" var log = require("npmlog") , npm = require("./npm.js") , npmconf = require("./config/core.js") , fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , types = npmconf.defs.types , ini = require("ini") , editor = require("editor") , os = require("os") config.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv[1] !== "config") argv.unshift("config") if (argv.length === 2) { var cmds = ["get", "set", "delete", "ls", "rm", "edit"] if (opts.partialWord !== "l") cmds.push("list") return cb(null, cmds) } var action = argv[2] switch (action) { case "set": // todo: complete with valid values, if possible. if (argv.length > 3) return cb(null, []) // fallthrough case "get": case "delete": case "rm": return cb(null, Object.keys(types)) case "edit": case "list": case "ls": return cb(null, []) default: return cb(null, []) } } // npm config set key value // npm config get key // npm config list function config (args, cb) { var action = args.shift() switch (action) { case "set": return set(args[0], args[1], cb) case "get": return get(args[0], cb) case "delete": case "rm": case "del": return del(args[0], cb) case "list": case "ls": return list(cb) case "edit": return edit(cb) default: return unknown(action, cb) } } function edit (cb) { var e = npm.config.get("editor") , which = npm.config.get("global") ? "global" : "user" , f = npm.config.get(which + "config") if (!e) return cb(new Error("No EDITOR config or environ set.")) npm.config.save(which, function (er) { if (er) return cb(er) fs.readFile(f, "utf8", function (er, data) { if (er) data = "" data = [ ";;;;" , "; npm "+(npm.config.get("global") ? "globalconfig" : "userconfig")+" file" , "; this is a simple ini-formatted file" , "; lines that start with semi-colons are comments." , "; read `npm help config` for help on the various options" , ";;;;" , "" , data ].concat( [ ";;;;" , "; all options with default values" , ";;;;" ] ) .concat(Object.keys(npmconf.defaults).reduce(function (arr, key) { var obj = {}; obj[key] = npmconf.defaults[key] if (key === "logstream") return arr return arr.concat( ini.stringify(obj) .replace(/\n$/m, "") .replace(/^/g, "; ") .replace(/\n/g, "\n; ") .split("\n")) }, [])) .concat([""]) .join(os.EOL) writeFileAtomic ( f , data , function (er) { if (er) return cb(er) editor(f, { editor: e }, cb) } ) }) }) } function del (key, cb) { if (!key) return cb(new Error("no key provided")) var where = npm.config.get("global") ? "global" : "user" npm.config.del(key, where) npm.config.save(where, cb) } function set (key, val, cb) { if (key === undefined) { return unknown("", cb) } if (val === undefined) { if (key.indexOf("=") !== -1) { var k = key.split("=") key = k.shift() val = k.join("=") } else { val = "" } } key = key.trim() val = val.trim() log.info("config", "set %j %j", key, val) var where = npm.config.get("global") ? "global" : "user" npm.config.set(key, val, where) npm.config.save(where, cb) } function get (key, cb) { if (!key) return list(cb) if (key.charAt(0) === "_") { return cb(new Error("---sekretz---")) } console.log(npm.config.get(key)) cb() } function sort (a, b) { return a > b ? 1 : -1 } function public (k) { return !(k.charAt(0) === "_" || types[k] !== types[k]) } function getKeys (data) { return Object.keys(data).filter(public).sort(sort) } function list (cb) { var msg = "" , long = npm.config.get("long") var cli = npm.config.sources.cli.data , cliKeys = getKeys(cli) if (cliKeys.length) { msg += "; cli configs\n" cliKeys.forEach(function (k) { if (cli[k] && typeof cli[k] === "object") return if (k === "argv") return msg += k + " = " + JSON.stringify(cli[k]) + "\n" }) msg += "\n" } // env configs var env = npm.config.sources.env.data , envKeys = getKeys(env) if (envKeys.length) { msg += "; environment configs\n" envKeys.forEach(function (k) { if (env[k] !== npm.config.get(k)) { if (!long) return msg += "; " + k + " = " + JSON.stringify(env[k]) + " (overridden)\n" } else msg += k + " = " + JSON.stringify(env[k]) + "\n" }) msg += "\n" } // user config file var uconf = npm.config.sources.user.data , uconfKeys = getKeys(uconf) if (uconfKeys.length) { msg += "; userconfig " + npm.config.get("userconfig") + "\n" uconfKeys.forEach(function (k) { var val = (k.charAt(0) === "_") ? "---sekretz---" : JSON.stringify(uconf[k]) if (uconf[k] !== npm.config.get(k)) { if (!long) return msg += "; " + k + " = " + val + " (overridden)\n" } else msg += k + " = " + val + "\n" }) msg += "\n" } // global config file var gconf = npm.config.sources.global.data , gconfKeys = getKeys(gconf) if (gconfKeys.length) { msg += "; globalconfig " + npm.config.get("globalconfig") + "\n" gconfKeys.forEach(function (k) { var val = (k.charAt(0) === "_") ? "---sekretz---" : JSON.stringify(gconf[k]) if (gconf[k] !== npm.config.get(k)) { if (!long) return msg += "; " + k + " = " + val + " (overridden)\n" } else msg += k + " = " + val + "\n" }) msg += "\n" } // builtin config file var builtin = npm.config.sources.builtin || {} if (builtin && builtin.data) { var bconf = builtin.data , bpath = builtin.path , bconfKeys = getKeys(bconf) if (bconfKeys.length) { msg += "; builtin config " + bpath + "\n" bconfKeys.forEach(function (k) { var val = (k.charAt(0) === "_") ? "---sekretz---" : JSON.stringify(bconf[k]) if (bconf[k] !== npm.config.get(k)) { if (!long) return msg += "; " + k + " = " + val + " (overridden)\n" } else msg += k + " = " + val + "\n" }) msg += "\n" } } // only show defaults if --long if (!long) { msg += "; node bin location = " + process.execPath + "\n" + "; cwd = " + process.cwd() + "\n" + "; HOME = " + process.env.HOME + "\n" + "; 'npm config ls -l' to show all defaults.\n" console.log(msg) return cb() } var defaults = npmconf.defaults , defKeys = getKeys(defaults) msg += "; default values\n" defKeys.forEach(function (k) { if (defaults[k] && typeof defaults[k] === "object") return var val = JSON.stringify(defaults[k]) if (defaults[k] !== npm.config.get(k)) { msg += "; " + k + " = " + val + " (overridden)\n" } else msg += k + " = " + val + "\n" }) msg += "\n" console.log(msg) return cb() } function unknown (action, cb) { cb("Usage:\n" + config.usage) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/dedupe.js�������������������������������������������000644 �000766 �000024 �00000024537 12455173731 023267� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// traverse the node_modules/package.json tree // looking for duplicates. If any duplicates are found, // then move them up to the highest level necessary // in order to make them no longer duplicated. // // This is kind of ugly, and really highlights the need for // much better "put pkg X at folder Y" abstraction. Oh well, // whatever. Perfect enemy of the good, and all that. var fs = require("fs") var asyncMap = require("slide").asyncMap var path = require("path") var readJson = require("read-package-json") var semver = require("semver") var rm = require("./utils/gently-rm.js") var log = require("npmlog") var npm = require("./npm.js") var mapToRegistry = require("./utils/map-to-registry.js") module.exports = dedupe dedupe.usage = "npm dedupe [pkg pkg...]" function dedupe (args, silent, cb) { if (typeof silent === "function") cb = silent, silent = false var dryrun = false if (npm.command.match(/^find/)) dryrun = true return dedupe_(npm.prefix, args, {}, dryrun, silent, cb) } function dedupe_ (dir, filter, unavoidable, dryrun, silent, cb) { readInstalled(path.resolve(dir), {}, null, function (er, data, counter) { if (er) { return cb(er) } if (!data) { return cb() } // find out which things are dupes var dupes = Object.keys(counter || {}).filter(function (k) { if (filter.length && -1 === filter.indexOf(k)) return false return counter[k] > 1 && !unavoidable[k] }).reduce(function (s, k) { s[k] = [] return s }, {}) // any that are unavoidable need to remain as they are. don't even // try to touch them or figure it out. Maybe some day, we can do // something a bit more clever here, but for now, just skip over it, // and all its children. ;(function U (obj) { if (unavoidable[obj.name]) { obj.unavoidable = true } if (obj.parent && obj.parent.unavoidable) { obj.unavoidable = true } Object.keys(obj.children).forEach(function (k) { U(obj.children[k]) }) })(data) // then collect them up and figure out who needs them ;(function C (obj) { if (dupes[obj.name] && !obj.unavoidable) { dupes[obj.name].push(obj) obj.duplicate = true } obj.dependents = whoDepends(obj) Object.keys(obj.children).forEach(function (k) { C(obj.children[k]) }) })(data) if (dryrun) { var k = Object.keys(dupes) if (!k.length) return cb() return npm.commands.ls(k, silent, cb) } var summary = Object.keys(dupes).map(function (n) { return [n, dupes[n].filter(function (d) { return d && d.parent && !d.parent.duplicate && !d.unavoidable }).map(function M (d) { return [d.path, d.version, d.dependents.map(function (k) { return [k.path, k.version, k.dependencies[d.name] || ""] })] })] }).map(function (item) { var set = item[1] var ranges = set.map(function (i) { return i[2].map(function (d) { return d[2] }) }).reduce(function (l, r) { return l.concat(r) }, []).map(function (v, i, set) { if (set.indexOf(v) !== i) return false return v }).filter(function (v) { return v !== false }) var locs = set.map(function (i) { return i[0] }) var versions = set.map(function (i) { return i[1] }).filter(function (v, i, set) { return set.indexOf(v) === i }) var has = set.map(function (i) { return [i[0], i[1]] }).reduce(function (set, kv) { set[kv[0]] = kv[1] return set }, {}) var loc = locs.length ? locs.reduce(function (a, b) { // a=/path/to/node_modules/foo/node_modules/bar // b=/path/to/node_modules/elk/node_modules/bar // ==/path/to/node_modules/bar var nmReg = new RegExp("\\" + path.sep + "node_modules\\" + path.sep) a = a.split(nmReg) b = b.split(nmReg) var name = a.pop() b.pop() // find the longest chain that both A and B share. // then push the name back on it, and join by /node_modules/ for (var i = 0, al = a.length, bl = b.length; i < al && i < bl && a[i] === b[i]; i++); return a.slice(0, i).concat(name).join(path.sep + "node_modules" + path.sep) }) : undefined return [item[0], { item: item , ranges: ranges , locs: locs , loc: loc , has: has , versions: versions }] }).filter(function (i) { return i[1].loc }) findVersions(npm, summary, function (er, set) { if (er) return cb(er) if (!set.length) return cb() installAndRetest(set, filter, dir, unavoidable, silent, cb) }) }) } function installAndRetest (set, filter, dir, unavoidable, silent, cb) { //return cb(null, set) var remove = [] asyncMap(set, function (item, cb) { // [name, has, loc, locMatch, regMatch, others] var name = item[0] var has = item[1] var where = item[2] var locMatch = item[3] var regMatch = item[4] var others = item[5] // nothing to be done here. oh well. just a conflict. if (!locMatch && !regMatch) { log.warn("unavoidable conflict", item[0], item[1]) log.warn("unavoidable conflict", "Not de-duplicating") unavoidable[item[0]] = true return cb() } // nothing to do except to clean up the extraneous deps if (locMatch && has[where] === locMatch) { remove.push.apply(remove, others) return cb() } if (regMatch) { var what = name + "@" + regMatch // where is /path/to/node_modules/foo/node_modules/bar // for package "bar", but we need it to be just // /path/to/node_modules/foo var nmReg = new RegExp("\\" + path.sep + "node_modules\\" + path.sep) where = where.split(nmReg) where.pop() where = where.join(path.sep + "node_modules" + path.sep) remove.push.apply(remove, others) return npm.commands.install(where, what, cb) } // hrm? return cb(new Error("danger zone\n" + name + " " + regMatch + " " + locMatch)) }, function (er) { if (er) return cb(er) asyncMap(remove, rm, function (er) { if (er) return cb(er) remove.forEach(function (r) { log.info("rm", r) }) dedupe_(dir, filter, unavoidable, false, silent, cb) }) }) } function findVersions (npm, summary, cb) { // now, for each item in the summary, try to find the maximum version // that will satisfy all the ranges. next step is to install it at // the specified location. asyncMap(summary, function (item, cb) { var name = item[0] var data = item[1] var loc = data.loc var locs = data.locs.filter(function (l) { return l !== loc }) // not actually a dupe, or perhaps all the other copies were // children of a dupe, so this'll maybe be picked up later. if (locs.length === 0) { return cb(null, []) } // { <folder>: <version> } var has = data.has // the versions that we already have. // if one of these is ok, then prefer to use that. // otherwise, try fetching from the registry. var versions = data.versions var ranges = data.ranges mapToRegistry(name, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, next) }) function next (er, data) { var regVersions = er ? [] : Object.keys(data.versions) var locMatch = bestMatch(versions, ranges) var tag = npm.config.get("tag") var distTag = data["dist-tags"] && data["dist-tags"][tag] var regMatch if (distTag && data.versions[distTag] && matches(distTag, ranges)) { regMatch = distTag } else { regMatch = bestMatch(regVersions, ranges) } cb(null, [[name, has, loc, locMatch, regMatch, locs]]) } }, cb) } function matches (version, ranges) { return !ranges.some(function (r) { return !semver.satisfies(version, r, true) }) } function bestMatch (versions, ranges) { return versions.filter(function (v) { return matches(v, ranges) }).sort(semver.compareLoose).pop() } function readInstalled (dir, counter, parent, cb) { var pkg, children, realpath fs.realpath(dir, function (er, rp) { realpath = rp next() }) readJson(path.resolve(dir, "package.json"), function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) return cb() // not a package, probably. counter[data.name] = counter[data.name] || 0 counter[data.name]++ pkg = { _id: data._id , name: data.name , version: data.version , dependencies: data.dependencies || {} , optionalDependencies: data.optionalDependencies || {} , devDependencies: data.devDependencies || {} , bundledDependencies: data.bundledDependencies || [] , path: dir , realPath: dir , children: {} , parent: parent , family: Object.create(parent ? parent.family : null) , unavoidable: false , duplicate: false } if (parent) { parent.children[data.name] = pkg parent.family[data.name] = pkg } next() }) fs.readdir(path.resolve(dir, "node_modules"), function (er, c) { children = c || [] // error is ok, just means no children. children = children.filter(function (p) { return !p.match(/^[\._-]/) }) next() }) function next () { if (!children || !pkg || !realpath) return // ignore devDependencies. Just leave them where they are. children = children.filter(function (c) { return !pkg.devDependencies.hasOwnProperty(c) }) pkg.realPath = realpath if (pkg.realPath !== pkg.path) children = [] var d = path.resolve(dir, "node_modules") asyncMap(children, function (child, cb) { readInstalled(path.resolve(d, child), counter, pkg, cb) }, function (er) { cb(er, pkg, counter) }) } } function whoDepends (pkg) { var start = pkg.parent || pkg return whoDepends_(pkg, [], start) } function whoDepends_ (pkg, who, test) { if (test !== pkg && test.dependencies[pkg.name] && test.family[pkg.name] === pkg) { who.push(test) } Object.keys(test.children).forEach(function (n) { whoDepends_(pkg, who, test.children[n]) }) return who } �����������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/deprecate.js����������������������������������������000644 �000766 �000024 �00000002427 12455173731 023747� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var npm = require("./npm.js") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") module.exports = deprecate deprecate.usage = "npm deprecate <pkg>[@<version>] <message>" deprecate.completion = function (opts, cb) { // first, get a list of remote packages this user owns. // once we have a user account, then don't complete anything. if (opts.conf.argv.remain.length > 2) return cb() // get the list of packages by user var path = "/-/by-user/" mapToRegistry(path, npm.config, function (er, uri, c) { if (er) return cb(er) if (!(c && c.username)) return cb() var params = { timeout : 60000, auth : c } npm.registry.get(uri + c.username, params, function (er, list) { if (er) return cb() console.error(list) return cb(null, list[c.username]) }) }) } function deprecate (args, cb) { var pkg = args[0] , msg = args[1] if (msg === undefined) return cb("Usage: " + deprecate.usage) // fetch the data and make sure it exists. var p = npa(pkg) mapToRegistry(p.name, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { version : p.spec, message : msg, auth : auth } npm.registry.deprecate(uri, params, cb) }) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/docs.js���������������������������������������������000644 �000766 �000024 �00000003564 12455173731 022746� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = docs docs.usage = "npm docs <pkgname>" docs.usage += "\n" docs.usage += "npm docs ." var npm = require("./npm.js") , opener = require("opener") , path = require("path") , log = require("npmlog") , mapToRegistry = require("./utils/map-to-registry.js") docs.completion = function (opts, cb) { mapToRegistry("/-/short", npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { timeout : 60000, auth : auth }, function (er, list) { return cb(null, list || []) }) }) } function url (json) { return json.homepage ? json.homepage : "https://npmjs.org/package/" + json.name } function docs (args, cb) { args = args || [] var pending = args.length if (!pending) return getDoc(".", cb) args.forEach(function(proj) { getDoc(proj, function(err) { if (err) { return cb(err) } --pending || cb() }) }) } function getDoc (project, cb) { project = project || "." var package = path.resolve(npm.localPrefix, "package.json") if (project === "." || project === "./") { var json try { json = require(package) if (!json.name) throw new Error('package.json does not have a valid "name" property') project = json.name } catch (e) { log.error(e.message) return cb(docs.usage) } return opener(url(json), { command: npm.config.get("browser") }, cb) } mapToRegistry(project, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri + "/latest", { timeout : 3600, auth : auth }, next) }) function next (er, json) { var github = "https://github.com/" + project + "#readme" if (er) { if (project.split("/").length !== 2) return cb(er) return opener(github, { command: npm.config.get("browser") }, cb) } return opener(url(json), { command: npm.config.get("browser") }, cb) } } ��������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/edit.js���������������������������������������������000644 �000766 �000024 �00000001545 12455173731 022740� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm edit <pkg>[@<version>] // open the package folder in the $EDITOR module.exports = edit edit.usage = "npm edit <pkg>" edit.completion = require("./utils/completion/installed-shallow.js") var npm = require("./npm.js") , path = require("path") , fs = require("graceful-fs") , editor = require("editor") function edit (args, cb) { var p = args[0] if (args.length !== 1 || !p) return cb(edit.usage) var e = npm.config.get("editor") if (!e) return cb(new Error( "No editor set. Set the 'editor' config, or $EDITOR environ.")) p = p.split("/") .join("/node_modules/") .replace(/(\/node_modules)+/, "/node_modules") var f = path.resolve(npm.dir, p) fs.lstat(f, function (er) { if (er) return cb(er) editor(f, { editor: e }, function (er) { if (er) return cb(er) npm.commands.rebuild(args, cb) }) }) } �����������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/explore.js������������������������������������������000644 �000766 �000024 �00000002102 12455173731 023457� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm explore <pkg>[@<version>] // open a subshell to the package folder. module.exports = explore explore.usage = "npm explore <pkg> [ -- <cmd>]" explore.completion = require("./utils/completion/installed-shallow.js") var npm = require("./npm.js") , spawn = require("./utils/spawn") , path = require("path") , fs = require("graceful-fs") function explore (args, cb) { if (args.length < 1 || !args[0]) return cb(explore.usage) var p = args.shift() args = args.join(" ").trim() if (args) args = ["-c", args] else args = [] var cwd = path.resolve(npm.dir, p) var sh = npm.config.get("shell") fs.stat(cwd, function (er, s) { if (er || !s.isDirectory()) return cb(new Error( "It doesn't look like "+p+" is installed.")) if (!args.length) console.log( "\nExploring "+cwd+"\n"+ "Type 'exit' or ^D when finished\n") npm.spinner.stop() var shell = spawn(sh, args, { cwd: cwd, stdio: "inherit" }) shell.on("close", function (er) { // only fail if non-interactive. if (!args.length) return cb() cb(er) }) }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/faq.js����������������������������������������������000644 �000766 �000024 �00000000210 12455173731 022546� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = faq faq.usage = "npm faq" var npm = require("./npm.js") function faq (args, cb) { npm.commands.help(["faq"], cb) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/get.js����������������������������������������������000644 �000766 �000024 �00000000353 12455173731 022566� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = get get.usage = "npm get <key> <value> (See `npm config`)" var npm = require("./npm.js") get.completion = npm.commands.config.completion function get (args, cb) { npm.commands.config(["get"].concat(args), cb) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/help-search.js��������������������������������������000644 �000766 �000024 �00000013260 12455173731 024203� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = helpSearch var fs = require("graceful-fs") , path = require("path") , asyncMap = require("slide").asyncMap , npm = require("./npm.js") , glob = require("glob") , color = require("ansicolors") helpSearch.usage = "npm help-search <text>" function helpSearch (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false if (!args.length) return cb(helpSearch.usage) var docPath = path.resolve(__dirname, "..", "doc") return glob(docPath + "/*/*.md", function (er, files) { if (er) return cb(er) readFiles(files, function (er, data) { if (er) return cb(er) searchFiles(args, data, function (er, results) { if (er) return cb(er) formatResults(args, results, cb) }) }) }) } function readFiles (files, cb) { var res = {} asyncMap(files, function (file, cb) { fs.readFile(file, 'utf8', function (er, data) { res[file] = data return cb(er) }) }, function (er) { return cb(er, res) }) } function searchFiles (args, files, cb) { var results = [] Object.keys(files).forEach(function (file) { var data = files[file] // skip if no matches at all var match for (var a = 0, l = args.length; a < l && !match; a++) { match = data.toLowerCase().indexOf(args[a].toLowerCase()) !== -1 } if (!match) return var lines = data.split(/\n+/) // if a line has a search term, then skip it and the next line. // if the next line has a search term, then skip all 3 // otherwise, set the line to null. then remove the nulls. l = lines.length for (var i = 0; i < l; i ++) { var line = lines[i] , nextLine = lines[i + 1] , ll match = false if (nextLine) { for (a = 0, ll = args.length; a < ll && !match; a ++) { match = nextLine.toLowerCase() .indexOf(args[a].toLowerCase()) !== -1 } if (match) { // skip over the next line, and the line after it. i += 2 continue } } match = false for (a = 0, ll = args.length; a < ll && !match; a ++) { match = line.toLowerCase().indexOf(args[a].toLowerCase()) !== -1 } if (match) { // skip over the next line i ++ continue } lines[i] = null } // now squish any string of nulls into a single null lines = lines.reduce(function (l, r) { if (!(r === null && l[l.length-1] === null)) l.push(r) return l }, []) if (lines[lines.length - 1] === null) lines.pop() if (lines[0] === null) lines.shift() // now see how many args were found at all. var found = {} , totalHits = 0 lines.forEach(function (line) { args.forEach(function (arg) { var hit = (line || "").toLowerCase() .split(arg.toLowerCase()).length - 1 if (hit > 0) { found[arg] = (found[arg] || 0) + hit totalHits += hit } }) }) var cmd = "npm help " if (path.basename(path.dirname(file)) === "api") { cmd = "npm apihelp " } cmd += path.basename(file, ".md").replace(/^npm-/, "") results.push({ file: file , cmd: cmd , lines: lines , found: Object.keys(found) , hits: found , totalHits: totalHits }) }) // if only one result, then just show that help section. if (results.length === 1) { return npm.commands.help([results[0].file.replace(/\.md$/, "")], cb) } if (results.length === 0) { console.log("No results for " + args.map(JSON.stringify).join(" ")) return cb() } // sort results by number of results found, then by number of hits // then by number of matching lines results = results.sort(function (a, b) { return a.found.length > b.found.length ? -1 : a.found.length < b.found.length ? 1 : a.totalHits > b.totalHits ? -1 : a.totalHits < b.totalHits ? 1 : a.lines.length > b.lines.length ? -1 : a.lines.length < b.lines.length ? 1 : 0 }) cb(null, results) } function formatResults (args, results, cb) { if (!results) return cb(null) var cols = Math.min(process.stdout.columns || Infinity, 80) + 1 var out = results.map(function (res) { var out = res.cmd , r = Object.keys(res.hits).map(function (k) { return k + ":" + res.hits[k] }).sort(function (a, b) { return a > b ? 1 : -1 }).join(" ") out += ((new Array(Math.max(1, cols - out.length - r.length))) .join(" ")) + r if (!npm.config.get("long")) return out out = "\n\n" + out + "\n" + (new Array(cols)).join("—") + "\n" + res.lines.map(function (line, i) { if (line === null || i > 3) return "" for (var out = line, a = 0, l = args.length; a < l; a ++) { var finder = out.toLowerCase().split(args[a].toLowerCase()) , newOut = "" , p = 0 finder.forEach(function (f) { newOut += out.substr(p, f.length) var hilit = out.substr(p + f.length, args[a].length) if (npm.color) hilit = color.bgBlack(color.red(hilit)) newOut += hilit p += f.length + args[a].length }) } return newOut }).join("\n").trim() return out }).join("\n") if (results.length && !npm.config.get("long")) { out = "Top hits for "+(args.map(JSON.stringify).join(" ")) + "\n" + (new Array(cols)).join("—") + "\n" + out + "\n" + (new Array(cols)).join("—") + "\n" + "(run with -l or --long to see more context)" } console.log(out.trim()) cb(null, results) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/help.js���������������������������������������������000644 �000766 �000024 �00000013403 12455173731 022737� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = help help.completion = function (opts, cb) { if (opts.conf.argv.remain.length > 2) return cb(null, []) getSections(cb) } var path = require("path") , spawn = require("./utils/spawn") , npm = require("./npm.js") , log = require("npmlog") , opener = require("opener") , glob = require("glob") function help (args, cb) { npm.spinner.stop() var argv = npm.config.get("argv").cooked var argnum = 0 if (args.length === 2 && ~~args[0]) { argnum = ~~args.shift() } // npm help foo bar baz: search topics if (args.length > 1 && args[0]) { return npm.commands["help-search"](args, argnum, cb) } var section = npm.deref(args[0]) || args[0] // npm help <noargs>: show basic usage if (!section) { var valid = argv[0] === "help" ? 0 : 1 return npmUsage(valid, cb) } // npm <cmd> -h: show command usage if ( npm.config.get("usage") && npm.commands[section] && npm.commands[section].usage ) { npm.config.set("loglevel", "silent") log.level = "silent" console.log(npm.commands[section].usage) return cb() } // npm apihelp <section>: Prefer section 3 over section 1 var apihelp = argv.length && -1 !== argv[0].indexOf("api") var pref = apihelp ? [3, 1, 5, 7] : [1, 3, 5, 7] if (argnum) pref = [ argnum ].concat(pref.filter(function (n) { return n !== argnum })) // npm help <section>: Try to find the path var manroot = path.resolve(__dirname, "..", "man") // legacy if (section === "global") section = "folders" else if (section === "json") section = "package.json" // find either /section.n or /npm-section.n var f = "+(npm-" + section + "|" + section + ").[0-9]" return glob(manroot + "/*/" + f, function (er, mans) { if (er) return cb(er) if (!mans.length) return npm.commands["help-search"](args, cb) viewMan(pickMan(mans, pref), cb) }) } function pickMan (mans, pref_) { var nre = /([0-9]+)$/ var pref = {} pref_.forEach(function (sect, i) { pref[sect] = i }) mans = mans.sort(function (a, b) { var an = a.match(nre)[1] var bn = b.match(nre)[1] return an === bn ? (a > b ? -1 : 1) : pref[an] < pref[bn] ? -1 : 1 }) return mans[0] } function viewMan (man, cb) { var nre = /([0-9]+)$/ var num = man.match(nre)[1] var section = path.basename(man, "." + num) // at this point, we know that the specified man page exists var manpath = path.join(__dirname, "..", "man") , env = {} Object.keys(process.env).forEach(function (i) { env[i] = process.env[i] }) env.MANPATH = manpath var viewer = npm.config.get("viewer") var conf switch (viewer) { case "woman": var a = ["-e", "(woman-find-file \"" + man + "\")"] conf = { env: env, stdio: "inherit" } var woman = spawn("emacsclient", a, conf) woman.on("close", cb) break case "browser": opener(htmlMan(man), { command: npm.config.get("browser") }, cb) break default: conf = { env: env, stdio: "inherit" } var manProcess = spawn("man", [num, section], conf) manProcess.on("close", cb) break } } function htmlMan (man) { var sect = +man.match(/([0-9]+)$/)[1] var f = path.basename(man).replace(/([0-9]+)$/, "html") switch (sect) { case 1: sect = "cli" break case 3: sect = "api" break case 5: sect = "files" break case 7: sect = "misc" break default: throw new Error("invalid man section: " + sect) } return path.resolve(__dirname, "..", "html", "doc", sect, f) } function npmUsage (valid, cb) { npm.config.set("loglevel", "silent") log.level = "silent" console.log( [ "\nUsage: npm <command>" , "" , "where <command> is one of:" , npm.config.get("long") ? usages() : " " + wrap(Object.keys(npm.commands)) , "" , "npm <cmd> -h quick help on <cmd>" , "npm -l display full usage info" , "npm faq commonly asked questions" , "npm help <term> search for help on <term>" , "npm help npm involved overview" , "" , "Specify configs in the ini-formatted file:" , " " + npm.config.get("userconfig") , "or on the command line via: npm <command> --key value" , "Config info can be viewed via: npm help config" , "" , "npm@" + npm.version + " " + path.dirname(__dirname) ].join("\n")) cb(valid) } function usages () { // return a string of <cmd>: <usage> var maxLen = 0 return Object.keys(npm.commands).filter(function (c) { return c === npm.deref(c) }).reduce(function (set, c) { set.push([c, npm.commands[c].usage || ""]) maxLen = Math.max(maxLen, c.length) return set }, []).map(function (item) { var c = item[0] , usage = item[1] return "\n " + c + (new Array(maxLen - c.length + 2).join(" ")) + (usage.split("\n") .join("\n" + (new Array(maxLen + 6).join(" ")))) }).join("\n") } function wrap (arr) { var out = [""] , l = 0 , line line = process.stdout.columns if (!line) line = 60 else line = Math.min(60, Math.max(line - 16, 24)) arr.sort(function (a,b) { return a<b?-1:1 }) .forEach(function (c) { if (out[l].length + c.length + 2 < line) { out[l] += ", "+c } else { out[l++] += "," out[l] = c } }) return out.join("\n ").substr(2) } function getSections (cb) { var g = path.resolve(__dirname, "../man/man[0-9]/*.[0-9]") glob(g, function (er, files) { if (er) return cb(er) cb(null, Object.keys(files.reduce(function (acc, file) { file = path.basename(file).replace(/\.[0-9]+$/, "") file = file.replace(/^npm-/, "") acc[file] = true return acc }, { help: true }))) }) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/init.js���������������������������������������������000644 �000766 �000024 �00000002225 12455173731 022752� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // initialize a package.json file module.exports = init var log = require("npmlog") , npm = require("./npm.js") , initJson = require("init-package-json") init.usage = "npm init [--force/-f]" function init (args, cb) { var dir = process.cwd() log.pause() npm.spinner.stop() var initFile = npm.config.get("init-module") if (!initJson.yes(npm.config)) { console.log( ["This utility will walk you through creating a package.json file." ,"It only covers the most common items, and tries to guess sane defaults." ,"" ,"See `npm help json` for definitive documentation on these fields" ,"and exactly what they do." ,"" ,"Use `npm install <pkg> --save` afterwards to install a package and" ,"save it as a dependency in the package.json file." ,"" ,"Press ^C at any time to quit." ].join("\n")) } initJson(dir, initFile, npm.config, function (er, data) { log.resume() log.silly("package data", data) if (er && er.message === "canceled") { log.warn("init", "canceled") return cb(null, data) } log.info("init", "written successfully") cb(er, data) }) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/install.js������������������������������������������000644 �000766 �000024 �00000107546 12455173731 023471� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm install <pkg> <pkg> <pkg> // // See doc/install.md for more description // Managing contexts... // there's a lot of state associated with an "install" operation, including // packages that are already installed, parent packages, current shrinkwrap, and // so on. We maintain this state in a "context" object that gets passed around. // every time we dive into a deeper node_modules folder, the "family" list that // gets passed along uses the previous "family" list as its __proto__. Any // "resolved precise dependency" things that aren't already on this object get // added, and then that's passed to the next generation of installation. module.exports = install install.usage = "npm install" + "\nnpm install <pkg>" + "\nnpm install <pkg>@<tag>" + "\nnpm install <pkg>@<version>" + "\nnpm install <pkg>@<version range>" + "\nnpm install <folder>" + "\nnpm install <tarball file>" + "\nnpm install <tarball url>" + "\nnpm install <git:// url>" + "\nnpm install <github username>/<github project>" + "\n\nCan specify one or more: npm install ./foo.tgz bar@stable /some/folder" + "\nIf no argument is supplied and ./npm-shrinkwrap.json is " + "\npresent, installs dependencies specified in the shrinkwrap." + "\nOtherwise, installs dependencies from ./package.json." install.completion = function (opts, cb) { // install can complete to a folder with a package.json, or any package. // if it has a slash, then it's gotta be a folder // if it starts with https?://, then just give up, because it's a url // for now, not yet implemented. mapToRegistry("-/short", npm.config, function (er, uri, auth) { if (er) return cb(er) var options = { auth : auth } npm.registry.get(uri, options, function (er, pkgs) { if (er) return cb() if (!opts.partialWord) return cb(null, pkgs) var name = npa(opts.partialWord).name pkgs = pkgs.filter(function (p) { return p.indexOf(name) === 0 }) if (pkgs.length !== 1 && opts.partialWord === name) { return cb(null, pkgs) } mapToRegistry(pkgs[0], npm.config, function (er, uri) { if (er) return cb(er) npm.registry.get(uri, options, function (er, d) { if (er) return cb() return cb(null, Object.keys(d["dist-tags"] || {}) .concat(Object.keys(d.versions || {})) .map(function (t) { return pkgs[0] + "@" + t })) }) }) }) }) } var npm = require("./npm.js") , semver = require("semver") , readJson = require("read-package-json") , readInstalled = require("read-installed") , log = require("npmlog") , path = require("path") , fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , cache = require("./cache.js") , asyncMap = require("slide").asyncMap , chain = require("slide").chain , url = require("url") , mkdir = require("mkdirp") , lifecycle = require("./utils/lifecycle.js") , archy = require("archy") , npmInstallChecks = require("npm-install-checks") , sortedObject = require("sorted-object") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") , inflight = require("inflight") , locker = require("./utils/locker.js") , lock = locker.lock , unlock = locker.unlock function install (args, cb_) { var hasArguments = !!args.length function cb (er, installed) { if (er) return cb_(er) findPeerInvalid(where, function (er, problem) { if (er) return cb_(er) if (problem) { var peerInvalidError = new Error("The package " + problem.name + " does not satisfy its siblings' peerDependencies requirements!") peerInvalidError.code = "EPEERINVALID" peerInvalidError.packageName = problem.name peerInvalidError.peersDepending = problem.peersDepending return cb(peerInvalidError) } var tree = treeify(installed || []) , pretty = prettify(tree, installed).trim() if (pretty) console.log(pretty) save(where, installed, tree, pretty, hasArguments, cb_) }) } // the /path/to/node_modules/.. var where = path.resolve(npm.dir, "..") // internal api: install(where, what, cb) if (arguments.length === 3) { where = args args = [].concat(cb_) // pass in [] to do default dep-install cb_ = arguments[2] log.verbose("install", "where, what", [where, args]) } if (!npm.config.get("global")) { args = args.filter(function (a) { return path.resolve(a) !== where }) } mkdir(where, function (er) { if (er) return cb(er) // install dependencies locally by default, // or install current folder globally if (!args.length) { var opt = { dev: npm.config.get("dev") || !npm.config.get("production") } if (npm.config.get("global")) args = ["."] else return readDependencies(null, where, opt, function (er, data) { if (er) { log.error("install", "Couldn't read dependencies") return cb(er) } var deps = Object.keys(data.dependencies || {}) log.verbose("install", "where, deps", [where, deps]) // FIXME: Install peerDependencies as direct dependencies, but only at // the top level. Should only last until peerDependencies are nerfed to // no longer implicitly install themselves. var peers = [] Object.keys(data.peerDependencies || {}).forEach(function (dep) { if (!data.dependencies[dep]) { log.verbose( "install", "peerDependency", dep, "wasn't going to be installed; adding" ) peers.push(dep) } }) log.verbose("install", "where, peers", [where, peers]) var context = { family: {} , ancestors: {} , explicit: false , parent: data , root: true , wrap: null } if (data.name === path.basename(where) && path.basename(path.dirname(where)) === "node_modules") { // Only include in ancestry if it can actually be required. // Otherwise, it does not count. context.family[data.name] = context.ancestors[data.name] = data.version } installManyTop(deps.map(function (dep) { var target = data.dependencies[dep] return dep + "@" + target }).concat(peers.map(function (dep) { var target = data.peerDependencies[dep] return dep + "@" + target })), where, context, function(er, results) { if (er || npm.config.get("production")) return cb(er, results) lifecycle(data, "prepublish", where, function(er) { return cb(er, results) }) }) }) } // initial "family" is the name:version of the root, if it's got // a package.json file. var jsonFile = path.resolve(where, "package.json") readJson(jsonFile, log.warn, function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) data = null var context = { family: {} , ancestors: {} , explicit: true , parent: data , root: true , wrap: null } if (data && data.name === path.basename(where) && path.basename(path.dirname(where)) === "node_modules") { context.family[data.name] = context.ancestors[data.name] = data.version } var fn = npm.config.get("global") ? installMany : installManyTop fn(args, where, context, cb) }) }) } function findPeerInvalid (where, cb) { readInstalled(where, { log: log.warn, dev: true }, function (er, data) { if (er) return cb(er) cb(null, findPeerInvalid_(data.dependencies, [])) }) } function findPeerInvalid_ (packageMap, fpiList) { if (fpiList.indexOf(packageMap) !== -1) return undefined fpiList.push(packageMap) for (var packageName in packageMap) { var pkg = packageMap[packageName] if (pkg.peerInvalid) { var peersDepending = {} for (var peerName in packageMap) { var peer = packageMap[peerName] if (peer.peerDependencies && peer.peerDependencies[packageName]) { peersDepending[peer.name + "@" + peer.version] = peer.peerDependencies[packageName] } } return { name: pkg.name, peersDepending: peersDepending } } if (pkg.dependencies) { var invalid = findPeerInvalid_(pkg.dependencies, fpiList) if (invalid) return invalid } } return null } // reads dependencies for the package at "where". There are several cases, // depending on our current state and the package's configuration: // // 1. If "context" is specified, then we examine the context to see if there's a // shrinkwrap there. In that case, dependencies are read from the shrinkwrap. // 2. Otherwise, if an npm-shrinkwrap.json file is present, dependencies are // read from there. // 3. Otherwise, dependencies come from package.json. // // Regardless of which case we fall into, "cb" is invoked with a first argument // describing the full package (as though readJson had been used) but with // "dependencies" read as described above. The second argument to "cb" is the // shrinkwrap to use in processing this package's dependencies, which may be // "wrap" (in case 1) or a new shrinkwrap (in case 2). function readDependencies (context, where, opts, cb) { var wrap = context ? context.wrap : null readJson( path.resolve(where, "package.json") , log.warn , function (er, data) { if (er && er.code === "ENOENT") er.code = "ENOPACKAGEJSON" if (er) return cb(er) if (opts && opts.dev) { if (!data.dependencies) data.dependencies = {} Object.keys(data.devDependencies || {}).forEach(function (k) { if (data.dependencies[k]) { log.warn("package.json", "Dependency '%s' exists in both dependencies " + "and devDependencies, using '%s@%s' from dependencies", k, k, data.dependencies[k]) } else { data.dependencies[k] = data.devDependencies[k] } }) } if (!npm.config.get("optional") && data.optionalDependencies) { Object.keys(data.optionalDependencies).forEach(function (d) { delete data.dependencies[d] }) } // User has opted out of shrinkwraps entirely if (npm.config.get("shrinkwrap") === false) return cb(null, data, null) if (wrap) { log.verbose("readDependencies: using existing wrap", [where, wrap]) var rv = {} Object.keys(data).forEach(function (key) { rv[key] = data[key] }) rv.dependencies = {} Object.keys(wrap).forEach(function (key) { log.verbose("from wrap", [key, wrap[key]]) rv.dependencies[key] = readWrap(wrap[key]) }) log.verbose("readDependencies returned deps", rv.dependencies) return cb(null, rv, wrap) } var wrapfile = path.resolve(where, "npm-shrinkwrap.json") fs.readFile(wrapfile, "utf8", function (er, wrapjson) { if (er) return cb(null, data, null) log.verbose("readDependencies", "npm-shrinkwrap.json is overriding dependencies") var newwrap try { newwrap = JSON.parse(wrapjson) } catch (ex) { return cb(ex) } log.info("shrinkwrap", "file %j", wrapfile) var rv = {} Object.keys(data).forEach(function (key) { rv[key] = data[key] }) rv.dependencies = {} Object.keys(newwrap.dependencies || {}).forEach(function (key) { rv.dependencies[key] = readWrap(newwrap.dependencies[key]) }) // fold in devDependencies if not already present, at top level if (opts && opts.dev) { Object.keys(data.devDependencies || {}).forEach(function (k) { rv.dependencies[k] = rv.dependencies[k] || data.devDependencies[k] }) } log.verbose("readDependencies returned deps", rv.dependencies) return cb(null, rv, newwrap.dependencies) }) }) } function readWrap (w) { return (w.resolved) ? w.resolved : (w.from && url.parse(w.from).protocol) ? w.from : w.version } // if the -S|--save option is specified, then write installed packages // as dependencies to a package.json file. // This is experimental. function save (where, installed, tree, pretty, hasArguments, cb) { if (!hasArguments || !npm.config.get("save") && !npm.config.get("save-dev") && !npm.config.get("save-optional") || npm.config.get("global")) { return cb(null, installed, tree, pretty) } var saveBundle = npm.config.get("save-bundle") var savePrefix = npm.config.get("save-prefix") // each item in the tree is a top-level thing that should be saved // to the package.json file. // The relevant tree shape is { <folder>: {what:<pkg>} } var saveTarget = path.resolve(where, "package.json") asyncMap(Object.keys(tree), function (k, cb) { // if "what" was a url, then save that instead. var t = tree[k] , u = url.parse(t.from) , a = npa(t.what) , w = [a.name, a.spec] fs.stat(t.from, function (er){ if (!er) { w[1] = "file:" + t.from } else if (u && u.protocol) { w[1] = t.from } cb(null, [w]) }) } , function (er, arr) { var things = arr.reduce(function (set, k) { var rangeDescriptor = semver.valid(k[1], true) && semver.gte(k[1], "0.1.0", true) && !npm.config.get("save-exact") ? savePrefix : "" set[k[0]] = rangeDescriptor + k[1] return set }, {}) // don't use readJson, because we don't want to do all the other // tricky npm-specific stuff that's in there. fs.readFile(saveTarget, function (er, data) { // ignore errors here, just don't save it. try { data = JSON.parse(data.toString("utf8")) } catch (ex) { er = ex } if (er) { return cb(null, installed, tree, pretty) } var deps = npm.config.get("save-optional") ? "optionalDependencies" : npm.config.get("save-dev") ? "devDependencies" : "dependencies" if (saveBundle) { var bundle = data.bundleDependencies || data.bundledDependencies delete data.bundledDependencies if (!Array.isArray(bundle)) bundle = [] data.bundleDependencies = bundle.sort() } log.verbose("saving", things) data[deps] = data[deps] || {} Object.keys(things).forEach(function (t) { data[deps][t] = things[t] if (saveBundle) { var i = bundle.indexOf(t) if (i === -1) bundle.push(t) data.bundleDependencies = bundle.sort() } }) data[deps] = sortedObject(data[deps]) data = JSON.stringify(data, null, 2) + "\n" writeFileAtomic(saveTarget, data, function (er) { cb(er, installed, tree, pretty) }) }) }) } // Outputting *all* the installed modules is a bit confusing, // because the length of the path does not make it clear // that the submodules are not immediately require()able. // TODO: Show the complete tree, ls-style, but only if --long is provided function prettify (tree, installed) { function red (set, kv) { set[kv[0]] = kv[1] return set } if (npm.config.get("json")) { tree = Object.keys(tree).map(function (p) { if (!tree[p]) return null var what = npa(tree[p].what) , name = what.name , version = what.spec , o = { name: name, version: version, from: tree[p].from } o.dependencies = tree[p].children.map(function P (dep) { var what = npa(dep.what) , name = what.name , version = what.spec , o = { version: version, from: dep.from } o.dependencies = dep.children.map(P).reduce(red, {}) return [name, o] }).reduce(red, {}) return o }) return JSON.stringify(tree, null, 2) } if (npm.config.get("parseable")) return parseable(installed) return Object.keys(tree).map(function (p) { return archy({ label: tree[p].what + " " + p , nodes: (tree[p].children || []).map(function P (c) { if (npm.config.get("long")) { return { label: c.what, nodes: c.children.map(P) } } var g = c.children.map(function (g) { return g.what }).join(", ") if (g) g = " (" + g + ")" return c.what + g }) }, "", { unicode: npm.config.get("unicode") }) }).join("\n") } function parseable (installed) { var long = npm.config.get("long") , cwd = process.cwd() return installed.map(function (item) { return path.resolve(cwd, item[1]) + ( long ? ":" + item[0] : "" ) }).join("\n") } function treeify (installed) { // each item is [what, where, parent, parentDir] // If no parent, then report it. // otherwise, tack it into the parent's children list. // If the parent isn't a top-level then ignore it. var whatWhere = installed.reduce(function (l, r) { var parentDir = r[3] , parent = r[2] , where = r[1] , what = r[0] , from = r[4] l[where] = { parentDir: parentDir , parent: parent , children: [] , where: where , what: what , from: from } return l }, {}) // log.warn("install", whatWhere, "whatWhere") return Object.keys(whatWhere).reduce(function (l, r) { var ww = whatWhere[r] //log.warn("r, ww", [r, ww]) if (!ww.parent) { l[r] = ww } else { var p = whatWhere[ww.parentDir] if (p) p.children.push(ww) else l[r] = ww } return l }, {}) } // just like installMany, but also add the existing packages in // where/node_modules to the family object. function installManyTop (what, where, context, cb_) { function cb (er, d) { if (context.explicit || er) return cb_(er, d) // since this wasn't an explicit install, let's build the top // folder, so that `npm install` also runs the lifecycle scripts. npm.commands.build([where], false, true, function (er) { return cb_(er, d) }) } if (context.explicit) return next() readJson(path.join(where, "package.json"), log.warn, function (er, data) { if (er) return next(er) lifecycle(data, "preinstall", where, next) }) function next (er) { if (er) return cb(er) installManyTop_(what, where, context, cb) } } function installManyTop_ (what, where, context, cb) { var nm = path.resolve(where, "node_modules") fs.readdir(nm, function (er, pkgs) { if (er) return installMany(what, where, context, cb) var scopes = [], unscoped = [] pkgs.filter(function (p) { return !p.match(/^[\._-]/) }).forEach(function (p) { // @names deserve deeper investigation if (p[0] === "@") { scopes.push(p) } else { unscoped.push(p) } }) maybeScoped(scopes, nm, function (er, scoped) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) // recombine unscoped with @scope/package packages asyncMap(unscoped.concat(scoped).map(function (p) { return path.resolve(nm, p, "package.json") }), function (jsonfile, cb) { readJson(jsonfile, log.warn, function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) return cb(null, []) cb(null, [[data.name, data.version]]) }) }, function (er, packages) { // if there's nothing in node_modules, then don't freak out. if (er) packages = [] // add all the existing packages to the family list. // however, do not add to the ancestors list. packages.forEach(function (p) { context.family[p[0]] = p[1] }) installMany(what, where, context, cb) }) }) }) } function maybeScoped (scopes, where, cb) { // find packages in scopes asyncMap(scopes, function (scope, cb) { fs.readdir(path.resolve(where, scope), function (er, scoped) { if (er) return cb(er) var paths = scoped.map(function (p) { return path.join(scope, p) }) cb(null, paths) }) }, cb) } function installMany (what, where, context, cb) { // readDependencies takes care of figuring out whether the list of // dependencies we'll iterate below comes from an existing shrinkwrap from a // parent level, a new shrinkwrap at this level, or package.json at this // level, as well as which shrinkwrap (if any) our dependencies should use. var opt = { dev: npm.config.get("dev") } readDependencies(context, where, opt, function (er, data, wrap) { if (er) data = {} var parent = data var d = data.dependencies || {} // if we're explicitly installing "what" into "where", then the shrinkwrap // for "where" doesn't apply. This would be the case if someone were adding // a new package to a shrinkwrapped package. (data.dependencies will not be // used here except to indicate what packages are already present, so // there's no harm in using that.) if (context.explicit) wrap = null // what is a list of things. // resolve each one. asyncMap( what , targetResolver(where, context, d) , function (er, targets) { if (er) return cb(er) // each target will be a data object corresponding // to a package, folder, or whatever that is in the cache now. var newPrev = Object.create(context.family) , newAnc = Object.create(context.ancestors) if (!context.root) { newAnc[data.name] = data.version } targets.forEach(function (t) { newPrev[t.name] = t.version }) log.silly("install resolved", targets) targets.filter(function (t) { return t }).forEach(function (t) { log.info("install", "%s into %s", t._id, where) }) asyncMap(targets, function (target, cb) { log.info("installOne", target._id) var wrapData = wrap ? wrap[target.name] : null var newWrap = wrapData && wrapData.dependencies ? wrap[target.name].dependencies || {} : null var newContext = { family: newPrev , ancestors: newAnc , parent: parent , explicit: false , wrap: newWrap } installOne(target, where, newContext, cb) }, cb) }) }) } function targetResolver (where, context, deps) { var alreadyInstalledManually = [] , resolveLeft = 0 , nm = path.resolve(where, "node_modules") , parent = context.parent , wrap = context.wrap if (!context.explicit) readdir(nm) function readdir(name) { resolveLeft++ fs.readdir(name, function (er, inst) { if (er) return resolveLeft-- // don't even mess with non-package looking things inst = inst.filter(function (p) { if (!p.match(/^[@\._-]/)) return true // scoped packages readdir(path.join(name, p)) }) asyncMap(inst, function (pkg, cb) { readJson(path.resolve(name, pkg, "package.json"), log.warn, function (er, d) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) // error means it's not a package, most likely. if (er) return cb(null, []) // if it's a bundled dep, then assume that anything there is valid. // otherwise, make sure that it's a semver match with what we want. var bd = parent.bundleDependencies if (bd && bd.indexOf(d.name) !== -1 || semver.satisfies(d.version, deps[d.name] || "*", true) || deps[d.name] === d._resolved) { return cb(null, d.name) } // see if the package had been previously linked fs.lstat(path.resolve(nm, pkg), function(err, s) { if (err) return cb(null, []) if (s.isSymbolicLink()) { return cb(null, d.name) } // something is there, but it's not satisfactory. Clobber it. return cb(null, []) }) }) }, function (er, inst) { // this is the list of things that are valid and should be ignored. alreadyInstalledManually = alreadyInstalledManually.concat(inst) resolveLeft-- }) }) } var to = 0 return function resolver (what, cb) { if (resolveLeft) return setTimeout(function () { resolver(what, cb) }, to++) // now we know what's been installed here manually, // or tampered with in some way that npm doesn't want to overwrite. if (alreadyInstalledManually.indexOf(npa(what).name) !== -1) { log.verbose("already installed", "skipping %s %s", what, where) return cb(null, []) } // check for a version installed higher in the tree. // If installing from a shrinkwrap, it must match exactly. if (context.family[what]) { if (wrap && wrap[what].version === context.family[what]) { log.verbose("shrinkwrap", "use existing", what) return cb(null, []) } } // if it's identical to its parent, then it's probably someone // doing `npm install foo` inside of the foo project. Print // a warning, and skip it. if (parent && parent.name === what && !npm.config.get("force")) { log.warn("install", "Refusing to install %s as a dependency of itself" , what) return cb(null, []) } if (wrap) { var name = npa(what).name if (wrap[name]) { var wrapTarget = readWrap(wrap[name]) what = name + "@" + wrapTarget } else { log.verbose("shrinkwrap", "skipping %s (not in shrinkwrap)", what) } } else if (deps[what]) { what = what + "@" + deps[what] } // This is where we actually fetch the package, if it's not already // in the cache. // If it's a git repo, then we want to install it, even if the parent // already has a matching copy. // If it's not a git repo, and the parent already has that pkg, then // we can skip installing it again. var pkgroot = path.resolve(npm.prefix, (parent && parent._from) || "") cache.add(what, null, pkgroot, false, function (er, data) { if (er && parent && parent.optionalDependencies && parent.optionalDependencies.hasOwnProperty(npa(what).name)) { log.warn("optional dep failed, continuing", what) log.verbose("optional dep failed, continuing", [what, er]) return cb(null, []) } var isGit = npa(what).type === "git" if (!er && data && !context.explicit && context.family[data.name] === data.version && !npm.config.get("force") && !isGit) { log.info("already installed", data.name + "@" + data.version) return cb(null, []) } if (data && !data._from) data._from = what if (er && parent && parent.name) er.parent = parent.name return cb(er, data || []) }) } } // we've already decided to install this. if anything's in the way, // then uninstall it first. function installOne (target, where, context, cb) { // the --link flag makes this a "link" command if it's at the // the top level. if (where === npm.prefix && npm.config.get("link") && !npm.config.get("global")) { return localLink(target, where, context, cb) } installOne_(target, where, context, function (er, installedWhat) { // check if this one is optional to its parent. if (er && context.parent && context.parent.optionalDependencies && context.parent.optionalDependencies.hasOwnProperty(target.name)) { log.warn("optional dep failed, continuing", target._id) log.verbose("optional dep failed, continuing", [target._id, er]) er = null } cb(er, installedWhat) }) } function localLink (target, where, context, cb) { log.verbose("localLink", target._id) var jsonFile = path.resolve( npm.globalDir, target.name , "package.json" ) , parent = context.parent readJson(jsonFile, log.warn, function (er, data) { function thenLink () { npm.commands.link([target.name], function (er, d) { log.silly("localLink", "back from link", [er, d]) cb(er, [resultList(target, where, parent && parent._id)]) }) } if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er || data._id === target._id) { if (er) { install( path.resolve(npm.globalDir, "..") , target._id , function (er) { if (er) return cb(er, []) thenLink() }) } else thenLink() } else { log.verbose("localLink", "install locally (no link)", target._id) installOne_(target, where, context, cb) } }) } function resultList (target, where, parentId) { var nm = path.resolve(where, "node_modules") , targetFolder = path.resolve(nm, target.name) , prettyWhere = where if (!npm.config.get("global")) { prettyWhere = path.relative(process.cwd(), where) } if (prettyWhere === ".") prettyWhere = null if (!npm.config.get("global")) { // print out the folder relative to where we are right now. targetFolder = path.relative(process.cwd(), targetFolder) } return [ target._id , targetFolder , prettyWhere && parentId , parentId && prettyWhere , target._from ] } var installed = Object.create(null) function installOne_ (target, where, context, cb_) { var nm = path.resolve(where, "node_modules") , targetFolder = path.resolve(nm, target.name) , prettyWhere = path.relative(process.cwd(), where) , parent = context.parent if (prettyWhere === ".") prettyWhere = null cb_ = inflight(target.name + ":" + where, cb_) if (!cb_) return log.verbose( "installOne", "of", target.name, "to", where, "already in flight; waiting" ) else log.verbose( "installOne", "of", target.name, "to", where, "not in flight; installing" ) function cb(er, data) { unlock(nm, target.name, function () { cb_(er, data) }) } lock(nm, target.name, function (er) { if (er) return cb(er) if (targetFolder in installed) { log.error("install", "trying to install", target.version, "to", targetFolder) log.error("install", "but already installed versions", installed[targetFolder]) installed[targetFolder].push(target.version) } else { installed[targetFolder] = [target.version] } var force = npm.config.get("force") , nodeVersion = npm.config.get("node-version") , strict = npm.config.get("engine-strict") , c = npmInstallChecks chain( [ [c.checkEngine, target, npm.version, nodeVersion, force, strict] , [c.checkPlatform, target, force] , [c.checkCycle, target, context.ancestors] , [c.checkGit, targetFolder] , [write, target, targetFolder, context] ] , function (er, d) { if (er) return cb(er) d.push(resultList(target, where, parent && parent._id)) cb(er, d) } ) }) } function write (target, targetFolder, context, cb_) { var up = npm.config.get("unsafe-perm") , user = up ? null : npm.config.get("user") , group = up ? null : npm.config.get("group") , family = context.family function cb (er, data) { // cache.unpack returns the data object, and all we care about // is the list of installed packages from that last thing. if (!er) return cb_(er, data) if (npm.config.get("rollback") === false) return cb_(er) npm.rollbacks.push(targetFolder) cb_(er, data) } var bundled = [] log.silly("install write", "writing", target.name, target.version, "to", targetFolder) chain( [ [ cache.unpack, target.name, target.version, targetFolder , null, null, user, group ] , [ fs, "writeFile" , path.resolve(targetFolder, "package.json") , JSON.stringify(target, null, 2) + "\n" ] , [ lifecycle, target, "preinstall", targetFolder ] , function (cb) { if (!target.bundleDependencies) return cb() var bd = path.resolve(targetFolder, "node_modules") fs.readdir(bd, function (er, b) { // nothing bundled, maybe if (er) return cb() bundled = b || [] cb() }) } ] // nest the chain so that we can throw away the results returned // up until this point, since we really don't care about it. , function X (er) { if (er) return cb(er) // before continuing to installing dependencies, check for a shrinkwrap. var opt = { dev: npm.config.get("dev") } readDependencies(context, targetFolder, opt, function (er, data, wrap) { var deps = prepareForInstallMany(data, "dependencies", bundled, wrap, family) var depsTargetFolder = targetFolder var depsContext = { family: family , ancestors: context.ancestors , parent: target , explicit: false , wrap: wrap } var actions = [ [ installManyAndBuild, deps, depsTargetFolder, depsContext ] ] // FIXME: This is an accident waiting to happen! // // 1. If multiple children at the same level of the tree share a // peerDependency that's not in the parent's dependencies, because // the peerDeps don't get added to the family, they will keep // getting reinstalled (worked around by inflighting installOne). // 2. The installer can't safely build at the parent level because // that's already being done by the parent's installAndBuild. This // runs the risk of the peerDependency never getting built. // // The fix: Don't install peerDependencies; require them to be // included as explicit dependencies / devDependencies, and warn // or error when they're missing. See #5080 for more arguments in // favor of killing implicit peerDependency installs with fire. var peerDeps = prepareForInstallMany(data, "peerDependencies", bundled, wrap, family) var pdTargetFolder = path.resolve(targetFolder, "..", "..") var pdContext = context if (peerDeps.length > 0) { actions.push( [ installMany, peerDeps, pdTargetFolder, pdContext ] ) } chain(actions, cb) }) }) } function installManyAndBuild (deps, targetFolder, context, cb) { installMany(deps, targetFolder, context, function (er, d) { log.verbose("about to build", targetFolder) if (er) return cb(er) npm.commands.build( [targetFolder] , npm.config.get("global") , true , function (er) { return cb(er, d) }) }) } function prepareForInstallMany (packageData, depsKey, bundled, wrap, family) { var deps = Object.keys(packageData[depsKey] || {}) // don't install bundleDependencies, unless they're missing. if (packageData.bundleDependencies) { deps = deps.filter(function (d) { return packageData.bundleDependencies.indexOf(d) === -1 || bundled.indexOf(d) === -1 }) } return deps.filter(function (d) { // prefer to not install things that are satisfied by // something in the "family" list, unless we're installing // from a shrinkwrap. if (wrap) return wrap if (semver.validRange(family[d], true)) return !semver.satisfies(family[d], packageData[depsKey][d], true) return true }).map(function (d) { var v = packageData[depsKey][d] var t = d + "@" + v log.silly("prepareForInstallMany", "adding", t, "from", packageData.name, depsKey) return t }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/link.js���������������������������������������������000644 �000766 �000024 �00000012240 12455173731 022742� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// link with no args: symlink the folder to the global location // link with package arg: symlink the global to the local var npm = require("./npm.js") , symlink = require("./utils/link.js") , fs = require("graceful-fs") , log = require("npmlog") , asyncMap = require("slide").asyncMap , chain = require("slide").chain , path = require("path") , rm = require("./utils/gently-rm.js") , build = require("./build.js") , npa = require("npm-package-arg") module.exports = link link.usage = "npm link (in package dir)" + "\nnpm link <pkg> (link global into local)" link.completion = function (opts, cb) { var dir = npm.globalDir fs.readdir(dir, function (er, files) { cb(er, files.filter(function (f) { return !f.match(/^[\._-]/) })) }) } function link (args, cb) { if (process.platform === "win32") { var semver = require("semver") if (!semver.satisfies(process.version, ">=0.7.9")) { var msg = "npm link not supported on windows prior to node 0.7.9" , e = new Error(msg) e.code = "ENOTSUP" e.errno = require("constants").ENOTSUP return cb(e) } } if (npm.config.get("global")) { return cb(new Error("link should never be --global.\n" +"Please re-run this command with --local")) } if (args.length === 1 && args[0] === ".") args = [] if (args.length) return linkInstall(args, cb) linkPkg(npm.prefix, cb) } function linkInstall (pkgs, cb) { asyncMap(pkgs, function (pkg, cb) { var t = path.resolve(npm.globalDir, "..") , pp = path.resolve(npm.globalDir, pkg) , rp = null , target = path.resolve(npm.dir, pkg) function n (er, data) { if (er) return cb(er, data) // install returns [ [folder, pkgId], ... ] // but we definitely installed just one thing. var d = data.filter(function (d) { return !d[3] }) var what = npa(d[0][0]) pp = d[0][1] pkg = what.name target = path.resolve(npm.dir, pkg) next() } // if it's a folder, a random not-installed thing, or not a scoped package, // then link or install it first if (pkg[0] !== "@" && (pkg.indexOf("/") !== -1 || pkg.indexOf("\\") !== -1)) { return fs.lstat(path.resolve(pkg), function (er, st) { if (er || !st.isDirectory()) { npm.commands.install(t, pkg, n) } else { rp = path.resolve(pkg) linkPkg(rp, n) } }) } fs.lstat(pp, function (er, st) { if (er) { rp = pp return npm.commands.install(t, pkg, n) } else if (!st.isSymbolicLink()) { rp = pp next() } else { return fs.realpath(pp, function (er, real) { if (er) log.warn("invalid symbolic link", pkg) else rp = real next() }) } }) function next () { chain ( [ [npm.commands, "unbuild", [target]] , [function (cb) { log.verbose("link", "symlinking %s to %s", pp, target) cb() }] , [symlink, pp, target] // do run lifecycle scripts - full build here. , rp && [build, [target]] , [ resultPrinter, pkg, pp, target, rp ] ] , cb ) } }, cb) } function linkPkg (folder, cb_) { var me = folder || npm.prefix , readJson = require("read-package-json") log.verbose("linkPkg", folder) readJson(path.resolve(me, "package.json"), function (er, d) { function cb (er) { return cb_(er, [[d && d._id, target, null, null]]) } if (er) return cb(er) if (!d.name) { er = new Error("Package must have a name field to be linked") return cb(er) } var target = path.resolve(npm.globalDir, d.name) rm(target, function (er) { if (er) return cb(er) symlink(me, target, function (er) { if (er) return cb(er) log.verbose("link", "build target", target) // also install missing dependencies. npm.commands.install(me, [], function (er) { if (er) return cb(er) // build the global stuff. Don't run *any* scripts, because // install command already will have done that. build([target], true, build._noLC, true, function (er) { if (er) return cb(er) resultPrinter(path.basename(me), me, target, cb) }) }) }) }) }) } function resultPrinter (pkg, src, dest, rp, cb) { if (typeof cb !== "function") cb = rp, rp = null var where = dest rp = (rp || "").trim() src = (src || "").trim() // XXX If --json is set, then look up the data from the package.json if (npm.config.get("parseable")) { return parseableOutput(dest, rp || src, cb) } if (rp === src) rp = null console.log(where + " -> " + src + (rp ? " -> " + rp: "")) cb() } function parseableOutput (dest, rp, cb) { // XXX this should match ls --parseable and install --parseable // look up the data from package.json, format it the same way. // // link is always effectively "long", since it doesn't help much to // *just* print the target folder. // However, we don't actually ever read the version number, so // the second field is always blank. console.log(dest + "::" + rp) cb() } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/ls.js�����������������������������������������������000644 �000766 �000024 �00000023542 12455173731 022432� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // show the installed versions of packages // // --parseable creates output like this: // <fullpath>:<name@ver>:<realpath>:<flags> // Flags are a :-separated list of zero or more indicators module.exports = exports = ls var npm = require("./npm.js") , readInstalled = require("read-installed") , log = require("npmlog") , path = require("path") , archy = require("archy") , semver = require("semver") , url = require("url") , color = require("ansicolors") , npa = require("npm-package-arg") ls.usage = "npm ls" ls.completion = require("./utils/completion/installed-deep.js") function ls (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false var dir = path.resolve(npm.dir, "..") // npm ls 'foo@~1.3' bar 'baz@<2' if (!args) args = [] else args = args.map(function (a) { var p = npa(a) , name = p.name , ver = semver.validRange(p.rawSpec) || "" return [ name, ver ] }) var depth = npm.config.get("depth") var opt = { depth: depth, log: log.warn, dev: true } readInstalled(dir, opt, function (er, data) { pruneNestedExtraneous(data) var bfs = bfsify(data, args) , lite = getLite(bfs) if (er || silent) return cb(er, data, lite) var long = npm.config.get("long") , json = npm.config.get("json") , out if (json) { var seen = [] var d = long ? bfs : lite // the raw data can be circular out = JSON.stringify(d, function (k, o) { if (typeof o === "object") { if (-1 !== seen.indexOf(o)) return "[Circular]" seen.push(o) } return o }, 2) } else if (npm.config.get("parseable")) { out = makeParseable(bfs, long, dir) } else if (data) { out = makeArchy(bfs, long, dir) } console.log(out) if (args.length && !data._found) process.exitCode = 1 // if any errors were found, then complain and exit status 1 if (lite.problems && lite.problems.length) { er = lite.problems.join("\n") } cb(er, data, lite) }) } function pruneNestedExtraneous (data, visited) { visited = visited || [] visited.push(data) for (var i in data.dependencies) { if (data.dependencies[i].extraneous) { data.dependencies[i].dependencies = {} } else if (visited.indexOf(data.dependencies[i]) === -1) { pruneNestedExtraneous(data.dependencies[i], visited) } } } function alphasort (a, b) { a = a.toLowerCase() b = b.toLowerCase() return a > b ? 1 : a < b ? -1 : 0 } function getLite (data, noname) { var lite = {} , maxDepth = npm.config.get("depth") if (!noname && data.name) lite.name = data.name if (data.version) lite.version = data.version if (data.extraneous) { lite.extraneous = true lite.problems = lite.problems || [] lite.problems.push( "extraneous: " + data.name + "@" + data.version + " " + (data.path || "") ) } if (data._from) lite.from = data._from if (data._resolved) lite.resolved = data._resolved if (data.invalid) { lite.invalid = true lite.problems = lite.problems || [] lite.problems.push( "invalid: " + data.name + "@" + data.version + " " + (data.path || "") ) } if (data.peerInvalid) { lite.peerInvalid = true lite.problems = lite.problems || [] lite.problems.push( "peer invalid: " + data.name + "@" + data.version + " " + (data.path || "") ) } if (data.dependencies) { var deps = Object.keys(data.dependencies) if (deps.length) lite.dependencies = deps.map(function (d) { var dep = data.dependencies[d] if (typeof dep === "string") { lite.problems = lite.problems || [] var p if (data.depth > maxDepth) { p = "max depth reached: " } else { p = "missing: " } p += d + "@" + dep + ", required by " + data.name + "@" + data.version lite.problems.push(p) return [d, { required: dep, missing: true }] } return [d, getLite(dep, true)] }).reduce(function (deps, d) { if (d[1].problems) { lite.problems = lite.problems || [] lite.problems.push.apply(lite.problems, d[1].problems) } deps[d[0]] = d[1] return deps }, {}) } return lite } function bfsify (root, args, current, queue, seen) { // walk over the data, and turn it from this: // +-- a // | `-- b // | `-- a (truncated) // `--b (truncated) // into this: // +-- a // `-- b // which looks nicer args = args || [] current = current || root queue = queue || [] seen = seen || [root] var deps = current.dependencies = current.dependencies || {} Object.keys(deps).forEach(function (d) { var dep = deps[d] if (typeof dep !== "object") return if (seen.indexOf(dep) !== -1) { if (npm.config.get("parseable") || !npm.config.get("long")) { delete deps[d] return } else { dep = deps[d] = Object.create(dep) dep.dependencies = {} } } queue.push(dep) seen.push(dep) }) if (!queue.length) { // if there were args, then only show the paths to found nodes. return filterFound(root, args) } return bfsify(root, args, queue.shift(), queue, seen) } function filterFound (root, args) { if (!args.length) return root var deps = root.dependencies if (deps) Object.keys(deps).forEach(function (d) { var dep = filterFound(deps[d], args) // see if this one itself matches var found = false for (var i = 0; !found && i < args.length; i ++) { if (d === args[i][0]) { found = semver.satisfies(dep.version, args[i][1], true) } } // included explicitly if (found) dep._found = true // included because a child was included if (dep._found && !root._found) root._found = 1 // not included if (!dep._found) delete deps[d] }) if (!root._found) root._found = false return root } function makeArchy (data, long, dir) { var out = makeArchy_(data, long, dir, 0) return archy(out, "", { unicode: npm.config.get("unicode") }) } function makeArchy_ (data, long, dir, depth, parent, d) { if (typeof data === "string") { if (depth -1 <= npm.config.get("depth")) { // just missing var unmet = "UNMET DEPENDENCY" if (npm.color) { unmet = color.bgBlack(color.red(unmet)) } data = unmet + " " + d + "@" + data } else { data = d+"@"+ data } return data } var out = {} // the top level is a bit special. out.label = data._id || "" if (data._found === true && data._id) { if (npm.color) { out.label = color.bgBlack(color.yellow(out.label.trim())) + " " } else { out.label = out.label.trim() + " " } } if (data.link) out.label += " -> " + data.link if (data.invalid) { if (data.realName !== data.name) out.label += " ("+data.realName+")" var invalid = "invalid" if (npm.color) invalid = color.bgBlack(color.red(invalid)) out.label += " " + invalid } if (data.peerInvalid) { var peerInvalid = "peer invalid" if (npm.color) peerInvalid = color.bgBlack(color.red(peerInvalid)) out.label += " " + peerInvalid } if (data.extraneous && data.path !== dir) { var extraneous = "extraneous" if (npm.color) extraneous = color.bgBlack(color.green(extraneous)) out.label += " " + extraneous } // add giturl to name@version if (data._resolved) { if (npa(data._resolved).type === "git") out.label += " (" + data._resolved + ")" } if (long) { if (dir === data.path) out.label += "\n" + dir out.label += "\n" + getExtras(data, dir) } else if (dir === data.path) { if (out.label) out.label += " " out.label += dir } // now all the children. out.nodes = [] if (depth <= npm.config.get("depth")) { out.nodes = Object.keys(data.dependencies || {}) .sort(alphasort).map(function (d) { return makeArchy_(data.dependencies[d], long, dir, depth + 1, data, d) }) } if (out.nodes.length === 0 && data.path === dir) { out.nodes = ["(empty)"] } return out } function getExtras (data) { var extras = [] if (data.description) extras.push(data.description) if (data.repository) extras.push(data.repository.url) if (data.homepage) extras.push(data.homepage) if (data._from) { var from = data._from if (from.indexOf(data.name + "@") === 0) { from = from.substr(data.name.length + 1) } var u = url.parse(from) if (u.protocol) extras.push(from) } return extras.join("\n") } function makeParseable (data, long, dir, depth, parent, d) { depth = depth || 0 return [ makeParseable_(data, long, dir, depth, parent, d) ] .concat(Object.keys(data.dependencies || {}) .sort(alphasort).map(function (d) { return makeParseable(data.dependencies[d], long, dir, depth + 1, data, d) })) .filter(function (x) { return x }) .join("\n") } function makeParseable_ (data, long, dir, depth, parent, d) { if (data.hasOwnProperty("_found") && data._found !== true) return "" if (typeof data === "string") { if (data.depth < npm.config.get("depth")) { data = npm.config.get("long") ? path.resolve(parent.path, "node_modules", d) + ":"+d+"@"+JSON.stringify(data)+":INVALID:MISSING" : "" } else { data = path.resolve(data.path || "", "node_modules", d || "") + (npm.config.get("long") ? ":" + d + "@" + JSON.stringify(data) + ":" // no realpath resolved + ":MAXDEPTH" : "") } return data } if (!npm.config.get("long")) return data.path return data.path + ":" + (data._id || "") + ":" + (data.realPath !== data.path ? data.realPath : "") + (data.extraneous ? ":EXTRANEOUS" : "") + (data.invalid ? ":INVALID" : "") + (data.peerInvalid ? ":PEERINVALID" : "") } ��������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/npm.js����������������������������������������������000644 �000766 �000024 �00000030526 12455173731 022606� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������;(function(){ // windows: running "npm blah" in this folder will invoke WSH, not node. if (typeof WScript !== "undefined") { WScript.echo("npm does not work when run\n" +"with the Windows Scripting Host\n\n" +"'cd' to a different directory,\n" +"or type 'npm.cmd <args>',\n" +"or type 'node npm <args>'.") WScript.quit(1) return } // monkey-patch support for 0.6 child processes require('child-process-close') var EventEmitter = require("events").EventEmitter , npm = module.exports = new EventEmitter() , npmconf = require("./config/core.js") , log = require("npmlog") , fs = require("graceful-fs") , path = require("path") , abbrev = require("abbrev") , which = require("which") , CachingRegClient = require("./cache/caching-client.js") , charSpin = require("char-spinner") npm.config = { loaded: false, get: function() { throw new Error('npm.load() required') }, set: function() { throw new Error('npm.load() required') } } npm.commands = {} npm.rollbacks = [] try { // startup, ok to do this synchronously var j = JSON.parse(fs.readFileSync( path.join(__dirname, "../package.json"))+"") npm.version = j.version } catch (ex) { try { log.info("error reading version", ex) } catch (er) {} npm.version = ex } var commandCache = {} // short names for common things , aliases = { "rm" : "uninstall" , "r" : "uninstall" , "un" : "uninstall" , "unlink" : "uninstall" , "remove" : "uninstall" , "rb" : "rebuild" , "list" : "ls" , "la" : "ls" , "ll" : "ls" , "ln" : "link" , "i" : "install" , "isntall" : "install" , "up" : "update" , "c" : "config" , "info" : "view" , "show" : "view" , "find" : "search" , "s" : "search" , "se" : "search" , "author" : "owner" , "home" : "docs" , "issues": "bugs" , "unstar": "star" // same function , "apihelp" : "help" , "login": "adduser" , "add-user": "adduser" , "tst": "test" , "t": "test" , "find-dupes": "dedupe" , "ddp": "dedupe" , "v": "view" , "verison": "version" } , aliasNames = Object.keys(aliases) // these are filenames in . , cmdList = [ "install" , "uninstall" , "cache" , "config" , "set" , "get" , "update" , "outdated" , "prune" , "pack" , "dedupe" , "rebuild" , "link" , "publish" , "star" , "stars" , "tag" , "adduser" , "unpublish" , "owner" , "deprecate" , "shrinkwrap" , "help" , "help-search" , "ls" , "search" , "view" , "init" , "version" , "edit" , "explore" , "docs" , "repo" , "bugs" , "faq" , "root" , "prefix" , "bin" , "whoami" , "test" , "stop" , "start" , "restart" , "run-script" , "completion" ] , plumbing = [ "build" , "unbuild" , "xmas" , "substack" , "visnup" ] , littleGuys = [ "isntall" ] , fullList = cmdList.concat(aliasNames).filter(function (c) { return plumbing.indexOf(c) === -1 }) , abbrevs = abbrev(fullList) // we have our reasons fullList = npm.fullList = fullList.filter(function (c) { return littleGuys.indexOf(c) === -1 }) npm.spinner = { int: null , started: false , start: function () { if (npm.spinner.int) return var c = npm.config.get("spin") if (!c) return var stream = npm.config.get("logstream") var opt = { tty: c !== "always", stream: stream } opt.cleanup = !npm.spinner.started npm.spinner.int = charSpin(opt) npm.spinner.started = true } , stop: function () { clearInterval(npm.spinner.int) npm.spinner.int = null } } Object.keys(abbrevs).concat(plumbing).forEach(function addCommand (c) { Object.defineProperty(npm.commands, c, { get : function () { if (!loaded) throw new Error( "Call npm.load(config, cb) before using this command.\n"+ "See the README.md or cli.js for example usage.") var a = npm.deref(c) if (c === "la" || c === "ll") { npm.config.set("long", true) } npm.command = c if (commandCache[a]) return commandCache[a] var cmd = require(__dirname+"/"+a+".js") commandCache[a] = function () { var args = Array.prototype.slice.call(arguments, 0) if (typeof args[args.length - 1] !== "function") { args.push(defaultCb) } if (args.length === 1) args.unshift([]) npm.registry.version = npm.version if (!npm.registry.refer) { npm.registry.refer = [a].concat(args[0]).map(function (arg) { // exclude anything that might be a URL, path, or private module // Those things will always have a slash in them somewhere if (arg && arg.match && arg.match(/\/|\\/)) { return "[REDACTED]" } else { return arg } }).filter(function (arg) { return arg && arg.match }).join(" ") } cmd.apply(npm, args) } Object.keys(cmd).forEach(function (k) { commandCache[a][k] = cmd[k] }) return commandCache[a] }, enumerable: fullList.indexOf(c) !== -1 }) // make css-case commands callable via camelCase as well if (c.match(/\-([a-z])/)) { addCommand(c.replace(/\-([a-z])/g, function (a, b) { return b.toUpperCase() })) } }) function defaultCb (er, data) { if (er) console.error(er.stack || er.message) else console.log(data) } npm.deref = function (c) { if (!c) return "" if (c.match(/[A-Z]/)) c = c.replace(/([A-Z])/g, function (m) { return "-" + m.toLowerCase() }) if (plumbing.indexOf(c) !== -1) return c var a = abbrevs[c] if (aliases[a]) a = aliases[a] return a } var loaded = false , loading = false , loadErr = null , loadListeners = [] function loadCb (er) { loadListeners.forEach(function (cb) { process.nextTick(cb.bind(npm, er, npm)) }) loadListeners.length = 0 } npm.load = function (cli, cb_) { if (!cb_ && typeof cli === "function") cb_ = cli , cli = {} if (!cb_) cb_ = function () {} if (!cli) cli = {} loadListeners.push(cb_) if (loaded || loadErr) return cb(loadErr) if (loading) return loading = true var onload = true function cb (er) { if (loadErr) return loadErr = er if (er) return cb_(er) if (npm.config.get("force")) { log.warn("using --force", "I sure hope you know what you are doing.") } npm.config.loaded = true loaded = true loadCb(loadErr = er) if (onload = onload && npm.config.get("onload-script")) { require(onload) onload = false } } log.pause() load(npm, cli, cb) } function load (npm, cli, cb) { which(process.argv[0], function (er, node) { if (!er && node.toUpperCase() !== process.execPath.toUpperCase()) { log.verbose("node symlink", node) process.execPath = node process.installPrefix = path.resolve(node, "..", "..") } // look up configs //console.error("about to look up configs") var builtin = path.resolve(__dirname, "..", "npmrc") npmconf.load(cli, builtin, function (er, config) { if (er === config) er = null npm.config = config if (er) return cb(er) // if the "project" config is not a filename, and we're // not in global mode, then that means that it collided // with either the default or effective userland config if (!config.get("global") && config.sources.project && config.sources.project.type !== "ini") { log.verbose("config" , "Skipping project config: %s. " + "(matches userconfig)" , config.localPrefix + "/.npmrc") } // Include npm-version and node-version in user-agent var ua = config.get("user-agent") || "" ua = ua.replace(/\{node-version\}/gi, process.version) ua = ua.replace(/\{npm-version\}/gi, npm.version) ua = ua.replace(/\{platform\}/gi, process.platform) ua = ua.replace(/\{arch\}/gi, process.arch) config.set("user-agent", ua) var color = config.get("color") log.level = config.get("loglevel") log.heading = config.get("heading") || "npm" log.stream = config.get("logstream") switch (color) { case "always": log.enableColor() npm.color = true break case false: log.disableColor() npm.color = false break default: var tty = require("tty") if (process.stdout.isTTY) npm.color = true else if (!tty.isatty) npm.color = true else if (tty.isatty(1)) npm.color = true else npm.color = false break } log.resume() // at this point the configs are all set. // go ahead and spin up the registry client. npm.registry = new CachingRegClient(npm.config) var umask = npm.config.get("umask") npm.modes = { exec: 0777 & (~umask) , file: 0666 & (~umask) , umask: umask } var gp = Object.getOwnPropertyDescriptor(config, "globalPrefix") Object.defineProperty(npm, "globalPrefix", gp) var lp = Object.getOwnPropertyDescriptor(config, "localPrefix") Object.defineProperty(npm, "localPrefix", lp) return cb(null, npm) }) }) } Object.defineProperty(npm, "prefix", { get : function () { return npm.config.get("global") ? npm.globalPrefix : npm.localPrefix } , set : function (r) { var k = npm.config.get("global") ? "globalPrefix" : "localPrefix" return npm[k] = r } , enumerable : true }) Object.defineProperty(npm, "bin", { get : function () { if (npm.config.get("global")) return npm.globalBin return path.resolve(npm.root, ".bin") } , enumerable : true }) Object.defineProperty(npm, "globalBin", { get : function () { var b = npm.globalPrefix if (process.platform !== "win32") b = path.resolve(b, "bin") return b } }) Object.defineProperty(npm, "dir", { get : function () { if (npm.config.get("global")) return npm.globalDir return path.resolve(npm.prefix, "node_modules") } , enumerable : true }) Object.defineProperty(npm, "globalDir", { get : function () { return (process.platform !== "win32") ? path.resolve(npm.globalPrefix, "lib", "node_modules") : path.resolve(npm.globalPrefix, "node_modules") } , enumerable : true }) Object.defineProperty(npm, "root", { get : function () { return npm.dir } }) Object.defineProperty(npm, "cache", { get : function () { return npm.config.get("cache") } , set : function (r) { return npm.config.set("cache", r) } , enumerable : true }) var tmpFolder var rand = require("crypto").randomBytes(4).toString("hex") Object.defineProperty(npm, "tmp", { get : function () { if (!tmpFolder) tmpFolder = "npm-" + process.pid + "-" + rand return path.resolve(npm.config.get("tmp"), tmpFolder) } , enumerable : true }) // the better to repl you with Object.getOwnPropertyNames(npm.commands).forEach(function (n) { if (npm.hasOwnProperty(n) || n === "config") return Object.defineProperty(npm, n, { get: function () { return function () { var args = Array.prototype.slice.call(arguments, 0) , cb = defaultCb if (args.length === 1 && Array.isArray(args[0])) { args = args[0] } if (typeof args[args.length - 1] === "function") { cb = args.pop() } npm.commands[n](args, cb) } }, enumerable: false, configurable: true }) }) if (require.main === module) { require("../bin/npm-cli.js") } })() ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/outdated.js�����������������������������������������000644 �000766 �000024 �00000021607 12455173731 023625� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/* npm outdated [pkg] Does the following: 1. check for a new version of pkg If no packages are specified, then run for all installed packages. --parseable creates output like this: <fullpath>:<name@wanted>:<name@installed>:<name@latest> */ module.exports = outdated outdated.usage = "npm outdated [<pkg> [<pkg> ...]]" outdated.completion = require("./utils/completion/installed-deep.js") var path = require("path") , readJson = require("read-package-json") , cache = require("./cache.js") , asyncMap = require("slide").asyncMap , npm = require("./npm.js") , url = require("url") , color = require("ansicolors") , styles = require("ansistyles") , table = require("text-table") , semver = require("semver") , os = require("os") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") , readInstalled = require("read-installed") function outdated (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false var dir = path.resolve(npm.dir, "..") outdated_(args, dir, {}, 0, function (er, list) { if (er || silent || list.length === 0) return cb(er, list) if (npm.config.get("json")) { console.log(makeJSON(list)) } else if (npm.config.get("parseable")) { console.log(makeParseable(list)) } else { var outList = list.map(makePretty) var outTable = [[ "Package" , "Current" , "Wanted" , "Latest" , "Location" ]].concat(outList) if (npm.color) { outTable[0] = outTable[0].map(function(heading) { return styles.underline(heading) }) } var tableOpts = { align: ["l", "r", "r", "r", "l"] , stringLength: function(s) { return ansiTrim(s).length } } console.log(table(outTable, tableOpts)) } cb(null, list) }) } // [[ dir, dep, has, want, latest ]] function makePretty (p) { var dep = p[1] , dir = path.resolve(p[0], "node_modules", dep) , has = p[2] , want = p[3] , latest = p[4] if (!npm.config.get("global")) { dir = path.relative(process.cwd(), dir) } var columns = [ dep , has || "MISSING" , want , latest , dirToPrettyLocation(dir) ] if (npm.color) { columns[0] = color[has === want ? "yellow" : "red"](columns[0]) // dep columns[2] = color.green(columns[2]) // want columns[3] = color.magenta(columns[3]) // latest columns[4] = color.brightBlack(columns[4]) // dir } return columns } function ansiTrim (str) { var r = new RegExp("\x1b(?:\\[(?:\\d+[ABCDEFGJKSTm]|\\d+;\\d+[Hfm]|" + "\\d+;\\d+;\\d+m|6n|s|u|\\?25[lh])|\\w)", "g") return str.replace(r, "") } function dirToPrettyLocation (dir) { return dir.replace(/^node_modules[/\\]/, "") .replace(/[[/\\]node_modules[/\\]/g, " > ") } function makeParseable (list) { return list.map(function (p) { var dep = p[1] , dir = path.resolve(p[0], "node_modules", dep) , has = p[2] , want = p[3] , latest = p[4] return [ dir , dep + "@" + want , (has ? (dep + "@" + has) : "MISSING") , dep + "@" + latest ].join(":") }).join(os.EOL) } function makeJSON (list) { var out = {} list.forEach(function (p) { var dir = path.resolve(p[0], "node_modules", p[1]) if (!npm.config.get("global")) { dir = path.relative(process.cwd(), dir) } out[p[1]] = { current: p[2] , wanted: p[3] , latest: p[4] , location: dir } }) return JSON.stringify(out, null, 2) } function outdated_ (args, dir, parentHas, depth, cb) { // get the deps from package.json, or {<dir/node_modules/*>:"*"} // asyncMap over deps: // shouldHave = cache.add(dep, req).version // if has === shouldHave then // return outdated(args, dir/node_modules/dep, parentHas + has) // else if dep in args or args is empty // return [dir, dep, has, shouldHave] if (depth > npm.config.get("depth")) { return cb(null, []) } var deps = null readJson(path.resolve(dir, "package.json"), function (er, d) { d = d || {} if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) deps = (er) ? true : (d.dependencies || {}) if (npm.config.get("save-dev")) { deps = d.devDependencies || {} return next() } if (npm.config.get("save")) { // remove optional dependencies from dependencies during --save. Object.keys(d.optionalDependencies || {}).forEach(function (k) { delete deps[k] }) return next() } if (npm.config.get("save-optional")) { deps = d.optionalDependencies || {} return next() } var doUpdate = npm.config.get("dev") || (!npm.config.get("production") && !Object.keys(parentHas).length && !npm.config.get("global")) if (!er && d && doUpdate) { Object.keys(d.devDependencies || {}).forEach(function (k) { if (!(k in parentHas)) { deps[k] = d.devDependencies[k] } }) } return next() }) var has = null readInstalled(path.resolve(dir), { dev : true }, function (er, data) { if (er) { has = Object.create(parentHas) return next() } var pkgs = Object.keys(data.dependencies) pkgs = pkgs.filter(function (p) { return !p.match(/^[\._-]/) }) asyncMap(pkgs, function (pkg, cb) { var jsonFile = path.resolve(dir, "node_modules", pkg, "package.json") readJson(jsonFile, function (er, d) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (d && d.name && d.private) delete deps[d.name] cb(null, er ? [] : [[d.name, d.version, d._from]]) }) }, function (er, pvs) { if (er) return cb(er) has = Object.create(parentHas) pvs.forEach(function (pv) { has[pv[0]] = { version: pv[1], from: pv[2] } }) next() }) }) function next () { if (!has || !deps) return if (deps === true) { deps = Object.keys(has).reduce(function (l, r) { l[r] = "latest" return l }, {}) } // now get what we should have, based on the dep. // if has[dep] !== shouldHave[dep], then cb with the data // otherwise dive into the folder asyncMap(Object.keys(deps), function (dep, cb) { shouldUpdate(args, dir, dep, has, deps[dep], depth, cb) }, cb) } } function shouldUpdate (args, dir, dep, has, req, depth, cb) { // look up the most recent version. // if that's what we already have, or if it's not on the args list, // then dive into it. Otherwise, cb() with the data. // { version: , from: } var curr = has[dep] function skip (er) { // show user that no viable version can be found if (er) return cb(er) outdated_( args , path.resolve(dir, "node_modules", dep) , has , depth + 1 , cb ) } function doIt (wanted, latest) { cb(null, [[ dir, dep, curr && curr.version, wanted, latest, req ]]) } if (args.length && args.indexOf(dep) === -1) { return skip() } if (npa(req).type === "git") return doIt("git", "git") // search for the latest package mapToRegistry(dep, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, updateDeps) }) function updateDeps (er, d) { if (er) return cb() if (!d || !d["dist-tags"] || !d.versions) return cb() var l = d.versions[d["dist-tags"].latest] if (!l) return cb() var r = req if (d["dist-tags"][req]) r = d["dist-tags"][req] if (semver.validRange(r, true)) { // some kind of semver range. // see if it's in the doc. var vers = Object.keys(d.versions) var v = semver.maxSatisfying(vers, r, true) if (v) { return onCacheAdd(null, d.versions[v]) } } // We didn't find the version in the doc. See if cache can find it. cache.add(dep, req, null, false, onCacheAdd) function onCacheAdd(er, d) { // if this fails, then it means we can't update this thing. // it's probably a thing that isn't published. if (er) { if (er.code && er.code === "ETARGET") { // no viable version found return skip(er) } return skip() } // check that the url origin hasn't changed (#1727) and that // there is no newer version available var dFromUrl = d._from && url.parse(d._from).protocol var cFromUrl = curr && curr.from && url.parse(curr.from).protocol if (!curr || dFromUrl && cFromUrl && d._from !== curr.from || d.version !== curr.version || d.version !== l.version) doIt(d.version, l.version) else skip() } } } �������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/owner.js��������������������������������������������000644 �000766 �000024 �00000017113 12455173731 023143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = owner owner.usage = "npm owner add <username> <pkg>" + "\nnpm owner rm <username> <pkg>" + "\nnpm owner ls <pkg>" var npm = require("./npm.js") , log = require("npmlog") , readJson = require("read-package-json") , mapToRegistry = require("./utils/map-to-registry.js") owner.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length > 4) return cb() if (argv.length <= 2) { var subs = ["add", "rm"] if (opts.partialWord === "l") subs.push("ls") else subs.push("ls", "list") return cb(null, subs) } npm.commands.whoami([], true, function (er, username) { if (er) return cb() var un = encodeURIComponent(username) var byUser, theUser switch (argv[2]) { case "ls": if (argv.length > 3) return cb() return mapToRegistry("-/short", npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, cb) }) case "rm": if (argv.length > 3) { theUser = encodeURIComponent(argv[3]) byUser = "-/by-user/" + theUser + "|" + un return mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) console.error(uri) npm.registry.get(uri, { auth : auth }, function (er, d) { if (er) return cb(er) // return the intersection return cb(null, d[theUser].filter(function (p) { // kludge for server adminery. return un === "isaacs" || d[un].indexOf(p) === -1 })) }) }) } // else fallthrough case "add": if (argv.length > 3) { theUser = encodeURIComponent(argv[3]) byUser = "-/by-user/" + theUser + "|" + un return mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) console.error(uri) npm.registry.get(uri, { auth : auth }, function (er, d) { console.error(uri, er || d) // return mine that they're not already on. if (er) return cb(er) var mine = d[un] || [] , theirs = d[theUser] || [] return cb(null, mine.filter(function (p) { return theirs.indexOf(p) === -1 })) }) }) } // just list all users who aren't me. return mapToRegistry("-/users", npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, list) { if (er) return cb() return cb(null, Object.keys(list).filter(function (n) { return n !== un })) }) }) default: return cb() } }) } function owner (args, cb) { var action = args.shift() switch (action) { case "ls": case "list": return ls(args[0], cb) case "add": return add(args[0], args[1], cb) case "rm": case "remove": return rm(args[0], args[1], cb) default: return unknown(action, cb) } } function ls (pkg, cb) { if (!pkg) return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(owner.usage) ls(pkg, cb) }) mapToRegistry(pkg, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, data) { var msg = "" if (er) { log.error("owner ls", "Couldn't get owner data", pkg) return cb(er) } var owners = data.maintainers if (!owners || !owners.length) msg = "admin party!" else msg = owners.map(function (o) { return o.name + " <" + o.email + ">" }).join("\n") console.log(msg) cb(er, owners) }) }) } function add (user, pkg, cb) { if (!user) return cb(owner.usage) if (!pkg) return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(new Error(owner.usage)) add(user, pkg, cb) }) log.verbose("owner add", "%s to %s", user, pkg) mutate(pkg, user, function (u, owners) { if (!owners) owners = [] for (var i = 0, l = owners.length; i < l; i ++) { var o = owners[i] if (o.name === u.name) { log.info( "owner add" , "Already a package owner: " + o.name + " <" + o.email + ">") return false } } owners.push(u) return owners }, cb) } function rm (user, pkg, cb) { if (!pkg) return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(new Error(owner.usage)) rm(user, pkg, cb) }) log.verbose("owner rm", "%s from %s", user, pkg) mutate(pkg, user, function (u, owners) { var found = false , m = owners.filter(function (o) { var match = (o.name === user) found = found || match return !match }) if (!found) { log.info("owner rm", "Not a package owner: " + user) return false } if (!m.length) return new Error( "Cannot remove all owners of a package. Add someone else first.") return m }, cb) } function mutate (pkg, user, mutation, cb) { if (user) { var byUser = "-/user/org.couchdb.user:" + user mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, mutate_) }) } else { mutate_(null, null) } function mutate_ (er, u) { if (!er && user && (!u || u.error)) er = new Error( "Couldn't get user data for " + user + ": " + JSON.stringify(u)) if (er) { log.error("owner mutate", "Error getting user data for %s", user) return cb(er) } if (u) u = { "name" : u.name, "email" : u.email } mapToRegistry(pkg, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, data) { if (er) { log.error("owner mutate", "Error getting package data for %s", pkg) return cb(er) } // save the number of maintainers before mutation so that we can figure // out if maintainers were added or removed var beforeMutation = data.maintainers.length var m = mutation(u, data.maintainers) if (!m) return cb() // handled if (m instanceof Error) return cb(m) // error data = { _id : data._id, _rev : data._rev, maintainers : m } var dataPath = pkg.replace("/", "%2f") + "/-rev/" + data._rev mapToRegistry(dataPath, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { method : "PUT", body : data, auth : auth } npm.registry.request(uri, params, function (er, data) { if (!er && data.error) { er = new Error("Failed to update package metadata: "+JSON.stringify(data)) } if (er) { log.error("owner mutate", "Failed to update package metadata") } else if (m.length > beforeMutation) { console.log("+ %s (%s)", user, pkg) } else if (m.length < beforeMutation) { console.log("- %s (%s)", user, pkg) } cb(er, data) }) }) }) }) } } function readLocalPkg (cb) { if (npm.config.get("global")) return cb() var path = require("path") readJson(path.resolve(npm.prefix, "package.json"), function (er, d) { return cb(er, d && d.name) }) } function unknown (action, cb) { cb("Usage: \n" + owner.usage) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/pack.js���������������������������������������������000644 �000766 �000024 �00000003353 12455173731 022730� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm pack <pkg> // Packs the specified package into a .tgz file, which can then // be installed. module.exports = pack var npm = require("./npm.js") , install = require("./install.js") , cache = require("./cache.js") , fs = require("graceful-fs") , chain = require("slide").chain , path = require("path") , cwd = process.cwd() , writeStream = require('fs-write-stream-atomic') , cachedPackageRoot = require("./cache/cached-package-root.js") pack.usage = "npm pack <pkg>" // if it can be installed, it can be packed. pack.completion = install.completion function pack (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false if (args.length === 0) args = ["."] chain(args.map(function (arg) { return function (cb) { pack_(arg, cb) }}), function (er, files) { if (er || silent) return cb(er, files) printFiles(files, cb) }) } function printFiles (files, cb) { files = files.map(function (file) { return path.relative(cwd, file) }) console.log(files.join("\n")) cb() } // add to cache, then cp to the cwd function pack_ (pkg, cb) { cache.add(pkg, null, null, false, function (er, data) { if (er) return cb(er) // scoped packages get special treatment var name = data.name if (name[0] === "@") name = name.substr(1).replace(/\//g, "-") var fname = name + "-" + data.version + ".tgz" var cached = path.join(cachedPackageRoot(data), "package.tgz") , from = fs.createReadStream(cached) , to = writeStream(fname) , errState = null from.on("error", cb_) to.on("error", cb_) to.on("close", cb_) from.pipe(to) function cb_ (er) { if (errState) return if (er) return cb(errState = er) cb(null, fname) } }) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/prefix.js�������������������������������������������000644 �000766 �000024 �00000000503 12455173731 023301� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = prefix var npm = require("./npm.js") prefix.usage = "npm prefix\nnpm prefix -g\n(just prints the prefix folder)" function prefix (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false if (!silent) console.log(npm.prefix) process.nextTick(cb.bind(this, null, npm.prefix)) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/prune.js��������������������������������������������000644 �000766 �000024 �00000002745 12455173731 023147� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// prune extraneous packages. module.exports = prune prune.usage = "npm prune" var readInstalled = require("read-installed") , npm = require("./npm.js") , path = require("path") , readJson = require("read-package-json") , log = require("npmlog") prune.completion = require("./utils/completion/installed-deep.js") function prune (args, cb) { //check if is a valid package.json file var jsonFile = path.resolve(npm.dir, "..", "package.json" ) readJson(jsonFile, log.warn, function (er) { if (er) return cb(er) next() }) function next() { var opt = { depth: npm.config.get("depth"), dev: !npm.config.get("production") || npm.config.get("dev") } readInstalled(npm.prefix, opt, function (er, data) { if (er) return cb(er) prune_(args, data, cb) }) } } function prune_ (args, data, cb) { npm.commands.unbuild(prunables(args, data, []), cb) } function prunables (args, data, seen) { var deps = data.dependencies || {} return Object.keys(deps).map(function (d) { if (typeof deps[d] !== "object" || seen.indexOf(deps[d]) !== -1) return null seen.push(deps[d]) if (deps[d].extraneous && (args.length === 0 || args.indexOf(d) !== -1)) { var extra = deps[d] delete deps[d] return extra.path } return prunables(args, deps[d], seen) }).filter(function (d) { return d !== null }) .reduce(function FLAT (l, r) { return l.concat(Array.isArray(r) ? r.reduce(FLAT,[]) : r) }, []) } ���������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/publish.js������������������������������������������000644 �000766 �000024 �00000011234 12455173731 023455� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = publish var npm = require("./npm.js") , log = require("npmlog") , path = require("path") , readJson = require("read-package-json") , lifecycle = require("./utils/lifecycle.js") , chain = require("slide").chain , Conf = require("./config/core.js").Conf , CachingRegClient = require("./cache/caching-client.js") , mapToRegistry = require("./utils/map-to-registry.js") , cachedPackageRoot = require("./cache/cached-package-root.js") , createReadStream = require("graceful-fs").createReadStream publish.usage = "npm publish <tarball>" + "\nnpm publish <folder>" + "\n\nPublishes '.' if no argument supplied" publish.completion = function (opts, cb) { // publish can complete to a folder with a package.json // or a tarball, or a tarball url. // for now, not yet implemented. return cb() } function publish (args, isRetry, cb) { if (typeof cb !== "function") { cb = isRetry isRetry = false } if (args.length === 0) args = ["."] if (args.length !== 1) return cb(publish.usage) log.verbose("publish", args) var arg = args[0] // if it's a local folder, then run the prepublish there, first. readJson(path.resolve(arg, "package.json"), function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (data) { if (!data.name) return cb(new Error("No name provided")) if (!data.version) return cb(new Error("No version provided")) } // Error is OK. Could be publishing a URL or tarball, however, that means // that we will not have automatically run the prepublish script, since // that gets run when adding a folder to the cache. if (er) return cacheAddPublish(arg, false, isRetry, cb) else cacheAddPublish(arg, true, isRetry, cb) }) } // didPre in this case means that we already ran the prepublish script, // and that the "dir" is an actual directory, and not something silly // like a tarball or name@version thing. // That means that we can run publish/postpublish in the dir, rather than // in the cache dir. function cacheAddPublish (dir, didPre, isRetry, cb) { npm.commands.cache.add(dir, null, null, false, function (er, data) { if (er) return cb(er) log.silly("publish", data) var cachedir = path.resolve(cachedPackageRoot(data), "package") chain([ !didPre && [lifecycle, data, "prepublish", cachedir] , [publish_, dir, data, isRetry, cachedir] , [lifecycle, data, "publish", didPre ? dir : cachedir] , [lifecycle, data, "postpublish", didPre ? dir : cachedir] ] , cb ) }) } function publish_ (arg, data, isRetry, cachedir, cb) { if (!data) return cb(new Error("no package.json file found")) var registry = npm.registry var config = npm.config // check for publishConfig hash if (data.publishConfig) { config = new Conf(npm.config) config.save = npm.config.save.bind(npm.config) // don't modify the actual publishConfig object, in case we have // to set a login token or some other data. config.unshift(Object.keys(data.publishConfig).reduce(function (s, k) { s[k] = data.publishConfig[k] return s }, {})) registry = new CachingRegClient(config) } data._npmVersion = npm.version data._nodeVersion = process.versions.node delete data.modules if (data.private) return cb( new Error( "This package has been marked as private\n" + "Remove the 'private' field from the package.json to publish it." ) ) mapToRegistry(data.name, config, function (er, registryURI, auth, registryBase) { if (er) return cb(er) var tarballPath = cachedir + ".tgz" // we just want the base registry URL in this case log.verbose("publish", "registryBase", registryBase) log.silly("publish", "uploading", tarballPath) data._npmUser = { name : auth.username, email : auth.email } var params = { metadata : data, body : createReadStream(tarballPath), auth : auth } registry.publish(registryBase, params, function (er) { if (er && er.code === "EPUBLISHCONFLICT" && npm.config.get("force") && !isRetry) { log.warn("publish", "Forced publish over " + data._id) return npm.commands.unpublish([data._id], function (er) { // ignore errors. Use the force. Reach out with your feelings. // but if it fails again, then report the first error. publish([arg], er || true, cb) }) } // report the unpublish error if this was a retry and unpublish failed if (er && isRetry && isRetry !== true) return cb(isRetry) if (er) return cb(er) console.log("+ " + data._id) cb() }) }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/rebuild.js������������������������������������������000644 �000766 �000024 �00000004054 12455173731 023437� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = rebuild var readInstalled = require("read-installed") , semver = require("semver") , log = require("npmlog") , npm = require("./npm.js") , npa = require("npm-package-arg") rebuild.usage = "npm rebuild [<name>[@<version>] [name[@<version>] ...]]" rebuild.completion = require("./utils/completion/installed-deep.js") function rebuild (args, cb) { var opt = { depth: npm.config.get("depth"), dev: true } readInstalled(npm.prefix, opt, function (er, data) { log.info("readInstalled", typeof data) if (er) return cb(er) var set = filter(data, args) , folders = Object.keys(set).filter(function (f) { return f !== npm.prefix }) if (!folders.length) return cb() log.silly("rebuild set", folders) cleanBuild(folders, set, cb) }) } function cleanBuild (folders, set, cb) { npm.commands.build(folders, function (er) { if (er) return cb(er) console.log(folders.map(function (f) { return set[f] + " " + f }).join("\n")) cb() }) } function filter (data, args, set, seen) { if (!set) set = {} if (!seen) seen = {} if (set.hasOwnProperty(data.path)) return set if (seen.hasOwnProperty(data.path)) return set seen[data.path] = true var pass if (!args.length) pass = true // rebuild everything else if (data.name && data._id) { for (var i = 0, l = args.length; i < l; i ++) { var arg = args[i] , nv = npa(arg) , n = nv.name , v = nv.rawSpec if (n !== data.name) continue if (!semver.satisfies(data.version, v, true)) continue pass = true break } } if (pass && data._id) { log.verbose("rebuild", "path, id", [data.path, data._id]) set[data.path] = data._id } // need to also dive through kids, always. // since this isn't an install these won't get auto-built unless // they're not dependencies. Object.keys(data.dependencies || {}).forEach(function (d) { // return var dep = data.dependencies[d] if (typeof dep === "string") return filter(dep, args, set, seen) }) return set } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/repo.js���������������������������������������������000644 �000766 �000024 �00000004376 12455173731 022765� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = repo repo.usage = "npm repo <pkgname>" var npm = require("./npm.js") , opener = require("opener") , github = require("github-url-from-git") , githubUserRepo = require("github-url-from-username-repo") , path = require("path") , readJson = require("read-package-json") , fs = require("fs") , url_ = require("url") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") repo.completion = function (opts, cb) { if (opts.conf.argv.remain.length > 2) return cb() mapToRegistry("/-/short", npm.config, function (er, uri) { if (er) return cb(er) npm.registry.get(uri, { timeout : 60000 }, function (er, list) { return cb(null, list || []) }) }) } function repo (args, cb) { var n = args.length && npa(args[0]).name || "." fs.stat(n, function (er, s) { if (er && er.code === "ENOENT") return callRegistry(n, cb) else if (er) return cb(er) if (!s.isDirectory()) return callRegistry(n, cb) readJson(path.resolve(n, "package.json"), function (er, d) { if (er) return cb(er) getUrlAndOpen(d, cb) }) }) } function getUrlAndOpen (d, cb) { var r = d.repository if (!r) return cb(new Error("no repository")) // XXX remove this when npm@v1.3.10 from node 0.10 is deprecated // from https://github.com/npm/npm-www/issues/418 if (githubUserRepo(r.url)) r.url = githubUserRepo(r.url) var url = (r.url && ~r.url.indexOf("github")) ? github(r.url) : nonGithubUrl(r.url) if (!url) return cb(new Error("no repository: could not get url")) opener(url, { command: npm.config.get("browser") }, cb) } function callRegistry (n, cb) { mapToRegistry(n, npm.config, function (er, uri) { if (er) return cb(er) npm.registry.get(uri + "/latest", { timeout : 3600 }, function (er, d) { if (er) return cb(er) getUrlAndOpen(d, cb) }) }) } function nonGithubUrl (url) { try { var idx = url.indexOf("@") if (idx !== -1) { url = url.slice(idx+1).replace(/:([^\d]+)/, "/$1") } url = url_.parse(url) var protocol = url.protocol === "https:" ? "https:" : "http:" return protocol + "//" + (url.host || "") + url.path.replace(/\.git$/, "") } catch(e) {} } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/restart.js������������������������������������������000644 �000766 �000024 �00000000100 12455173731 023461� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./utils/lifecycle.js").cmd("restart") ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/root.js���������������������������������������������000644 �000766 �000024 �00000000461 12455173731 022772� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = root var npm = require("./npm.js") root.usage = "npm root\nnpm root -g\n(just prints the root folder)" function root (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false if (!silent) console.log(npm.dir) process.nextTick(cb.bind(this, null, npm.dir)) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/run-script.js���������������������������������������000644 �000766 �000024 �00000010605 12455173731 024116� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = runScript var lifecycle = require("./utils/lifecycle.js") , npm = require("./npm.js") , path = require("path") , readJson = require("read-package-json") , log = require("npmlog") , chain = require("slide").chain runScript.usage = "npm run-script <command> [-- <args>]" runScript.completion = function (opts, cb) { // see if there's already a package specified. var argv = opts.conf.argv.remain , installedShallow = require("./utils/completion/installed-shallow.js") if (argv.length >= 4) return cb() if (argv.length === 3) { // either specified a script locally, in which case, done, // or a package, in which case, complete against its scripts var json = path.join(npm.localPrefix, "package.json") return readJson(json, function (er, d) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) d = {} var scripts = Object.keys(d.scripts || {}) console.error("local scripts", scripts) if (scripts.indexOf(argv[2]) !== -1) return cb() // ok, try to find out which package it was, then var pref = npm.config.get("global") ? npm.config.get("prefix") : npm.localPrefix var pkgDir = path.resolve( pref, "node_modules" , argv[2], "package.json" ) readJson(pkgDir, function (er, d) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) d = {} var scripts = Object.keys(d.scripts || {}) return cb(null, scripts) }) }) } // complete against the installed-shallow, and the pwd's scripts. // but only packages that have scripts var installed , scripts installedShallow(opts, function (d) { return d.scripts }, function (er, inst) { installed = inst next() }) if (npm.config.get("global")) { scripts = [] next() } else readJson(path.join(npm.localPrefix, "package.json"), function (er, d) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) d = d || {} scripts = Object.keys(d.scripts || {}) next() }) function next () { if (!installed || !scripts) return cb(null, scripts.concat(installed)) } } function runScript (args, cb) { if (!args.length) return list(cb) var pkgdir = npm.localPrefix , cmd = args.shift() readJson(path.resolve(pkgdir, "package.json"), function (er, d) { if (er) return cb(er) run(d, pkgdir, cmd, args, cb) }) } function list(cb) { var json = path.join(npm.localPrefix, "package.json") return readJson(json, function(er, d) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) d = {} var scripts = Object.keys(d.scripts || {}) if (log.level === "silent") { return cb(null, scripts) } if (npm.config.get("json")) { console.log(JSON.stringify(d.scripts || {}, null, 2)) return cb(null, scripts) } var s = ":" var prefix = "" if (!npm.config.get("parseable")) { s = "\n " prefix = " " console.log("Available scripts in the %s package:", d.name) } scripts.forEach(function(script) { console.log(prefix + script + s + d.scripts[script]) }) return cb(null, scripts) }) } function run (pkg, wd, cmd, args, cb) { if (!pkg.scripts) pkg.scripts = {} var cmds if (cmd === "restart") { cmds = [ "prestop", "stop", "poststop", "restart", "prestart", "start", "poststart" ] } else { if (!pkg.scripts[cmd]) { if (cmd === "test") { pkg.scripts.test = "echo \"Error: no test specified\""; } else { return cb(new Error("missing script: " + cmd)); } } cmds = [cmd] } if (!cmd.match(/^(pre|post)/)) { cmds = ["pre"+cmd].concat(cmds).concat("post"+cmd) } log.verbose("run-script", cmds) chain(cmds.map(function (c) { // pass cli arguments after -- to script. if (pkg.scripts[c] && c === cmd) pkg.scripts[c] = pkg.scripts[c] + joinArgs(args) // when running scripts explicitly, assume that they're trusted. return [lifecycle, pkg, c, wd, true] }), cb) } // join arguments after '--' and pass them to script, // handle special characters such as ', ", ' '. function joinArgs (args) { var joinedArgs = "" args.forEach(function(arg) { if (arg.match(/[ '"]/)) arg = '"' + arg.replace(/"/g, '\\"') + '"' joinedArgs += " " + arg }) return joinedArgs } ���������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/search.js�������������������������������������������000644 �000766 �000024 �00000017657 12455173731 023273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = exports = search var npm = require("./npm.js") , columnify = require("columnify") , mapToRegistry = require("./utils/map-to-registry.js") , updateIndex = require("./cache/update-index.js") search.usage = "npm search [some search terms ...]" search.completion = function (opts, cb) { var compl = {} , partial = opts.partialWord , ipartial = partial.toLowerCase() , plen = partial.length // get the batch of data that matches so far. // this is an example of using npm.commands.search programmatically // to fetch data that has been filtered by a set of arguments. search(opts.conf.argv.remain.slice(2), true, function (er, data) { if (er) return cb(er) Object.keys(data).forEach(function (name) { data[name].words.split(" ").forEach(function (w) { if (w.toLowerCase().indexOf(ipartial) === 0) { compl[partial + w.substr(plen)] = true } }) }) cb(null, Object.keys(compl)) }) } function search (args, silent, staleness, cb) { if (typeof cb !== "function") cb = staleness, staleness = 600 if (typeof cb !== "function") cb = silent, silent = false var searchopts = npm.config.get("searchopts") , searchexclude = npm.config.get("searchexclude") if (typeof searchopts !== "string") searchopts = "" searchopts = searchopts.split(/\s+/) if (typeof searchexclude === "string") { searchexclude = searchexclude.split(/\s+/) } else searchexclude = [] var opts = searchopts.concat(args).map(function (s) { return s.toLowerCase() }).filter(function (s) { return s }) searchexclude = searchexclude.map(function (s) { return s.toLowerCase() }) getFilteredData( staleness, opts, searchexclude, function (er, data) { // now data is the list of data that we want to show. // prettify and print it, and then provide the raw // data to the cb. if (er || silent) return cb(er, data) console.log(prettify(data, args)) cb(null, data) }) } function getFilteredData (staleness, args, notArgs, cb) { mapToRegistry("-/all", npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { timeout : staleness, follow : true, staleOk : true, auth : auth } updateIndex(uri, params, function (er, data) { if (er) return cb(er) return cb(null, filter(data, args, notArgs)) }) }) } function filter (data, args, notArgs) { // data={<name>:{package data}} return Object.keys(data).map(function (d) { return data[d] }).filter(function (d) { return typeof d === "object" }).map(stripData).map(getWords).filter(function (data) { return filterWords(data, args, notArgs) }).reduce(function (l, r) { l[r.name] = r return l }, {}) } function stripData (data) { return { name: data.name , description: npm.config.get("description") ? data.description : "" , maintainers: (data.maintainers || []).map(function (m) { return "=" + m.name }) , url: !Object.keys(data.versions || {}).length ? data.url : null , keywords: data.keywords || [] , version: Object.keys(data.versions || {})[0] || [] , time: data.time && data.time.modified && (new Date(data.time.modified).toISOString() .split("T").join(" ") .replace(/:[0-9]{2}\.[0-9]{3}Z$/, "")) .slice(0, -5) // remove time || "prehistoric" } } function getWords (data) { data.words = [ data.name ] .concat(data.description) .concat(data.maintainers) .concat(data.url && ("<" + data.url + ">")) .concat(data.keywords) .map(function (f) { return f && f.trim && f.trim() }) .filter(function (f) { return f }) .join(" ") .toLowerCase() return data } function filterWords (data, args, notArgs) { var words = data.words for (var i = 0, l = args.length; i < l; i ++) { if (!match(words, args[i])) return false } for (i = 0, l = notArgs.length; i < l; i ++) { if (match(words, notArgs[i])) return false } return true } function match (words, arg) { if (arg.charAt(0) === "/") { arg = arg.replace(/\/$/, "") arg = new RegExp(arg.substr(1, arg.length - 1)) return words.match(arg) } return words.indexOf(arg) !== -1 } function prettify (data, args) { var searchsort = (npm.config.get("searchsort") || "NAME").toLowerCase() , sortField = searchsort.replace(/^\-+/, "") , searchRev = searchsort.charAt(0) === "-" , truncate = !npm.config.get("long") if (Object.keys(data).length === 0) { return "No match found for "+(args.map(JSON.stringify).join(" ")) } var lines = Object.keys(data).map(function (d) { // strip keyname return data[d] }).map(function(dat) { dat.author = dat.maintainers delete dat.maintainers dat.date = dat.time delete dat.time return dat }).map(function(dat) { // split keywords on whitespace or , if (typeof dat.keywords === "string") { dat.keywords = dat.keywords.split(/[,\s]+/) } if (Array.isArray(dat.keywords)) { dat.keywords = dat.keywords.join(" ") } // split author on whitespace or , if (typeof dat.author === "string") { dat.author = dat.author.split(/[,\s]+/) } if (Array.isArray(dat.author)) { dat.author = dat.author.join(" ") } return dat }) lines.sort(function(a, b) { var aa = a[sortField].toLowerCase() , bb = b[sortField].toLowerCase() return aa === bb ? 0 : aa < bb ? -1 : 1 }) if (searchRev) lines.reverse() var columns = npm.config.get("description") ? ["name", "description", "author", "date", "version", "keywords"] : ["name", "author", "date", "version", "keywords"] var output = columnify(lines, { include: columns , truncate: truncate , config: { name: { maxWidth: 40, truncate: false, truncateMarker: "" } , description: { maxWidth: 60 } , author: { maxWidth: 20 } , date: { maxWidth: 11 } , version: { maxWidth: 11 } , keywords: { maxWidth: Infinity } } }) output = trimToMaxWidth(output) output = highlightSearchTerms(output, args) return output } var colors = [31, 33, 32, 36, 34, 35 ] , cl = colors.length function addColorMarker (str, arg, i) { var m = i % cl + 1 , markStart = String.fromCharCode(m) , markEnd = String.fromCharCode(0) if (arg.charAt(0) === "/") { //arg = arg.replace(/\/$/, "") return str.replace( new RegExp(arg.substr(1, arg.length - 2), "gi") , function (bit) { return markStart + bit + markEnd } ) } // just a normal string, do the split/map thing var pieces = str.toLowerCase().split(arg.toLowerCase()) , p = 0 return pieces.map(function (piece) { piece = str.substr(p, piece.length) var mark = markStart + str.substr(p+piece.length, arg.length) + markEnd p += piece.length + arg.length return piece + mark }).join("") } function colorize (line) { for (var i = 0; i < cl; i ++) { var m = i + 1 var color = npm.color ? "\033["+colors[i]+"m" : "" line = line.split(String.fromCharCode(m)).join(color) } var uncolor = npm.color ? "\033[0m" : "" return line.split("\u0000").join(uncolor) } function getMaxWidth() { var cols try { var tty = require("tty") , stdout = process.stdout cols = !tty.isatty(stdout.fd) ? Infinity : process.stdout.getWindowSize()[0] cols = (cols === 0) ? Infinity : cols } catch (ex) { cols = Infinity } return cols } function trimToMaxWidth(str) { var maxWidth = getMaxWidth() return str.split("\n").map(function(line) { return line.slice(0, maxWidth) }).join("\n") } function highlightSearchTerms(str, terms) { terms.forEach(function (arg, i) { str = addColorMarker(str, arg, i) }) return colorize(str).trim() } ���������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/set.js����������������������������������������������000644 �000766 �000024 �00000000424 12455173731 022601� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = set set.usage = "npm set <key> <value> (See `npm config`)" var npm = require("./npm.js") set.completion = npm.commands.config.completion function set (args, cb) { if (!args.length) return cb(set.usage) npm.commands.config(["set"].concat(args), cb) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/shrinkwrap.js���������������������������������������000644 �000766 �000024 �00000004440 12455173731 024200� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// emit JSON describing versions of all packages currently installed (for later // use with shrinkwrap install) module.exports = exports = shrinkwrap var npm = require("./npm.js") , log = require("npmlog") , fs = require("fs") , writeFileAtomic = require("write-file-atomic") , path = require("path") , readJson = require("read-package-json") , sortedObject = require("sorted-object") shrinkwrap.usage = "npm shrinkwrap" function shrinkwrap (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false if (args.length) { log.warn("shrinkwrap", "doesn't take positional args") } npm.commands.ls([], true, function (er, _, pkginfo) { if (er) return cb(er) shrinkwrap_(pkginfo, silent, npm.config.get("dev"), cb) }) } function shrinkwrap_ (pkginfo, silent, dev, cb) { if (pkginfo.problems) { return cb(new Error("Problems were encountered\n" +"Please correct and try again.\n" +pkginfo.problems.join("\n"))) } if (!dev) { // remove dev deps unless the user does --dev readJson(path.resolve(npm.prefix, "package.json"), function (er, data) { if (er) return cb(er) if (data.devDependencies) { Object.keys(data.devDependencies).forEach(function (dep) { if (data.dependencies && data.dependencies[dep]) { // do not exclude the dev dependency if it's also listed as a dependency return } log.warn("shrinkwrap", "Excluding devDependency: %s", dep) delete pkginfo.dependencies[dep] }) } save(pkginfo, silent, cb) }) } else { save(pkginfo, silent, cb) } } function save (pkginfo, silent, cb) { // copy the keys over in a well defined order // because javascript objects serialize arbitrarily pkginfo.dependencies = sortedObject(pkginfo.dependencies || {}) var swdata try { swdata = JSON.stringify(pkginfo, null, 2) + "\n" } catch (er) { log.error("shrinkwrap", "Error converting package info to json") return cb(er) } var file = path.resolve(npm.prefix, "npm-shrinkwrap.json") writeFileAtomic(file, swdata, function (er) { if (er) return cb(er) if (silent) return cb(null, pkginfo) console.log("wrote npm-shrinkwrap.json") cb(null, pkginfo) }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/star.js���������������������������������������������000644 �000766 �000024 �00000002366 12455173731 022766� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = star var npm = require("./npm.js") , log = require("npmlog") , asyncMap = require("slide").asyncMap , mapToRegistry = require("./utils/map-to-registry.js") star.usage = "npm star <package> [pkg, pkg, ...]\n" + "npm unstar <package> [pkg, pkg, ...]" star.completion = function (opts, cb) { mapToRegistry("-/short", npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { timeout : 60000, auth : auth } npm.registry.get(uri, params, function (er, list) { return cb(null, list || []) }) }) } function star (args, cb) { if (!args.length) return cb(star.usage) var s = npm.config.get("unicode") ? "\u2605 " : "(*)" , u = npm.config.get("unicode") ? "\u2606 " : "( )" , using = !(npm.command.match(/^un/)) if (!using) s = u asyncMap(args, function (pkg, cb) { mapToRegistry(pkg, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { starred : using, auth : auth } npm.registry.star(uri, params, function (er, data, raw, req) { if (!er) { console.log(s + " "+pkg) log.verbose("star", data) } cb(er, data, raw, req) }) }) }, cb) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/stars.js��������������������������������������������000644 �000766 �000024 �00000001460 12455173731 023143� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = stars stars.usage = "npm stars [username]" var npm = require("./npm.js") , log = require("npmlog") , mapToRegistry = require("./utils/map-to-registry.js") function stars (args, cb) { npm.commands.whoami([], true, function (er, username) { var name = args.length === 1 ? args[0] : username mapToRegistry("", npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { username : name, auth : auth } npm.registry.stars(uri, params, showstars) }) }) function showstars (er, data) { if (er) return cb(er) if (data.rows.length === 0) { log.warn("stars", "user has not starred any packages.") } else { data.rows.forEach(function(a) { console.log(a.value) }) } cb() } } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/start.js��������������������������������������������000644 �000766 �000024 �00000000076 12455173731 023146� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./utils/lifecycle.js").cmd("start") ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/stop.js���������������������������������������������000644 �000766 �000024 �00000000075 12455173731 022775� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = require("./utils/lifecycle.js").cmd("stop") �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/substack.js�����������������������������������������000644 �000766 �000024 �00000000731 12455173731 023626� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = substack var npm = require("./npm.js") var isms = [ "\033[32mbeep \033[35mboop\033[m" , "Replace your configs with services" , "SEPARATE ALL THE CONCERNS!" , "MODULE ALL THE THINGS!" , "\\o/" , "but first, burritos" , "full time mad scientist here" , "c/,,\\" ] function substack (args, cb) { var i = Math.floor(Math.random() * isms.length) console.log(isms[i]) var c = args.shift() if (c) npm.commands[c](args, cb) else cb() } ���������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/tag.js����������������������������������������������000644 �000766 �000024 �00000001712 12455173731 022562� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// turns out tagging isn't very complicated // all the smarts are in the couch. module.exports = tag tag.usage = "npm tag <project>@<version> [<tag>]" tag.completion = require("./unpublish.js").completion var npm = require("./npm.js") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") , semver = require("semver") function tag (args, cb) { var thing = npa(args.shift() || "") , project = thing.name , version = thing.rawSpec , t = args.shift() || npm.config.get("tag") t = t.trim() if (!project || !version || !t) return cb("Usage:\n"+tag.usage) if (semver.validRange(t)) { var er = new Error("Tag name must not be a valid SemVer range: " + t) return cb(er) } mapToRegistry(project, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { version : version, tag : t, auth : auth } npm.registry.tag(uri, params, cb) }) } ������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/test.js���������������������������������������������000644 �000766 �000024 �00000000446 12455173731 022771� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = test var testCmd = require("./utils/lifecycle.js").cmd("test") function test (args, cb) { testCmd(args, function (er) { if (!er) return cb() if (er.code === "ELIFECYCLE") { return cb("Test failed. See above for more details.") } return cb(er) }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/unbuild.js������������������������������������������000644 �000766 �000024 �00000007144 12455173731 023456� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = unbuild unbuild.usage = "npm unbuild <folder>\n(this is plumbing)" var readJson = require("read-package-json") , gentlyRm = require("./utils/gently-rm.js") , npm = require("./npm.js") , path = require("path") , lifecycle = require("./utils/lifecycle.js") , asyncMap = require("slide").asyncMap , chain = require("slide").chain , log = require("npmlog") , build = require("./build.js") // args is a list of folders. // remove any bins/etc, and then delete the folder. function unbuild (args, silent, cb) { if (typeof silent === "function") cb = silent, silent = false asyncMap(args, unbuild_(silent), cb) } function unbuild_ (silent) { return function (folder, cb_) { function cb (er) { cb_(er, path.relative(npm.root, folder)) } folder = path.resolve(folder) delete build._didBuild[folder] log.verbose("unbuild", folder.substr(npm.prefix.length + 1)) readJson(path.resolve(folder, "package.json"), function (er, pkg) { // if no json, then just trash it, but no scripts or whatever. if (er) return gentlyRm(folder, false, npm.prefix, cb) readJson.cache.del(folder) chain ( [ [lifecycle, pkg, "preuninstall", folder, false, true] , [lifecycle, pkg, "uninstall", folder, false, true] , !silent && function(cb) { console.log("unbuild " + pkg._id) cb() } , [rmStuff, pkg, folder] , [lifecycle, pkg, "postuninstall", folder, false, true] , [gentlyRm, folder, false, npm.prefix] ] , cb ) }) }} function rmStuff (pkg, folder, cb) { // if it's global, and folder is in {prefix}/node_modules, // then bins are in {prefix}/bin // otherwise, then bins are in folder/../.bin var parent = path.dirname(folder) , gnm = npm.dir , top = gnm === parent readJson.cache.del(path.resolve(folder, "package.json")) log.verbose("unbuild rmStuff", pkg._id, "from", gnm) if (!top) log.verbose("unbuild rmStuff", "in", parent) asyncMap([rmBins, rmMans], function (fn, cb) { fn(pkg, folder, parent, top, cb) }, cb) } function rmBins (pkg, folder, parent, top, cb) { if (!pkg.bin) return cb() var binRoot = top ? npm.bin : path.resolve(parent, ".bin") asyncMap(Object.keys(pkg.bin), function (b, cb) { if (process.platform === "win32") { chain([ [gentlyRm, path.resolve(binRoot, b) + ".cmd", true] , [gentlyRm, path.resolve(binRoot, b), true] ], cb) } else { gentlyRm(path.resolve(binRoot, b), true, cb) } }, cb) } function rmMans (pkg, folder, parent, top, cb) { if (!pkg.man || !top || process.platform === "win32" || !npm.config.get("global")) { return cb() } var manRoot = path.resolve(npm.config.get("prefix"), "share", "man") log.verbose("rmMans", "man files are", pkg.man, "in", manRoot) asyncMap(pkg.man, function (man, cb) { if (Array.isArray(man)) { man.forEach(rmMan) } else { rmMan(man) } function rmMan (man) { log.silly("rmMan", "preparing to remove", man) var parseMan = man.match(/(.*\.([0-9]+)(\.gz)?)$/) if (!parseMan) { log.error( "rmMan", man, "is not a valid name for a man file.", "Man files must end with a number, " + "and optionally a .gz suffix if they are compressed." ) return cb() } var stem = parseMan[1] var sxn = parseMan[2] var gz = parseMan[3] || "" var bn = path.basename(stem) var manDest = path.join( manRoot, "man"+sxn, (bn.indexOf(pkg.name) === 0 ? bn : pkg.name+"-"+bn)+"."+sxn+gz ) gentlyRm(manDest, true, cb) } }, cb) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/uninstall.js����������������������������������������000644 �000766 �000024 �00000007526 12455173731 024031� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // remove a package. module.exports = uninstall uninstall.usage = "npm uninstall <name>[@<version> [<name>[@<version>] ...]" + "\nnpm rm <name>[@<version> [<name>[@<version>] ...]" uninstall.completion = require("./utils/completion/installed-shallow.js") var fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , log = require("npmlog") , readJson = require("read-package-json") , path = require("path") , npm = require("./npm.js") , asyncMap = require("slide").asyncMap function uninstall (args, cb) { // this is super easy // get the list of args that correspond to package names in either // the global npm.dir, // then call unbuild on all those folders to pull out their bins // and mans and whatnot, and then delete the folder. var nm = npm.dir if (args.length === 1 && args[0] === ".") args = [] if (args.length) return uninstall_(args, nm, cb) // remove this package from the global space, if it's installed there if (npm.config.get("global")) return cb(uninstall.usage) readJson(path.resolve(npm.prefix, "package.json"), function (er, pkg) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) return cb(uninstall.usage) uninstall_( [pkg.name] , npm.dir , cb ) }) } function uninstall_ (args, nm, cb) { // if we've been asked to --save or --save-dev or --save-optional, // then also remove it from the associated dependencies hash. var s = npm.config.get('save') , d = npm.config.get('save-dev') , o = npm.config.get('save-optional') if (s || d || o) { cb = saver(args, nm, cb) } asyncMap(args, function (arg, cb) { // uninstall .. should not delete /usr/local/lib/node_modules/.. var p = path.join(path.resolve(nm), path.join("/", arg)) if (path.resolve(p) === nm) { log.warn("uninstall", "invalid argument: %j", arg) return cb(null, []) } fs.lstat(p, function (er) { if (er) { log.warn("uninstall", "not installed in %s: %j", nm, arg) return cb(null, []) } cb(null, p) }) }, function (er, folders) { if (er) return cb(er) asyncMap(folders, npm.commands.unbuild, cb) }) } function saver (args, nm, cb_) { return cb function cb (er, data) { var s = npm.config.get('save') , d = npm.config.get('save-dev') , o = npm.config.get('save-optional') if (er || !(s || d || o)) return cb_(er, data) var pj = path.resolve(nm, '..', 'package.json') // don't use readJson here, because we don't want all the defaults // filled in, for mans and other bs. fs.readFile(pj, 'utf8', function (er, json) { var pkg try { pkg = JSON.parse(json) } catch (_) {} if (!pkg) return cb_(null, data) var bundle if (npm.config.get('save-bundle')) { bundle = pkg.bundleDependencies || pkg.bundledDependencies if (!Array.isArray(bundle)) bundle = undefined } var changed = false args.forEach(function (a) { ; [ [s, 'dependencies'] , [o, 'optionalDependencies'] , [d, 'devDependencies'] ].forEach(function (f) { var flag = f[0] , field = f[1] if (!flag || !pkg[field] || !pkg[field].hasOwnProperty(a)) return changed = true if (bundle) { var i = bundle.indexOf(a) if (i !== -1) bundle.splice(i, 1) } delete pkg[field][a] }) }) if (!changed) return cb_(null, data) if (bundle) { delete pkg.bundledDependencies if (bundle.length) { pkg.bundleDependencies = bundle } else { delete pkg.bundleDependencies } } writeFileAtomic(pj, JSON.stringify(pkg, null, 2) + "\n", function (er) { return cb_(er, data) }) }) } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/unpublish.js����������������������������������������000644 �000766 �000024 �00000006240 12455173731 024021� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = unpublish var log = require("npmlog") , npm = require("./npm.js") , readJson = require("read-package-json") , path = require("path") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") unpublish.usage = "npm unpublish <project>[@<version>]" unpublish.completion = function (opts, cb) { if (opts.conf.argv.remain.length >= 3) return cb() npm.commands.whoami([], true, function (er, username) { if (er) return cb() var un = encodeURIComponent(username) if (!un) return cb() var byUser = "-/by-user/" + un mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, pkgs) { // do a bit of filtering at this point, so that we don't need // to fetch versions for more than one thing, but also don't // accidentally a whole project. pkgs = pkgs[un] if (!pkgs || !pkgs.length) return cb() var pp = npa(opts.partialWord).name pkgs = pkgs.filter(function (p) { return p.indexOf(pp) === 0 }) if (pkgs.length > 1) return cb(null, pkgs) mapToRegistry(pkgs[0], npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, d) { if (er) return cb(er) var vers = Object.keys(d.versions) if (!vers.length) return cb(null, pkgs) return cb(null, vers.map(function (v) { return pkgs[0] + "@" + v })) }) }) }) }) }) } function unpublish (args, cb) { if (args.length > 1) return cb(unpublish.usage) var thing = args.length ? npa(args[0]) : {} , project = thing.name , version = thing.rawSpec log.silly("unpublish", "args[0]", args[0]) log.silly("unpublish", "thing", thing) if (!version && !npm.config.get("force")) { return cb("Refusing to delete entire project.\n" + "Run with --force to do this.\n" + unpublish.usage) } if (!project || path.resolve(project) === npm.localPrefix) { // if there's a package.json in the current folder, then // read the package name and version out of that. var cwdJson = path.join(npm.localPrefix, "package.json") return readJson(cwdJson, function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (er) return cb("Usage:\n" + unpublish.usage) gotProject(data.name, data.version, cb) }) } return gotProject(project, version, cb) } function gotProject (project, version, cb_) { function cb (er) { if (er) return cb_(er) console.log("- " + project + (version ? "@" + version : "")) cb_() } // remove from the cache first npm.commands.cache(["clean", project, version], function (er) { if (er) { log.error("unpublish", "Failed to clean cache") return cb(er) } mapToRegistry(project, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { version : version, auth : auth } npm.registry.unpublish(uri, params, cb) }) }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/update.js�������������������������������������������000644 �000766 �000024 �00000002161 12455173731 023270� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������/* for each pkg in prefix that isn't a git repo look for a new version of pkg that satisfies dep if so, install it. if not, then update it */ module.exports = update update.usage = "npm update [pkg]" var npm = require("./npm.js") , asyncMap = require("slide").asyncMap , log = require("npmlog") // load these, just so that we know that they'll be available, in case // npm itself is getting overwritten. , install = require("./install.js") , build = require("./build.js") update.completion = npm.commands.outdated.completion function update (args, cb) { npm.commands.outdated(args, true, function (er, outdated) { log.info("outdated", "updating", outdated) if (er) return cb(er) asyncMap(outdated, function (ww, cb) { // [[ dir, dep, has, want, req ]] var where = ww[0] , dep = ww[1] , want = ww[3] , what = dep + "@" + want , req = ww[5] , url = require('url') // use the initial installation method (repo, tar, git) for updating if (url.parse(req).protocol) what = req npm.commands.install(where, what, cb) }, cb) }) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022603� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/version.js������������������������������������������000644 �000766 �000024 �00000012143 12455173731 023474� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm version <newver> module.exports = version var semver = require("semver") , path = require("path") , fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , chain = require("slide").chain , log = require("npmlog") , npm = require("./npm.js") , git = require("./utils/git.js") , assert = require("assert") version.usage = "npm version [<newversion> | major | minor | patch | prerelease | preminor | premajor ]\n" + "\n(run in package dir)\n" + "'npm -v' or 'npm --version' to print npm version " + "("+npm.version+")\n" + "'npm view <pkg> version' to view a package's " + "published version\n" + "'npm ls' to inspect current package/dependency versions" function version (args, silent, cb_) { if (typeof cb_ !== "function") cb_ = silent, silent = false if (args.length > 1) return cb_(version.usage) var packagePath = path.join(npm.localPrefix, "package.json") fs.readFile(packagePath, function (er, data) { function cb (er) { if (!er && !silent) console.log("v" + data.version) cb_(er) } if (data) data = data.toString() try { data = JSON.parse(data) } catch (er) { log.error("version", "Bad package.json data", data) return cb_(er) } if (!args.length && data) return dump(data.name, data.version, cb_) if (er) { log.error("version", "No package.json found") return cb_(er) } var newVersion = semver.valid(args[0]) if (!newVersion) newVersion = semver.inc(data.version, args[0]) if (!newVersion) return cb_(version.usage) if (data.version === newVersion) return cb_(new Error("Version not changed")) data.version = newVersion checkGit(function (er, hasGit) { if (er) return cb_(er) write(data, "package.json", function (er) { if (er) return cb_(er) updateShrinkwrap(newVersion, function (er, hasShrinkwrap) { if (er || !hasGit) return cb(er) commit(data.version, hasShrinkwrap, cb) }) }) }) }) } function updateShrinkwrap (newVersion, cb) { fs.readFile(path.join(npm.localPrefix, "npm-shrinkwrap.json"), function (er, data) { if (er && er.code === "ENOENT") return cb(null, false) try { data = data.toString() data = JSON.parse(data) } catch (er) { log.error("version", "Bad npm-shrinkwrap.json data") return cb(er) } data.version = newVersion write(data, "npm-shrinkwrap.json", function (er) { if (er) { log.error("version", "Bad npm-shrinkwrap.json data") return cb(er) } cb(null, true) }) }) } function dump (name, version, cb) { assert(typeof name === "string", "package name must be passed to version dump") assert(typeof version === "string", "package version must be passed to version dump") var v = {} if (name) v[name] = version v.npm = npm.version Object.keys(process.versions).forEach(function (k) { v[k] = process.versions[k] }) if (npm.config.get("json")) v = JSON.stringify(v, null, 2) console.log(v) cb() } function checkGit (cb) { fs.stat(path.join(npm.localPrefix, ".git"), function (er, s) { var doGit = !er && s.isDirectory() && npm.config.get("git-tag-version") if (!doGit) { if (er) log.verbose("version", "error checking for .git", er) log.verbose("version", "not tagging in git") return cb(null, false) } // check for git git.whichAndExec( [ "status", "--porcelain" ], { env : process.env }, function (er, stdout) { if (er && er.code === "ENOGIT") { log.warn( "version", "This is a Git checkout, but the git command was not found.", "npm could not create a Git tag for this release!" ) return cb(null, false) } var lines = stdout.trim().split("\n").filter(function (line) { return line.trim() && !line.match(/^\?\? /) }).map(function (line) { return line.trim() }) if (lines.length) return cb(new Error( "Git working directory not clean.\n"+lines.join("\n") )) cb(null, true) } ) }) } function commit (version, hasShrinkwrap, cb) { var options = { env : process.env } var message = npm.config.get("message").replace(/%s/g, version) var sign = npm.config.get("sign-git-tag") var flag = sign ? "-sm" : "-am" chain( [ git.chainableExec([ "add", "package.json" ], options), hasShrinkwrap && git.chainableExec([ "add", "npm-shrinkwrap.json" ] , options), git.chainableExec([ "commit", "-m", message ], options), git.chainableExec([ "tag", "v" + version, flag, message ], options) ], cb ) } function write (data, file, cb) { assert(data && typeof data === "object", "must pass data to version write") assert(typeof file === "string", "must pass filename to write to version write") log.verbose("version.write", "data", data, "to", file) writeFileAtomic( path.join(npm.localPrefix, file), new Buffer(JSON.stringify(data, null, 2) + "\n"), cb ) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/view.js���������������������������������������������000644 �000766 �000024 �00000020117 12455173731 022761� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// npm view [pkg [pkg ...]] module.exports = view view.usage = "npm view pkg[@version] [<field>[.subfield]...]" var npm = require("./npm.js") , readJson = require("read-package-json") , log = require("npmlog") , util = require("util") , semver = require("semver") , mapToRegistry = require("./utils/map-to-registry.js") , npa = require("npm-package-arg") , path = require("path") view.completion = function (opts, cb) { if (opts.conf.argv.remain.length <= 2) { return mapToRegistry("-/short", npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, cb) }) } // have the package, get the fields. var tag = npm.config.get("tag") mapToRegistry(opts.conf.argv.remain[2], npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, d) { if (er) return cb(er) var dv = d.versions[d["dist-tags"][tag]] , fields = [] d.versions = Object.keys(d.versions).sort(semver.compareLoose) fields = getFields(d).concat(getFields(dv)) cb(null, fields) }) }) function getFields (d, f, pref) { f = f || [] if (!d) return f pref = pref || [] Object.keys(d).forEach(function (k) { if (k.charAt(0) === "_" || k.indexOf(".") !== -1) return var p = pref.concat(k).join(".") f.push(p) if (Array.isArray(d[k])) { d[k].forEach(function (val, i) { var pi = p + "[" + i + "]" if (val && typeof val === "object") getFields(val, f, [p]) else f.push(pi) }) return } if (typeof d[k] === "object") getFields(d[k], f, [p]) }) return f } } function view (args, silent, cb) { if (typeof cb !== "function") cb = silent, silent = false if (!args.length) args = ["."] var pkg = args.shift() , nv = npa(pkg) , name = nv.name , local = (name === "." || !name) if (npm.config.get("global") && local) { return cb(new Error("Cannot use view command in global mode.")) } if (local) { var dir = npm.prefix readJson(path.resolve(dir, "package.json"), function (er, d) { d = d || {} if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (!d.name) return cb(new Error("Invalid package.json")) var p = d.name nv = npa(p) if (pkg && ~pkg.indexOf("@")) { nv.rawSpec = pkg.split("@")[pkg.indexOf("@")] } fetchAndRead(nv, args, silent, cb) }) } else { fetchAndRead(nv, args, silent, cb) } } function fetchAndRead (nv, args, silent, cb) { // get the data about this package var name = nv.name , version = nv.rawSpec || npm.config.get("tag") mapToRegistry(name, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth : auth }, function (er, data) { if (er) return cb(er) if (data["dist-tags"] && data["dist-tags"].hasOwnProperty(version)) { version = data["dist-tags"][version] } if (data.time && data.time.unpublished) { var u = data.time.unpublished er = new Error("Unpublished by " + u.name + " on " + u.time) er.statusCode = 404 er.code = "E404" er.pkgid = data._id return cb(er, data) } var results = [] , error = null , versions = data.versions || {} data.versions = Object.keys(versions).sort(semver.compareLoose) if (!args.length) args = [""] // remove readme unless we asked for it if (-1 === args.indexOf("readme")) { delete data.readme } Object.keys(versions).forEach(function (v) { if (semver.satisfies(v, version, true)) args.forEach(function (args) { // remove readme unless we asked for it if (-1 === args.indexOf("readme")) { delete versions[v].readme } results.push(showFields(data, versions[v], args)) }) }) results = results.reduce(reducer, {}) var retval = results if (args.length === 1 && args[0] === "") { retval = cleanBlanks(retval) log.silly("cleanup", retval) } if (error || silent) cb(error, retval) else printData(results, data._id, cb.bind(null, error, retval)) }) }) } function cleanBlanks (obj) { var clean = {} Object.keys(obj).forEach(function (version) { clean[version] = obj[version][""] }) return clean } function reducer (l, r) { if (r) Object.keys(r).forEach(function (v) { l[v] = l[v] || {} Object.keys(r[v]).forEach(function (t) { l[v][t] = r[v][t] }) }) return l } // return whatever was printed function showFields (data, version, fields) { var o = {} ;[data, version].forEach(function (s) { Object.keys(s).forEach(function (k) { o[k] = s[k] }) }) return search(o, fields.split("."), version.version, fields) } function search (data, fields, version, title) { var field , tail = fields while (!field && fields.length) field = tail.shift() fields = [field].concat(tail) var o if (!field && !tail.length) { o = {} o[version] = {} o[version][title] = data return o } var index = field.match(/(.+)\[([^\]]+)\]$/) if (index) { field = index[1] index = index[2] if (data.field && data.field.hasOwnProperty(index)) { return search(data[field][index], tail, version, title) } else { field = field + "[" + index + "]" } } if (Array.isArray(data)) { if (data.length === 1) { return search(data[0], fields, version, title) } var results = [] data.forEach(function (data, i) { var tl = title.length , newt = title.substr(0, tl-(fields.join(".").length) - 1) + "["+i+"]" + [""].concat(fields).join(".") results.push(search(data, fields.slice(), version, newt)) }) results = results.reduce(reducer, {}) return results } if (!data.hasOwnProperty(field)) return undefined data = data[field] if (tail.length) { if (typeof data === "object") { // there are more fields to deal with. return search(data, tail, version, title) } else { return new Error("Not an object: "+data) } } o = {} o[version] = {} o[version][title] = data return o } function printData (data, name, cb) { var versions = Object.keys(data) , msg = "" , includeVersions = versions.length > 1 , includeFields versions.forEach(function (v) { var fields = Object.keys(data[v]) includeFields = includeFields || (fields.length > 1) fields.forEach(function (f) { var d = cleanup(data[v][f]) if (includeVersions || includeFields || typeof d !== "string") { d = cleanup(data[v][f]) d = npm.config.get("json") ? JSON.stringify(d, null, 2) : util.inspect(d, false, 5, npm.color) } else if (typeof d === "string" && npm.config.get("json")) { d = JSON.stringify(d) } if (f && includeFields) f += " = " if (d.indexOf("\n") !== -1) d = " \n" + d msg += (includeVersions ? name + "@" + v + " " : "") + (includeFields ? f : "") + d + "\n" }) }) console.log(msg) cb(null, data) } function cleanup (data) { if (Array.isArray(data)) { if (data.length === 1) { data = data[0] } else { return data.map(cleanup) } } if (!data || typeof data !== "object") return data if (typeof data.versions === "object" && data.versions && !Array.isArray(data.versions)) { data.versions = Object.keys(data.versions || {}) } var keys = Object.keys(data) keys.forEach(function (d) { if (d.charAt(0) === "_") delete data[d] else if (typeof data[d] === "object") data[d] = cleanup(data[d]) }) keys = Object.keys(data) if (keys.length <= 3 && data.name && (keys.length === 1 || keys.length === 3 && data.email && data.url || keys.length === 2 && (data.email || data.url))) { data = unparsePerson(data) } return data } function unparsePerson (d) { if (typeof d === "string") return d return d.name + (d.email ? " <"+d.email+">" : "") + (d.url ? " ("+d.url+")" : "") } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/visnup.js�������������������������������������������000644 �000766 �000024 �00000007711 12455173731 023340� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = visnup var npm = require("./npm.js") var handsomeFace = [ [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 237, 236, 236, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 235, 236, 235, 233, 237, 235, 233, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 235, 233, 232, 235, 235, 234, 233, 236, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 237, 235, 232, 232, 234, 233, 233, 232, 232, 233, 232, 232, 235, 232, 233, 234, 234, 0, 0, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 232, 232, 232, 239, 238, 235, 233, 232, 232, 232, 232, 232, 232, 232, 233, 235, 232, 233, 233, 232, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 234, 234, 232, 233, 234, 233, 234, 235, 233, 235, 60, 238, 238, 234, 234, 233, 234, 233, 238, 251, 246, 233, 233, 232, 0, 0, 0, 0, 0, 0] ,[0, 0, 233, 233, 233, 232, 232, 239, 249, 251, 252, 231, 231, 188, 250, 254, 59, 60, 255, 231, 231, 231, 252, 235, 239, 235, 232, 233, 0, 0, 0, 0, 0, 0] ,[0, 0, 232, 233, 232, 232, 232, 248, 231, 231, 231, 231, 231, 231, 231, 254, 238, 254, 231, 231, 231, 231, 231, 252, 233, 235, 237, 233, 234, 0, 0, 0, 0, 0] ,[0, 0, 233, 232, 232, 232, 248, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 251, 233, 233, 233, 236, 233, 0, 0, 0, 0] ,[232, 233, 233, 232, 232, 246, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 249, 233, 234, 234, 0, 0, 0, 0] ,[232, 232, 232, 232, 233, 249, 231, 255, 255, 255, 255, 254, 109, 60, 239, 237, 238, 237, 235, 235, 235, 235, 236, 235, 235, 235, 234, 232, 232, 232, 232, 232, 233, 0] ,[0, 232, 232, 233, 233, 233, 233, 233, 233, 233, 233, 233, 235, 236, 238, 238, 235, 188, 254, 254, 145, 236, 252, 254, 254, 254, 254, 249, 236, 235, 232, 232, 233, 0] ,[0, 0, 233, 237, 249, 239, 233, 252, 231, 231, 231, 231, 231, 231, 254, 235, 235, 254, 231, 231, 251, 235, 237, 231, 231, 231, 231, 7, 237, 235, 232, 233, 233, 0] ,[0, 0, 0, 0, 233, 248, 239, 233, 231, 231, 231, 231, 254, 233, 233, 235, 254, 255, 231, 254, 237, 236, 254, 239, 235, 235, 233, 233, 232, 232, 233, 232, 0, 0] ,[0, 0, 0, 232, 233, 246, 255, 255, 236, 236, 236, 236, 236, 255, 231, 231, 231, 231, 231, 231, 252, 234, 248, 231, 231, 231, 231, 248, 232, 232, 232, 0, 0, 0] ,[0, 0, 0, 0, 235, 237, 7, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 255, 238, 235, 7, 231, 231, 231, 246, 232, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 235, 103, 188, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 252, 232, 238, 231, 231, 255, 244, 232, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 235, 236, 103, 146, 253, 255, 231, 231, 231, 231, 231, 253, 251, 250, 250, 250, 246, 232, 235, 152, 255, 146, 66, 233, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 233, 103, 146, 146, 146, 146, 254, 231, 231, 231, 109, 103, 146, 255, 188, 239, 240, 103, 255, 253, 103, 238, 234, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 232, 235, 109, 146, 146, 146, 146, 146, 252, 152, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 103, 235, 233, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 235, 235, 103, 146, 146, 146, 146, 146, 146, 188, 188, 188, 188, 188, 188, 152, 146, 146, 146, 66, 235, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 233, 235, 66, 146, 146, 146, 146, 152, 255, 146, 240, 239, 241, 109, 146, 146, 146, 103, 233, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 234, 237, 109, 146, 146, 146, 146, 146, 254, 231, 231, 188, 146, 146, 146, 103, 233, 0, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 233, 237, 60, 103, 146, 146, 146, 146, 146, 103, 66, 60, 235, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0] ,[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 233, 233, 236, 235, 237, 235, 237, 237, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]] function visnup (args, cb) { handsomeFace.forEach(function (line) { console.log(line.map(function (ch) { return "\033[" + (ch ? "48;5;" + ch : ch) + "m" }).join(' ')) }) var c = args.shift() if (c) npm.commands[c](args, cb) else cb() } �������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/whoami.js�������������������������������������������000644 �000766 �000024 �00000002160 12455173731 023271� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var npm = require("./npm.js") module.exports = whoami whoami.usage = "npm whoami\n(just prints username according to given registry)" function whoami (args, silent, cb) { // FIXME: need tighter checking on this, but is a breaking change if (typeof cb !== "function") { cb = silent silent = false } var registry = npm.config.get("registry") if (!registry) return cb(new Error("no default registry set")) var auth = npm.config.getCredentialsByURI(registry) if (auth) { if (auth.username) { if (!silent) console.log(auth.username) return process.nextTick(cb.bind(this, null, auth.username)) } else if (auth.token) { return npm.registry.whoami(registry, { auth : auth }, function (er, username) { if (er) return cb(er) if (!silent) console.log(username) cb(null, username) }) } } // At this point, if they have a credentials object, it doesn't // have a token or auth in it. Probably just the default // registry. var msg = "Not authed. Run 'npm adduser'" if (!silent) console.log(msg) process.nextTick(cb.bind(this, null, msg)) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/xmas.js���������������������������������������������000644 �000766 �000024 �00000002662 12455173731 022764� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// happy xmas var log = require("npmlog") module.exports = function (args, cb) { var s = process.platform === "win32" ? " *" : " \u2605" , f = "\uFF0F" , b = "\uFF3C" , x = process.platform === "win32" ? " " : "" , o = [ "\u0069" , "\u0020", "\u0020", "\u0020", "\u0020", "\u0020" , "\u0020", "\u0020", "\u0020", "\u0020", "\u0020", "\u0020" , "\u0020", "\u2E1B","\u2042","\u2E2E","&","@","\uFF61" ] , oc = [21,33,34,35,36,37] , l = "\u005e" function w (s) { process.stderr.write(s) } w("\n") ;(function T (H) { for (var i = 0; i < H; i ++) w(" ") w(x+"\033[33m"+s+"\n") var M = H * 2 - 1 for (var L = 1; L <= H; L ++) { var O = L * 2 - 2 var S = (M - O) / 2 for (i = 0; i < S; i ++) w(" ") w(x+"\033[32m"+f) for (i = 0; i < O; i ++) w( "\033["+oc[Math.floor(Math.random()*oc.length)]+"m"+ o[Math.floor(Math.random() * o.length)] ) w(x+"\033[32m"+b+"\n") } w(" ") for (i = 1; i < H; i ++) w("\033[32m"+l) w("| "+x+" |") for (i = 1; i < H; i ++) w("\033[32m"+l) if (H > 10) { w("\n ") for (i = 1; i < H; i ++) w(" ") w("| "+x+" |") for (i = 1; i < H; i ++) w(" ") } })(20) w("\n\n") log.heading = '' log.addLevel('npm', 100000, log.headingStyle) log.npm("loves you", "Happy Xmas, Noders!") cb() } var dg=false Object.defineProperty(module.exports, "usage", {get:function () { if (dg) module.exports([], function () {}) dg = true return " " }}) ������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/completion/�����������������������������������000755 �000766 �000024 �00000000000 12456115117 024754� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/completion.sh���������������������������������000755 �000766 �000024 �00000003014 12455173731 025316� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash ###-begin-npm-completion-### # # npm command completion script # # Installation: npm completion >> ~/.bashrc (or ~/.zshrc) # Or, maybe: npm completion > /usr/local/etc/bash_completion.d/npm # COMP_WORDBREAKS=${COMP_WORDBREAKS/=/} COMP_WORDBREAKS=${COMP_WORDBREAKS/@/} export COMP_WORDBREAKS if type complete &>/dev/null; then _npm_completion () { local si="$IFS" IFS=$'\n' COMPREPLY=($(COMP_CWORD="$COMP_CWORD" \ COMP_LINE="$COMP_LINE" \ COMP_POINT="$COMP_POINT" \ npm completion -- "${COMP_WORDS[@]}" \ 2>/dev/null)) || return $? IFS="$si" } complete -F _npm_completion npm elif type compdef &>/dev/null; then _npm_completion() { si=$IFS compadd -- $(COMP_CWORD=$((CURRENT-1)) \ COMP_LINE=$BUFFER \ COMP_POINT=0 \ npm completion -- "${words[@]}" \ 2>/dev/null) IFS=$si } compdef _npm_completion npm elif type compctl &>/dev/null; then _npm_completion () { local cword line point words si read -Ac words read -cn cword let cword-=1 read -l line read -ln point si="$IFS" IFS=$'\n' reply=($(COMP_CWORD="$cword" \ COMP_LINE="$line" \ COMP_POINT="$point" \ npm completion -- "${words[@]}" \ 2>/dev/null)) || return $? IFS="$si" } compctl -K _npm_completion npm fi ###-end-npm-completion-### ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/depr-check.js���������������������������������000644 �000766 �000024 �00000000606 12455173731 025155� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var log = require("npmlog") var deprecated = {} , deprWarned = {} module.exports = function deprCheck (data) { if (deprecated[data._id]) data.deprecated = deprecated[data._id] if (data.deprecated) deprecated[data._id] = data.deprecated else return if (!deprWarned[data._id]) { deprWarned[data._id] = true log.warn("deprecated", "%s: %s", data._id, data.deprecated) } } ��������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/error-handler.js������������������������������000644 �000766 �000024 �00000026615 12455173731 025724� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = errorHandler var cbCalled = false , log = require("npmlog") , npm = require("../npm.js") , rm = require("rimraf") , itWorked = false , path = require("path") , wroteLogFile = false , exitCode = 0 , rollbacks = npm.rollbacks , chain = require("slide").chain , writeStream = require("fs-write-stream-atomic") process.on("exit", function (code) { // console.error("exit", code) if (!npm.config || !npm.config.loaded) return if (code) itWorked = false if (itWorked) log.info("ok") else { if (!cbCalled) { log.error("", "cb() never called!") } if (wroteLogFile) { // just a line break if (log.levels[log.level] <= log.levels.error) console.error("") log.error("", ["Please include the following file with any support request:" ," " + path.resolve("npm-debug.log") ].join("\n")) wroteLogFile = false } if (code) { log.error("code", code) } } var doExit = npm.config.get("_exit") if (doExit) { // actually exit. if (exitCode === 0 && !itWorked) { exitCode = 1 } if (exitCode !== 0) process.exit(exitCode) } else { itWorked = false // ready for next exit } }) function exit (code, noLog) { exitCode = exitCode || process.exitCode || code var doExit = npm.config ? npm.config.get("_exit") : true log.verbose("exit", [code, doExit]) if (log.level === "silent") noLog = true if (rollbacks.length) { chain(rollbacks.map(function (f) { return function (cb) { npm.commands.unbuild([f], true, cb) } }), function (er) { if (er) { log.error("error rolling back", er) if (!code) errorHandler(er) else if (noLog) rm("npm-debug.log", reallyExit.bind(null, er)) else writeLogFile(reallyExit.bind(this, er)) } else { if (!noLog && code) writeLogFile(reallyExit) else rm("npm-debug.log", reallyExit) } }) rollbacks.length = 0 } else if (code && !noLog) writeLogFile(reallyExit) else rm("npm-debug.log", reallyExit) function reallyExit (er) { if (er && !code) code = typeof er.errno === "number" ? er.errno : 1 // truncate once it's been written. log.record.length = 0 itWorked = !code // just emit a fake exit event. // if we're really exiting, then let it exit on its own, so that // in-process stuff can finish or clean up first. if (!doExit) process.emit("exit", code) npm.spinner.stop() } } function errorHandler (er) { // console.error("errorHandler", er) if (!npm.config || !npm.config.loaded) { // logging won't work unless we pretend that it's ready er = er || new Error("Exit prior to config file resolving.") console.error(er.stack || er.message) } if (cbCalled) { er = er || new Error("Callback called more than once.") } cbCalled = true if (!er) return exit(0) if (typeof er === "string") { log.error("", er) return exit(1, true) } else if (!(er instanceof Error)) { log.error("weird error", er) return exit(1, true) } var m = er.code || er.message.match(/^(?:Error: )?(E[A-Z]+)/) if (m && !er.code) er.code = m ; [ "type" , "fstream_path" , "fstream_unc_path" , "fstream_type" , "fstream_class" , "fstream_finish_call" , "fstream_linkpath" , "stack" , "fstream_stack" , "statusCode" , "pkgid" ].forEach(function (k) { var v = er[k] if (!v) return if (k === "fstream_stack") v = v.join("\n") log.verbose(k, v) }) log.verbose("cwd", process.cwd()) var os = require("os") // log.error("System", os.type() + " " + os.release()) // log.error("command", process.argv.map(JSON.stringify).join(" ")) // log.error("node -v", process.version) // log.error("npm -v", npm.version) log.error("", os.type() + " " + os.release()) log.error("argv", process.argv.map(JSON.stringify).join(" ")) log.error("node", process.version) log.error("npm ", "v" + npm.version) ; [ "file" , "path" , "code" , "errno" , "syscall" ].forEach(function (k) { var v = er[k] if (v) log.error(k, v) }) // just a line break if (log.levels[log.level] <= log.levels.error) console.error("") switch (er.code) { case "ECONNREFUSED": log.error("", er) log.error("", ["\nIf you are behind a proxy, please make sure that the" ,"'proxy' config is set properly. See: 'npm help config'" ].join("\n")) break case "EACCES": case "EPERM": log.error("", er) log.error("", ["\nPlease try running this command again as root/Administrator." ].join("\n")) break case "ELIFECYCLE": log.error("", er.message) log.error("", ["","Failed at the "+er.pkgid+" "+er.stage+" script '"+er.script+"'." ,"This is most likely a problem with the "+er.pkgname+" package," ,"not with npm itself." ,"Tell the author that this fails on your system:" ," "+er.script ,"You can get their info via:" ," npm owner ls "+er.pkgname ,"There is likely additional logging output above." ].join("\n")) break case "ENOGIT": log.error("", er.message) log.error("", ["","Failed using git." ,"This is most likely not a problem with npm itself." ,"Please check if you have git installed and in your PATH." ].join("\n")) break case "EJSONPARSE": log.error("", er.message) log.error("", "File: "+er.file) log.error("", ["Failed to parse package.json data." ,"package.json must be actual JSON, not just JavaScript." ,"","This is not a bug in npm." ,"Tell the package author to fix their package.json file." ].join("\n"), "JSON.parse") break // TODO(isaacs) // Add a special case here for E401 and E403 explaining auth issues? case "E404": var msg = [er.message] if (er.pkgid && er.pkgid !== "-") { msg.push("", "'"+er.pkgid+"' is not in the npm registry." ,"You should bug the author to publish it (or use the name yourself!)") if (er.parent) { msg.push("It was specified as a dependency of '"+er.parent+"'") } msg.push("\nNote that you can also install from a" ,"tarball, folder, http url, or git url.") } // There's no need to have 404 in the message as well. msg[0] = msg[0].replace(/^404\s+/, "") log.error("404", msg.join("\n")) break case "EPUBLISHCONFLICT": log.error("publish fail", ["Cannot publish over existing version." ,"Update the 'version' field in package.json and try again." ,"" ,"To automatically increment version numbers, see:" ," npm help version" ].join("\n")) break case "EISGIT": log.error("git", [er.message ," "+er.path ,"Refusing to remove it. Update manually," ,"or move it out of the way first." ].join("\n")) break case "ECYCLE": log.error("cycle", [er.message ,"While installing: "+er.pkgid ,"Found a pathological dependency case that npm cannot solve." ,"Please report this to the package author." ].join("\n")) break case "EBADPLATFORM": log.error("notsup", [er.message ,"Not compatible with your operating system or architecture: "+er.pkgid ,"Valid OS: "+er.os.join(",") ,"Valid Arch: "+er.cpu.join(",") ,"Actual OS: "+process.platform ,"Actual Arch: "+process.arch ].join("\n")) break case "EEXIST": log.error([er.message ,"File exists: "+er.path ,"Move it away, and try again."].join("\n")) break case "ENEEDAUTH": log.error("need auth", [er.message ,"You need to authorize this machine using `npm adduser`" ].join("\n")) break case "EPEERINVALID": var peerErrors = Object.keys(er.peersDepending).map(function (peer) { return "Peer " + peer + " wants " + er.packageName + "@" + er.peersDepending[peer] }) log.error("peerinvalid", [er.message].concat(peerErrors).join("\n")) break case "ECONNRESET": case "ENOTFOUND": case "ETIMEDOUT": log.error("network", [er.message ,"This is most likely not a problem with npm itself" ,"and is related to network connectivity." ,"In most cases you are behind a proxy or have bad network settings." ,"\nIf you are behind a proxy, please make sure that the" ,"'proxy' config is set properly. See: 'npm help config'" ].join("\n")) break case "ENOPACKAGEJSON": log.error("package.json", [er.message ,"This is most likely not a problem with npm itself." ,"npm can't find a package.json file in your current directory." ].join("\n")) break case "ETARGET": log.error("notarget", [er.message ,"This is most likely not a problem with npm itself." ,"In most cases you or one of your dependencies are requesting" ,"a package version that doesn't exist." ].join("\n")) break case "ENOTSUP": if (er.required) { log.error("notsup", [er.message ,"Not compatible with your version of node/npm: "+er.pkgid ,"Required: "+JSON.stringify(er.required) ,"Actual: " +JSON.stringify({npm:npm.version ,node:npm.config.get("node-version")}) ].join("\n")) break } // else passthrough case "ENOSPC": log.error("nospc", [er.message ,"This is most likely not a problem with npm itself" ,"and is related to insufficient space on your system." ].join("\n")) break case "EROFS": log.error("rofs", [er.message ,"This is most likely not a problem with npm itself" ,"and is related to the file system being read-only." ,"\nOften virtualized file systems, or other file systems" ,"that don't support symlinks, give this error." ].join("\n")) break case "ENOENT": log.error("enoent", [er.message ,"This is most likely not a problem with npm itself" ,"and is related to npm not being able to find a file." ,er.file?"\nCheck if the file '"+er.file+"' is present.":"" ].join("\n")) break default: log.error("", er.message || er) log.error("", ["", "If you need help, you may report this error at:" ," <http://github.com/npm/npm/issues>" ].join("\n")) break } exit(typeof er.errno === "number" ? er.errno : 1) } var writingLogFile = false function writeLogFile (cb) { if (writingLogFile) return cb() writingLogFile = true wroteLogFile = true var fstr = writeStream("npm-debug.log") , os = require("os") , out = "" log.record.forEach(function (m) { var pref = [m.id, m.level] if (m.prefix) pref.push(m.prefix) pref = pref.join(" ") m.message.trim().split(/\r?\n/).map(function (line) { return (pref + " " + line).trim() }).forEach(function (line) { out += line + os.EOL }) }) fstr.end(out) fstr.on("close", cb) } �������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/gently-rm.js����������������������������������000644 �000766 �000024 �00000011722 12455173731 025067� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// only remove the thing if it's a symlink into a specific folder. // This is a very common use-case of npm's, but not so common elsewhere. module.exports = gentlyRm var npm = require("../npm.js") , log = require("npmlog") , resolve = require("path").resolve , dirname = require("path").dirname , lstat = require("graceful-fs").lstat , readlink = require("graceful-fs").readlink , isInside = require("path-is-inside") , vacuum = require("fs-vacuum") , some = require("async-some") , asyncMap = require("slide").asyncMap , normalize = require("path").normalize function gentlyRm (path, gently, base, cb) { if (!cb) { cb = base base = undefined } if (!cb) { cb = gently gently = false } // never rm the root, prefix, or bin dirs. // just a safety precaution. var prefixes = [ npm.dir, npm.root, npm.bin, npm.prefix, npm.globalDir, npm.globalRoot, npm.globalBin, npm.globalPrefix ] var resolved = normalize(resolve(path)) if (prefixes.indexOf(resolved) !== -1) { log.verbose("gentlyRm", resolved, "is part of npm and can't be removed") return cb(new Error("May not delete: "+resolved)) } var options = {log : log.silly.bind(log, "gentlyRm")} if (npm.config.get("force") || !gently) options.purge = true if (base) options.base = normalize(base) if (!gently) { log.verbose("gentlyRm", "vacuuming", resolved) return vacuum(resolved, options, cb) } var parent = options.base = normalize(base ? base : npm.prefix) log.verbose("gentlyRm", "verifying that", parent, "is managed by npm") some(prefixes, isManaged(parent), function (er, matched) { if (er) return cb(er) if (!matched) { log.verbose("gentlyRm", parent, "is not managed by npm") return clobberFail(resolved, parent, cb) } log.silly("gentlyRm", parent, "is managed by npm") if (isInside(resolved, parent)) { log.silly("gentlyRm", resolved, "is under", parent) log.verbose("gentlyRm", "vacuuming", resolved, "up to", parent) return vacuum(resolved, options, cb) } log.silly("gentlyRm", resolved, "is not under", parent) log.silly("gentlyRm", "checking to see if", resolved, "is a link") lstat(resolved, function (er, stat) { if (er) { if (er.code === "ENOENT") return cb(null) return cb(er) } if (!stat.isSymbolicLink()) { log.verbose("gentlyRm", resolved, "is outside", parent, "and not a link") return clobberFail(resolved, parent, cb) } log.silly("gentlyRm", resolved, "is a link") readlink(resolved, function (er, link) { if (er) { if (er.code === "ENOENT") return cb(null) return cb(er) } var source = resolve(dirname(resolved), link) if (isInside(source, parent)) { log.silly("gentlyRm", source, "inside", parent) log.verbose("gentlyRm", "vacuuming", resolved) return vacuum(resolved, options, cb) } log.silly("gentlyRm", "checking to see if", source, "is managed by npm") some(prefixes, isManaged(source), function (er, matched) { if (er) return cb(er) if (matched) { log.silly("gentlyRm", source, "is under", matched) log.verbose("gentlyRm", "removing", resolved) vacuum(resolved, options, cb) } log.verbose("gentlyRm", source, "is not managed by npm") return clobberFail(path, parent, cb) }) }) }) }) } var resolvedPaths = {} function isManaged (target) { return function predicate (path, cb) { if (!path) { log.verbose("isManaged", "no path") return cb(null, false) } asyncMap([path, target], resolveSymlink, function (er, results) { if (er) { if (er.code === "ENOENT") return cb(null, false) return cb(er) } var path = results[0] var target = results[1] var inside = isInside(target, path) log.silly("isManaged", target, inside ? "is" : "is not", "inside", path) return cb(null, inside && path) }) } function resolveSymlink (toResolve, cb) { var resolved = resolve(toResolve) // if the path has already been memoized, return immediately var cached = resolvedPaths[resolved] if (cached) return cb(null, cached) // otherwise, check the path lstat(resolved, function (er, stat) { if (er) return cb(er) // if it's not a link, cache & return the path itself if (!stat.isSymbolicLink()) { resolvedPaths[resolved] = resolved return cb(null, resolved) } // otherwise, cache & return the link's source readlink(resolved, function (er, source) { if (er) return cb(er) resolved = resolve(resolved, source) resolvedPaths[resolved] = resolved cb(null, resolved) }) }) } } function clobberFail (p, g, cb) { var er = new Error("Refusing to delete: "+p+" not in "+g) er.code = "EEXIST" er.path = p return cb(er) } ����������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/git.js����������������������������������������000644 �000766 �000024 �00000002221 12455173731 023726� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ // handle some git configuration for windows exports.spawn = spawnGit exports.chainableExec = chainableExec exports.whichAndExec = whichAndExec var exec = require("child_process").execFile , spawn = require("./spawn") , npm = require("../npm.js") , which = require("which") , git = npm.config.get("git") , assert = require("assert") , log = require("npmlog") function prefixGitArgs () { return process.platform === "win32" ? ["-c", "core.longpaths=true"] : [] } function execGit (args, options, cb) { log.info("git", args) return exec(git, prefixGitArgs().concat(args || []), options, cb) } function spawnGit (args, options) { log.info("git", args) return spawn(git, prefixGitArgs().concat(args || []), options) } function chainableExec () { var args = Array.prototype.slice.call(arguments) return [execGit].concat(args) } function whichGit (cb) { return which(git, cb) } function whichAndExec (args, options, cb) { assert.equal(typeof cb, "function", "no callback provided") // check for git whichGit(function (err) { if (err) { err.code = "ENOGIT" return cb(err) } execGit(args, options, cb) }) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/lifecycle.js����������������������������������000644 �000766 �000024 �00000023054 12455173731 025111� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������exports = module.exports = lifecycle exports.cmd = cmd exports.makeEnv = makeEnv var log = require("npmlog") , spawn = require("./spawn") , npm = require("../npm.js") , path = require("path") , fs = require("graceful-fs") , chain = require("slide").chain , Stream = require("stream").Stream , PATH = "PATH" , uidNumber = require("uid-number") // windows calls it's path "Path" usually, but this is not guaranteed. if (process.platform === "win32") { PATH = "Path" Object.keys(process.env).forEach(function (e) { if (e.match(/^PATH$/i)) { PATH = e } }) } function lifecycle (pkg, stage, wd, unsafe, failOk, cb) { if (typeof cb !== "function") cb = failOk, failOk = false if (typeof cb !== "function") cb = unsafe, unsafe = false if (typeof cb !== "function") cb = wd, wd = null while (pkg && pkg._data) pkg = pkg._data if (!pkg) return cb(new Error("Invalid package data")) log.info(stage, pkg._id) if (!pkg.scripts || npm.config.get('ignore-scripts')) pkg.scripts = {} validWd(wd || path.resolve(npm.dir, pkg.name), function (er, wd) { if (er) return cb(er) unsafe = unsafe || npm.config.get("unsafe-perm") if ((wd.indexOf(npm.dir) !== 0 || path.basename(wd) !== pkg.name) && !unsafe && pkg.scripts[stage]) { log.warn( "cannot run in wd", "%s %s (wd=%s)" , pkg._id, pkg.scripts[stage], wd) return cb() } // set the env variables, then run scripts as a child process. var env = makeEnv(pkg) env.npm_lifecycle_event = stage env.npm_node_execpath = env.NODE = env.NODE || process.execPath env.npm_execpath = require.main.filename // "nobody" typically doesn't have permission to write to /tmp // even if it's never used, sh freaks out. if (!npm.config.get("unsafe-perm")) env.TMPDIR = wd lifecycle_(pkg, stage, wd, env, unsafe, failOk, cb) }) } function checkForLink (pkg, cb) { var f = path.join(npm.dir, pkg.name) fs.lstat(f, function (er, s) { cb(null, !(er || !s.isSymbolicLink())) }) } function lifecycle_ (pkg, stage, wd, env, unsafe, failOk, cb) { var pathArr = [] , p = wd.split("node_modules") , acc = path.resolve(p.shift()) p.forEach(function (pp) { pathArr.unshift(path.join(acc, "node_modules", ".bin")) acc = path.join(acc, "node_modules", pp) }) pathArr.unshift(path.join(acc, "node_modules", ".bin")) // we also unshift the bundled node-gyp-bin folder so that // the bundled one will be used for installing things. pathArr.unshift(path.join(__dirname, "..", "..", "bin", "node-gyp-bin")) if (env[PATH]) pathArr.push(env[PATH]) env[PATH] = pathArr.join(process.platform === "win32" ? ";" : ":") var packageLifecycle = pkg.scripts && pkg.scripts.hasOwnProperty(stage) if (packageLifecycle) { // define this here so it's available to all scripts. env.npm_lifecycle_script = pkg.scripts[stage] } function done (er) { if (er) { if (npm.config.get("force")) { log.info("forced, continuing", er) er = null } else if (failOk) { log.warn("continuing anyway", er.message) er = null } } cb(er) } chain ( [ packageLifecycle && [runPackageLifecycle, pkg, env, wd, unsafe] , [runHookLifecycle, pkg, env, wd, unsafe] ] , done ) } function validWd (d, cb) { fs.stat(d, function (er, st) { if (er || !st.isDirectory()) { var p = path.dirname(d) if (p === d) { return cb(new Error("Could not find suitable wd")) } return validWd(p, cb) } return cb(null, d) }) } function runPackageLifecycle (pkg, env, wd, unsafe, cb) { // run package lifecycle scripts in the package root, or the nearest parent. var stage = env.npm_lifecycle_event , cmd = env.npm_lifecycle_script var note = "\n> " + pkg._id + " " + stage + " " + wd + "\n> " + cmd + "\n" runCmd(note, cmd, pkg, env, stage, wd, unsafe, cb) } var running = false var queue = [] function dequeue() { running = false if (queue.length) { var r = queue.shift() runCmd.apply(null, r) } } function runCmd (note, cmd, pkg, env, stage, wd, unsafe, cb) { if (running) { queue.push([note, cmd, pkg, env, stage, wd, unsafe, cb]) return } running = true log.pause() var user = unsafe ? null : npm.config.get("user") , group = unsafe ? null : npm.config.get("group") if (log.level !== 'silent') { if (npm.spinner.int) { npm.config.get("logstream").write("\r \r") } console.log(note) } log.verbose("unsafe-perm in lifecycle", unsafe) if (process.platform === "win32") { unsafe = true } if (unsafe) { runCmd_(cmd, pkg, env, wd, stage, unsafe, 0, 0, cb) } else { uidNumber(user, group, function (er, uid, gid) { runCmd_(cmd, pkg, env, wd, stage, unsafe, uid, gid, cb) }) } } function runCmd_ (cmd, pkg, env, wd, stage, unsafe, uid, gid, cb_) { function cb (er) { cb_.apply(null, arguments) log.resume() process.nextTick(dequeue) } var conf = { cwd: wd , env: env , stdio: [ 0, 1, 2 ] } if (!unsafe) { conf.uid = uid ^ 0 conf.gid = gid ^ 0 } var sh = "sh" var shFlag = "-c" if (process.platform === "win32") { sh = "cmd" shFlag = "/c" conf.windowsVerbatimArguments = true } var proc = spawn(sh, [shFlag, cmd], conf) proc.on("error", procError) proc.on("close", function (code, signal) { if (signal) { process.kill(process.pid, signal); } else if (code) { var er = new Error("Exit status " + code) } procError(er) }) function procError (er) { if (er && !npm.ROLLBACK) { log.info(pkg._id, "Failed to exec "+stage+" script") er.message = pkg._id + " " + stage + ": `" + cmd +"`\n" + er.message if (er.code !== "EPERM") { er.code = "ELIFECYCLE" } er.pkgid = pkg._id er.stage = stage er.script = cmd er.pkgname = pkg.name return cb(er) } else if (er) { log.error(pkg._id+"."+stage, er) log.error(pkg._id+"."+stage, "continuing anyway") return cb() } cb(er) } } function runHookLifecycle (pkg, env, wd, unsafe, cb) { // check for a hook script, run if present. var stage = env.npm_lifecycle_event , hook = path.join(npm.dir, ".hooks", stage) , user = unsafe ? null : npm.config.get("user") , group = unsafe ? null : npm.config.get("group") , cmd = hook fs.stat(hook, function (er) { if (er) return cb() var note = "\n> " + pkg._id + " " + stage + " " + wd + "\n> " + cmd runCmd(note, hook, pkg, env, stage, wd, unsafe, cb) }) } function makeEnv (data, prefix, env) { prefix = prefix || "npm_package_" if (!env) { env = {} for (var i in process.env) if (!i.match(/^npm_/)) { env[i] = process.env[i] } // npat asks for tap output if (npm.config.get("npat")) env.TAP = 1 // express and others respect the NODE_ENV value. if (npm.config.get("production")) env.NODE_ENV = "production" } else if (!data.hasOwnProperty("_lifecycleEnv")) { Object.defineProperty(data, "_lifecycleEnv", { value : env , enumerable : false }) } for (var i in data) if (i.charAt(0) !== "_") { var envKey = (prefix+i).replace(/[^a-zA-Z0-9_]/g, '_') if (i === "readme") { continue } if (data[i] && typeof(data[i]) === "object") { try { // quick and dirty detection for cyclical structures JSON.stringify(data[i]) makeEnv(data[i], envKey+"_", env) } catch (ex) { // usually these are package objects. // just get the path and basic details. var d = data[i] makeEnv( { name: d.name, version: d.version, path:d.path } , envKey+"_", env) } } else { env[envKey] = String(data[i]) env[envKey] = -1 !== env[envKey].indexOf("\n") ? JSON.stringify(env[envKey]) : env[envKey] } } if (prefix !== "npm_package_") return env prefix = "npm_config_" var pkgConfig = {} , keys = npm.config.keys , pkgVerConfig = {} , namePref = data.name + ":" , verPref = data.name + "@" + data.version + ":" keys.forEach(function (i) { if (i.charAt(0) === "_" && i.indexOf("_"+namePref) !== 0) { return } var value = npm.config.get(i) if (value instanceof Stream || Array.isArray(value)) return if (!value) value = "" else if (typeof value === "number") value = "" + value else if (typeof value !== "string") value = JSON.stringify(value) value = -1 !== value.indexOf("\n") ? JSON.stringify(value) : value i = i.replace(/^_+/, "") if (i.indexOf(namePref) === 0) { var k = i.substr(namePref.length).replace(/[^a-zA-Z0-9_]/g, "_") pkgConfig[ k ] = value } else if (i.indexOf(verPref) === 0) { var k = i.substr(verPref.length).replace(/[^a-zA-Z0-9_]/g, "_") pkgVerConfig[ k ] = value } var envKey = (prefix+i).replace(/[^a-zA-Z0-9_]/g, "_") env[envKey] = value }) prefix = "npm_package_config_" ;[pkgConfig, pkgVerConfig].forEach(function (conf) { for (var i in conf) { var envKey = (prefix+i) env[envKey] = conf[i] } }) return env } function cmd (stage) { function CMD (args, cb) { npm.commands["run-script"]([stage].concat(args), cb) } CMD.usage = "npm "+stage+" [-- <args>]" var installedShallow = require("./completion/installed-shallow.js") CMD.completion = function (opts, cb) { installedShallow(opts, function (d) { return d.scripts && d.scripts[stage] }, cb) } return CMD } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/link.js���������������������������������������000644 �000766 �000024 �00000002036 12455173731 024104� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = link link.ifExists = linkIfExists var fs = require("graceful-fs") , chain = require("slide").chain , mkdir = require("mkdirp") , rm = require("./gently-rm.js") , path = require("path") , npm = require("../npm.js") function linkIfExists (from, to, gently, cb) { fs.stat(from, function (er) { if (er) return cb() link(from, to, gently, cb) }) } function link (from, to, gently, cb) { if (typeof cb !== "function") cb = gently, gently = null if (npm.config.get("force")) gently = false to = path.resolve(to) var target = from = path.resolve(from) if (process.platform !== "win32") { // junctions on windows must be absolute target = path.relative(path.dirname(to), from) // if there is no folder in common, then it will be much // longer, and using a relative link is dumb. if (target.length >= from.length) target = from } chain ( [ [fs, "stat", from] , [rm, to, gently] , [mkdir, path.dirname(to)] , [fs, "symlink", target, to, "junction"] ] , cb) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/locker.js�������������������������������������000644 �000766 �000024 �00000003540 12455173731 024427� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var crypto = require("crypto") var resolve = require("path").resolve var lockfile = require("lockfile") var log = require("npmlog") var mkdirp = require("mkdirp") var npm = require("../npm.js") var getStat = require("../cache/get-stat.js") var installLocks = {} function lockFileName (base, name) { var c = name.replace(/[^a-zA-Z0-9]+/g, "-").replace(/^-+|-+$/g, "") , p = resolve(base, name) , h = crypto.createHash("sha1").update(p).digest("hex") , l = resolve(npm.cache, "_locks") return resolve(l, c.substr(0, 24)+"-"+h.substr(0, 16)+".lock") } function lock (base, name, cb) { getStat(function (er) { var lockDir = resolve(npm.cache, "_locks") mkdirp(lockDir, function () { if (er) return cb(er) var opts = { stale: npm.config.get("cache-lock-stale") , retries: npm.config.get("cache-lock-retries") , wait: npm.config.get("cache-lock-wait") } var lf = lockFileName(base, name) lockfile.lock(lf, opts, function (er) { if (er) log.warn("locking", lf, "failed", er) if (!er) { log.verbose("lock", "using", lf, "for", resolve(base, name)) installLocks[lf] = true } cb(er) }) }) }) } function unlock (base, name, cb) { var lf = lockFileName(base, name) , locked = installLocks[lf] if (locked === false) { return process.nextTick(cb) } else if (locked === true) { lockfile.unlock(lf, function (er) { if (er) { log.warn("unlocking", lf, "failed", er) } else { installLocks[lf] = false log.verbose("unlock", "done using", lf, "for", resolve(base, name)) } cb(er) }) } else { throw new Error( "Attempt to unlock " + resolve(base, name) + ", which hasn't been locked" ) } } module.exports = { lock : lock, unlock : unlock } ����������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/map-to-registry.js����������������������������000644 �000766 �000024 �00000003077 12455173731 026220� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var url = require("url") var log = require("npmlog") , npa = require("npm-package-arg") module.exports = mapToRegistry function mapToRegistry(name, config, cb) { log.silly("mapToRegistry", "name", name) var registry // the name itself takes precedence var data = npa(name) if (data.scope) { // the name is definitely scoped, so escape now name = name.replace("/", "%2f") log.silly("mapToRegistry", "scope (from package name)", data.scope) registry = config.get(data.scope + ":registry") if (!registry) { log.verbose("mapToRegistry", "no registry URL found in name for scope", data.scope) } } // ...then --scope=@scope or --scope=scope var scope = config.get("scope") if (!registry && scope) { // I'm an enabler, sorry if (scope.charAt(0) !== "@") scope = "@" + scope log.silly("mapToRegistry", "scope (from config)", scope) registry = config.get(scope + ":registry") if (!registry) { log.verbose("mapToRegistry", "no registry URL found in config for scope", scope) } } // ...and finally use the default registry if (!registry) { log.silly("mapToRegistry", "using default registry") registry = config.get("registry") } log.silly("mapToRegistry", "registry", registry) var auth = config.getCredentialsByURI(registry) // normalize registry URL so resolution doesn't drop a piece of registry URL var normalized = registry.slice(-1) !== "/" ? registry+"/" : registry var uri = url.resolve(normalized, name) log.silly("mapToRegistry", "uri", uri) cb(null, uri, auth, normalized) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/spawn.js��������������������������������������000644 �000766 �000024 �00000001060 12455173731 024273� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = spawn var _spawn = require("child_process").spawn var EventEmitter = require("events").EventEmitter function spawn (cmd, args, options) { var raw = _spawn(cmd, args, options) var cooked = new EventEmitter() raw.on("error", function (er) { er.file = cmd cooked.emit("error", er) }).on("close", function (code, signal) { cooked.emit("close", code, signal) }) cooked.stdin = raw.stdin cooked.stdout = raw.stdout cooked.stderr = raw.stderr cooked.kill = function (sig) { return raw.kill(sig) } return cooked } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/tar.js����������������������������������������000644 �000766 �000024 �00000021547 12455173731 023745� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// commands for packing and unpacking tarballs // this file is used by lib/cache.js var npm = require("../npm.js") , fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , writeStreamAtomic = require("fs-write-stream-atomic") , path = require("path") , log = require("npmlog") , uidNumber = require("uid-number") , rm = require("./gently-rm.js") , readJson = require("read-package-json") , myUid = process.getuid && process.getuid() , myGid = process.getgid && process.getgid() , tar = require("tar") , zlib = require("zlib") , fstream = require("fstream") , Packer = require("fstream-npm") , lifecycle = require("./lifecycle.js") if (process.env.SUDO_UID && myUid === 0) { if (!isNaN(process.env.SUDO_UID)) myUid = +process.env.SUDO_UID if (!isNaN(process.env.SUDO_GID)) myGid = +process.env.SUDO_GID } exports.pack = pack exports.unpack = unpack function pack (tarball, folder, pkg, dfc, cb) { log.verbose("tar pack", [tarball, folder]) if (typeof cb !== "function") cb = dfc, dfc = false log.verbose("tarball", tarball) log.verbose("folder", folder) if (dfc) { // do fancy crap return lifecycle(pkg, "prepublish", folder, function (er) { if (er) return cb(er) pack_(tarball, folder, pkg, cb) }) } else { pack_(tarball, folder, pkg, cb) } } function pack_ (tarball, folder, pkg, cb) { new Packer({ path: folder, type: "Directory", isDirectory: true }) .on("error", function (er) { if (er) log.error("tar pack", "Error reading " + folder) return cb(er) }) // By default, npm includes some proprietary attributes in the // package tarball. This is sane, and allowed by the spec. // However, npm *itself* excludes these from its own package, // so that it can be more easily bootstrapped using old and // non-compliant tar implementations. .pipe(tar.Pack({ noProprietary: !npm.config.get("proprietary-attribs") })) .on("error", function (er) { if (er) log.error("tar.pack", "tar creation error", tarball) cb(er) }) .pipe(zlib.Gzip()) .on("error", function (er) { if (er) log.error("tar.pack", "gzip error "+tarball) cb(er) }) .pipe(writeStreamAtomic(tarball)) .on("error", function (er) { if (er) log.error("tar.pack", "Could not write "+tarball) cb(er) }) .on("close", cb) } function unpack (tarball, unpackTarget, dMode, fMode, uid, gid, cb) { log.verbose("tar", "unpack", tarball) log.verbose("tar", "unpacking to", unpackTarget) if (typeof cb !== "function") cb = gid, gid = null if (typeof cb !== "function") cb = uid, uid = null if (typeof cb !== "function") cb = fMode, fMode = npm.modes.file if (typeof cb !== "function") cb = dMode, dMode = npm.modes.exec uidNumber(uid, gid, function (er, uid, gid) { if (er) return cb(er) unpack_(tarball, unpackTarget, dMode, fMode, uid, gid, cb) }) } function unpack_ ( tarball, unpackTarget, dMode, fMode, uid, gid, cb ) { rm(unpackTarget, function (er) { if (er) return cb(er) // gzip {tarball} --decompress --stdout \ // | tar -mvxpf - --strip-components=1 -C {unpackTarget} gunzTarPerm( tarball, unpackTarget , dMode, fMode , uid, gid , function (er, folder) { if (er) return cb(er) readJson(path.resolve(folder, "package.json"), cb) }) }) } function gunzTarPerm (tarball, target, dMode, fMode, uid, gid, cb_) { if (!dMode) dMode = npm.modes.exec if (!fMode) fMode = npm.modes.file log.silly("gunzTarPerm", "modes", [dMode.toString(8), fMode.toString(8)]) var cbCalled = false function cb (er) { if (cbCalled) return cbCalled = true cb_(er, target) } var fst = fs.createReadStream(tarball) fst.on("open", function (fd) { fs.fstat(fd, function (er, st) { if (er) return fst.emit("error", er) if (st.size === 0) { er = new Error("0-byte tarball\n" + "Please run `npm cache clean`") fst.emit("error", er) } }) }) // figure out who we're supposed to be, if we're not pretending // to be a specific user. if (npm.config.get("unsafe-perm") && process.platform !== "win32") { uid = myUid gid = myGid } function extractEntry (entry) { log.silly("gunzTarPerm", "extractEntry", entry.path) // never create things that are user-unreadable, // or dirs that are user-un-listable. Only leads to headaches. var originalMode = entry.mode = entry.mode || entry.props.mode entry.mode = entry.mode | (entry.type === "Directory" ? dMode : fMode) entry.mode = entry.mode & (~npm.modes.umask) entry.props.mode = entry.mode if (originalMode !== entry.mode) { log.silly( "gunzTarPerm", "modified mode" , [entry.path, originalMode, entry.mode]) } // if there's a specific owner uid/gid that we want, then set that if (process.platform !== "win32" && typeof uid === "number" && typeof gid === "number") { entry.props.uid = entry.uid = uid entry.props.gid = entry.gid = gid } } var extractOpts = { type: "Directory", path: target, strip: 1 } if (process.platform !== "win32" && typeof uid === "number" && typeof gid === "number") { extractOpts.uid = uid extractOpts.gid = gid } var sawIgnores = {} extractOpts.filter = function () { // symbolic links are not allowed in packages. if (this.type.match(/^.*Link$/)) { log.warn( "excluding symbolic link" , this.path.substr(target.length + 1) + " -> " + this.linkpath ) return false } // Note: This mirrors logic in the fs read operations that are // employed during tarball creation, in the fstream-npm module. // It is duplicated here to handle tarballs that are created // using other means, such as system tar or git archive. if (this.type === "File") { var base = path.basename(this.path) if (base === ".npmignore") { sawIgnores[ this.path ] = true } else if (base === ".gitignore") { var npmignore = this.path.replace(/\.gitignore$/, ".npmignore") if (sawIgnores[npmignore]) { // Skip this one, already seen. return false } else { // Rename, may be clobbered later. this.path = npmignore this._path = npmignore } } } return true } fst .on("error", function (er) { if (er) log.error("tar.unpack", "error reading "+tarball) cb(er) }) .on("data", function OD (c) { // detect what it is. // Then, depending on that, we'll figure out whether it's // a single-file module, gzipped tarball, or naked tarball. // gzipped files all start with 1f8b08 if (c[0] === 0x1F && c[1] === 0x8B && c[2] === 0x08) { fst .pipe(zlib.Unzip()) .on("error", function (er) { if (er) log.error("tar.unpack", "unzip error "+tarball) cb(er) }) .pipe(tar.Extract(extractOpts)) .on("entry", extractEntry) .on("error", function (er) { if (er) log.error("tar.unpack", "untar error "+tarball) cb(er) }) .on("close", cb) } else if (hasTarHeader(c)) { // naked tar fst .pipe(tar.Extract(extractOpts)) .on("entry", extractEntry) .on("error", function (er) { if (er) log.error("tar.unpack", "untar error "+tarball) cb(er) }) .on("close", cb) } else { // naked js file var jsOpts = { path: path.resolve(target, "index.js") } if (process.platform !== "win32" && typeof uid === "number" && typeof gid === "number") { jsOpts.uid = uid jsOpts.gid = gid } fst .pipe(fstream.Writer(jsOpts)) .on("error", function (er) { if (er) log.error("tar.unpack", "copy error "+tarball) cb(er) }) .on("close", function () { var j = path.resolve(target, "package.json") readJson(j, function (er, d) { if (er) { log.error("not a package", tarball) return cb(er) } writeFileAtomic(j, JSON.stringify(d) + "\n", cb) }) }) } // now un-hook, and re-emit the chunk fst.removeListener("data", OD) fst.emit("data", c) }) } function hasTarHeader (c) { return c[257] === 0x75 && // tar archives have 7573746172 at position c[258] === 0x73 && // 257 and 003030 or 202000 at position 262 c[259] === 0x74 && c[260] === 0x61 && c[261] === 0x72 && ((c[262] === 0x00 && c[263] === 0x30 && c[264] === 0x30) || (c[262] === 0x20 && c[263] === 0x20 && c[264] === 0x00)) } ���������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/completion/file-completion.js�����������������000644 �000766 �000024 �00000001306 12455173731 030405� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = fileCompletion var mkdir = require("mkdirp") , path = require("path") , glob = require("glob") function fileCompletion (root, req, depth, cb) { if (typeof cb !== "function") cb = depth, depth = Infinity mkdir(root, function (er) { if (er) return cb(er) // can be either exactly the req, or a descendent var pattern = root + "/{" + req + "," + req + "/**/*}" , opts = { mark: true, dot: true, maxDepth: depth } glob(pattern, opts, function (er, files) { if (er) return cb(er) return cb(null, (files || []).map(function (f) { var tail = f.substr(root.length + 1).replace(/^\//, "") return path.join(req, tail) })) }) }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/completion/installed-deep.js������������������000644 �000766 �000024 �00000002135 12455173731 030212� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = installedDeep var npm = require("../../npm.js") , readInstalled = require("read-installed") function installedDeep (opts, cb) { var local , global , depth = npm.config.get("depth") , opt = { depth: depth, dev: true } if (npm.config.get("global")) local = [], next() else readInstalled(npm.prefix, opt, function (er, data) { local = getNames(data || {}) next() }) readInstalled(npm.config.get("prefix"), opt, function (er, data) { global = getNames(data || {}) next() }) function getNames_ (d, n) { if (d.realName && n) { if (n[d.realName]) return n n[d.realName] = true } if (!n) n = {} Object.keys(d.dependencies || {}).forEach(function (dep) { getNames_(d.dependencies[dep], n) }) return n } function getNames (d) { return Object.keys(getNames_(d)) } function next () { if (!local || !global) return if (!npm.config.get("global")) { global = global.map(function (g) { return [g, "-g"] }) } var names = local.concat(global) return cb(null, names) } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/utils/completion/installed-shallow.js���������������000644 �000766 �000024 �00000003543 12455173731 030752� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ module.exports = installedShallow var npm = require("../../npm.js") , fs = require("graceful-fs") , path = require("path") , readJson = require("read-package-json") , asyncMap = require("slide").asyncMap function installedShallow (opts, filter, cb) { if (typeof cb !== "function") cb = filter, filter = null var conf = opts.conf , args = conf.argv.remain if (args.length > 3) return cb() var local , global , localDir = npm.dir , globalDir = npm.globalDir if (npm.config.get("global")) local = [], next() else fs.readdir(localDir, function (er, pkgs) { local = (pkgs || []).filter(function (p) { return p.charAt(0) !== "." }) next() }) fs.readdir(globalDir, function (er, pkgs) { global = (pkgs || []).filter(function (p) { return p.charAt(0) !== "." }) next() }) function next () { if (!local || !global) return filterInstalled(local, global, filter, cb) } } function filterInstalled (local, global, filter, cb) { var fl , fg if (!filter) { fl = local fg = global return next() } asyncMap(local, function (p, cb) { readJson(path.join(npm.dir, p, "package.json"), function (er, d) { if (!d || !filter(d)) return cb(null, []) return cb(null, d.name) }) }, function (er, local) { fl = local || [] next() }) var globalDir = npm.globalDir asyncMap(global, function (p, cb) { readJson(path.join(globalDir, p, "package.json"), function (er, d) { if (!d || !filter(d)) return cb(null, []) return cb(null, d.name) }) }, function (er, global) { fg = global || [] next() }) function next () { if (!fg || !fl) return if (!npm.config.get("global")) { fg = fg.map(function (g) { return [g, "-g"] }) } console.error("filtered", fl, fg) return cb(null, fl.concat(fg)) } } �������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/core.js��������������������������������������000644 �000766 �000024 �00000025274 12455173731 024215� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������ var CC = require("config-chain").ConfigChain var inherits = require("inherits") var configDefs = require("./defaults.js") var types = configDefs.types var once = require("once") var fs = require("fs") var path = require("path") var nopt = require("nopt") var ini = require("ini") var Octal = configDefs.Octal var mkdirp = require("mkdirp") exports.load = load exports.Conf = Conf exports.loaded = false exports.rootConf = null exports.usingBuiltin = false exports.defs = configDefs Object.defineProperty(exports, "defaults", { get: function () { return configDefs.defaults }, enumerable: true }) Object.defineProperty(exports, "types", { get: function () { return configDefs.types }, enumerable: true }) exports.validate = validate var myUid = process.env.SUDO_UID !== undefined ? process.env.SUDO_UID : (process.getuid && process.getuid()) var myGid = process.env.SUDO_GID !== undefined ? process.env.SUDO_GID : (process.getgid && process.getgid()) var loading = false var loadCbs = [] function load () { var cli, builtin, cb for (var i = 0; i < arguments.length; i++) switch (typeof arguments[i]) { case "string": builtin = arguments[i]; break case "object": cli = arguments[i]; break case "function": cb = arguments[i]; break } if (!cb) cb = function () {} if (exports.loaded) { var ret = exports.loaded if (cli) { ret = new Conf(ret) ret.unshift(cli) } return process.nextTick(cb.bind(null, null, ret)) } // either a fresh object, or a clone of the passed in obj if (!cli) cli = {} else cli = Object.keys(cli).reduce(function (c, k) { c[k] = cli[k] return c }, {}) loadCbs.push(cb) if (loading) return loading = true cb = once(function (er, conf) { if (!er) exports.loaded = conf loadCbs.forEach(function (fn) { fn(er, conf) }) loadCbs.length = 0 }) // check for a builtin if provided. exports.usingBuiltin = !!builtin var rc = exports.rootConf = new Conf() if (builtin) rc.addFile(builtin, "builtin") else rc.add({}, "builtin") rc.on("load", function () { load_(builtin, rc, cli, cb) }) rc.on("error", cb) } function load_(builtin, rc, cli, cb) { var defaults = configDefs.defaults var conf = new Conf(rc) conf.usingBuiltin = !!builtin conf.add(cli, "cli") conf.addEnv() conf.loadPrefix(function(er) { if (er) return cb(er) // If you're doing `npm --userconfig=~/foo.npmrc` then you'd expect // that ~/.npmrc won't override the stuff in ~/foo.npmrc (or, indeed // be used at all). // // However, if the cwd is ~, then ~/.npmrc is the home for the project // config, and will override the userconfig. // // If you're not setting the userconfig explicitly, then it will be loaded // twice, which is harmless but excessive. If you *are* setting the // userconfig explicitly then it will override your explicit intent, and // that IS harmful and unexpected. // // Solution: Do not load project config file that is the same as either // the default or resolved userconfig value. npm will log a "verbose" // message about this when it happens, but it is a rare enough edge case // that we don't have to be super concerned about it. var projectConf = path.resolve(conf.localPrefix, ".npmrc") var defaultUserConfig = rc.get("userconfig") var resolvedUserConfig = conf.get("userconfig") if (!conf.get("global") && projectConf !== defaultUserConfig && projectConf !== resolvedUserConfig) { conf.addFile(projectConf, "project") conf.once("load", afterPrefix) } else { conf.add({}, "project") afterPrefix() } }) function afterPrefix() { conf.addFile(conf.get("userconfig"), "user") conf.once("error", cb) conf.once("load", afterUser) } function afterUser () { // globalconfig and globalignorefile defaults // need to respond to the 'prefix' setting up to this point. // Eg, `npm config get globalconfig --prefix ~/local` should // return `~/local/etc/npmrc` // annoying humans and their expectations! if (conf.get("prefix")) { var etc = path.resolve(conf.get("prefix"), "etc") defaults.globalconfig = path.resolve(etc, "npmrc") defaults.globalignorefile = path.resolve(etc, "npmignore") } conf.addFile(conf.get("globalconfig"), "global") // move the builtin into the conf stack now. conf.root = defaults conf.add(rc.shift(), "builtin") conf.once("load", function () { conf.loadExtras(afterExtras) }) } function afterExtras(er) { if (er) return cb(er) // warn about invalid bits. validate(conf) var cafile = conf.get("cafile") if (cafile) { return conf.loadCAFile(cafile, finalize) } finalize() } function finalize(er) { if (er) { return cb(er) } exports.loaded = conf cb(er, conf) } } // Basically the same as CC, but: // 1. Always ini // 2. Parses environment variable names in field values // 3. Field values that start with ~/ are replaced with process.env.HOME // 4. Can inherit from another Conf object, using it as the base. inherits(Conf, CC) function Conf (base) { if (!(this instanceof Conf)) return new Conf(base) CC.apply(this) if (base) if (base instanceof Conf) this.root = base.list[0] || base.root else this.root = base else this.root = configDefs.defaults } Conf.prototype.loadPrefix = require("./load-prefix.js") Conf.prototype.loadCAFile = require("./load-cafile.js") Conf.prototype.loadUid = require("./load-uid.js") Conf.prototype.setUser = require("./set-user.js") Conf.prototype.findPrefix = require("./find-prefix.js") Conf.prototype.getCredentialsByURI = require("./get-credentials-by-uri.js") Conf.prototype.setCredentialsByURI = require("./set-credentials-by-uri.js") Conf.prototype.loadExtras = function(cb) { this.setUser(function(er) { if (er) return cb(er) this.loadUid(function(er) { if (er) return cb(er) // Without prefix, nothing will ever work mkdirp(this.prefix, cb) }.bind(this)) }.bind(this)) } Conf.prototype.save = function (where, cb) { var target = this.sources[where] if (!target || !(target.path || target.source) || !target.data) { if (where !== "builtin") var er = new Error("bad save target: " + where) if (cb) { process.nextTick(cb.bind(null, er)) return this } return this.emit("error", er) } if (target.source) { var pref = target.prefix || "" Object.keys(target.data).forEach(function (k) { target.source[pref + k] = target.data[k] }) if (cb) process.nextTick(cb) return this } var data = ini.stringify(target.data) then = then.bind(this) done = done.bind(this) this._saving ++ var mode = where === "user" ? "0600" : "0666" if (!data.trim()) { fs.unlink(target.path, function () { // ignore the possible error (e.g. the file doesn't exist) done(null) }) } else { mkdirp(path.dirname(target.path), function (er) { if (er) return then(er) fs.writeFile(target.path, data, "utf8", function (er) { if (er) return then(er) if (where === "user" && myUid && myGid) fs.chown(target.path, +myUid, +myGid, then) else then() }) }) } function then (er) { if (er) return done(er) fs.chmod(target.path, mode, done) } function done (er) { if (er) { if (cb) return cb(er) else return this.emit("error", er) } this._saving -- if (this._saving === 0) { if (cb) cb() this.emit("save") } } return this } Conf.prototype.addFile = function (file, name) { name = name || file var marker = {__source__:name} this.sources[name] = { path: file, type: "ini" } this.push(marker) this._await() fs.readFile(file, "utf8", function (er, data) { if (er) // just ignore missing files. return this.add({}, marker) this.addString(data, file, "ini", marker) }.bind(this)) return this } // always ini files. Conf.prototype.parse = function (content, file) { return CC.prototype.parse.call(this, content, file, "ini") } Conf.prototype.add = function (data, marker) { try { Object.keys(data).forEach(function (k) { data[k] = parseField(data[k], k) }) } catch (e) { this.emit("error", e) return this } return CC.prototype.add.call(this, data, marker) } Conf.prototype.addEnv = function (env) { env = env || process.env var conf = {} Object.keys(env) .filter(function (k) { return k.match(/^npm_config_/i) }) .forEach(function (k) { if (!env[k]) return // leave first char untouched, even if // it is a "_" - convert all other to "-" var p = k.toLowerCase() .replace(/^npm_config_/, "") .replace(/(?!^)_/g, "-") conf[p] = env[k] }) return CC.prototype.addEnv.call(this, "", conf, "env") } function parseField (f, k) { if (typeof f !== "string" && !(f instanceof String)) return f // type can be an array or single thing. var typeList = [].concat(types[k]) var isPath = -1 !== typeList.indexOf(path) var isBool = -1 !== typeList.indexOf(Boolean) var isString = -1 !== typeList.indexOf(String) var isOctal = -1 !== typeList.indexOf(Octal) var isNumber = isOctal || (-1 !== typeList.indexOf(Number)) f = (""+f).trim() if (f.match(/^".*"$/)) { try { f = JSON.parse(f) } catch (e) { throw new Error("Failed parsing JSON config key " + k + ": " + f) } } if (isBool && !isString && f === "") return true switch (f) { case "true": return true case "false": return false case "null": return null case "undefined": return undefined } f = envReplace(f) if (isPath) { var homePattern = process.platform === "win32" ? /^~(\/|\\)/ : /^~\// if (f.match(homePattern) && process.env.HOME) { f = path.resolve(process.env.HOME, f.substr(2)) } f = path.resolve(f) } if (isNumber && !isNaN(f)) f = isOctal ? parseInt(f, 8) : +f return f } function envReplace (f) { if (typeof f !== "string" || !f) return f // replace any ${ENV} values with the appropriate environ. var envExpr = /(\\*)\$\{([^}]+)\}/g return f.replace(envExpr, function (orig, esc, name) { esc = esc.length && esc.length % 2 if (esc) return orig if (undefined === process.env[name]) throw new Error("Failed to replace env in config: "+orig) return process.env[name] }) } function validate (cl) { // warn about invalid configs at every level. cl.list.forEach(function (conf) { nopt.clean(conf, configDefs.types) }) nopt.clean(cl.root, configDefs.types) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/defaults.js����������������������������������000644 �000766 �000024 �00000023760 12455173731 025072� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// defaults, types, and shorthands. var path = require("path") , url = require("url") , Stream = require("stream").Stream , semver = require("semver") , stableFamily = semver.parse(process.version) , nopt = require("nopt") , os = require("os") , osenv = require("osenv") var log try { log = require("npmlog") } catch (er) { var util = require("util") log = { warn: function (m) { console.warn(m + " " + util.format.apply(util, [].slice.call(arguments, 1))) } } } exports.Octal = Octal function Octal () {} function validateOctal (data, k, val) { // must be either an integer or an octal string. if (typeof val === "number") { data[k] = val return true } if (typeof val === "string") { if (val.charAt(0) !== "0" || isNaN(val)) return false data[k] = parseInt(val, 8).toString(8) } } function validateSemver (data, k, val) { if (!semver.valid(val)) return false data[k] = semver.valid(val) } function validateTag (data, k, val) { val = ("" + val).trim() if (!val || semver.validRange(val)) return false data[k] = val } function validateStream (data, k, val) { if (!(val instanceof Stream)) return false data[k] = val } nopt.typeDefs.semver = { type: semver, validate: validateSemver } nopt.typeDefs.Octal = { type: Octal, validate: validateOctal } nopt.typeDefs.Stream = { type: Stream, validate: validateStream } // Don't let --tag=1.2.3 ever be a thing var tag = {} nopt.typeDefs.tag = { type: tag, validate: validateTag } nopt.invalidHandler = function (k, val, type) { log.warn("invalid config", k + "=" + JSON.stringify(val)) if (Array.isArray(type)) { if (type.indexOf(url) !== -1) type = url else if (type.indexOf(path) !== -1) type = path } switch (type) { case tag: log.warn("invalid config", "Tag must not be a SemVer range") break case Octal: log.warn("invalid config", "Must be octal number, starting with 0") break case url: log.warn("invalid config", "Must be a full url with 'http://'") break case path: log.warn("invalid config", "Must be a valid filesystem path") break case Number: log.warn("invalid config", "Must be a numeric value") break case Stream: log.warn("invalid config", "Must be an instance of the Stream class") break } } if (!stableFamily || (+stableFamily.minor % 2)) stableFamily = null else stableFamily = stableFamily.major + "." + stableFamily.minor var defaults var temp = osenv.tmpdir() var home = osenv.home() var uidOrPid = process.getuid ? process.getuid() : process.pid if (home) process.env.HOME = home else home = path.resolve(temp, "npm-" + uidOrPid) var cacheExtra = process.platform === "win32" ? "npm-cache" : ".npm" var cacheRoot = process.platform === "win32" && process.env.APPDATA || home var cache = path.resolve(cacheRoot, cacheExtra) var globalPrefix Object.defineProperty(exports, "defaults", {get: function () { if (defaults) return defaults if (process.env.PREFIX) { globalPrefix = process.env.PREFIX } else if (process.platform === "win32") { // c:\node\node.exe --> prefix=c:\node\ globalPrefix = path.dirname(process.execPath) } else { // /usr/local/bin/node --> prefix=/usr/local globalPrefix = path.dirname(path.dirname(process.execPath)) // destdir only is respected on Unix if (process.env.DESTDIR) { globalPrefix = path.join(process.env.DESTDIR, globalPrefix) } } defaults = { "always-auth" : false , "bin-links" : true , browser : null , ca: null , cafile: null , cache : cache , "cache-lock-stale": 60000 , "cache-lock-retries": 10 , "cache-lock-wait": 10000 , "cache-max": Infinity , "cache-min": 10 , cert: null , color : true , depth: Infinity , description : true , dev : false , editor : osenv.editor() , "engine-strict": false , force : false , "fetch-retries": 2 , "fetch-retry-factor": 10 , "fetch-retry-mintimeout": 10000 , "fetch-retry-maxtimeout": 60000 , git: "git" , "git-tag-version": true , global : false , globalconfig : path.resolve(globalPrefix, "etc", "npmrc") , group : process.platform === "win32" ? 0 : process.env.SUDO_GID || (process.getgid && process.getgid()) , heading: "npm" , "ignore-scripts": false , "init-module": path.resolve(home, ".npm-init.js") , "init-author-name" : "" , "init-author-email" : "" , "init-author-url" : "" , "init-version": "1.0.0" , "init-license": "ISC" , json: false , key: null , link: false , "local-address" : undefined , loglevel : "warn" , logstream : process.stderr , long : false , message : "%s" , "node-version" : process.version , npat : false , "onload-script" : false , optional: true , parseable : false , prefix : globalPrefix , production: process.env.NODE_ENV === "production" , "proprietary-attribs": true , proxy : null , "https-proxy" : null , "user-agent" : "npm/{npm-version} " + "node/{node-version} " + "{platform} " + "{arch}" , "rebuild-bundle" : true , registry : "https://registry.npmjs.org/" , rollback : true , save : false , "save-bundle": false , "save-dev" : false , "save-exact" : false , "save-optional" : false , "save-prefix": "^" , scope : "" , searchopts: "" , searchexclude: null , searchsort: "name" , shell : osenv.shell() , shrinkwrap: true , "sign-git-tag": false , spin: true , "strict-ssl": true , tag : "latest" , tmp : temp , unicode : true , "unsafe-perm" : process.platform === "win32" || process.platform === "cygwin" || !( process.getuid && process.setuid && process.getgid && process.setgid ) || process.getuid() !== 0 , usage : false , user : process.platform === "win32" ? 0 : "nobody" , userconfig : path.resolve(home, ".npmrc") , umask: process.umask ? process.umask() : parseInt("022", 8) , version : false , versions : false , viewer: process.platform === "win32" ? "browser" : "man" , _exit : true } return defaults }}) exports.types = { "always-auth" : Boolean , "bin-links": Boolean , browser : [null, String] , ca: [null, String, Array] , cafile : path , cache : path , "cache-lock-stale": Number , "cache-lock-retries": Number , "cache-lock-wait": Number , "cache-max": Number , "cache-min": Number , cert: [null, String] , color : ["always", Boolean] , depth : Number , description : Boolean , dev : Boolean , editor : String , "engine-strict": Boolean , force : Boolean , "fetch-retries": Number , "fetch-retry-factor": Number , "fetch-retry-mintimeout": Number , "fetch-retry-maxtimeout": Number , git: String , "git-tag-version": Boolean , global : Boolean , globalconfig : path , group : [Number, String] , "https-proxy" : [null, url] , "user-agent" : String , "heading": String , "ignore-scripts": Boolean , "init-module": path , "init-author-name" : String , "init-author-email" : String , "init-author-url" : ["", url] , "init-license": String , "init-version": semver , json: Boolean , key: [null, String] , link: Boolean // local-address must be listed as an IP for a local network interface // must be IPv4 due to node bug , "local-address" : getLocalAddresses() , loglevel : ["silent","error","warn","http","info","verbose","silly"] , logstream : Stream , long : Boolean , message: String , "node-version" : [null, semver] , npat : Boolean , "onload-script" : [null, String] , optional: Boolean , parseable : Boolean , prefix: path , production: Boolean , "proprietary-attribs": Boolean , proxy : [null, url] , "rebuild-bundle" : Boolean , registry : [null, url] , rollback : Boolean , save : Boolean , "save-bundle": Boolean , "save-dev" : Boolean , "save-exact" : Boolean , "save-optional" : Boolean , "save-prefix": String , scope : String , searchopts : String , searchexclude: [null, String] , searchsort: [ "name", "-name" , "description", "-description" , "author", "-author" , "date", "-date" , "keywords", "-keywords" ] , shell : String , shrinkwrap: Boolean , "sign-git-tag": Boolean , spin: ["always", Boolean] , "strict-ssl": Boolean , tag : tag , tmp : path , unicode : Boolean , "unsafe-perm" : Boolean , usage : Boolean , user : [Number, String] , userconfig : path , umask: Octal , version : Boolean , versions : Boolean , viewer: String , _exit : Boolean } function getLocalAddresses() { Object.keys(os.networkInterfaces()).map(function (nic) { return os.networkInterfaces()[nic].filter(function (addr) { return addr.family === "IPv4" }) .map(function (addr) { return addr.address }) }).reduce(function (curr, next) { return curr.concat(next) }, []).concat(undefined) } exports.shorthands = { s : ["--loglevel", "silent"] , d : ["--loglevel", "info"] , dd : ["--loglevel", "verbose"] , ddd : ["--loglevel", "silly"] , noreg : ["--no-registry"] , N : ["--no-registry"] , reg : ["--registry"] , "no-reg" : ["--no-registry"] , silent : ["--loglevel", "silent"] , verbose : ["--loglevel", "verbose"] , quiet: ["--loglevel", "warn"] , q: ["--loglevel", "warn"] , h : ["--usage"] , H : ["--usage"] , "?" : ["--usage"] , help : ["--usage"] , v : ["--version"] , f : ["--force"] , gangster : ["--force"] , gangsta : ["--force"] , desc : ["--description"] , "no-desc" : ["--no-description"] , "local" : ["--no-global"] , l : ["--long"] , m : ["--message"] , p : ["--parseable"] , porcelain : ["--parseable"] , g : ["--global"] , S : ["--save"] , D : ["--save-dev"] , E : ["--save-exact"] , O : ["--save-optional"] , y : ["--yes"] , n : ["--no-yes"] , B : ["--save-bundle"] , C : ["--prefix"] } ����������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/find-prefix.js�������������������������������000644 �000766 �000024 �00000002473 12455173731 025474� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������// try to find the most reasonable prefix to use module.exports = findPrefix var fs = require("fs") var path = require("path") function findPrefix (p, cb_) { function cb (er, p) { process.nextTick(function () { cb_(er, p) }) } p = path.resolve(p) // if there's no node_modules folder, then // walk up until we hopefully find one. // if none anywhere, then use cwd. var walkedUp = false while (path.basename(p) === "node_modules") { p = path.dirname(p) walkedUp = true } if (walkedUp) return cb(null, p) findPrefix_(p, p, cb) } function findPrefix_ (p, original, cb) { if (p === "/" || (process.platform === "win32" && p.match(/^[a-zA-Z]:(\\|\/)?$/))) { return cb(null, original) } fs.readdir(p, function (er, files) { // an error right away is a bad sign. // unless the prefix was simply a non // existent directory. if (er && p === original) { if (er.code === "ENOENT") return cb(null, original); return cb(er) } // walked up too high or something. if (er) return cb(null, original) if (files.indexOf("node_modules") !== -1 || files.indexOf("package.json") !== -1) { return cb(null, p) } var d = path.dirname(p) if (d === p) return cb(null, original) return findPrefix_(d, original, cb) }) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/get-credentials-by-uri.js��������������������000644 �000766 �000024 �00000004026 12455173731 027534� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") var toNerfDart = require("./nerf-dart.js") module.exports = getCredentialsByURI function getCredentialsByURI (uri) { assert(uri && typeof uri === "string", "registry URL is required") var nerfed = toNerfDart(uri) var defnerf = toNerfDart(this.get("registry")) // hidden class micro-optimization var c = { scope : nerfed, token : undefined, password : undefined, username : undefined, email : undefined, auth : undefined, alwaysAuth : undefined } if (this.get(nerfed + ":_authToken")) { c.token = this.get(nerfed + ":_authToken") // the bearer token is enough, don't confuse things return c } // Handle the old-style _auth=<base64> style for the default // registry, if set. // // XXX(isaacs): Remove when npm 1.4 is no longer relevant var authDef = this.get("_auth") var userDef = this.get("username") var passDef = this.get("_password") if (authDef && !(userDef && passDef)) { authDef = new Buffer(authDef, "base64").toString() authDef = authDef.split(":") userDef = authDef.shift() passDef = authDef.join(":") } if (this.get(nerfed + ":_password")) { c.password = new Buffer(this.get(nerfed + ":_password"), "base64").toString("utf8") } else if (nerfed === defnerf && passDef) { c.password = passDef } if (this.get(nerfed + ":username")) { c.username = this.get(nerfed + ":username") } else if (nerfed === defnerf && userDef) { c.username = userDef } if (this.get(nerfed + ":email")) { c.email = this.get(nerfed + ":email") } else if (this.get("email")) { c.email = this.get("email") } if (this.get(nerfed + ":always-auth") !== undefined) { var val = this.get(nerfed + ":always-auth") c.alwaysAuth = val === "false" ? false : !!val } else if (this.get("always-auth") !== undefined) { c.alwaysAuth = this.get("always-auth") } if (c.username && c.password) { c.auth = new Buffer(c.username + ":" + c.password).toString("base64") } return c } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/load-cafile.js�������������������������������000644 �000766 �000024 �00000001054 12455173731 025413� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = loadCAFile var fs = require("fs") function loadCAFile(cafilePath, cb) { if (!cafilePath) return process.nextTick(cb) fs.readFile(cafilePath, "utf8", afterCARead.bind(this)) function afterCARead(er, cadata) { if (er) return cb(er) var delim = "-----END CERTIFICATE-----" var output output = cadata .split(delim) .filter(function(xs) { return !!xs.trim() }) .map(function(xs) { return xs.trimLeft() + delim }) this.set("ca", output) cb(null) } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/load-prefix.js�������������������������������000644 �000766 �000024 �00000002456 12455173731 025474� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = loadPrefix var findPrefix = require("./find-prefix.js") var path = require("path") function loadPrefix (cb) { var cli = this.list[0] Object.defineProperty(this, "prefix", { set : function (prefix) { var g = this.get("global") this[g ? "globalPrefix" : "localPrefix"] = prefix }.bind(this) , get : function () { var g = this.get("global") return g ? this.globalPrefix : this.localPrefix }.bind(this) , enumerable : true }) Object.defineProperty(this, "globalPrefix", { set : function (prefix) { this.set("prefix", prefix) }.bind(this) , get : function () { return path.resolve(this.get("prefix")) }.bind(this) , enumerable : true }) var p Object.defineProperty(this, "localPrefix", { set : function (prefix) { p = prefix }, get : function () { return p } , enumerable: true }) // try to guess at a good node_modules location. // If we are *explicitly* given a prefix on the cli, then // always use that. otherwise, infer local prefix from cwd. if (Object.prototype.hasOwnProperty.call(cli, "prefix")) { p = path.resolve(cli.prefix) process.nextTick(cb) } else { findPrefix(process.cwd(), function (er, found) { p = found cb(er) }) } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/load-uid.js����������������������������������000644 �000766 �000024 �00000000602 12455173731 024747� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = loadUid var getUid = require("uid-number") // Call in the context of a npmconf object function loadUid (cb) { // if we're not in unsafe-perm mode, then figure out who // to run stuff as. Do this first, to support `npm update npm -g` if (!this.get("unsafe-perm")) { getUid(this.get("user"), this.get("group"), cb) } else { process.nextTick(cb) } } ������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/nerf-dart.js���������������������������������000644 �000766 �000024 �00000000726 12455173731 025142� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var url = require("url") module.exports = toNerfDart /** * Maps a URL to an identifier. * * Name courtesy schiffertronix media LLC, a New Jersey corporation * * @param {String} uri The URL to be nerfed. * * @returns {String} A nerfed URL. */ function toNerfDart(uri) { var parsed = url.parse(uri) delete parsed.protocol delete parsed.auth delete parsed.query delete parsed.search delete parsed.hash return url.resolve(url.format(parsed), ".") } ������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/set-credentials-by-uri.js��������������������000644 �000766 �000024 �00000002455 12455173731 027554� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") var toNerfDart = require("./nerf-dart.js") module.exports = setCredentialsByURI function setCredentialsByURI (uri, c) { assert(uri && typeof uri === "string", "registry URL is required") assert(c && typeof c === "object", "credentials are required") var nerfed = toNerfDart(uri) if (c.token) { this.set(nerfed + ":_authToken", c.token, "user") this.del(nerfed + ":_password", "user") this.del(nerfed + ":username", "user") this.del(nerfed + ":email", "user") this.del(nerfed + ":always-auth", "user") } else if (c.username || c.password || c.email) { assert(c.username, "must include username") assert(c.password, "must include password") assert(c.email, "must include email address") this.del(nerfed + ":_authToken", "user") var encoded = new Buffer(c.password, "utf8").toString("base64") this.set(nerfed + ":_password", encoded, "user") this.set(nerfed + ":username", c.username, "user") this.set(nerfed + ":email", c.email, "user") if (c.alwaysAuth !== undefined) { this.set(nerfed + ":always-auth", c.alwaysAuth, "user") } else { this.del(nerfed + ":always-auth", "user") } } else { throw new Error("No credentials to set.") } } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/config/set-user.js����������������������������������000644 �000766 �000024 �00000001355 12455173731 025026� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = setUser var assert = require("assert") var path = require("path") var fs = require("fs") var mkdirp = require("mkdirp") function setUser (cb) { var defaultConf = this.root assert(defaultConf !== Object.prototype) // If global, leave it as-is. // If not global, then set the user to the owner of the prefix folder. // Just set the default, so it can be overridden. if (this.get("global")) return cb() if (process.env.SUDO_UID) { defaultConf.user = +(process.env.SUDO_UID) return cb() } var prefix = path.resolve(this.get("prefix")) mkdirp(prefix, function (er) { if (er) return cb(er) fs.stat(prefix, function (er, st) { defaultConf.user = st && st.uid return cb(er) }) }) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/add-local-tarball.js��������������������������000644 �000766 �000024 �00000013310 12455173731 026306� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdir = require("mkdirp") , assert = require("assert") , fs = require("graceful-fs") , writeFileAtomic = require("write-file-atomic") , path = require("path") , sha = require("sha") , npm = require("../npm.js") , log = require("npmlog") , tar = require("../utils/tar.js") , pathIsInside = require("path-is-inside") , getCacheStat = require("./get-stat.js") , cachedPackageRoot = require("./cached-package-root.js") , chownr = require("chownr") , inflight = require("inflight") , once = require("once") , writeStream = require("fs-write-stream-atomic") , randomBytes = require("crypto").pseudoRandomBytes // only need uniqueness module.exports = addLocalTarball function addLocalTarball (p, pkgData, shasum, cb) { assert(typeof p === "string", "must have path") assert(typeof cb === "function", "must have callback") if (!pkgData) pkgData = {} // If we don't have a shasum yet, compute it. if (!shasum) { return sha.get(p, function (er, shasum) { if (er) return cb(er) log.silly("addLocalTarball", "shasum (computed)", shasum) addLocalTarball(p, pkgData, shasum, cb) }) } if (pathIsInside(p, npm.cache)) { if (path.basename(p) !== "package.tgz") { return cb(new Error("Not a valid cache tarball name: "+p)) } log.verbose("addLocalTarball", "adding from inside cache", p) return addPlacedTarball(p, pkgData, shasum, cb) } addTmpTarball(p, pkgData, shasum, function (er, data) { if (data) { data._resolved = p data._shasum = data._shasum || shasum } return cb(er, data) }) } function addPlacedTarball (p, pkgData, shasum, cb) { assert(pkgData, "should have package data by now") assert(typeof cb === "function", "cb function required") getCacheStat(function (er, cs) { if (er) return cb(er) return addPlacedTarball_(p, pkgData, cs.uid, cs.gid, shasum, cb) }) } function addPlacedTarball_ (p, pkgData, uid, gid, resolvedSum, cb) { var folder = path.join(cachedPackageRoot(pkgData), "package") // First, make sure we have the shasum, if we don't already. if (!resolvedSum) { sha.get(p, function (er, shasum) { if (er) return cb(er) addPlacedTarball_(p, pkgData, uid, gid, shasum, cb) }) return } mkdir(folder, function (er) { if (er) return cb(er) var pj = path.join(folder, "package.json") var json = JSON.stringify(pkgData, null, 2) writeFileAtomic(pj, json, function (er) { cb(er, pkgData) }) }) } function addTmpTarball (tgz, pkgData, shasum, cb) { assert(typeof cb === "function", "must have callback function") assert(shasum, "must have shasum by now") cb = inflight("addTmpTarball:" + tgz, cb) if (!cb) return log.verbose("addTmpTarball", tgz, "already in flight; not adding") log.verbose("addTmpTarball", tgz, "not in flight; adding") // we already have the package info, so just move into place if (pkgData && pkgData.name && pkgData.version) { log.verbose( "addTmpTarball", "already have metadata; skipping unpack for", pkgData.name + "@" + pkgData.version ) return addTmpTarball_(tgz, pkgData, shasum, cb) } // This is a tarball we probably downloaded from the internet. The shasum's // already been checked, but we haven't ever had a peek inside, so we unpack // it here just to make sure it is what it says it is. // // NOTE: we might not have any clue what we think it is, for example if the // user just did `npm install ./foo.tgz` // generate a unique filename randomBytes(6, function (er, random) { if (er) return cb(er) var target = path.join(npm.tmp, "unpack-" + random.toString("hex")) getCacheStat(function (er, cs) { if (er) return cb(er) log.verbose("addTmpTarball", "validating metadata from", tgz) tar.unpack(tgz, target, null, null, cs.uid, cs.gid, function (er, data) { if (er) return cb(er) // check that this is what we expected. if (!data.name) { return cb(new Error("No name provided")) } else if (pkgData.name && data.name !== pkgData.name) { return cb(new Error("Invalid Package: expected " + pkgData.name + " but found " + data.name)) } if (!data.version) { return cb(new Error("No version provided")) } else if (pkgData.version && data.version !== pkgData.version) { return cb(new Error("Invalid Package: expected " + pkgData.name + "@" + pkgData.version + " but found " + data.name + "@" + data.version)) } addTmpTarball_(tgz, data, shasum, cb) }) }) }) } function addTmpTarball_ (tgz, data, shasum, cb) { assert(typeof cb === "function", "must have callback function") cb = once(cb) assert(data.name, "should have package name by now") assert(data.version, "should have package version by now") var root = cachedPackageRoot(data) var pkg = path.resolve(root, "package") var target = path.resolve(root, "package.tgz") getCacheStat(function (er, cs) { if (er) return cb(er) mkdir(pkg, function (er, created) { // chown starting from the first dir created by mkdirp, // or the root dir, if none had to be created, so that // we know that we get all the children. function chown () { chownr(created || root, cs.uid, cs.gid, done) } if (er) return cb(er) var read = fs.createReadStream(tgz) var write = writeStream(target, { mode: npm.modes.file }) var fin = cs.uid && cs.gid ? chown : done read.on("error", cb).pipe(write).on("error", cb).on("close", fin) }) }) function done() { data._shasum = data._shasum || shasum cb(null, data) } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/add-local.js����������������������������������000644 �000766 �000024 �00000007432 12455173731 024677� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") , path = require("path") , mkdir = require("mkdirp") , chownr = require("chownr") , pathIsInside = require("path-is-inside") , readJson = require("read-package-json") , log = require("npmlog") , npm = require("../npm.js") , tar = require("../utils/tar.js") , deprCheck = require("../utils/depr-check.js") , getCacheStat = require("./get-stat.js") , cachedPackageRoot = require("./cached-package-root.js") , addLocalTarball = require("./add-local-tarball.js") , sha = require("sha") , inflight = require("inflight") module.exports = addLocal function addLocal (p, pkgData, cb_) { assert(typeof p === "object", "must have spec info") assert(typeof cb === "function", "must have callback") pkgData = pkgData || {} function cb (er, data) { if (er) { log.error("addLocal", "Could not install %s", p.spec) return cb_(er) } if (data && !data._fromGithub) { data._from = path.relative(npm.prefix, p.spec) || "." var resolved = path.relative(npm.prefix, p.spec) if (resolved) data._resolved = "file:"+resolved } return cb_(er, data) } if (p.type === "directory") { addLocalDirectory(p.spec, pkgData, null, cb) } else { addLocalTarball(p.spec, pkgData, null, cb) } } // At this point, if shasum is set, it's something that we've already // read and checked. Just stashing it in the data at this point. function addLocalDirectory (p, pkgData, shasum, cb) { assert(pkgData, "must pass package data") assert(typeof cb === "function", "must have callback") // if it's a folder, then read the package.json, // tar it to the proper place, and add the cache tar if (pathIsInside(p, npm.cache)) return cb(new Error( "Adding a cache directory to the cache will make the world implode.")) readJson(path.join(p, "package.json"), false, function (er, data) { if (er) return cb(er) if (!data.name) { return cb(new Error("No name provided in package.json")) } else if (pkgData.name && pkgData.name !== data.name) { return cb(new Error( "Invalid package: expected " + pkgData.name + " but found " + data.name )) } if (!data.version) { return cb(new Error("No version provided in package.json")) } else if (pkgData.version && pkgData.version !== data.version) { return cb(new Error( "Invalid package: expected " + pkgData.name + "@" + pkgData.version + " but found " + data.name + "@" + data.version )) } deprCheck(data) // pack to {cache}/name/ver/package.tgz var root = cachedPackageRoot(data) var tgz = path.resolve(root, "package.tgz") var pj = path.resolve(root, "package/package.json") var wrapped = inflight(tgz, next) if (!wrapped) return log.verbose("addLocalDirectory", tgz, "already in flight; waiting") log.verbose("addLocalDirectory", tgz, "not in flight; packing") getCacheStat(function (er, cs) { mkdir(path.dirname(pj), function (er, made) { if (er) return cb(er) var fancy = !pathIsInside(p, npm.tmp) tar.pack(tgz, p, data, fancy, function (er) { if (er) { log.error("addLocalDirectory", "Could not pack", p, "to", tgz) return cb(er) } if (!cs || isNaN(cs.uid) || isNaN(cs.gid)) wrapped() chownr(made || tgz, cs.uid, cs.gid, wrapped) }) }) }) function next (er) { if (er) return cb(er) // if we have the shasum already, just add it if (shasum) { return addLocalTarball(tgz, data, shasum, cb) } else { sha.get(tgz, function (er, shasum) { if (er) { return cb(er) } data._shasum = shasum return addLocalTarball(tgz, data, shasum, cb) }) } } }) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/add-named.js����������������������������������000644 �000766 �000024 �00000017712 12455173731 024673� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require("path") , assert = require("assert") , fs = require("graceful-fs") , http = require("http") , log = require("npmlog") , semver = require("semver") , readJson = require("read-package-json") , url = require("url") , npm = require("../npm.js") , deprCheck = require("../utils/depr-check.js") , inflight = require("inflight") , addRemoteTarball = require("./add-remote-tarball.js") , cachedPackageRoot = require("./cached-package-root.js") , mapToRegistry = require("../utils/map-to-registry.js") module.exports = addNamed function getOnceFromRegistry (name, from, next, done) { mapToRegistry(name, npm.config, function (er, uri, auth) { if (er) return done(er) var key = "registry:" + uri next = inflight(key, next) if (!next) return log.verbose(from, key, "already in flight; waiting") else log.verbose(from, key, "not in flight; fetching") npm.registry.get(uri, { auth : auth }, next) }) } function addNamed (name, version, data, cb_) { assert(typeof name === "string", "must have module name") assert(typeof cb_ === "function", "must have callback") var key = name + "@" + version log.verbose("addNamed", key) function cb (er, data) { if (data && !data._fromGithub) data._from = key cb_(er, data) } log.silly("addNamed", "semver.valid", semver.valid(version)) log.silly("addNamed", "semver.validRange", semver.validRange(version)) var fn = ( semver.valid(version, true) ? addNameVersion : semver.validRange(version, true) ? addNameRange : addNameTag ) fn(name, version, data, cb) } function addNameTag (name, tag, data, cb) { log.info("addNameTag", [name, tag]) var explicit = true if (!tag) { explicit = false tag = npm.config.get("tag") } getOnceFromRegistry(name, "addNameTag", next, cb) function next (er, data, json, resp) { if (!er) er = errorResponse(name, resp) if (er) return cb(er) log.silly("addNameTag", "next cb for", name, "with tag", tag) engineFilter(data) if (data["dist-tags"] && data["dist-tags"][tag] && data.versions[data["dist-tags"][tag]]) { var ver = data["dist-tags"][tag] return addNamed(name, ver, data.versions[ver], cb) } if (!explicit && Object.keys(data.versions).length) { return addNamed(name, "*", data, cb) } er = installTargetsError(tag, data) return cb(er) } } function engineFilter (data) { var npmv = npm.version , nodev = npm.config.get("node-version") , strict = npm.config.get("engine-strict") if (!nodev || npm.config.get("force")) return data Object.keys(data.versions || {}).forEach(function (v) { var eng = data.versions[v].engines if (!eng) return if (!strict && !data.versions[v].engineStrict) return if (eng.node && !semver.satisfies(nodev, eng.node, true) || eng.npm && !semver.satisfies(npmv, eng.npm, true)) { delete data.versions[v] } }) } function addNameVersion (name, v, data, cb) { var ver = semver.valid(v, true) if (!ver) return cb(new Error("Invalid version: "+v)) var response if (data) { response = null return next() } getOnceFromRegistry(name, "addNameVersion", setData, cb) function setData (er, d, json, resp) { if (!er) { er = errorResponse(name, resp) } if (er) return cb(er) data = d && d.versions[ver] if (!data) { er = new Error("version not found: "+name+"@"+ver) er.package = name er.statusCode = 404 return cb(er) } response = resp next() } function next () { deprCheck(data) var dist = data.dist if (!dist) return cb(new Error("No dist in "+data._id+" package")) if (!dist.tarball) return cb(new Error( "No dist.tarball in " + data._id + " package")) if ((response && response.statusCode !== 304) || npm.config.get("force")) { return fetchit() } // we got cached data, so let's see if we have a tarball. var pkgroot = cachedPackageRoot({name : name, version : ver}) var pkgtgz = path.join(pkgroot, "package.tgz") var pkgjson = path.join(pkgroot, "package", "package.json") fs.stat(pkgtgz, function (er) { if (!er) { readJson(pkgjson, function (er, data) { if (er && er.code !== "ENOENT" && er.code !== "ENOTDIR") return cb(er) if (data) { if (!data.name) return cb(new Error("No name provided")) if (!data.version) return cb(new Error("No version provided")) // check the SHA of the package we have, to ensure it wasn't installed // from somewhere other than the registry (eg, a fork) if (data._shasum && dist.shasum && data._shasum !== dist.shasum) { return fetchit() } } if (er) return fetchit() else return cb(null, data) }) } else return fetchit() }) function fetchit () { mapToRegistry(name, npm.config, function (er, _, auth, ruri) { if (er) return cb(er) // Use the same protocol as the registry. https registry --> https // tarballs, but only if they're the same hostname, or else detached // tarballs may not work. var tb = url.parse(dist.tarball) var rp = url.parse(ruri) if (tb.hostname === rp.hostname && tb.protocol !== rp.protocol) { tb.protocol = rp.protocol delete tb.href } tb = url.format(tb) // Only add non-shasum'ed packages if --forced. Only ancient things // would lack this for good reasons nowadays. if (!dist.shasum && !npm.config.get("force")) { return cb(new Error("package lacks shasum: " + data._id)) } addRemoteTarball(tb, data, dist.shasum, auth, cb) }) } } } function addNameRange (name, range, data, cb) { range = semver.validRange(range, true) if (range === null) return cb(new Error( "Invalid version range: " + range )) log.silly("addNameRange", {name:name, range:range, hasData:!!data}) if (data) return next() getOnceFromRegistry(name, "addNameRange", setData, cb) function setData (er, d, json, resp) { if (!er) { er = errorResponse(name, resp) } if (er) return cb(er) data = d next() } function next () { log.silly( "addNameRange", "number 2" , {name:name, range:range, hasData:!!data}) engineFilter(data) log.silly("addNameRange", "versions" , [data.name, Object.keys(data.versions || {})]) // if the tagged version satisfies, then use that. var tagged = data["dist-tags"][npm.config.get("tag")] if (tagged && data.versions[tagged] && semver.satisfies(tagged, range, true)) { return addNamed(name, tagged, data.versions[tagged], cb) } // find the max satisfying version. var versions = Object.keys(data.versions || {}) var ms = semver.maxSatisfying(versions, range, true) if (!ms) { return cb(installTargetsError(range, data)) } // if we don't have a registry connection, try to see if // there's a cached copy that will be ok. addNamed(name, ms, data.versions[ms], cb) } } function installTargetsError (requested, data) { var targets = Object.keys(data["dist-tags"]).filter(function (f) { return (data.versions || {}).hasOwnProperty(f) }).concat(Object.keys(data.versions || {})) requested = data.name + (requested ? "@'" + requested + "'" : "") targets = targets.length ? "Valid install targets:\n" + JSON.stringify(targets) + "\n" : "No valid targets found.\n" + "Perhaps not compatible with your version of node?" var er = new Error( "No compatible version found: " + requested + "\n" + targets) er.code = "ETARGET" return er } function errorResponse (name, response) { var er if (response.statusCode >= 400) { er = new Error(http.STATUS_CODES[response.statusCode]) er.statusCode = response.statusCode er.code = "E" + er.statusCode er.pkgid = name } return er } ������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/add-remote-git.js�����������������������������000644 �000766 �000024 �00000022400 12455173731 025651� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdir = require("mkdirp") , assert = require("assert") , git = require("../utils/git.js") , fs = require("graceful-fs") , log = require("npmlog") , path = require("path") , url = require("url") , chownr = require("chownr") , crypto = require("crypto") , npm = require("../npm.js") , rm = require("../utils/gently-rm.js") , inflight = require("inflight") , getCacheStat = require("./get-stat.js") , addLocal = require("./add-local.js") , realizePackageSpecifier = require("realize-package-specifier") , normalizeGitUrl = require("normalize-git-url") var remotes = path.resolve(npm.config.get("cache"), "_git-remotes") var templates = path.join(remotes, "_templates") var VALID_VARIABLES = [ "GIT_SSH", "GIT_SSL_NO_VERIFY", "GIT_PROXY_COMMAND", "GIT_SSL_CAINFO" ] // 1. cacheDir = path.join(cache,'_git-remotes',sha1(u)) // 2. checkGitDir(cacheDir) ? 4. : 3. (rm cacheDir if necessary) // 3. git clone --mirror u cacheDir // 4. cd cacheDir && git fetch -a origin // 5. git archive /tmp/random.tgz // 6. addLocalTarball(/tmp/random.tgz) <gitref> --format=tar --prefix=package/ // silent flag is used if this should error quietly module.exports = function addRemoteGit (u, silent, cb) { assert(typeof u === "string", "must have git URL") assert(typeof cb === "function", "must have callback") log.verbose("addRemoteGit", "u=%j silent=%j", u, silent) var normalized = normalizeGitUrl(u) log.silly("addRemoteGit", "normalized", normalized) var v = crypto.createHash("sha1").update(normalized.url).digest("hex").slice(0, 8) v = normalized.url.replace(/[^a-zA-Z0-9]+/g, "-")+"-"+v log.silly("addRemoteGit", "v", v) var p = path.join(remotes, v) cb = inflight(p, cb) if (!cb) return log.verbose("addRemoteGit", p, "already in flight; waiting") log.verbose("addRemoteGit", p, "not in flight; cloning") getGitDir(function (er) { if (er) return cb(er) checkGitDir(p, normalized.url, normalized.branch, u, silent, function (er, data) { if (er) return cb(er, data) addModeRecursive(p, npm.modes.file, function (er) { return cb(er, data) }) }) }) } function getGitDir (cb) { getCacheStat(function (er, st) { if (er) return cb(er) // We don't need global templates when cloning. Use an empty directory for // the templates, creating it (and setting its permissions) if necessary. mkdir(templates, function (er) { if (er) return cb(er) // Ensure that both the template and remotes directories have the correct // permissions. fs.chown(templates, st.uid, st.gid, function (er) { if (er) return cb(er) fs.chown(remotes, st.uid, st.gid, function (er) { cb(er, st) }) }) }) }) } function checkGitDir (p, u, co, origUrl, silent, cb) { fs.stat(p, function (er, s) { if (er) return cloneGitRemote(p, u, co, origUrl, silent, cb) if (!s.isDirectory()) return rm(p, function (er) { if (er) return cb(er) cloneGitRemote(p, u, co, origUrl, silent, cb) }) git.whichAndExec( [ "config", "--get", "remote.origin.url" ], { cwd : p, env : gitEnv }, function (er, stdout, stderr) { var stdoutTrimmed = (stdout + "\n" + stderr).trim() if (er || u !== stdout.trim()) { log.warn( "`git config --get remote.origin.url` returned " + "wrong result ("+u+")", stdoutTrimmed ) return rm(p, function (er){ if (er) return cb(er) cloneGitRemote(p, u, co, origUrl, silent, cb) }) } log.verbose("git remote.origin.url", stdoutTrimmed) fetchRemote(p, u, co, origUrl, cb) } ) }) } function cloneGitRemote (p, u, co, origUrl, silent, cb) { mkdir(p, function (er) { if (er) return cb(er) git.whichAndExec( [ "clone", "--template=" + templates, "--mirror", u, p ], { cwd : p, env : gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + "\n" + stderr).trim() if (er) { if (silent) { log.verbose("git clone " + u, stdout) } else { log.error("git clone " + u, stdout) } return cb(er) } log.verbose("git clone " + u, stdout) fetchRemote(p, u, co, origUrl, cb) } ) }) } function fetchRemote (p, u, co, origUrl, cb) { git.whichAndExec( [ "fetch", "-a", "origin" ], { cwd : p, env : gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + "\n" + stderr).trim() if (er) { log.error("git fetch -a origin ("+u+")", stdout) return cb(er) } log.verbose("git fetch -a origin ("+u+")", stdout) if (process.platform === "win32") { log.silly("verifyOwnership", "skipping for windows") resolveHead(p, u, co, origUrl, cb) } else { getGitDir(function (er, cs) { if (er) { log.error("Could not get cache stat") return cb(er) } chownr(p, cs.uid, cs.gid, function (er) { if (er) { log.error("Failed to change folder ownership under npm cache for %s", p) return cb(er) } resolveHead(p, u, co, origUrl, cb) }) }) } } ) } function resolveHead (p, u, co, origUrl, cb) { git.whichAndExec( [ "rev-list", "-n1", co ], { cwd : p, env : gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + "\n" + stderr).trim() if (er) { log.error("Failed resolving git HEAD (" + u + ")", stderr) return cb(er) } log.verbose("git rev-list -n1 " + co, stdout) var parsed = url.parse(origUrl) parsed.hash = stdout var resolved = url.format(parsed) if (parsed.protocol !== "git:") resolved = "git+" + resolved // https://github.com/npm/npm/issues/3224 // node incorrectly sticks a / at the start of the path We know that the // host won't change, so split and detect this var spo = origUrl.split(parsed.host) var spr = resolved.split(parsed.host) if (spo[1].charAt(0) === ":" && spr[1].charAt(0) === "/") { spr[1] = spr[1].slice(1) } resolved = spr.join(parsed.host) log.verbose("resolved git url", resolved) cache(p, u, stdout, resolved, cb) } ) } /** * Make an actual clone from the bare (mirrored) cache. There is no safe way to * do a one-step clone to a treeish that isn't guaranteed to be a branch, so * this has to be two steps. */ function cache (p, u, treeish, resolved, cb) { var tmp = path.join(npm.tmp, Date.now()+"-"+Math.random(), treeish) git.whichAndExec( [ "clone", p, tmp ], { cwd : p, env : gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + "\n" + stderr).trim() if (er) { log.error("Failed to clone "+resolved+" from "+u, stderr) return cb(er) } log.verbose("git clone", "from", p) log.verbose("git clone", stdout) git.whichAndExec( [ "checkout", treeish ], { cwd : tmp, env : gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + "\n" + stderr).trim() if (er) { log.error("Failed to check out "+treeish, stderr) return cb(er) } log.verbose("git checkout", stdout) realizePackageSpecifier(tmp, function (er, spec) { if (er) { log.error("Failed to map", tmp, "to a package specifier") return cb(er) } // https://github.com/npm/npm/issues/6400 // ensure pack logic is applied addLocal(spec, null, function (er, data) { if (data) data._resolved = resolved cb(er, data) }) }) } ) } ) } var gitEnv_ function gitEnv () { // git responds to env vars in some weird ways in post-receive hooks // so don't carry those along. if (gitEnv_) return gitEnv_ gitEnv_ = {} for (var k in process.env) { if (!~VALID_VARIABLES.indexOf(k) && k.match(/^GIT/)) continue gitEnv_[k] = process.env[k] } return gitEnv_ } // similar to chmodr except it add permissions rather than overwriting them // adapted from https://github.com/isaacs/chmodr/blob/master/chmodr.js function addModeRecursive(p, mode, cb) { fs.readdir(p, function (er, children) { // Any error other than ENOTDIR means it's not readable, or doesn't exist. // Give up. if (er && er.code !== "ENOTDIR") return cb(er) if (er || !children.length) return addMode(p, mode, cb) var len = children.length var errState = null children.forEach(function (child) { addModeRecursive(path.resolve(p, child), mode, then) }) function then (er) { if (errState) return undefined if (er) return cb(errState = er) if (--len === 0) return addMode(p, dirMode(mode), cb) } }) } function addMode(p, mode, cb) { fs.stat(p, function (er, stats) { if (er) return cb(er) mode = stats.mode | mode fs.chmod(p, mode, cb) }) } // taken from https://github.com/isaacs/chmodr/blob/master/chmodr.js function dirMode(mode) { if (mode & parseInt("0400", 8)) mode |= parseInt("0100", 8) if (mode & parseInt( "040", 8)) mode |= parseInt( "010", 8) if (mode & parseInt( "04", 8)) mode |= parseInt( "01", 8) return mode } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/add-remote-tarball.js�������������������������000644 �000766 �000024 �00000006716 12455173731 026523� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdir = require("mkdirp") , assert = require("assert") , log = require("npmlog") , path = require("path") , sha = require("sha") , retry = require("retry") , createWriteStream = require("fs-write-stream-atomic") , npm = require("../npm.js") , inflight = require("inflight") , addLocalTarball = require("./add-local-tarball.js") , cacheFile = require("npm-cache-filename") module.exports = addRemoteTarball function addRemoteTarball (u, pkgData, shasum, auth, cb_) { assert(typeof u === "string", "must have module URL") assert(typeof cb_ === "function", "must have callback") function cb (er, data) { if (data) { data._from = u data._shasum = data._shasum || shasum data._resolved = u } cb_(er, data) } cb_ = inflight(u, cb_) if (!cb_) return log.verbose("addRemoteTarball", u, "already in flight; waiting") log.verbose("addRemoteTarball", u, "not in flight; adding") // XXX Fetch direct to cache location, store tarballs under // ${cache}/registry.npmjs.org/pkg/-/pkg-1.2.3.tgz var tmp = cacheFile(npm.tmp, u) function next (er, resp, shasum) { if (er) return cb(er) addLocalTarball(tmp, pkgData, shasum, cb) } log.verbose("addRemoteTarball", [u, shasum]) mkdir(path.dirname(tmp), function (er) { if (er) return cb(er) addRemoteTarball_(u, tmp, shasum, auth, next) }) } function addRemoteTarball_ (u, tmp, shasum, auth, cb) { // Tuned to spread 3 attempts over about a minute. // See formula at <https://github.com/tim-kos/node-retry>. var operation = retry.operation({ retries: npm.config.get("fetch-retries") , factor: npm.config.get("fetch-retry-factor") , minTimeout: npm.config.get("fetch-retry-mintimeout") , maxTimeout: npm.config.get("fetch-retry-maxtimeout") }) operation.attempt(function (currentAttempt) { log.info("retry", "fetch attempt " + currentAttempt + " at " + (new Date()).toLocaleTimeString()) fetchAndShaCheck(u, tmp, shasum, auth, function (er, response, shasum) { // Only retry on 408, 5xx or no `response`. var sc = response && response.statusCode var statusRetry = !sc || (sc === 408 || sc >= 500) if (er && statusRetry && operation.retry(er)) { log.warn("retry", "will retry, error on last attempt: " + er) return } cb(er, response, shasum) }) }) } function fetchAndShaCheck (u, tmp, shasum, auth, cb) { npm.registry.fetch(u, { auth : auth }, function (er, response) { if (er) { log.error("fetch failed", u) return cb(er, response) } var tarball = createWriteStream(tmp, { mode : npm.modes.file }) tarball.on("error", function (er) { cb(er) tarball.destroy() }) tarball.on("finish", function () { if (!shasum) { // Well, we weren't given a shasum, so at least sha what we have // in case we want to compare it to something else later return sha.get(tmp, function (er, shasum) { log.silly("fetchAndShaCheck", "shasum", shasum) cb(er, response, shasum) }) } // validate that the url we just downloaded matches the expected shasum. log.silly("fetchAndShaCheck", "shasum", shasum) sha.check(tmp, shasum, function (er) { if (er && er.message) { // add original filename for better debuggability er.message = er.message + "\n" + "From: " + u } return cb(er, response, shasum) }) }) response.pipe(tarball) }) } ��������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/cached-package-root.js������������������������000644 �000766 �000024 �00000000603 12455173731 026631� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") var resolve = require("path").resolve var npm = require("../npm.js") module.exports = getCacheRoot function getCacheRoot (data) { assert(data, "must pass package metadata") assert(data.name, "package metadata must include name") assert(data.version, "package metadata must include version") return resolve(npm.cache, data.name, data.version) } �����������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/caching-client.js�����������������������������000644 �000766 �000024 �00000014236 12455173731 025727� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = CachingRegistryClient var path = require("path") , fs = require("graceful-fs") , url = require("url") , assert = require("assert") , inherits = require("util").inherits var RegistryClient = require("npm-registry-client") , npm = require("../npm.js") , log = require("npmlog") , getCacheStat = require("./get-stat.js") , cacheFile = require("npm-cache-filename") , mkdirp = require("mkdirp") , rimraf = require("rimraf") , chownr = require("chownr") , writeFile = require("write-file-atomic") function CachingRegistryClient (config) { RegistryClient.call(this, adaptConfig(config)) this._mapToCache = cacheFile(config.get("cache")) // swizzle in our custom cache invalidation logic this._request = this.request this.request = this._invalidatingRequest } inherits(CachingRegistryClient, RegistryClient) CachingRegistryClient.prototype._invalidatingRequest = function (uri, params, cb) { var client = this this._request.call(this, uri, params, function () { var args = arguments var method = params.method if (method !== "HEAD" && method !== "GET") { var invalidated = client._mapToCache(uri) // invalidate cache // // This is irrelevant for commands that do etag caching, but ls and // view also have a timed cache, so this keeps the user from thinking // that it didn't work when it did. // Note that failure is an acceptable option here, since the only // result will be a stale cache for some helper commands. client.log.verbose("request", "invalidating", invalidated, "on", method) return rimraf(invalidated, function () { cb.apply(undefined, args) }) } cb.apply(undefined, args) }) } CachingRegistryClient.prototype.get = function get (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to get") assert(params && typeof params === "object", "must pass params to get") assert(typeof cb === "function", "must pass callback to get") var parsed = url.parse(uri) assert( parsed.protocol === "http:" || parsed.protocol === "https:", "must have a URL that starts with http: or https:" ) var cacheBase = cacheFile(npm.config.get("cache"))(uri) var cachePath = path.join(cacheBase, ".cache.json") // If the GET is part of a write operation (PUT or DELETE), then // skip past the cache entirely, but still save the results. if (uri.match(/\?write=true$/)) return get_.call(this, uri, cachePath, params, cb) var client = this fs.stat(cachePath, function (er, stat) { if (!er) { fs.readFile(cachePath, function (er, data) { try { data = JSON.parse(data) } catch (ex) { data = null } params.stat = stat params.data = data get_.call(client, uri, cachePath, params, cb) }) } else { get_.call(client, uri, cachePath, params, cb) } }) } function get_ (uri, cachePath, params, cb) { var staleOk = params.staleOk === undefined ? false : params.staleOk , timeout = params.timeout === undefined ? -1 : params.timeout , data = params.data , stat = params.stat , etag timeout = Math.min(timeout, npm.config.get("cache-max") || 0) timeout = Math.max(timeout, npm.config.get("cache-min") || -Infinity) if (process.env.COMP_CWORD !== undefined && process.env.COMP_LINE !== undefined && process.env.COMP_POINT !== undefined) { timeout = Math.max(timeout, 60000) } if (data) { if (data._etag) etag = data._etag if (stat && timeout && timeout > 0) { if ((Date.now() - stat.mtime.getTime())/1000 < timeout) { log.verbose("get", uri, "not expired, no request") delete data._etag return cb(null, data, JSON.stringify(data), { statusCode : 304 }) } if (staleOk) { log.verbose("get", uri, "staleOk, background update") delete data._etag process.nextTick( cb.bind(null, null, data, JSON.stringify(data), { statusCode : 304 } ) ) cb = function () {} } } } var options = { etag : etag, follow : params.follow, auth : params.auth } this.request(uri, options, function (er, remoteData, raw, response) { // if we get an error talking to the registry, but we have it // from the cache, then just pretend we got it. if (er && cachePath && data && !data.error) { er = null response = { statusCode: 304 } } if (response) { log.silly("get", "cb", [response.statusCode, response.headers]) if (response.statusCode === 304 && etag) { remoteData = data log.verbose("etag", uri+" from cache") } } data = remoteData if (!data) er = er || new Error("failed to fetch from registry: " + uri) if (er) return cb(er, data, raw, response) saveToCache(cachePath, data, saved) // just give the write the old college try. if it fails, whatever. function saved () { delete data._etag cb(er, data, raw, response) } function saveToCache (cachePath, data, saved) { getCacheStat(function (er, st) { mkdirp(path.dirname(cachePath), function (er, made) { if (er) return saved() writeFile(cachePath, JSON.stringify(data), function (er) { if (er || st.uid === null || st.gid === null) return saved() chownr(made || cachePath, st.uid, st.gid, saved) }) }) }) } }) } function adaptConfig (config) { return { proxy : { http : config.get("proxy"), https : config.get("https-proxy"), localAddress : config.get("local-address") }, ssl : { certificate : config.get("cert"), key : config.get("key"), ca : config.get("ca"), strict : config.get("strict-ssl") }, retry : { retries : config.get("fetch-retries"), factor : config.get("fetch-retry-factor"), minTimeout : config.get("fetch-retry-mintimeout"), maxTimeout : config.get("fetch-retry-maxtimeout") }, userAgent : config.get("user-agent"), log : log, defaultTag : config.get("tag"), couchToken : config.get("_token") } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/get-stat.js�����������������������������������000644 �000766 �000024 �00000003500 12455173731 024577� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdir = require("mkdirp") , fs = require("graceful-fs") , log = require("npmlog") , chownr = require("chownr") , npm = require("../npm.js") , inflight = require("inflight") // to maintain the cache dir's permissions consistently. var cacheStat = null module.exports = function getCacheStat (cb) { if (cacheStat) return cb(null, cacheStat) fs.stat(npm.cache, function (er, st) { if (er) return makeCacheDir(cb) if (!st.isDirectory()) { log.error("getCacheStat", "invalid cache dir %j", npm.cache) return cb(er) } return cb(null, cacheStat = st) }) } function makeCacheDir (cb) { cb = inflight("makeCacheDir", cb) if (!cb) return log.verbose("getCacheStat", "cache creation already in flight; waiting") log.verbose("getCacheStat", "cache creation not in flight; initializing") if (!process.getuid) return mkdir(npm.cache, function (er) { return cb(er, {}) }) var uid = +process.getuid() , gid = +process.getgid() if (uid === 0) { if (process.env.SUDO_UID) uid = +process.env.SUDO_UID if (process.env.SUDO_GID) gid = +process.env.SUDO_GID } if (uid !== 0 || !process.env.HOME) { cacheStat = {uid: uid, gid: gid} return mkdir(npm.cache, afterMkdir) } fs.stat(process.env.HOME, function (er, st) { if (er) { log.error("makeCacheDir", "homeless?") return cb(er) } cacheStat = st log.silly("makeCacheDir", "cache dir uid, gid", [st.uid, st.gid]) return mkdir(npm.cache, afterMkdir) }) function afterMkdir (er, made) { if (er || !cacheStat || isNaN(cacheStat.uid) || isNaN(cacheStat.gid)) { return cb(er, cacheStat) } if (!made) return cb(er, cacheStat) // ensure that the ownership is correct. chownr(made, cacheStat.uid, cacheStat.gid, function (er) { return cb(er, cacheStat) }) } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/maybe-github.js�������������������������������000644 �000766 �000024 �00000001470 12455173731 025430� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require("assert") , log = require("npmlog") , addRemoteGit = require("./add-remote-git.js") module.exports = function maybeGithub (p, cb) { assert(typeof p === "string", "must pass package name") assert(typeof cb === "function", "must pass callback") var u = "git://github.com/" + p log.info("maybeGithub", "Attempting %s from %s", p, u) return addRemoteGit(u, true, function (er, data) { if (er) { var upriv = "git+ssh://git@github.com:" + p log.info("maybeGithub", "Attempting %s from %s", p, upriv) return addRemoteGit(upriv, false, function (er, data) { if (er) return cb(er) success(upriv, data) }) } success(u, data) }) function success (u, data) { data._from = u data._fromGithub = true return cb(null, data) } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/lib/cache/update-index.js�������������������������������000644 �000766 �000024 �00000005503 12455173731 025443� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������module.exports = updateIndex var fs = require("graceful-fs") , assert = require("assert") , path = require("path") , mkdir = require("mkdirp") , chownr = require("chownr") , url = require("url") , npm = require("../npm.js") , log = require("npmlog") , cacheFile = require("npm-cache-filename") , getCacheStat = require("./get-stat.js") /* /-/all is special. * It uses timestamp-based caching and partial updates, * because it is a monster. */ function updateIndex (uri, params, cb) { assert(typeof uri === "string", "must pass registry URI to updateIndex") assert(params && typeof params === "object", "must pass params to updateIndex") assert(typeof cb === "function", "must pass callback to updateIndex") var parsed = url.parse(uri) assert( parsed.protocol === "http:" || parsed.protocol === "https:", "must have a URL that starts with http: or https:" ) var cacheBase = cacheFile(npm.config.get("cache"))(uri) var cachePath = path.join(cacheBase, ".cache.json") log.info("updateIndex", cachePath) getCacheStat(function (er, st) { if (er) return cb(er) mkdir(cacheBase, function (er, made) { if (er) return cb(er) fs.readFile(cachePath, function (er, data) { if (er) return updateIndex_(uri, params, 0, {}, cachePath, cb) try { data = JSON.parse(data) } catch (ex) { fs.writeFile(cachePath, "{}", function (er) { if (er) return cb(new Error("Broken cache.")) return updateIndex_(uri, params, 0, {}, cachePath, cb) }) } var t = +data._updated || 0 chownr(made || cachePath, st.uid, st.gid, function (er) { if (er) return cb(er) updateIndex_(uri, params, t, data, cachePath, cb) }) }) }) }) } function updateIndex_ (uri, params, t, data, cachePath, cb) { // use the cache and update in the background if it's not too old if (Date.now() - t < 60000) { cb(null, data) cb = function () {} } var full if (t === 0) { log.warn("", "Building the local index for the first time, please be patient") full = url.resolve(uri, "/-/all") } else { full = url.resolve(uri, "/-/all/since?stale=update_after&startkey=" + t) } npm.registry.request(full, params, function (er, updates, _, res) { if (er) return cb(er, data) var headers = res.headers var updated = updates._updated || Date.parse(headers.date) Object.keys(updates).forEach(function (p) { data[p] = updates[p] }) data._updated = updated getCacheStat(function (er, st) { if (er) return cb(er) fs.writeFile(cachePath, JSON.stringify(data), function (er) { delete data._updated if (er) return cb(er) chownr(cachePath, st.uid, st.gid, function (er) { cb(er, data) }) }) }) }) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/�����������������������������������������������000755 �000766 �000024 �00000000000 12456115117 022406� 5����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/docfoot.html���������������������������������������000644 �000766 �000024 �00000004157 12455173731 024200� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������</div> <table border=0 cellspacing=0 cellpadding=0 id=npmlogo> <tr><td style="width:180px;height:10px;background:rgb(237,127,127)" colspan=18> </td></tr> <tr><td rowspan=4 style="width:10px;height:10px;background:rgb(237,127,127)"> </td><td style="width:40px;height:10px;background:#fff" colspan=4> </td><td style="width:10px;height:10px;background:rgb(237,127,127)" rowspan=4> </td><td style="width:40px;height:10px;background:#fff" colspan=4> </td><td rowspan=4 style="width:10px;height:10px;background:rgb(237,127,127)"> </td><td colspan=6 style="width:60px;height:10px;background:#fff"> </td><td style="width:10px;height:10px;background:rgb(237,127,127)" rowspan=4> </td></tr> <tr><td colspan=2 style="width:20px;height:30px;background:#fff" rowspan=3> </td><td style="width:10px;height:10px;background:rgb(237,127,127)" rowspan=3> </td><td style="width:10px;height:10px;background:#fff" rowspan=3> </td><td style="width:20px;height:10px;background:#fff" rowspan=4 colspan=2> </td><td style="width:10px;height:20px;background:rgb(237,127,127)" rowspan=2> </td><td style="width:10px;height:10px;background:#fff" rowspan=3> </td><td style="width:20px;height:10px;background:#fff" rowspan=3 colspan=2> </td><td style="width:10px;height:10px;background:rgb(237,127,127)" rowspan=3> </td><td style="width:10px;height:10px;background:#fff" rowspan=3> </td><td style="width:10px;height:10px;background:rgb(237,127,127)" rowspan=3> </td></tr> <tr><td style="width:10px;height:10px;background:#fff" rowspan=2> </td></tr> <tr><td style="width:10px;height:10px;background:#fff"> </td></tr> <tr><td style="width:60px;height:10px;background:rgb(237,127,127)" colspan=6> </td><td colspan=10 style="width:10px;height:10px;background:rgb(237,127,127)"> </td></tr> <tr><td colspan=5 style="width:50px;height:10px;background:#fff"> </td><td style="width:40px;height:10px;background:rgb(237,127,127)" colspan=4> </td><td style="width:90px;height:10px;background:#fff" colspan=9> </td></tr> </table> <p id="footer">@NAME@ — npm@@VERSION@</p> �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/dochead.html���������������������������������������000644 �000766 �000024 �00000000511 12455173731 024120� 0����������������������������������������������������������������������������������������������������ustar�00iojs����������������������������staff���������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������<!doctype html> <html> <title>@NAME@

iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/favicon.ico000644 000766 000024 00000015666 12455173731 024005 0ustar00iojsstaff000000 000000 @@(&hN(@78( @78iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/index.html000644 000766 000024 00000005075 12455173731 023652 0ustar00iojsstaff000000 000000 npm - Node Package Manager

npm

npm is a package manager for node. You can use it to install and publish your node programs. It manages dependencies and does other cool stuff.

Easy Zero Line Install

Install Node.js
(npm comes with it.)

Because a one-line install is one too many.

Fancy Install

  1. Get the code.
  2. Do what the README says to do.

There's a pretty thorough install script at https://npmjs.org/install.sh

For maximum security, make sure to thorougly inspect every program that you run on your computer!

Other Cool Stuff

iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/000755 000766 000024 00000000000 12456115117 023275 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/static/000755 000766 000024 00000000000 12456115117 023130 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/static/style.css000644 000766 000024 00000013003 12455173731 025004 0ustar00iojsstaff000000 000000 /* reset */ * { margin:0; padding:0; border:none; font-family:inherit; font-size:inherit; font-weight:inherit; } :target::before { content:" >>> "; position:absolute; display:block; opacity:0.5; color:#f00; margin:0 0 0 -2em; } abbr, acronym { border-bottom:1px dotted #aaa; } kbd, code, pre { font-family:monospace; margin:0; font-size:18px; line-height:24px; background:#eee; outline:1px solid #ccc; } kbd code, kbd pre, kbd kbd, pre code, pre pre, pre kbd, code code, code pre, code kbd { outline: none } .dollar::before { content:"$ "; display:inline; } p, ul, ol, dl, pre { margin:30px 0; line-height:30px; } hr { margin:30px auto 29px; width:66%; height:1px; background:#aaa; } pre { display:block; } dd :first-child { margin-top:0; } body { quotes:"“" "”" "‘" "’"; width:666px; margin:30px auto 120px; font-family:Times New Roman, serif; font-size:20px; background:#fff; line-height:30px; color:#111; } blockquote { position:relative; font-size:16px; line-height:30px; font-weight:bold; width:85%; margin:0 auto; } blockquote::before { font-size:90px; display:block; position:absolute; top:20px; right:100%; content:"“"; padding-right:10px; color:#ccc; } .source cite::before { content:"— "; } .source { padding-left:20%; margin-top:30px; } .source cite span { font-style:normal; } blockquote p { margin-bottom:0; } .quote blockquote { font-weight:normal; } h1, h2, h3, h4, h5, h6, dt, #header { font-family:serif; font-size:20px; font-weight:bold; } h2 { background:#eee; } h1, h2 { line-height:40px; } i, em, cite { font-style:italic; } b, strong { font-weight:bold; } i, em, cite, b, strong, small { line-height:28px; } small, .small, .small *, aside { font-style:italic; color:#669; font-size:18px; } small a, .small a { text-decoration:underline; } del { text-decoration:line-through; } ins { text-decoration:underline; } .alignright { display:block; float:right; margin-left:1em; } .alignleft { display:block; float:left; margin-right:1em; } q:before, q q q:before, q q q q q:before, q q q q q q q:before { content:"“"; } q q:before, q q q q:before, q q q q q q:before, q q q q q q q q:before { content:"‘"; } q:after, q q q:after, q q q q q:after, q q q q q q q:after { content:"”"; } q q:after, q q q q:after, q q q q q q:after, q q q q q q q q:after { content:"’"; } a { color:#00f; text-decoration:none; } a:visited { color:#636; } a:hover, a:active { color:#c00!important; text-decoration:underline; } h1 { font-weight:bold; background:#fff; } h1 a, h1 a:visited { font-family:monospace; font-size:60px; color:#c00; display:block; } h1 a:focus, h1 a:hover, h1 a:active { color:#f00!important; text-decoration:none; } .navigation { display:table; width:100%; margin:0 0 30px 0; position:relative; } #nav-above { margin-bottom:0; } .navigation .nav-previous { display:table-cell; text-align:left; width:50%; } /* hang the » and « off into the margins */ .navigation .nav-previous a:before, .navigation .nav-next a:after { content: "«"; display:block; height:30px; margin-bottom:-30px; text-decoration:none; margin-left:-15px; } .navigation .nav-next a:after { content: "»"; text-align:right; margin-left:0; margin-top:-30px; margin-right:-15px; } .navigation .nav-next { display:table-cell; text-align:right; width:50%; } .navigation a { display:block; width:100%; height:100%; } input, button, textarea { border:0; line-height:30px; } textarea { height:300px; } input { height:30px; line-height:30px; } input.submit, input#submit, input.button, button, input[type=submit] { cursor:hand; cursor:pointer; outline:1px solid #ccc; } #wrapper { margin-bottom:90px; position:relative; z-index:1; *zoom:1; background:#fff; } #wrapper:after { display:block; content:"."; visibility:hidden; width:0; height:0; clear:both; } .sidebar .xoxo > li { float:left; width:50%; } .sidebar li { list-style:none; } .sidebar #elsewhere { margin-left:-10%; margin-right:-10%; } .sidebar #rss-links, .sidebar #twitter-feeds { float:right; clear:right; width:20%; } .sidebar #comment { clear:both; float:none; width:100%; } .sidebar #search { clear:both; float:none; width:100%; } .sidebar #search h2 { margin-left:40%; } .sidebar #search #s { width:90%; float:left; } .sidebar #search #searchsubmit { width:10%; float:right; } .sidebar * { font-size:15px; line-height:30px; } #footer, #footer * { text-align:center; font-size:16px; color:#ccc; font-style:italic; word-spacing:1em; margin-top:0; } #toc { position:absolute; top:0; right:0; padding:40px 0 40px 20px; margin:0; width:200px; opacity:0.2; z-index:-1; } #toc:hover { opacity:1; background:#fff; z-index:999; } #toc ul { padding:0; margin:0; } #toc, #toc li { list-style-type:none; font-size:15px; line-height:15px; } #toc li { padding:0 0 0 10px; } #toc li a { position:relative; display:block; } table#npmlogo { line-height:10px; width:180px; margin:0 auto; } @media print { a[href] { color:inherit; } a[href]:after { white-space:nowrap; content:" " attr(href); } a[href^=\#], .navigation { display:none; } } iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/static/toc.js000644 000766 000024 00000001247 12455173731 024264 0ustar00iojsstaff000000 000000 ;(function () { var wrapper = document.getElementById("wrapper") var els = Array.prototype.slice.call(wrapper.getElementsByTagName("*"), 0) .filter(function (el) { return el.parentNode === wrapper && el.tagName.match(/H[1-6]/) && el.id }) var l = 2 , toc = document.createElement("ul") toc.innerHTML = els.map(function (el) { var i = el.tagName.charAt(1) , out = "" while (i > l) { out += "
    " l ++ } while (i < l) { out += "
" l -- } out += "
  • " + ( el.innerText || el.text || el.innerHTML) + "" return out }).join("\n") toc.id = "toc" document.body.appendChild(toc) })(); iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/000755 000766 000024 00000000000 12456115117 024042 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/000755 000766 000024 00000000000 12456115117 024613 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/000755 000766 000024 00000000000 12456115117 024611 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/files/000755 000766 000024 00000000000 12456115117 025144 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/index.html000644 000766 000024 00000026005 12455173731 026047 0ustar00iojsstaff000000 000000

    npm-index

    Index of all npm documentation

    README

    a JavaScript package manager

    Command Line Documentation

    Using npm on the command line

    npm(1)

    node package manager

    npm-adduser(1)

    Add a registry user account

    npm-bin(1)

    Display npm bin folder

    npm-bugs(1)

    Bugs for a package in a web browser maybe

    npm-build(1)

    Build a package

    npm-bundle(1)

    REMOVED

    npm-cache(1)

    Manipulates packages cache

    npm-completion(1)

    Tab Completion for npm

    npm-config(1)

    Manage the npm configuration files

    npm-dedupe(1)

    Reduce duplication

    npm-deprecate(1)

    Deprecate a version of a package

    npm-docs(1)

    Docs for a package in a web browser maybe

    npm-edit(1)

    Edit an installed package

    npm-explore(1)

    Browse an installed package

    npm-help-search(1)

    Search npm help documentation

    npm-help(1)

    Get help on npm

    npm-init(1)

    Interactively create a package.json file

    npm-install(1)

    Install a package

    Symlink a package folder

    npm-ls(1)

    List installed packages

    npm-outdated(1)

    Check for outdated packages

    npm-owner(1)

    Manage package owners

    npm-pack(1)

    Create a tarball from a package

    npm-prefix(1)

    Display prefix

    npm-prune(1)

    Remove extraneous packages

    npm-publish(1)

    Publish a package

    npm-rebuild(1)

    Rebuild a package

    npm-repo(1)

    Open package repository page in the browser

    npm-restart(1)

    Restart a package

    npm-rm(1)

    Remove a package

    npm-root(1)

    Display npm root

    npm-run-script(1)

    Run arbitrary package scripts

    npm-search(1)

    Search for packages

    npm-shrinkwrap(1)

    Lock down dependency versions

    npm-star(1)

    Mark your favorite packages

    npm-stars(1)

    View packages marked as favorites

    npm-start(1)

    Start a package

    npm-stop(1)

    Stop a package

    npm-tag(1)

    Tag a published version

    npm-test(1)

    Test a package

    npm-uninstall(1)

    Remove a package

    npm-unpublish(1)

    Remove a package from the registry

    npm-update(1)

    Update a package

    npm-version(1)

    Bump a package version

    npm-view(1)

    View registry info

    npm-whoami(1)

    Display npm username

    API Documentation

    Using npm in your Node programs

    npm(3)

    node package manager

    npm-bin(3)

    Display npm bin folder

    npm-bugs(3)

    Bugs for a package in a web browser maybe

    npm-cache(3)

    manage the npm cache programmatically

    npm-commands(3)

    npm commands

    npm-config(3)

    Manage the npm configuration files

    npm-deprecate(3)

    Deprecate a version of a package

    npm-docs(3)

    Docs for a package in a web browser maybe

    npm-edit(3)

    Edit an installed package

    npm-explore(3)

    Browse an installed package

    npm-help-search(3)

    Search the help pages

    npm-init(3)

    Interactively create a package.json file

    npm-install(3)

    install a package programmatically

    Symlink a package folder

    npm-load(3)

    Load config settings

    npm-ls(3)

    List installed packages

    npm-outdated(3)

    Check for outdated packages

    npm-owner(3)

    Manage package owners

    npm-pack(3)

    Create a tarball from a package

    npm-prefix(3)

    Display prefix

    npm-prune(3)

    Remove extraneous packages

    npm-publish(3)

    Publish a package

    npm-rebuild(3)

    Rebuild a package

    npm-repo(3)

    Open package repository page in the browser

    npm-restart(3)

    Restart a package

    npm-root(3)

    Display npm root

    npm-run-script(3)

    Run arbitrary package scripts

    npm-search(3)

    Search for packages

    npm-shrinkwrap(3)

    programmatically generate package shrinkwrap file

    npm-start(3)

    Start a package

    npm-stop(3)

    Stop a package

    npm-tag(3)

    Tag a published version

    npm-test(3)

    Test a package

    npm-uninstall(3)

    uninstall a package programmatically

    npm-unpublish(3)

    Remove a package from the registry

    npm-update(3)

    Update a package

    npm-version(3)

    Bump a package version

    npm-view(3)

    View registry info

    npm-whoami(3)

    Display npm username

    Files

    File system structures npm uses

    npm-folders(5)

    Folder Structures Used by npm

    npmrc(5)

    The npm config files

    package.json(5)

    Specifics of npm's package.json handling

    Misc

    Various other bits and bobs

    npm-coding-style(7)

    npm's "funny" coding style

    npm-config(7)

    More than you probably want to know about npm configuration

    npm-developers(7)

    Developer Guide

    npm-disputes(7)

    Handling Module Name Disputes

    npm-faq(7)

    Frequently Asked Questions

    npm-index(7)

    Index of all npm documentation

    npm-registry(7)

    The JavaScript Package Registry

    npm-scope(7)

    Scoped packages

    npm-scripts(7)

    How npm handles the "scripts" field

    removing-npm(7)

    Cleaning the Slate

    semver(7)

    The semantic versioner for npm

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/000755 000766 000024 00000000000 12456115117 024775 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/README.html000644 000766 000024 00000022147 12455173731 025700 0ustar00iojsstaff000000 000000

    npm

    a JavaScript package manager

    Build Status

    SYNOPSIS

    This is just enough info to get you up and running.

    Much more info available via npm help once it's installed.

    IMPORTANT

    You need node v0.8 or higher to run this program.

    To install an old and unsupported version of npm that works on node 0.3 and prior, clone the git repo and dig through the old tags and branches.

    Super Easy Install

    npm comes with node now.

    Windows Computers

    Get the MSI. npm is in it.

    Apple Macintosh Computers

    Get the pkg. npm is in it.

    Other Sorts of Unices

    Run make install. npm will be installed with node.

    If you want a more fancy pants install (a different version, customized paths, etc.) then read on.

    Fancy Install (Unix)

    There's a pretty robust install script at https://www.npmjs.com/install.sh. You can download that and run it.

    Here's an example using curl:

    curl -L https://npmjs.com/install.sh | sh
    

    Slightly Fancier

    You can set any npm configuration params with that script:

    npm_config_prefix=/some/path sh install.sh
    

    Or, you can run it in uber-debuggery mode:

    npm_debug=1 sh install.sh
    

    Even Fancier

    Get the code with git. Use make to build the docs and do other stuff. If you plan on hacking on npm, make link is your friend.

    If you've got the npm source code, you can also semi-permanently set arbitrary config keys using the ./configure --key=val ..., and then run npm commands by doing node cli.js <cmd> <args>. (This is helpful for testing, or running stuff without actually installing npm itself.)

    Windows Install or Upgrade

    You can download a zip file from https://github.com/npm/npm/releases, and unpack it in the same folder where node.exe lives.

    The latest version in a zip file is 1.4.12. To upgrade to npm 2, follow the Windows upgrade instructions in the npm Troubleshooting Guide:

    https://github.com/npm/npm/wiki/Troubleshooting#upgrading-on-windows

    If that's not fancy enough for you, then you can fetch the code with git, and mess with it directly.

    Installing on Cygwin

    No.

    Uninstalling

    So sad to see you go.

    sudo npm uninstall npm -g
    

    Or, if that fails,

    sudo make uninstall
    

    More Severe Uninstalling

    Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed.

    If you would like to remove all the packages that you have installed, then you can use the npm ls command to find them, and then npm rm to remove them.

    To remove cruft left behind by npm 0.x, you can use the included clean-old.sh script file. You can run it conveniently like this:

    npm explore npm -g -- sh scripts/clean-old.sh
    

    npm uses two configuration files, one for per-user configs, and another for global (every-user) configs. You can view them by doing:

    npm config get userconfig   # defaults to ~/.npmrc
    npm config get globalconfig # defaults to /usr/local/etc/npmrc
    

    Uninstalling npm does not remove configuration files by default. You must remove them yourself manually if you want them gone. Note that this means that future npm installs will not remember the settings that you have chosen.

    Using npm Programmatically

    If you would like to use npm programmatically, you can do that. It's not very well documented, but it is rather simple.

    Most of the time, unless you actually want to do all the things that npm does, you should try using one of npm's dependencies rather than using npm itself, if possible.

    Eventually, npm will be just a thin cli wrapper around the modules that it depends on, but for now, there are some things that you must use npm itself to do.

    var npm = require("npm")
    npm.load(myConfigObject, function (er) {
      if (er) return handlError(er)
      npm.commands.install(["some", "args"], function (er, data) {
        if (er) return commandFailed(er)
        // command succeeded, and data might have some info
      })
      npm.registry.log.on("log", function (message) { .... })
    })
    

    The load function takes an object hash of the command-line configs. The various npm.commands.<cmd> functions take an array of positional argument strings. The last argument to any npm.commands.<cmd> function is a callback. Some commands take other optional arguments. Read the source.

    You cannot set configs individually for any single npm function at this time. Since npm is a singleton, any call to npm.config.set will change the value for all npm commands in that process.

    See ./bin/npm-cli.js for an example of pulling config values off of the command line arguments using nopt. You may also want to check out npm help config to learn about all the options you can set there.

    More Docs

    Check out the docs, especially the faq.

    You can use the npm help command to read any of them.

    If you're a developer, and you want to use npm to publish your program, you should read this

    "npm" and "The npm Registry" are owned by npm, Inc. All rights reserved. See the included LICENSE file for more details.

    "Node.js" and "node" are trademarks owned by Joyent, Inc.

    Modules published on the npm registry are not officially endorsed by npm, Inc. or the Node.js project.

    Data published to the npm registry is not part of npm itself, and is the sole property of the publisher. While every effort is made to ensure accountability, there is absolutely no guarantee, warrantee, or assertion expressed or implied as to the quality, fitness for a specific purpose, or lack of malice in any given npm package.

    If you have a complaint about a package in the public npm registry, and cannot resolve it with the package owner, please email support@npmjs.com and explain the situation.

    Any data published to The npm Registry (including user account information) may be removed or modified at the sole discretion of the npm server administrators.

    In plainer english

    npm is the property of npm, Inc.

    If you publish something, it's yours, and you are solely accountable for it.

    If other people publish something, it's theirs.

    Users can publish Bad Stuff. It will be removed promptly if reported. But there is no vetting process for published modules, and you use them at your own risk. Please inspect the source.

    If you publish Bad Stuff, we may delete it from the registry, or even ban your account in extreme cases. So don't do that.

    BUGS

    When you find issues, please report them:

    Be sure to include all of the output from the npm command that didn't work as expected. The npm-debug.log file is also helpful to provide.

    You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-coding-style.html000644 000766 000024 00000014364 12455173731 031071 0ustar00iojsstaff000000 000000

    npm-coding-style

    npm's "funny" coding style

    DESCRIPTION

    npm's coding style is a bit unconventional. It is not different for difference's sake, but rather a carefully crafted style that is designed to reduce visual clutter and make bugs more apparent.

    If you want to contribute to npm (which is very encouraged), you should make your code conform to npm's style.

    Note: this concerns npm's code not the specific packages that you can download from the npm registry.

    Line Length

    Keep lines shorter than 80 characters. It's better for lines to be too short than to be too long. Break up long lists, objects, and other statements onto multiple lines.

    Indentation

    Two-spaces. Tabs are better, but they look like hell in web browsers (and on GitHub), and node uses 2 spaces, so that's that.

    Configure your editor appropriately.

    Curly braces

    Curly braces belong on the same line as the thing that necessitates them.

    Bad:

    function ()
    {
    

    Good:

    function () {
    

    If a block needs to wrap to the next line, use a curly brace. Don't use it if it doesn't.

    Bad:

    if (foo) { bar() }
    while (foo)
      bar()
    

    Good:

    if (foo) bar()
    while (foo) {
      bar()
    }
    

    Semicolons

    Don't use them except in four situations:

    • for (;;) loops. They're actually required.
    • null loops like: while (something) ; (But you'd better have a good reason for doing that.)
    • case "foo": doSomething(); break
    • In front of a leading ( or [ at the start of the line. This prevents the expression from being interpreted as a function call or property access, respectively.

    Some examples of good semicolon usage:

    ;(x || y).doSomething()
    ;[a, b, c].forEach(doSomething)
    for (var i = 0; i < 10; i ++) {
      switch (state) {
        case "begin": start(); continue
        case "end": finish(); break
        default: throw new Error("unknown state")
      }
      end()
    }
    

    Note that starting lines with - and + also should be prefixed with a semicolon, but this is much less common.

    Comma First

    If there is a list of things separated by commas, and it wraps across multiple lines, put the comma at the start of the next line, directly below the token that starts the list. Put the final token in the list on a line by itself. For example:

    var magicWords = [ "abracadabra"
                     , "gesundheit"
                     , "ventrilo"
                     ]
      , spells = { "fireball" : function () { setOnFire() }
                 , "water" : function () { putOut() }
                 }
      , a = 1
      , b = "abc"
      , etc
      , somethingElse
    

    Whitespace

    Put a single space in front of ( for anything other than a function call. Also use a single space wherever it makes things more readable.

    Don't leave trailing whitespace at the end of lines. Don't indent empty lines. Don't use more spaces than are helpful.

    Functions

    Use named functions. They make stack traces a lot easier to read.

    Callbacks, Sync/async Style

    Use the asynchronous/non-blocking versions of things as much as possible. It might make more sense for npm to use the synchronous fs APIs, but this way, the fs and http and child process stuff all uses the same callback-passing methodology.

    The callback should always be the last argument in the list. Its first argument is the Error or null.

    Be very careful never to ever ever throw anything. It's worse than useless. Just send the error message back as the first argument to the callback.

    Errors

    Always create a new Error object with your message. Don't just return a string message to the callback. Stack traces are handy.

    Logging

    Logging is done using the npmlog utility.

    Please clean up logs when they are no longer helpful. In particular, logging the same object over and over again is not helpful. Logs should report what's happening so that it's easier to track down where a fault occurs.

    Use appropriate log levels. See npm-config(7) and search for "loglevel".

    Case, naming, etc.

    Use lowerCamelCase for multiword identifiers when they refer to objects, functions, methods, properties, or anything not specified in this section.

    Use UpperCamelCase for class names (things that you'd pass to "new").

    Use all-lower-hyphen-css-case for multiword filenames and config keys.

    Use named functions. They make stack traces easier to follow.

    Use CAPS_SNAKE_CASE for constants, things that should never change and are rarely used.

    Use a single uppercase letter for function names where the function would normally be anonymous, but needs to call itself recursively. It makes it clear that it's a "throwaway" function.

    null, undefined, false, 0

    Boolean variables and functions should always be either true or false. Don't set it to 0 unless it's supposed to be a number.

    When something is intentionally missing or removed, set it to null.

    Don't set things to undefined. Reserve that value to mean "not yet set to anything."

    Boolean objects are verboten.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-config.html000644 000766 000024 00000067670 12455173731 027745 0ustar00iojsstaff000000 000000

    npm-config

    More than you probably want to know about npm configuration

    DESCRIPTION

    npm gets its configuration values from 6 sources, in this priority:

    Command Line Flags

    Putting --foo bar on the command line sets the foo configuration parameter to "bar". A -- argument tells the cli parser to stop reading flags. A --flag parameter that is at the end of the command will be given the value of true.

    Environment Variables

    Any environment variables that start with npm_config_ will be interpreted as a configuration parameter. For example, putting npm_config_foo=bar in your environment will set the foo configuration parameter to bar. Any environment configurations that are not given a value will be given the value of true. Config values are case-insensitive, so NPM_CONFIG_FOO=bar will work the same.

    npmrc Files

    The four relevant files are:

    • per-project config file (/path/to/my/project/.npmrc)
    • per-user config file (~/.npmrc)
    • global config file ($PREFIX/npmrc)
    • npm builtin config file (/path/to/npm/npmrc)

    See npmrc(5) for more details.

    Default Configs

    A set of configuration parameters that are internal to npm, and are defaults if nothing else is specified.

    Shorthands and Other CLI Niceties

    The following shorthands are parsed on the command-line:

    • -v: --version
    • -h, -?, --help, -H: --usage
    • -s, --silent: --loglevel silent
    • -q, --quiet: --loglevel warn
    • -d: --loglevel info
    • -dd, --verbose: --loglevel verbose
    • -ddd: --loglevel silly
    • -g: --global
    • -C: --prefix
    • -l: --long
    • -m: --message
    • -p, --porcelain: --parseable
    • -reg: --registry
    • -v: --version
    • -f: --force
    • -desc: --description
    • -S: --save
    • -D: --save-dev
    • -O: --save-optional
    • -B: --save-bundle
    • -E: --save-exact
    • -y: --yes
    • -n: --yes false
    • ll and la commands: ls --long

    If the specified configuration param resolves unambiguously to a known configuration parameter, then it is expanded to that configuration parameter. For example:

    npm ls --par
    # same as:
    npm ls --parseable
    

    If multiple single-character shorthands are strung together, and the resulting combination is unambiguously not some other configuration param, then it is expanded to its various component pieces. For example:

    npm ls -gpld
    # same as:
    npm ls --global --parseable --long --loglevel info
    

    Per-Package Config Settings

    When running scripts (see npm-scripts(7)) the package.json "config" keys are overwritten in the environment if there is a config param of <name>[@<version>]:<key>. For example, if the package.json has this:

    { "name" : "foo"
    , "config" : { "port" : "8080" }
    , "scripts" : { "start" : "node server.js" } }
    

    and the server.js is this:

    http.createServer(...).listen(process.env.npm_package_config_port)
    

    then the user could change the behavior by doing:

    npm config set foo:port 80
    

    See package.json(5) for more information.

    Config Settings

    always-auth

    • Default: false
    • Type: Boolean

    Force npm to always require authentication when accessing the registry, even for GET requests.

    • Default: true
    • Type: Boolean

    Tells npm to create symlinks (or .cmd shims on Windows) for package executables.

    Set to false to have it not do this. This can be used to work around the fact that some file systems don't support symlinks, even on ostensibly Unix systems.

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm docs command to open websites.

    ca

    • Default: The npm CA certificate
    • Type: String, Array or null

    The Certificate Authority signing certificate that is trusted for SSL connections to the registry. Values should be in PEM format with newlines replaced by the string "\n". For example:

    ca="-----BEGIN CERTIFICATE-----\nXXXX\nXXXX\n-----END CERTIFICATE-----"
    

    Set to null to only allow "known" registrars, or to a specific CA cert to trust only that specific signing authority.

    Multiple CAs can be trusted by specifying an array of certificates:

    ca[]="..."
    ca[]="..."
    

    See also the strict-ssl config.

    cafile

    • Default: null
    • Type: path

    A path to a file containing one or multiple Certificate Authority signing certificates. Similar to the ca setting, but allows for multiple CA's, as well as for the CA information to be stored in a file on disk.

    cache

    • Default: Windows: %AppData%\npm-cache, Posix: ~/.npm
    • Type: path

    The location of npm's cache directory. See npm-cache(1)

    cache-lock-stale

    • Default: 60000 (1 minute)
    • Type: Number

    The number of ms before cache folder lockfiles are considered stale.

    cache-lock-retries

    • Default: 10
    • Type: Number

    Number of times to retry to acquire a lock on cache folder lockfiles.

    cache-lock-wait

    • Default: 10000 (10 seconds)
    • Type: Number

    Number of ms to wait for cache lock files to expire.

    cache-max

    • Default: Infinity
    • Type: Number

    The maximum time (in seconds) to keep items in the registry cache before re-checking against the registry.

    Note that no purging is done unless the npm cache clean command is explicitly used, and that only GET requests use the cache.

    cache-min

    • Default: 10
    • Type: Number

    The minimum time (in seconds) to keep items in the registry cache before re-checking against the registry.

    Note that no purging is done unless the npm cache clean command is explicitly used, and that only GET requests use the cache.

    cert

    • Default: null
    • Type: String

    A client certificate to pass when accessing the registry.

    color

    • Default: true on Posix, false on Windows
    • Type: Boolean or "always"

    If false, never shows colors. If "always" then always shows colors. If true, then only prints color codes for tty file descriptors.

    depth

    • Default: Infinity
    • Type: Number

    The depth to go when recursing directories for npm ls and npm cache ls.

    description

    • Default: true
    • Type: Boolean

    Show the description in npm search

    dev

    • Default: false
    • Type: Boolean

    Install dev-dependencies along with packages.

    Note that dev-dependencies are also installed if the npat flag is set.

    editor

    • Default: EDITOR environment variable if set, or "vi" on Posix, or "notepad" on Windows.
    • Type: path

    The command to run for npm edit or npm config edit.

    engine-strict

    • Default: false
    • Type: Boolean

    If set to true, then npm will stubbornly refuse to install (or even consider installing) any package that claims to not be compatible with the current Node.js version.

    force

    • Default: false
    • Type: Boolean

    Makes various commands more forceful.

    • lifecycle script failure does not block progress.
    • publishing clobbers previously published versions.
    • skips cache when requesting from the registry.
    • prevents checks against clobbering non-npm files.

    fetch-retries

    • Default: 2
    • Type: Number

    The "retries" config for the retry module to use when fetching packages from the registry.

    fetch-retry-factor

    • Default: 10
    • Type: Number

    The "factor" config for the retry module to use when fetching packages.

    fetch-retry-mintimeout

    • Default: 10000 (10 seconds)
    • Type: Number

    The "minTimeout" config for the retry module to use when fetching packages.

    fetch-retry-maxtimeout

    • Default: 60000 (1 minute)
    • Type: Number

    The "maxTimeout" config for the retry module to use when fetching packages.

    git

    • Default: "git"
    • Type: String

    The command to use for git commands. If git is installed on the computer, but is not in the PATH, then set this to the full path to the git binary.

    git-tag-version

    • Default: true
    • Type: Boolean

    Tag the commit when using the npm version command.

    global

    • Default: false
    • Type: Boolean

    Operates in "global" mode, so that packages are installed into the prefix folder instead of the current working directory. See npm-folders(5) for more on the differences in behavior.

    • packages are installed into the {prefix}/lib/node_modules folder, instead of the current working directory.
    • bin files are linked to {prefix}/bin
    • man pages are linked to {prefix}/share/man

    globalconfig

    • Default: {prefix}/etc/npmrc
    • Type: path

    The config file to read for global config options.

    group

    • Default: GID of the current process
    • Type: String or Number

    The group to use when running package scripts in global mode as the root user.

    heading

    • Default: "npm"
    • Type: String

    The string that starts all the debugging log output.

    https-proxy

    • Default: null
    • Type: url

    A proxy to use for outgoing https requests. If the HTTPS_PROXY or https_proxy or HTTP_PROXY or http_proxy environment variables are set, proxy settings will be honored by the underlying request library.

    ignore-scripts

    • Default: false
    • Type: Boolean

    If true, npm does not run scripts specified in package.json files.

    init-module

    • Default: ~/.npm-init.js
    • Type: path

    A module that will be loaded by the npm init command. See the documentation for the init-package-json module for more information, or npm-init(1).

    init-author-name

    • Default: ""
    • Type: String

    The value npm init should use by default for the package author's name.

    init-author-email

    • Default: ""
    • Type: String

    The value npm init should use by default for the package author's email.

    init-author-url

    • Default: ""
    • Type: String

    The value npm init should use by default for the package author's homepage.

    init-license

    • Default: "ISC"
    • Type: String

    The value npm init should use by default for the package license.

    init-version

    • Default: "0.0.0"
    • Type: semver

    The value that npm init should use by default for the package version number, if not already set in package.json.

    json

    • Default: false
    • Type: Boolean

    Whether or not to output JSON data, rather than the normal output.

    This feature is currently experimental, and the output data structures for many commands is either not implemented in JSON yet, or subject to change. Only the output from npm ls --json is currently valid.

    key

    • Default: null
    • Type: String

    A client key to pass when accessing the registry.

    • Default: false
    • Type: Boolean

    If true, then local installs will link if there is a suitable globally installed package.

    Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:

    • The package is not already installed globally, or
    • the globally installed version is identical to the version that is being installed locally.

    local-address

    • Default: undefined
    • Type: IP Address

    The IP address of the local interface to use when making connections to the npm registry. Must be IPv4 in versions of Node prior to 0.12.

    loglevel

    • Default: "warn"
    • Type: String
    • Values: "silent", "error", "warn", "http", "info", "verbose", "silly"

    What level of logs to report. On failure, all logs are written to npm-debug.log in the current working directory.

    Any logs of a higher level than the setting are shown. The default is "warn", which shows warn and error output.

    logstream

    • Default: process.stderr
    • Type: Stream

    This is the stream that is passed to the npmlog module at run time.

    It cannot be set from the command line, but if you are using npm programmatically, you may wish to send logs to somewhere other than stderr.

    If the color config is set to true, then this stream will receive colored output if it is a TTY.

    long

    • Default: false
    • Type: Boolean

    Show extended information in npm ls and npm search.

    message

    • Default: "%s"
    • Type: String

    Commit message which is used by npm version when creating version commit.

    Any "%s" in the message will be replaced with the version number.

    node-version

    • Default: process.version
    • Type: semver or false

    The node version to use when checking a package's engines map.

    npat

    • Default: false
    • Type: Boolean

    Run tests on installation.

    onload-script

    • Default: false
    • Type: path

    A node module to require() when npm loads. Useful for programmatic usage.

    optional

    • Default: true
    • Type: Boolean

    Attempt to install packages in the optionalDependencies object. Note that if these packages fail to install, the overall installation process is not aborted.

    parseable

    • Default: false
    • Type: Boolean

    Output parseable results from commands that write to standard output.

    prefix

    The location to install global items. If set on the command line, then it forces non-global commands to run in the specified folder.

    production

    • Default: false
    • Type: Boolean

    Set to true to run in "production" mode.

    1. devDependencies are not installed at the topmost level when running local npm install without any arguments.
    2. Set the NODE_ENV="production" for lifecycle scripts.

    proprietary-attribs

    • Default: true
    • Type: Boolean

    Whether or not to include proprietary extended attributes in the tarballs created by npm.

    Unless you are expecting to unpack package tarballs with something other than npm -- particularly a very outdated tar implementation -- leave this as true.

    proxy

    • Default: null
    • Type: url

    A proxy to use for outgoing http requests. If the HTTP_PROXY or http_proxy environment variables are set, proxy settings will be honored by the underlying request library.

    rebuild-bundle

    • Default: true
    • Type: Boolean

    Rebuild bundled dependencies after installation.

    registry

    The base URL of the npm package registry.

    rollback

    • Default: true
    • Type: Boolean

    Remove failed installs.

    save

    • Default: false
    • Type: Boolean

    Save installed packages to a package.json file as dependencies.

    When used with the npm rm command, it removes it from the dependencies object.

    Only works if there is already a package.json file present.

    save-bundle

    • Default: false
    • Type: Boolean

    If a package would be saved at install time by the use of --save, --save-dev, or --save-optional, then also put it in the bundleDependencies list.

    When used with the npm rm command, it removes it from the bundledDependencies list.

    save-dev

    • Default: false
    • Type: Boolean

    Save installed packages to a package.json file as devDependencies.

    When used with the npm rm command, it removes it from the devDependencies object.

    Only works if there is already a package.json file present.

    save-exact

    • Default: false
    • Type: Boolean

    Dependencies saved to package.json using --save, --save-dev or --save-optional will be configured with an exact version rather than using npm's default semver range operator.

    save-optional

    • Default: false
    • Type: Boolean

    Save installed packages to a package.json file as optionalDependencies.

    When used with the npm rm command, it removes it from the devDependencies object.

    Only works if there is already a package.json file present.

    save-prefix

    • Default: '^'
    • Type: String

    Configure how versions of packages installed to a package.json file via --save or --save-dev get prefixed.

    For example if a package has version 1.2.3, by default it's version is set to ^1.2.3 which allows minor upgrades for that package, but after npm config set save-prefix='~' it would be set to ~1.2.3 which only allows patch upgrades.

    scope

    • Default: ""
    • Type: String

    Associate an operation with a scope for a scoped registry. Useful when logging in to a private registry for the first time: npm login --scope=@organization --registry=registry.organization.com, which will cause @organization to be mapped to the registry for future installation of packages specified according to the pattern @organization/package.

    searchopts

    • Default: ""
    • Type: String

    Space-separated options that are always passed to search.

    searchexclude

    • Default: ""
    • Type: String

    Space-separated options that limit the results from search.

    searchsort

    • Default: "name"
    • Type: String
    • Values: "name", "-name", "date", "-date", "description", "-description", "keywords", "-keywords"

    Indication of which field to sort search results by. Prefix with a - character to indicate reverse sort.

    shell

    • Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows
    • Type: path

    The shell to run for the npm explore command.

    shrinkwrap

    • Default: true
    • Type: Boolean

    If set to false, then ignore npm-shrinkwrap.json files when installing.

    sign-git-tag

    • Default: false
    • Type: Boolean

    If set to true, then the npm version command will tag the version using -s to add a signature.

    Note that git requires you to have set up GPG keys in your git configs for this to work properly.

    spin

    • Default: true
    • Type: Boolean or "always"

    When set to true, npm will display an ascii spinner while it is doing things, if process.stderr is a TTY.

    Set to false to suppress the spinner, or set to always to output the spinner even for non-TTY outputs.

    strict-ssl

    • Default: true
    • Type: Boolean

    Whether or not to do SSL key validation when making requests to the registry via https.

    See also the ca config.

    tag

    • Default: latest
    • Type: String

    If you ask npm to install a package and don't tell it a specific version, then it will install the specified tag.

    Also the tag that is added to the package@version specified by the npm tag command, if no explicit tag is given.

    tmp

    • Default: TMPDIR environment variable, or "/tmp"
    • Type: path

    Where to store temporary files and folders. All temp files are deleted on success, but left behind on failure for forensic purposes.

    unicode

    • Default: true
    • Type: Boolean

    When set to true, npm uses unicode characters in the tree output. When false, it uses ascii characters to draw trees.

    unsafe-perm

    • Default: false if running as root, true otherwise
    • Type: Boolean

    Set to true to suppress the UID/GID switching when running package scripts. If set explicitly to false, then installing as a non-root user will fail.

    usage

    • Default: false
    • Type: Boolean

    Set to show short usage output (like the -H output) instead of complete help when doing npm-help(1).

    user

    • Default: "nobody"
    • Type: String or Number

    The UID to set to when running package scripts as root.

    userconfig

    • Default: ~/.npmrc
    • Type: path

    The location of user-level configuration settings.

    umask

    • Default: 022
    • Type: Octal numeric string

    The "umask" value to use when setting the file creation mode on files and folders.

    Folders and executables are given a mode which is 0777 masked against this value. Other files are given a mode which is 0666 masked against this value. Thus, the defaults are 0755 and 0644 respectively.

    user-agent

    • Default: node/{process.version} {process.platform} {process.arch}
    • Type: String

    Sets a User-Agent to the request header

    version

    • Default: false
    • Type: boolean

    If true, output the npm version and exit successfully.

    Only relevant when specified explicitly on the command line.

    versions

    • Default: false
    • Type: boolean

    If true, output the npm version as well as node's process.versions map, and exit successfully.

    Only relevant when specified explicitly on the command line.

    viewer

    • Default: "man" on Posix, "browser" on Windows
    • Type: path

    The program to use to view help content.

    Set to "browser" to view html help content in the default web browser.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-developers.html000644 000766 000024 00000021423 12455173731 030632 0ustar00iojsstaff000000 000000

    npm-developers

    Developer Guide

    DESCRIPTION

    So, you've decided to use npm to develop (and maybe publish/deploy) your project.

    Fantastic!

    There are a few things that you need to do above the simple steps that your users will do to install your program.

    About These Documents

    These are man pages. If you install npm, you should be able to then do man npm-thing to get the documentation on a particular topic, or npm help thing to see the same information.

    What is a package

    A package is:

    • a) a folder containing a program described by a package.json file
    • b) a gzipped tarball containing (a)
    • c) a url that resolves to (b)
    • d) a <name>@<version> that is published on the registry with (c)
    • e) a <name>@<tag> that points to (d)
    • f) a <name> that has a "latest" tag satisfying (e)
    • g) a git url that, when cloned, results in (a).

    Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b).

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    The package.json File

    You need to have a package.json file in the root of your project to do much of anything with npm. That is basically the whole interface.

    See package.json(5) for details about what goes in that file. At the very least, you need:

    • name: This should be a string that identifies your project. Please do not use the name to specify that it runs on node, or is in JavaScript. You can use the "engines" field to explicitly state the versions of node (or whatever else) that your program requires, and it's pretty well assumed that it's javascript.

      It does not necessarily need to match your github repository name.

      So, node-foo and bar-js are bad names. foo or bar are better.

    • version: A semver-compatible version.

    • engines: Specify the versions of node (or whatever else) that your program runs on. The node API changes a lot, and there may be bugs or new functionality that you depend on. Be explicit.

    • author: Take some credit.

    • scripts: If you have a special compilation or installation script, then you should put it in the scripts object. You should definitely have at least a basic smoke-test command as the "scripts.test" field. See npm-scripts(7).

    • main: If you have a single module that serves as the entry point to your program (like what the "foo" package gives you at require("foo")), then you need to specify that in the "main" field.

    • directories: This is an object mapping names to folders. The best ones to include are "lib" and "doc", but if you use "man" to specify a folder full of man pages, they'll get installed just like these ones.

    You can use npm init in the root of your package in order to get you started with a pretty basic package.json file. See npm-init(1) for more info.

    Keeping files out of your package

    Use a .npmignore file to keep stuff out of your package. If there's no .npmignore file, but there is a .gitignore file, then npm will ignore the stuff matched by the .gitignore file. If you want to include something that is excluded by your .gitignore file, you can create an empty .npmignore file to override it.

    .npmignore files follow the same pattern rules as .gitignore files:

    • Blank lines or lines starting with # are ignored.
    • Standard glob patterns work.
    • You can end patterns with a forward slash / to specify a directory.
    • You can negate a pattern by starting it with an exclamation point !.

    By default, the following paths and files are ignored, so there's no need to add them to .npmignore explicitly:

    • .*.swp
    • ._*
    • .DS_Store
    • .git
    • .hg
    • .lock-wscript
    • .svn
    • .wafpickle-*
    • CVS
    • npm-debug.log

    Additionally, everything in node_modules is ignored, except for bundled dependencies. npm automatically handles this for you, so don't bother adding node_modules to .npmignore.

    The following paths and files are never ignored, so adding them to .npmignore is pointless:

    npm link is designed to install a development package and see the changes in real time without having to keep re-installing it. (You do need to either re-link or npm rebuild -g to update compiled packages, of course.)

    More info at npm-link(1).

    Before Publishing: Make Sure Your Package Installs and Works

    This is important.

    If you can not install it locally, you'll have problems trying to publish it. Or, worse yet, you'll be able to publish it, but you'll be publishing a broken or pointless package. So don't do that.

    In the root of your package, do this:

    npm install . -g
    

    That'll show you that it's working. If you'd rather just create a symlink package that points to your working directory, then do this:

    npm link
    

    Use npm ls -g to see if it's there.

    To test a local install, go into some other folder, and then do:

    cd ../some-other-folder
    npm install ../my-package
    

    to install it locally into the node_modules folder in that other place.

    Then go into the node-repl, and try using require("my-thing") to bring in your module's main module.

    Create a User Account

    Create a user with the adduser command. It works like this:

    npm adduser
    

    and then follow the prompts.

    This is documented better in npm-adduser(1).

    Publish your package

    This part's easy. IN the root of your folder, do this:

    npm publish
    

    You can give publish a url to a tarball, or a filename of a tarball, or a path to a folder.

    Note that pretty much everything in that folder will be exposed by default. So, if you have secret stuff in there, use a .npmignore file to list out the globs to ignore, or publish from a fresh checkout.

    Brag about it

    Send emails, write blogs, blab in IRC.

    Tell the world how easy it is to install your program!

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-disputes.html000644 000766 000024 00000013406 12455173731 030324 0ustar00iojsstaff000000 000000

    npm-disputes

    Handling Module Name Disputes

    SYNOPSIS

    1. Get the author email with npm owner ls <pkgname>
    2. Email the author, CC support@npmjs.com
    3. After a few weeks, if there's no resolution, we'll sort it out.

    Don't squat on package names. Publish code or move out of the way.

    DESCRIPTION

    There sometimes arise cases where a user publishes a module, and then later, some other user wants to use that name. Here are some common ways that happens (each of these is based on actual events.)

    1. Joe writes a JavaScript module foo, which is not node-specific. Joe doesn't use node at all. Bob wants to use foo in node, so he wraps it in an npm module. Some time later, Joe starts using node, and wants to take over management of his program.
    2. Bob writes an npm module foo, and publishes it. Perhaps much later, Joe finds a bug in foo, and fixes it. He sends a pull request to Bob, but Bob doesn't have the time to deal with it, because he has a new job and a new baby and is focused on his new erlang project, and kind of not involved with node any more. Joe would like to publish a new foo, but can't, because the name is taken.
    3. Bob writes a 10-line flow-control library, and calls it foo, and publishes it to the npm registry. Being a simple little thing, it never really has to be updated. Joe works for Foo Inc, the makers of the critically acclaimed and widely-marketed foo JavaScript toolkit framework. They publish it to npm as foojs, but people are routinely confused when npm install foo is some different thing.
    4. Bob writes a parser for the widely-known foo file format, because he needs it for work. Then, he gets a new job, and never updates the prototype. Later on, Joe writes a much more complete foo parser, but can't publish, because Bob's foo is in the way.

    The validity of Joe's claim in each situation can be debated. However, Joe's appropriate course of action in each case is the same.

    1. npm owner ls foo. This will tell Joe the email address of the owner (Bob).
    2. Joe emails Bob, explaining the situation as respectfully as possible, and what he would like to do with the module name. He adds the npm support staff support@npmjs.com to the CC list of the email. Mention in the email that Bob can run npm owner add joe foo to add Joe as an owner of the foo package.
    3. After a reasonable amount of time, if Bob has not responded, or if Bob and Joe can't come to any sort of resolution, email support support@npmjs.com and we'll sort it out. ("Reasonable" is usually at least 4 weeks, but extra time is allowed around common holidays.)

    REASONING

    In almost every case so far, the parties involved have been able to reach an amicable resolution without any major intervention. Most people really do want to be reasonable, and are probably not even aware that they're in your way.

    Module ecosystems are most vibrant and powerful when they are as self-directed as possible. If an admin one day deletes something you had worked on, then that is going to make most people quite upset, regardless of the justification. When humans solve their problems by talking to other humans with respect, everyone has the chance to end up feeling good about the interaction.

    EXCEPTIONS

    Some things are not allowed, and will be removed without discussion if they are brought to the attention of the npm registry admins, including but not limited to:

    1. Malware (that is, a package designed to exploit or harm the machine on which it is installed).
    2. Violations of copyright or licenses (for example, cloning an MIT-licensed program, and then removing or changing the copyright and license statement).
    3. Illegal content.
    4. "Squatting" on a package name that you plan to use, but aren't actually using. Sorry, I don't care how great the name is, or how perfect a fit it is for the thing that someday might happen. If someone wants to use it today, and you're just taking up space with an empty tarball, you're going to be evicted.
    5. Putting empty packages in the registry. Packages must have SOME functionality. It can be silly, but it can't be nothing. (See also: squatting.)
    6. Doing weird things with the registry, like using it as your own personal application database or otherwise putting non-packagey things into it.

    If you see bad behavior like this, please report it right away.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-faq.html000644 000766 000024 00000046767 12455173731 027253 0ustar00iojsstaff000000 000000

    npm-faq

    Frequently Asked Questions

    Where can I find these docs in HTML?

    https://docs.npmjs.com/, or run:

    npm config set viewer browser
    

    to open these documents in your default web browser rather than man.

    It didn't work.

    That's not really a question.

    Why didn't it work?

    I don't know yet.

    Read the error output, and if you can't figure out what it means, do what it says and post a bug with all the information it asks for.

    Where does npm put stuff?

    See npm-folders(5)

    tl;dr:

    • Use the npm root command to see where modules go, and the npm bin command to see where executables go
    • Global installs are different from local installs. If you install something with the -g flag, then its executables go in npm bin -g and its modules go in npm root -g.

    How do I install something on my computer in a central location?

    Install it globally by tacking -g or --global to the command. (This is especially important for command line utilities that need to add their bins to the global system PATH.)

    I installed something globally, but I can't require() it

    Install it locally.

    The global install location is a place for command-line utilities to put their bins in the system PATH. It's not for use with require().

    If you require() a module in your code, then that means it's a dependency, and a part of your program. You need to install it locally in your program.

    Why can't npm just put everything in one place, like other package managers?

    Not every change is an improvement, but every improvement is a change. This would be like asking git to do network IO for every commit. It's not going to happen, because it's a terrible idea that causes more problems than it solves.

    It is much harder to avoid dependency conflicts without nesting dependencies. This is fundamental to the way that npm works, and has proven to be an extremely successful approach. See npm-folders(5) for more details.

    If you want a package to be installed in one place, and have all your programs reference the same copy of it, then use the npm link command. That's what it's for. Install it globally, then link it into each program that uses it.

    Whatever, I really want the old style 'everything global' style.

    Write your own package manager. You could probably even wrap up npm in a shell script if you really wanted to.

    npm will not help you do something that is known to be a bad idea.

    Should I check my node_modules folder into git?

    Usually, no. Allow npm to resolve dependencies for your packages.

    For packages you deploy, such as websites and apps, you should use npm shrinkwrap to lock down your full dependency tree:

    https://docs.npmjs.com/cli/shrinkwrap

    If you are paranoid about depending on the npm ecosystem, you should run a private npm mirror or a private cache.

    If you want 100% confidence in being able to reproduce the specific bytes included in a deployment, you should use an additional mechanism that can verify contents rather than versions. For example, Amazon machine images, DigitalOcean snapshots, Heroku slugs, or simple tarballs.

    Is it 'npm' or 'NPM' or 'Npm'?

    npm should never be capitalized unless it is being displayed in a location that is customarily all-caps (such as the title of man pages.)

    If 'npm' is an acronym, why is it never capitalized?

    Contrary to the belief of many, "npm" is not in fact an abbreviation for "Node Package Manager". It is a recursive bacronymic abbreviation for "npm is not an acronym". (If it was "ninaa", then it would be an acronym, and thus incorrectly named.)

    "NPM", however, is an acronym (more precisely, a capitonym) for the National Association of Pastoral Musicians. You can learn more about them at http://npm.org/.

    In software, "NPM" is a Non-Parametric Mapping utility written by Chris Rorden. You can analyze pictures of brains with it. Learn more about the (capitalized) NPM program at http://www.cabiatl.com/mricro/npm/.

    The first seed that eventually grew into this flower was a bash utility named "pm", which was a shortened descendent of "pkgmakeinst", a bash function that was used to install various different things on different platforms, most often using Yahoo's yinst. If npm was ever an acronym for anything, it was node pm or maybe new pm.

    So, in all seriousness, the "npm" project is named after its command-line utility, which was organically selected to be easily typed by a right-handed programmer using a US QWERTY keyboard layout, ending with the right-ring-finger in a postition to type the - key for flags and other command-line arguments. That command-line utility is always lower-case, though it starts most sentences it is a part of.

    How do I list installed packages?

    npm ls

    How do I search for packages?

    npm search

    Arguments are greps. npm search jsdom shows jsdom packages.

    How do I update npm?

    npm install npm -g
    

    You can also update all outdated local packages by doing npm update without any arguments, or global packages by doing npm update -g.

    Occasionally, the version of npm will progress such that the current version cannot be properly installed with the version that you have installed already. (Consider, if there is ever a bug in the update command.)

    In those cases, you can do this:

    curl https://www.npmjs.com/install.sh | sh
    

    What is a package?

    A package is:

    • a) a folder containing a program described by a package.json file
    • b) a gzipped tarball containing (a)
    • c) a url that resolves to (b)
    • d) a <name>@<version> that is published on the registry with (c)
    • e) a <name>@<tag> that points to (d)
    • f) a <name> that has a "latest" tag satisfying (e)
    • g) a git url that, when cloned, results in (a).

    Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b).

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    What is a module?

    A module is anything that can be loaded with require() in a Node.js program. The following things are all examples of things that can be loaded as modules:

    • A folder with a package.json file containing a main field.
    • A folder with an index.js file in it.
    • A JavaScript file.

    Most npm packages are modules, because they are libraries that you load with require. However, there's no requirement that an npm package be a module! Some only contain an executable command-line interface, and don't provide a main field for use in Node programs.

    Almost all npm packages (at least, those that are Node programs) contain many modules within them (because every file they load with require() is a module).

    In the context of a Node program, the module is also the thing that was loaded from a file. For example, in the following program:

    var req = require('request')
    

    we might say that "The variable req refers to the request module".

    So, why is it the "node_modules" folder, but "package.json" file? Why not node_packages or module.json?

    The package.json file defines the package. (See "What is a package?" above.)

    The node_modules folder is the place Node.js looks for modules. (See "What is a module?" above.)

    For example, if you create a file at node_modules/foo.js and then had a program that did var f = require('foo.js') then it would load the module. However, foo.js is not a "package" in this case, because it does not have a package.json.

    Alternatively, if you create a package which does not have an index.js or a "main" field in the package.json file, then it is not a module. Even if it's installed in node_modules, it can't be an argument to require().

    "node_modules" is the name of my deity's arch-rival, and a Forbidden Word in my religion. Can I configure npm to use a different folder?

    No. This will never happen. This question comes up sometimes, because it seems silly from the outside that npm couldn't just be configured to put stuff somewhere else, and then npm could load them from there. It's an arbitrary spelling choice, right? What's the big deal?

    At the time of this writing, the string 'node_modules' appears 151 times in 53 separate files in npm and node core (excluding tests and documentation).

    Some of these references are in node's built-in module loader. Since npm is not involved at all at run-time, node itself would have to be configured to know where you've decided to stick stuff. Complexity hurdle #1. Since the Node module system is locked, this cannot be changed, and is enough to kill this request. But I'll continue, in deference to your deity's delicate feelings regarding spelling.

    Many of the others are in dependencies that npm uses, which are not necessarily tightly coupled to npm (in the sense that they do not read npm's configuration files, etc.) Each of these would have to be configured to take the name of the node_modules folder as a parameter. Complexity hurdle #2.

    Furthermore, npm has the ability to "bundle" dependencies by adding the dep names to the "bundledDependencies" list in package.json, which causes the folder to be included in the package tarball. What if the author of a module bundles its dependencies, and they use a different spelling for node_modules? npm would have to rename the folder at publish time, and then be smart enough to unpack it using your locally configured name. Complexity hurdle #3.

    Furthermore, what happens when you change this name? Fine, it's easy enough the first time, just rename the node_modules folders to ./blergyblerp/ or whatever name you choose. But what about when you change it again? npm doesn't currently track any state about past configuration settings, so this would be rather difficult to do properly. It would have to track every previous value for this config, and always accept any of them, or else yesterday's install may be broken tomorrow. Complexity hurdle #4.

    Never going to happen. The folder is named node_modules. It is written indelibly in the Node Way, handed down from the ancient times of Node 0.3.

    How do I install node with npm?

    You don't. Try one of these node version managers:

    Unix:

    Windows:

    How can I use npm for development?

    See npm-developers(7) and package.json(5).

    You'll most likely want to npm link your development folder. That's awesomely handy.

    To set up your own private registry, check out npm-registry(7).

    Can I list a url as a dependency?

    Yes. It should be a url to a gzipped tarball containing a single folder that has a package.json in its root, or a git url. (See "what is a package?" above.)

    See npm-link(1)

    The package registry website. What is that exactly?

    See npm-registry(7).

    I forgot my password, and can't publish. How do I reset it?

    Go to https://npmjs.com/forgot.

    I get ECONNREFUSED a lot. What's up?

    Either the registry is down, or node's DNS isn't able to reach out.

    To check if the registry is down, open up https://registry.npmjs.org/ in a web browser. This will also tell you if you are just unable to access the internet for some reason.

    If the registry IS down, let us know by emailing support@npmjs.com or posting an issue at https://github.com/npm/npm/issues. If it's down for the world (and not just on your local network) then we're probably already being pinged about it.

    You can also often get a faster response by visiting the #npm channel on Freenode IRC.

    Why no namespaces?

    npm has only one global namespace. If you want to namespace your own packages, you may: simply use the - character to separate the names. npm is a mostly anarchic system. There is not sufficient need to impose namespace rules on everyone.

    As of 2.0, npm supports scoped packages, which allow you to publish a group of related modules without worrying about name collisions.

    Every npm user owns the scope associated with their username. For example, the user named npm owns the scope @npm. Scoped packages are published inside a scope by naming them as if they were files under the scope directory, e.g., by setting name in package.json to @npm/npm.

    Scoped packages can coexist with public npm packages in a private npm registry. At present (2014-11-04) scoped packages may NOT be published to the public npm registry.

    Unscoped packages can only depend on other unscoped packages. Scoped packages can depend on packages from their own scope, a different scope, or the public registry (unscoped).

    For the current documentation of scoped packages, see https://docs.npmjs.com/misc/scope

    References:

    1. For the reasoning behind the "one global namespace", please see this discussion: https://github.com/npm/npm/issues/798 (TL;DR: It doesn't actually make things better, and can make them worse.)

    2. For the pre-implementation discussion of the scoped package feature, see this discussion: https://github.com/npm/npm/issues/5239

    Who does npm?

    npm was originally written by Isaac Z. Schlueter, and many others have contributed to it, some of them quite substantially.

    The npm open source project, The npm Registry, and the community website are maintained and operated by the good folks at npm, Inc.

    I have a question or request not addressed here. Where should I put it?

    Post an issue on the github project:

    Why does npm hate me?

    npm is not capable of hatred. It loves everyone, especially you.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-index.html000644 000766 000024 00000026464 12455173731 027603 0ustar00iojsstaff000000 000000

    npm-index

    Index of all npm documentation

    README

    a JavaScript package manager

    Command Line Documentation

    Using npm on the command line

    npm(1)

    node package manager

    npm-adduser(1)

    Add a registry user account

    npm-bin(1)

    Display npm bin folder

    npm-bugs(1)

    Bugs for a package in a web browser maybe

    npm-build(1)

    Build a package

    npm-bundle(1)

    REMOVED

    npm-cache(1)

    Manipulates packages cache

    npm-completion(1)

    Tab Completion for npm

    npm-config(1)

    Manage the npm configuration files

    npm-dedupe(1)

    Reduce duplication

    npm-deprecate(1)

    Deprecate a version of a package

    npm-docs(1)

    Docs for a package in a web browser maybe

    npm-edit(1)

    Edit an installed package

    npm-explore(1)

    Browse an installed package

    npm-help-search(1)

    Search npm help documentation

    npm-help(1)

    Get help on npm

    npm-init(1)

    Interactively create a package.json file

    npm-install(1)

    Install a package

    Symlink a package folder

    npm-ls(1)

    List installed packages

    npm-outdated(1)

    Check for outdated packages

    npm-owner(1)

    Manage package owners

    npm-pack(1)

    Create a tarball from a package

    npm-prefix(1)

    Display prefix

    npm-prune(1)

    Remove extraneous packages

    npm-publish(1)

    Publish a package

    npm-rebuild(1)

    Rebuild a package

    npm-repo(1)

    Open package repository page in the browser

    npm-restart(1)

    Restart a package

    npm-rm(1)

    Remove a package

    npm-root(1)

    Display npm root

    npm-run-script(1)

    Run arbitrary package scripts

    npm-search(1)

    Search for packages

    npm-shrinkwrap(1)

    Lock down dependency versions

    npm-star(1)

    Mark your favorite packages

    npm-stars(1)

    View packages marked as favorites

    npm-start(1)

    Start a package

    npm-stop(1)

    Stop a package

    npm-tag(1)

    Tag a published version

    npm-test(1)

    Test a package

    npm-uninstall(1)

    Remove a package

    npm-unpublish(1)

    Remove a package from the registry

    npm-update(1)

    Update a package

    npm-version(1)

    Bump a package version

    npm-view(1)

    View registry info

    npm-whoami(1)

    Display npm username

    API Documentation

    Using npm in your Node programs

    npm(3)

    node package manager

    npm-bin(3)

    Display npm bin folder

    npm-bugs(3)

    Bugs for a package in a web browser maybe

    npm-cache(3)

    manage the npm cache programmatically

    npm-commands(3)

    npm commands

    npm-config(3)

    Manage the npm configuration files

    npm-deprecate(3)

    Deprecate a version of a package

    npm-docs(3)

    Docs for a package in a web browser maybe

    npm-edit(3)

    Edit an installed package

    npm-explore(3)

    Browse an installed package

    npm-help-search(3)

    Search the help pages

    npm-init(3)

    Interactively create a package.json file

    npm-install(3)

    install a package programmatically

    Symlink a package folder

    npm-load(3)

    Load config settings

    npm-ls(3)

    List installed packages

    npm-outdated(3)

    Check for outdated packages

    npm-owner(3)

    Manage package owners

    npm-pack(3)

    Create a tarball from a package

    npm-prefix(3)

    Display prefix

    npm-prune(3)

    Remove extraneous packages

    npm-publish(3)

    Publish a package

    npm-rebuild(3)

    Rebuild a package

    npm-repo(3)

    Open package repository page in the browser

    npm-restart(3)

    Restart a package

    npm-root(3)

    Display npm root

    npm-run-script(3)

    Run arbitrary package scripts

    npm-search(3)

    Search for packages

    npm-shrinkwrap(3)

    programmatically generate package shrinkwrap file

    npm-start(3)

    Start a package

    npm-stop(3)

    Stop a package

    npm-tag(3)

    Tag a published version

    npm-test(3)

    Test a package

    npm-uninstall(3)

    uninstall a package programmatically

    npm-unpublish(3)

    Remove a package from the registry

    npm-update(3)

    Update a package

    npm-version(3)

    Bump a package version

    npm-view(3)

    View registry info

    npm-whoami(3)

    Display npm username

    Files

    File system structures npm uses

    npm-folders(5)

    Folder Structures Used by npm

    npmrc(5)

    The npm config files

    package.json(5)

    Specifics of npm's package.json handling

    Misc

    Various other bits and bobs

    npm-coding-style(7)

    npm's "funny" coding style

    npm-config(7)

    More than you probably want to know about npm configuration

    npm-developers(7)

    Developer Guide

    npm-disputes(7)

    Handling Module Name Disputes

    npm-faq(7)

    Frequently Asked Questions

    npm-index(7)

    Index of all npm documentation

    npm-registry(7)

    The JavaScript Package Registry

    npm-scope(7)

    Scoped packages

    npm-scripts(7)

    How npm handles the "scripts" field

    removing-npm(7)

    Cleaning the Slate

    semver(7)

    The semantic versioner for npm

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-registry.html000644 000766 000024 00000007242 12455173731 030335 0ustar00iojsstaff000000 000000

    npm-registry

    The JavaScript Package Registry

    DESCRIPTION

    To resolve packages by name and version, npm talks to a registry website that implements the CommonJS Package Registry specification for reading package info.

    Additionally, npm's package registry implementation supports several write APIs as well, to allow for publishing packages and managing user account information.

    The official public npm registry is at http://registry.npmjs.org/. It is powered by a CouchDB database, of which there is a public mirror at http://skimdb.npmjs.com/registry. The code for the couchapp is available at http://github.com/npm/npm-registry-couchapp.

    The registry URL used is determined by the scope of the package (see npm-scope(7)). If no scope is specified, the default registry is used, which is supplied by the registry config parameter. See npm-config(1), npmrc(5), and npm-config(7) for more on managing npm's configuration.

    Can I run my own private registry?

    Yes!

    The easiest way is to replicate the couch database, and use the same (or similar) design doc to implement the APIs.

    If you set up continuous replication from the official CouchDB, and then set your internal CouchDB as the registry config, then you'll be able to read any published packages, in addition to your private ones, and by default will only publish internally. If you then want to publish a package for the whole world to see, you can simply override the --registry config for that command.

    I don't want my package published in the official registry. It's private.

    Set "private": true in your package.json to prevent it from being published at all, or "publishConfig":{"registry":"http://my-internal-registry.local"} to force it to be published only to your internal registry.

    See package.json(5) for more info on what goes in the package.json file.

    Will you replicate from my registry into the public one?

    No. If you want things to be public, then publish them into the public registry using npm. What little security there is would be for nought otherwise.

    Do I have to use couchdb to build a registry that npm can talk to?

    No, but it's way easier. Basically, yes, you do, or you have to effectively implement the entire CouchDB API anyway.

    Is there a website or something to see package docs and such?

    Yes, head over to https://npmjs.com/

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-scope.html000644 000766 000024 00000007312 12455173731 027574 0ustar00iojsstaff000000 000000

    npm-scope

    Scoped packages

    DESCRIPTION

    All npm packages have a name. Some package names also have a scope. A scope follows the usual rules for package names (url-safe characters, no leading dots or underscores). When used in package names, preceded by an @-symbol and followed by a slash, e.g.

    @somescope/somepackagename
    

    Scopes are a way of grouping related packages together, and also affect a few things about the way npm treats the package.

    As of 2014-09-03, scoped packages are not supported by the public npm registry. However, the npm client is backwards-compatible with un-scoped registries, so it can be used to work with scoped and un-scoped registries at the same time.

    Installing scoped packages

    Scoped packages are installed to a sub-folder of the regular installation folder, e.g. if your other packages are installed in node_modules/packagename, scoped modules will be in node_modules/@myorg/packagename. The scope folder (@myorg) is simply the name of the scope preceded by an @-symbol, and can contain any number of scoped packages.

    A scoped package is installed by referencing it by name, preceded by an @-symbol, in npm install:

    npm install @myorg/mypackage
    

    Or in package.json:

    "dependencies": {
      "@myorg/mypackage": "^1.3.0"
    }
    

    Note that if the @-symbol is omitted in either case npm will instead attempt to install from GitHub; see npm-install(1).

    Requiring scoped packages

    Because scoped packages are installed into a scope folder, you have to include the name of the scope when requiring them in your code, e.g.

    require('@myorg/mypackage')
    

    There is nothing special about the way Node treats scope folders, this is just specifying to require the module mypackage in the folder called @myorg.

    Publishing scoped packages

    Scoped packages can be published to any registry that supports them. As of 2014-09-03, the public npm registry does not support scoped packages, so attempting to publish a scoped package to the registry will fail unless you have associated that scope with a different registry, see below.

    Associating a scope with a registry

    Scopes can be associated with a separate registry. This allows you to seamlessly use a mix of packages from the public npm registry and one or more private registries, such as npm Enterprise.

    You can associate a scope with a registry at login, e.g.

    npm login --registry=http://reg.example.com --scope=@myco
    

    Scopes have a many-to-one relationship with registries: one registry can host multiple scopes, but a scope only ever points to one registry.

    You can also associate a scope with a registry using npm config:

    npm config set @myco:registry http://reg.example.com
    

    Once a scope is associated with a registry, any npm install for a package with that scope will request packages from that registry instead. Any npm publish for a package name that contains the scope will be published to that registry instead.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/npm-scripts.html000644 000766 000024 00000025470 12455173731 030157 0ustar00iojsstaff000000 000000

    npm-scripts

    How npm handles the "scripts" field

    DESCRIPTION

    npm supports the "scripts" property of the package.json script, for the following scripts:

    • prepublish: Run BEFORE the package is published. (Also run on local npm install without any arguments.)
    • publish, postpublish: Run AFTER the package is published.
    • preinstall: Run BEFORE the package is installed
    • install, postinstall: Run AFTER the package is installed.
    • preuninstall, uninstall: Run BEFORE the package is uninstalled.
    • postuninstall: Run AFTER the package is uninstalled.
    • pretest, test, posttest: Run by the npm test command.
    • prestop, stop, poststop: Run by the npm stop command.
    • prestart, start, poststart: Run by the npm start command.
    • prerestart, restart, postrestart: Run by the npm restart command. Note: npm restart will run the stop and start scripts if no restart script is provided.

    Additionally, arbitrary scripts can be executed by running npm run-script <pkg> <stage>. Pre and post commands with matching names will be run for those as well (e.g. premyscript, myscript, postmyscript).

    NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN

    tl;dr Don't use install. Use a .gyp file for compilation, and prepublish for anything else.

    You should almost never have to explicitly set a preinstall or install script. If you are doing this, please consider if there is another option.

    The only valid use of install or preinstall scripts is for compilation which must be done on the target architecture. In early versions of node, this was often done using the node-waf scripts, or a standalone Makefile, and early versions of npm required that it be explicitly set in package.json. This was not portable, and harder to do properly.

    In the current version of node, the standard way to do this is using a .gyp file. If you have a file with a .gyp extension in the root of your package, then npm will run the appropriate node-gyp commands automatically at install time. This is the only officially supported method for compiling binary addons, and does not require that you add anything to your package.json file.

    If you have to do other things before your package is used, in a way that is not dependent on the operating system or architecture of the target system, then use a prepublish script instead. This includes tasks such as:

    • Compile CoffeeScript source code into JavaScript.
    • Create minified versions of JavaScript source code.
    • Fetching remote resources that your package will use.

    The advantage of doing these things at prepublish time instead of preinstall or install time is that they can be done once, in a single place, and thus greatly reduce complexity and variability. Additionally, this means that:

    • You can depend on coffee-script as a devDependency, and thus your users don't need to have it installed.
    • You don't need to include the minifiers in your package, reducing the size for your users.
    • You don't need to rely on your users having curl or wget or other system tools on the target machines.

    DEFAULT VALUES

    npm will default some script values based on package contents.

    • "start": "node server.js":

      If there is a server.js file in the root of your package, then npm will default the start command to node server.js.

    • "preinstall": "node-waf clean || true; node-waf configure build":

      If there is a wscript file in the root of your package, npm will default the preinstall command to compile using node-waf.

    USER

    If npm was invoked with root privileges, then it will change the uid to the user account or uid specified by the user config, which defaults to nobody. Set the unsafe-perm flag to run scripts with root privileges.

    ENVIRONMENT

    Package scripts run in an environment where many pieces of information are made available regarding the setup of npm and the current state of the process.

    path

    If you depend on modules that define executable scripts, like test suites, then those executables will be added to the PATH for executing the scripts. So, if your package.json has this:

    { "name" : "foo"
    , "dependencies" : { "bar" : "0.1.x" }
    , "scripts": { "start" : "bar ./test" } }
    

    then you could run npm start to execute the bar script, which is exported into the node_modules/.bin directory on npm install.

    package.json vars

    The package.json fields are tacked onto the npm_package_ prefix. So, for instance, if you had {"name":"foo", "version":"1.2.5"} in your package.json file, then your package scripts would have the npm_package_name environment variable set to "foo", and the npm_package_version set to "1.2.5"

    configuration

    Configuration parameters are put in the environment with the npm_config_ prefix. For instance, you can view the effective root config by checking the npm_config_root environment variable.

    Special: package.json "config" object

    The package.json "config" keys are overwritten in the environment if there is a config param of <name>[@<version>]:<key>. For example, if the package.json has this:

    { "name" : "foo"
    , "config" : { "port" : "8080" }
    , "scripts" : { "start" : "node server.js" } }
    

    and the server.js is this:

    http.createServer(...).listen(process.env.npm_package_config_port)
    

    then the user could change the behavior by doing:

    npm config set foo:port 80
    

    current lifecycle event

    Lastly, the npm_lifecycle_event environment variable is set to whichever stage of the cycle is being executed. So, you could have a single script used for different parts of the process which switches based on what's currently happening.

    Objects are flattened following this format, so if you had {"scripts":{"install":"foo.js"}} in your package.json, then you'd see this in the script:

    process.env.npm_package_scripts_install === "foo.js"
    

    EXAMPLES

    For example, if your package.json contains this:

    { "scripts" :
      { "install" : "scripts/install.js"
      , "postinstall" : "scripts/install.js"
      , "uninstall" : "scripts/uninstall.js"
      }
    }
    

    then the scripts/install.js will be called for the install, post-install, stages of the lifecycle, and the scripts/uninstall.js would be called when the package is uninstalled. Since scripts/install.js is running for three different phases, it would be wise in this case to look at the npm_lifecycle_event environment variable.

    If you want to run a make command, you can do so. This works just fine:

    { "scripts" :
      { "preinstall" : "./configure"
      , "install" : "make && make install"
      , "test" : "make test"
      }
    }
    

    EXITING

    Scripts are run by passing the line as a script argument to sh.

    If the script exits with a code other than 0, then this will abort the process.

    Note that these script files don't have to be nodejs or even javascript programs. They just have to be some kind of executable file.

    HOOK SCRIPTS

    If you want to run a specific script at a specific lifecycle event for ALL packages, then you can use a hook script.

    Place an executable file at node_modules/.hooks/{eventname}, and it'll get run for all packages when they are going through that point in the package lifecycle for any packages installed in that root.

    Hook scripts are run exactly the same way as package.json scripts. That is, they are in a separate child process, with the env described above.

    BEST PRACTICES

    • Don't exit with a non-zero error code unless you really mean it. Except for uninstall scripts, this will cause the npm action to fail, and potentially be rolled back. If the failure is minor or only will prevent some optional features, then it's better to just print a warning and exit successfully.
    • Try not to use scripts to do what npm can do for you. Read through package.json(5) to see all the things that you can specify and enable by simply describing your package appropriately. In general, this will lead to a more robust and consistent state.
    • Inspect the env to determine where to put things. For instance, if the npm_config_binroot environ is set to /home/user/bin, then don't try to install executables into /usr/local/bin. The user probably set it up that way for a reason.
    • Don't prefix your script commands with "sudo". If root permissions are required for some reason, then it'll fail with that error, and the user will sudo the npm command in question.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/removing-npm.html000644 000766 000024 00000004056 12455173731 030313 0ustar00iojsstaff000000 000000

    npm-removal

    Cleaning the Slate

    SYNOPSIS

    So sad to see you go.

    sudo npm uninstall npm -g
    

    Or, if that fails, get the npm source code, and do:

    sudo make uninstall
    

    More Severe Uninstalling

    Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed.

    If that doesn't work, or if you require more drastic measures, continue reading.

    Note that this is only necessary for globally-installed packages. Local installs are completely contained within a project's node_modules folder. Delete that folder, and everything is gone (unless a package's install script is particularly ill-behaved).

    This assumes that you installed node and npm in the default place. If you configured node with a different --prefix, or installed npm with a different prefix setting, then adjust the paths accordingly, replacing /usr/local with your install prefix.

    To remove everything npm-related manually:

    rm -rf /usr/local/{lib/node{,/.npm,_modules},bin,share/man}/npm*
    

    If you installed things with npm, then your best bet is to uninstall them with npm first, and then install them again once you have a proper install. This can help find any symlinks that are lying around:

    ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm
    

    Prior to version 0.3, npm used shim files for executables and node modules. To track those down, you can do the following:

    find /usr/local/{lib/node,bin} -exec grep -l npm \{\} \; ;
    

    (This is also in the README file.)

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/misc/semver.html000644 000766 000024 00000037010 12455173731 027172 0ustar00iojsstaff000000 000000

    semver

    The semantic versioner for npm

    Usage

    $ npm install semver
    
    semver.valid('1.2.3') // '1.2.3'
    semver.valid('a.b.c') // null
    semver.clean('  =v1.2.3   ') // '1.2.3'
    semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true
    semver.gt('1.2.3', '9.8.7') // false
    semver.lt('1.2.3', '9.8.7') // true
    

    As a command-line utility:

    $ semver -h
    
    Usage: semver <version> [<version> [...]] [-r <range> | -i <inc> | --preid <identifier> | -l | -rv]
    Test if version(s) satisfy the supplied range(s), and sort them.
    
    Multiple versions or ranges may be supplied, unless increment
    option is specified.  In that case, only a single version may
    be used, and it is incremented by the specified level
    
    Program exits successfully if any valid version satisfies
    all supplied ranges, and prints all satisfying versions.
    
    If no versions are valid, or ranges are not satisfied,
    then exits failure.
    
    Versions are printed in ascending order, so supplying
    multiple versions to the utility will just sort them.
    

    Versions

    A "version" is described by the v2.0.0 specification found at http://semver.org/.

    A leading "=" or "v" character is stripped off and ignored.

    Ranges

    A version range is a set of comparators which specify versions that satisfy the range.

    A comparator is composed of an operator and a version. The set of primitive operators is:

    • < Less than
    • <= Less than or equal to
    • > Greater than
    • >= Greater than or equal to
    • = Equal. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included.

    For example, the comparator >=1.2.7 would match the versions 1.2.7, 1.2.8, 2.5.3, and 1.3.9, but not the versions 1.2.6 or 1.1.0.

    Comparators can be joined by whitespace to form a comparator set, which is satisfied by the intersection of all of the comparators it includes.

    A range is composed of one or more comparator sets, joined by ||. A version matches a range if and only if every comparator in at least one of the ||-separated comparator sets is satisfied by the version.

    For example, the range >=1.2.7 <1.3.0 would match the versions 1.2.7, 1.2.8, and 1.2.99, but not the versions 1.2.6, 1.3.0, or 1.1.0.

    The range 1.2.7 || >=1.2.9 <2.0.0 would match the versions 1.2.7, 1.2.9, and 1.4.6, but not the versions 1.2.8 or 2.0.0.

    Prerelease Tags

    If a version has a prerelease tag (for example, 1.2.3-alpha.3) then it will only be allowed to satisfy comparator sets if at least one comparator with the same [major, minor, patch] tuple also has a prerelease tag.

    For example, the range >1.2.3-alpha.3 would be allowed to match the version 1.2.3-alpha.7, but it would not be satisfied by 3.4.5-alpha.9, even though 3.4.5-alpha.9 is technically "greater than" 1.2.3-alpha.3 according to the SemVer sort rules. The version range only accepts prerelease tags on the 1.2.3 version. The version 3.4.5 would satisfy the range, because it does not have a prerelease flag, and 3.4.5 is greater than 1.2.3-alpha.7.

    The purpose for this behavior is twofold. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author's design) not yet fit for public consumption. Therefore, by default, they are excluded from range matching semantics.

    Second, a user who has opted into using a prerelease version has clearly indicated the intent to use that specific set of alpha/beta/rc versions. By including a prerelease tag in the range, the user is indicating that they are aware of the risk. However, it is still not appropriate to assume that they have opted into taking a similar risk on the next set of prerelease versions.

    Prerelease Identifiers

    The method .inc takes an additional identifier string argument that will append the value of the string as a prerelease identifier:

    > semver.inc('1.2.3', 'pre', 'beta')
    '1.2.4-beta.0'
    

    command-line example:

    $ semver 1.2.3 -i prerelease --preid beta
    1.2.4-beta.0
    

    Which then can be used to increment further:

    $ semver 1.2.4-beta.0 -i prerelease
    1.2.4-beta.1
    

    Advanced Range Syntax

    Advanced range syntax desugars to primitive comparators in deterministic ways.

    Advanced ranges may be combined in the same way as primitive comparators using white space or ||.

    Hyphen Ranges X.Y.Z - A.B.C

    Specifies an inclusive set.

    • 1.2.3 - 2.3.4 := >=1.2.3 <=2.3.4

    If a partial version is provided as the first version in the inclusive range, then the missing pieces are replaced with zeroes.

    • 1.2 - 2.3.4 := >=1.2.0 <=2.3.4

    If a partial version is provided as the second version in the inclusive range, then all versions that start with the supplied parts of the tuple are accepted, but nothing that would be greater than the provided tuple parts.

    • 1.2.3 - 2.3 := >=1.2.3 <2.4.0
    • 1.2.3 - 2 := >=1.2.3 <3.0.0

    X-Ranges 1.2.x 1.X 1.2.* *

    Any of X, x, or * may be used to "stand in" for one of the numeric values in the [major, minor, patch] tuple.

    • * := >=0.0.0 (Any version satisfies)
    • 1.x := >=1.0.0 <2.0.0 (Matching major version)
    • 1.2.x := >=1.2.0 <1.3.0 (Matching major and minor versions)

    A partial version range is treated as an X-Range, so the special character is in fact optional.

    • "" (empty string) := * := >=0.0.0
    • 1 := 1.x.x := >=1.0.0 <2.0.0
    • 1.2 := 1.2.x := >=1.2.0 <1.3.0

    Tilde Ranges ~1.2.3 ~1.2 ~1

    Allows patch-level changes if a minor version is specified on the comparator. Allows minor-level changes if not.

    • ~1.2.3 := >=1.2.3 <1.(2+1).0 := >=1.2.3 <1.3.0
    • ~1.2 := >=1.2.0 <1.(2+1).0 := >=1.2.0 <1.3.0 (Same as 1.2.x)
    • ~1 := >=1.0.0 <(1+1).0.0 := >=1.0.0 <2.0.0 (Same as 1.x)
    • ~0.2.3 := >=0.2.3 <0.(2+1).0 := >=0.2.3 <0.3.0
    • ~0.2 := >=0.2.0 <0.(2+1).0 := >=0.2.0 <0.3.0 (Same as 0.2.x)
    • ~0 := >=0.0.0 <(0+1).0.0 := >=0.0.0 <1.0.0 (Same as 0.x)
    • ~1.2.3-beta.2 := >=1.2.3-beta.2 <1.3.0 Note that prereleases in the 1.2.3 version will be allowed, if they are greater than or equal to beta.2. So, 1.2.3-beta.4 would be allowed, but 1.2.4-beta.2 would not, because it is a prerelease of a different [major, minor, patch] tuple.

    Caret Ranges ^1.2.3 ^0.2.5 ^0.0.4

    Allows changes that do not modify the left-most non-zero digit in the [major, minor, patch] tuple. In other words, this allows patch and minor updates for versions 1.0.0 and above, patch updates for versions 0.X >=0.1.0, and no updates for versions 0.0.X.

    Many authors treat a 0.x version as if the x were the major "breaking-change" indicator.

    Caret ranges are ideal when an author may make breaking changes between 0.2.4 and 0.3.0 releases, which is a common practice. However, it presumes that there will not be breaking changes between 0.2.4 and 0.2.5. It allows for changes that are presumed to be additive (but non-breaking), according to commonly observed practices.

    • ^1.2.3 := >=1.2.3 <2.0.0
    • ^0.2.3 := >=0.2.3 <0.3.0
    • ^0.0.3 := >=0.0.3 <0.0.4
    • ^1.2.3-beta.2 := >=1.2.3-beta.2 <2.0.0 Note that prereleases in the 1.2.3 version will be allowed, if they are greater than or equal to beta.2. So, 1.2.3-beta.4 would be allowed, but 1.2.4-beta.2 would not, because it is a prerelease of a different [major, minor, patch] tuple.
    • ^0.0.3-beta := >=0.0.3-beta <0.0.4 Note that prereleases in the 0.0.3 version only will be allowed, if they are greater than or equal to beta. So, 0.0.3-pr.2 would be allowed.

    When parsing caret ranges, a missing patch value desugars to the number 0, but will allow flexibility within that value, even if the major and minor versions are both 0.

    • ^1.2.x := >=1.2.0 <2.0.0
    • ^0.0.x := >=0.0.0 <0.1.0
    • ^0.0 := >=0.0.0 <0.1.0

    A missing minor and patch values will desugar to zero, but also allow flexibility within those values, even if the major version is zero.

    • ^1.x := >=1.0.0 <2.0.0
    • ^0.x := >=0.0.0 <1.0.0

    Functions

    All methods and classes take a final loose boolean argument that, if true, will be more forgiving about not-quite-valid semver strings. The resulting output will always be 100% strict, of course.

    Strict-mode Comparators and Ranges will be strict about the SemVer strings that they parse.

    • valid(v): Return the parsed version, or null if it's not valid.
    • inc(v, release): Return the version incremented by the release type (major, premajor, minor, preminor, patch, prepatch, or prerelease), or null if it's not valid
      • premajor in one call will bump the version up to the next major version and down to a prerelease of that major version. preminor, and prepatch work the same way.
      • If called from a non-prerelease version, the prerelease will work the same as prepatch. It increments the patch version, then makes a prerelease. If the input version is already a prerelease it simply increments it.

    Comparison

    • gt(v1, v2): v1 > v2
    • gte(v1, v2): v1 >= v2
    • lt(v1, v2): v1 < v2
    • lte(v1, v2): v1 <= v2
    • eq(v1, v2): v1 == v2 This is true if they're logically equivalent, even if they're not the exact same string. You already know how to compare strings.
    • neq(v1, v2): v1 != v2 The opposite of eq.
    • cmp(v1, comparator, v2): Pass in a comparison string, and it'll call the corresponding function above. "===" and "!==" do simple string comparison, but are included for completeness. Throws if an invalid comparison string is provided.
    • compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or -1 if v2 is greater. Sorts in ascending order if passed to Array.sort().
    • rcompare(v1, v2): The reverse of compare. Sorts an array of versions in descending order when passed to Array.sort().
    • diff(v1, v2): Returns difference between two versions by the release type (major, premajor, minor, preminor, patch, prepatch, or prerelease), or null if the versions are the same.

    Ranges

    • validRange(range): Return the valid range or null if it's not valid
    • satisfies(version, range): Return true if the version satisfies the range.
    • maxSatisfying(versions, range): Return the highest version in the list that satisfies the range, or null if none of them do.
    • gtr(version, range): Return true if version is greater than all the versions possible in the range.
    • ltr(version, range): Return true if version is less than all the versions possible in the range.
    • outside(version, range, hilo): Return true if the version is outside the bounds of the range in either the high or low direction. The hilo argument must be either the string '>' or '<'. (This is the function called by gtr and ltr.)

    Note that, since ranges may be non-contiguous, a version might not be greater than a range, less than a range, or satisfy a range! For example, the range 1.2 <1.2.9 || >2.0.0 would have a hole from 1.2.9 until 2.0.0, so the version 1.2.10 would not be greater than the range (because 2.0.1 satisfies, which is higher), nor less than the range (since 1.2.8 satisfies, which is lower), and it also does not satisfy the range.

    If you want to know if a version satisfies or does not satisfy a range, use the satisfies(version, range) function.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/files/npm-folders.html000644 000766 000024 00000023175 12455173731 030275 0ustar00iojsstaff000000 000000

    npm-folders

    Folder Structures Used by npm

    DESCRIPTION

    npm puts various things on your computer. That's its job.

    This document will tell you what it puts where.

    tl;dr

    • Local install (default): puts stuff in ./node_modules of the current package root.
    • Global install (with -g): puts stuff in /usr/local or wherever node is installed.
    • Install it locally if you're going to require() it.
    • Install it globally if you're going to run it on the command line.
    • If you need both, then install it in both places, or use npm link.

    prefix Configuration

    The prefix config defaults to the location where node is installed. On most systems, this is /usr/local, and most of the time is the same as node's process.installPrefix.

    On windows, this is the exact location of the node.exe binary. On Unix systems, it's one level up, since node is typically installed at {prefix}/bin/node rather than {prefix}/node.exe.

    When the global flag is set, npm installs things into this prefix. When it is not set, it uses the root of the current package, or the current working directory if not in a package already.

    Node Modules

    Packages are dropped into the node_modules folder under the prefix. When installing locally, this means that you can require("packagename") to load its main module, or require("packagename/lib/path/to/sub/module") to load other modules.

    Global installs on Unix systems go to {prefix}/lib/node_modules. Global installs on Windows go to {prefix}/node_modules (that is, no lib folder.)

    Scoped packages are installed the same way, except they are grouped together in a sub-folder of the relevant node_modules folder with the name of that scope prefix by the @ symbol, e.g. npm install @myorg/package would place the package in {prefix}/node_modules/@myorg/package. See scopes(7) for more details.

    If you wish to require() a package, then install it locally.

    Executables

    When in global mode, executables are linked into {prefix}/bin on Unix, or directly into {prefix} on Windows.

    When in local mode, executables are linked into ./node_modules/.bin so that they can be made available to scripts run through npm. (For example, so that a test runner will be in the path when you run npm test.)

    Man Pages

    When in global mode, man pages are linked into {prefix}/share/man.

    When in local mode, man pages are not installed.

    Man pages are not installed on Windows systems.

    Cache

    See npm-cache(1). Cache files are stored in ~/.npm on Posix, or ~/npm-cache on Windows.

    This is controlled by the cache configuration param.

    Temp Files

    Temporary files are stored by default in the folder specified by the tmp config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or /tmp on Unix and c:\windows\temp on Windows.

    Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit.

    More Information

    When installing locally, npm first tries to find an appropriate prefix folder. This is so that npm install foo@1.2.3 will install to the sensible root of your package, even if you happen to have cded into some other folder.

    Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a package.json file, or a node_modules folder. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands. (This behavior is inspired by and similar to git's .git-folder seeking logic when running git commands in a working dir.)

    If no package root is found, then the current folder is used.

    When you run npm install foo@1.2.3, then the package is loaded into the cache, and then unpacked into ./node_modules/foo. Then, any of foo's dependencies are similarly unpacked into ./node_modules/foo/node_modules/....

    Any bin files are symlinked to ./node_modules/.bin/, so that they may be found by npm scripts when necessary.

    Global Installation

    If the global configuration is set to true, then npm will install packages "globally".

    For global installation, packages are installed roughly the same way, but using the folders described above.

    Cycles, Conflicts, and Folder Parsimony

    Cycles are handled using the property of node's module system that it walks up the directories looking for node_modules folders. So, at every stage, if a package is already installed in an ancestor node_modules folder, then it is not installed at the current location.

    Consider the case above, where foo -> bar -> baz. Imagine if, in addition to that, baz depended on bar, so you'd have: foo -> bar -> baz -> bar -> baz .... However, since the folder structure is: foo/node_modules/bar/node_modules/baz, there's no need to put another copy of bar into .../baz/node_modules, since when it calls require("bar"), it will get the copy that is installed in foo/node_modules/bar.

    This shortcut is only used if the exact same version would be installed in multiple nested node_modules folders. It is still possible to have a/node_modules/b/node_modules/a if the two "a" packages are different versions. However, without repeating the exact same package multiple times, an infinite regress will always be prevented.

    Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder.

    Example

    Consider this dependency graph:

    foo
    +-- blerg@1.2.5
    +-- bar@1.2.3
    |   +-- blerg@1.x (latest=1.3.7)
    |   +-- baz@2.x
    |   |   `-- quux@3.x
    |   |       `-- bar@1.2.3 (cycle)
    |   `-- asdf@*
    `-- baz@1.2.3
        `-- quux@3.x
            `-- bar
    

    In this case, we might expect a folder structure like this:

    foo
    +-- node_modules
        +-- blerg (1.2.5) <---[A]
        +-- bar (1.2.3) <---[B]
        |   `-- node_modules
        |       +-- baz (2.0.2) <---[C]
        |       |   `-- node_modules
        |       |       `-- quux (3.2.0)
        |       `-- asdf (2.3.4)
        `-- baz (1.2.3) <---[D]
            `-- node_modules
                `-- quux (3.2.0) <---[E]
    

    Since foo depends directly on bar@1.2.3 and baz@1.2.3, those are installed in foo's node_modules folder.

    Even though the latest copy of blerg is 1.3.7, foo has a specific dependency on version 1.2.5. So, that gets installed at [A]. Since the parent installation of blerg satisfies bar's dependency on blerg@1.x, it does not install another copy under [B].

    Bar [B] also has dependencies on baz and asdf, so those are installed in bar's node_modules folder. Because it depends on baz@2.x, it cannot re-use the baz@1.2.3 installed in the parent node_modules folder [D], and must install its own copy [C].

    Underneath bar, the baz -> quux -> bar dependency creates a cycle. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder.

    Underneath foo -> baz [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B].

    For a graphical breakdown of what is installed where, use npm ls.

    Publishing

    Upon publishing, npm will look in the node_modules folder. If any of the items there are not in the bundledDependencies array, then they will not be included in the package tarball.

    This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re-publish those items that cannot be found elsewhere. See package.json(5) for more information.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/files/npm-global.html000644 000766 000024 00000023175 12455173731 030077 0ustar00iojsstaff000000 000000

    npm-folders

    Folder Structures Used by npm

    DESCRIPTION

    npm puts various things on your computer. That's its job.

    This document will tell you what it puts where.

    tl;dr

    • Local install (default): puts stuff in ./node_modules of the current package root.
    • Global install (with -g): puts stuff in /usr/local or wherever node is installed.
    • Install it locally if you're going to require() it.
    • Install it globally if you're going to run it on the command line.
    • If you need both, then install it in both places, or use npm link.

    prefix Configuration

    The prefix config defaults to the location where node is installed. On most systems, this is /usr/local, and most of the time is the same as node's process.installPrefix.

    On windows, this is the exact location of the node.exe binary. On Unix systems, it's one level up, since node is typically installed at {prefix}/bin/node rather than {prefix}/node.exe.

    When the global flag is set, npm installs things into this prefix. When it is not set, it uses the root of the current package, or the current working directory if not in a package already.

    Node Modules

    Packages are dropped into the node_modules folder under the prefix. When installing locally, this means that you can require("packagename") to load its main module, or require("packagename/lib/path/to/sub/module") to load other modules.

    Global installs on Unix systems go to {prefix}/lib/node_modules. Global installs on Windows go to {prefix}/node_modules (that is, no lib folder.)

    Scoped packages are installed the same way, except they are grouped together in a sub-folder of the relevant node_modules folder with the name of that scope prefix by the @ symbol, e.g. npm install @myorg/package would place the package in {prefix}/node_modules/@myorg/package. See scopes(7) for more details.

    If you wish to require() a package, then install it locally.

    Executables

    When in global mode, executables are linked into {prefix}/bin on Unix, or directly into {prefix} on Windows.

    When in local mode, executables are linked into ./node_modules/.bin so that they can be made available to scripts run through npm. (For example, so that a test runner will be in the path when you run npm test.)

    Man Pages

    When in global mode, man pages are linked into {prefix}/share/man.

    When in local mode, man pages are not installed.

    Man pages are not installed on Windows systems.

    Cache

    See npm-cache(1). Cache files are stored in ~/.npm on Posix, or ~/npm-cache on Windows.

    This is controlled by the cache configuration param.

    Temp Files

    Temporary files are stored by default in the folder specified by the tmp config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or /tmp on Unix and c:\windows\temp on Windows.

    Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit.

    More Information

    When installing locally, npm first tries to find an appropriate prefix folder. This is so that npm install foo@1.2.3 will install to the sensible root of your package, even if you happen to have cded into some other folder.

    Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a package.json file, or a node_modules folder. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands. (This behavior is inspired by and similar to git's .git-folder seeking logic when running git commands in a working dir.)

    If no package root is found, then the current folder is used.

    When you run npm install foo@1.2.3, then the package is loaded into the cache, and then unpacked into ./node_modules/foo. Then, any of foo's dependencies are similarly unpacked into ./node_modules/foo/node_modules/....

    Any bin files are symlinked to ./node_modules/.bin/, so that they may be found by npm scripts when necessary.

    Global Installation

    If the global configuration is set to true, then npm will install packages "globally".

    For global installation, packages are installed roughly the same way, but using the folders described above.

    Cycles, Conflicts, and Folder Parsimony

    Cycles are handled using the property of node's module system that it walks up the directories looking for node_modules folders. So, at every stage, if a package is already installed in an ancestor node_modules folder, then it is not installed at the current location.

    Consider the case above, where foo -> bar -> baz. Imagine if, in addition to that, baz depended on bar, so you'd have: foo -> bar -> baz -> bar -> baz .... However, since the folder structure is: foo/node_modules/bar/node_modules/baz, there's no need to put another copy of bar into .../baz/node_modules, since when it calls require("bar"), it will get the copy that is installed in foo/node_modules/bar.

    This shortcut is only used if the exact same version would be installed in multiple nested node_modules folders. It is still possible to have a/node_modules/b/node_modules/a if the two "a" packages are different versions. However, without repeating the exact same package multiple times, an infinite regress will always be prevented.

    Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder.

    Example

    Consider this dependency graph:

    foo
    +-- blerg@1.2.5
    +-- bar@1.2.3
    |   +-- blerg@1.x (latest=1.3.7)
    |   +-- baz@2.x
    |   |   `-- quux@3.x
    |   |       `-- bar@1.2.3 (cycle)
    |   `-- asdf@*
    `-- baz@1.2.3
        `-- quux@3.x
            `-- bar
    

    In this case, we might expect a folder structure like this:

    foo
    +-- node_modules
        +-- blerg (1.2.5) <---[A]
        +-- bar (1.2.3) <---[B]
        |   `-- node_modules
        |       +-- baz (2.0.2) <---[C]
        |       |   `-- node_modules
        |       |       `-- quux (3.2.0)
        |       `-- asdf (2.3.4)
        `-- baz (1.2.3) <---[D]
            `-- node_modules
                `-- quux (3.2.0) <---[E]
    

    Since foo depends directly on bar@1.2.3 and baz@1.2.3, those are installed in foo's node_modules folder.

    Even though the latest copy of blerg is 1.3.7, foo has a specific dependency on version 1.2.5. So, that gets installed at [A]. Since the parent installation of blerg satisfies bar's dependency on blerg@1.x, it does not install another copy under [B].

    Bar [B] also has dependencies on baz and asdf, so those are installed in bar's node_modules folder. Because it depends on baz@2.x, it cannot re-use the baz@1.2.3 installed in the parent node_modules folder [D], and must install its own copy [C].

    Underneath bar, the baz -> quux -> bar dependency creates a cycle. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder.

    Underneath foo -> baz [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B].

    For a graphical breakdown of what is installed where, use npm ls.

    Publishing

    Upon publishing, npm will look in the node_modules folder. If any of the items there are not in the bundledDependencies array, then they will not be included in the package tarball.

    This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re-publish those items that cannot be found elsewhere. See package.json(5) for more information.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/files/npm-json.html000644 000766 000024 00000065375 12455173731 027620 0ustar00iojsstaff000000 000000

    package.json

    Specifics of npm's package.json handling

    DESCRIPTION

    This document is all you need to know about what's required in your package.json file. It must be actual JSON, not just a JavaScript object literal.

    A lot of the behavior described in this document is affected by the config settings described in npm-config(7).

    name

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    The name is what your thing is called. Some tips:

    • Don't put "js" or "node" in the name. It's assumed that it's js, since you're writing a package.json file, and you can specify the engine using the "engines" field. (See below.)
    • The name ends up being part of a URL, an argument on the command line, and a folder name. Any name with non-url-safe characters will be rejected. Also, it can't start with a dot or an underscore.
    • The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive.
    • You may want to check the npm registry to see if there's something by that name already, before you get too attached to it. http://registry.npmjs.org/

    A name can be optionally prefixed by a scope, e.g. @myorg/mypackage. See npm-scope(7) for more detail.

    version

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    Version must be parseable by node-semver, which is bundled with npm as a dependency. (npm install semver to use it yourself.)

    More on version numbers and ranges at semver(7).

    description

    Put a description in it. It's a string. This helps people discover your package, as it's listed in npm search.

    keywords

    Put keywords in it. It's an array of strings. This helps people discover your package as it's listed in npm search.

    homepage

    The url to the project homepage.

    NOTE: This is not the same as "url". If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you.

    Literally. Spit. I'm so not kidding.

    bugs

    The url to your project's issue tracker and / or the email address to which issues should be reported. These are helpful for people who encounter issues with your package.

    It should look like this:

    { "url" : "http://github.com/owner/project/issues"
    , "email" : "project@hostname.com"
    }
    

    You can specify either one or both values. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object.

    If a url is provided, it will be used by the npm bugs command.

    license

    You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it.

    The simplest way, assuming you're using a common license such as BSD-3-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this:

    { "license" : "BSD-3-Clause" }
    

    You can check the full list of SPDX license IDs. Ideally you should pick one that is OSI approved.

    It's also a good idea to include a LICENSE file at the top level in your package.

    people fields: author, contributors

    The "author" is one person. "contributors" is an array of people. A "person" is an object with a "name" field and optionally "url" and "email", like this:

    { "name" : "Barney Rubble"
    , "email" : "b@rubble.com"
    , "url" : "http://barnyrubble.tumblr.com/"
    }
    

    Or you can shorten that all into a single string, and npm will parse it for you:

    "Barney Rubble <b@rubble.com> (http://barnyrubble.tumblr.com/)
    

    Both email and url are optional either way.

    npm also sets a top-level "maintainers" field with your npm user info.

    files

    The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)

    You can also provide a ".npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array. The ".npmignore" file works just like a ".gitignore".

    main

    The main field is a module ID that is the primary entry point to your program. That is, if your package is named foo, and a user installs it, and then does require("foo"), then your main module's exports object will be returned.

    This should be a module ID relative to the root of your package folder.

    For most modules, it makes the most sense to have a main script and often not much else.

    bin

    A lot of packages have one or more executable files that they'd like to install into the PATH. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable.)

    To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.

    For example, npm has this:

    { "bin" : { "npm" : "./cli.js" } }
    

    So, when you install npm, it'll create a symlink from the cli.js script to /usr/local/bin/npm.

    If you have a single executable, and its name should be the name of the package, then you can just supply it as a string. For example:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin": "./path/to/program" }
    

    would be the same as this:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin" : { "my-program" : "./path/to/program" } }
    

    man

    Specify either a single file or an array of filenames to put in place for the man program to find.

    If only a single file is provided, then it's installed such that it is the result from man <pkgname>, regardless of its actual filename. For example:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : "./man/doc.1"
    }
    

    would link the ./man/doc.1 file in such that it is the target for man foo

    If the filename doesn't start with the package name, then it's prefixed. So, this:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/bar.1" ]
    }
    

    will create files to do man foo and man foo-bar.

    Man files must end with a number, and optionally a .gz suffix if they are compressed. The number dictates which man section the file is installed into.

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/foo.2" ]
    }
    

    will create entries for man foo and man 2 foo

    directories

    The CommonJS Packages spec details a few ways that you can indicate the structure of your package using a directories object. If you look at npm's package.json, you'll see that it has directories for doc, lib, and man.

    In the future, this information may be used in other creative ways.

    directories.lib

    Tell people where the bulk of your library is. Nothing special is done with the lib folder in any way, but it's useful meta info.

    directories.bin

    If you specify a bin directory, then all the files in that folder will be added as children of the bin path.

    If you have a bin path already, then this has no effect.

    directories.man

    A folder that is full of man pages. Sugar to generate a "man" array by walking the folder.

    directories.doc

    Put markdown files in here. Eventually, these will be displayed nicely, maybe, someday.

    directories.example

    Put example scripts in here. Someday, it might be exposed in some clever way.

    repository

    Specify the place where your code lives. This is helpful for people who want to contribute. If the git repo is on GitHub, then the npm docs command will be able to find you.

    Do it like this:

    "repository" :
      { "type" : "git"
      , "url" : "http://github.com/npm/npm.git"
      }
    
    "repository" :
      { "type" : "svn"
      , "url" : "http://v8.googlecode.com/svn/trunk/"
      }
    

    The URL should be a publicly available (perhaps read-only) url that can be handed directly to a VCS program without any modification. It should not be a url to an html project page that you put in your browser. It's for computers.

    scripts

    The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package. The key is the lifecycle event, and the value is the command to run at that point.

    See npm-scripts(7) to find out more about writing package scripts.

    config

    A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades. For instance, if a package had the following:

    { "name" : "foo"
    , "config" : { "port" : "8080" } }
    

    and then had a "start" command that then referenced the npm_package_config_port environment variable, then the user could override that by doing npm config set foo:port 8001.

    See npm-config(7) and npm-scripts(7) for more on package configs.

    dependencies

    Dependencies are specified in a simple object that maps a package name to a version range. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or git URL.

    Please do not put test harnesses or transpilers in your dependencies object. See devDependencies, below.

    See semver(7) for more details about specifying version ranges.

    • version Must match version exactly
    • >version Must be greater than version
    • >=version etc
    • <version
    • <=version
    • ~version "Approximately equivalent to version" See semver(7)
    • ^version "Compatible with version" See semver(7)
    • 1.2.x 1.2.0, 1.2.1, etc., but not 1.3.0
    • http://... See 'URLs as Dependencies' below
    • * Matches any version
    • "" (just an empty string) Same as *
    • version1 - version2 Same as >=version1 <=version2.
    • range1 || range2 Passes if either range1 or range2 are satisfied.
    • git... See 'Git URLs as Dependencies' below
    • user/repo See 'GitHub URLs' below
    • tag A specific version tagged and published as tag See npm-tag(1)
    • path/path/path See Local Paths below

    For example, these are all valid:

    { "dependencies" :
      { "foo" : "1.0.0 - 2.9999.9999"
      , "bar" : ">=1.0.2 <2.1.2"
      , "baz" : ">1.0.2 <=2.3.4"
      , "boo" : "2.0.1"
      , "qux" : "<1.0.0 || >=2.3.1 <2.4.5 || >=2.5.2 <3.0.0"
      , "asd" : "http://asdf.com/asdf.tar.gz"
      , "til" : "~1.2"
      , "elf" : "~1.2.3"
      , "two" : "2.x"
      , "thr" : "3.3.x"
      , "lat" : "latest"
      , "dyl" : "file:../dyl"
      }
    }
    

    URLs as Dependencies

    You may specify a tarball URL in place of a version range.

    This tarball will be downloaded and installed locally to your package at install time.

    Git URLs as Dependencies

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+ssh://user@hostname/project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    GitHub URLs

    As of version 1.1.65, you can refer to GitHub urls as just "foo": "user/foo-project". Just as with git URLs, a commit-ish suffix can be included. For example:

    {
      "name": "foo",
      "version": "0.0.0",
      "dependencies": {
        "express": "visionmedia/express",
        "mocha": "visionmedia/mocha#4727d357ea"
      }
    }
    

    Local Paths

    As of version 2.0.0 you can provide a path to a local directory that contains a package. Local paths can be saved using npm install --save, using any of these forms:

    ../foo/bar
    ~/foo/bar
    ./foo/bar
    /foo/bar
    

    in which case they will be normalized to a relative path and added to your package.json. For example:

    {
      "name": "baz",
      "dependencies": {
        "bar": "file:../foo/bar"
      }
    }
    

    This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry.

    devDependencies

    If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use.

    In this case, it's best to map these additional items in a devDependencies object.

    These things will be installed when doing npm link or npm install from the root of a package, and can be managed like any other npm configuration param. See npm-config(7) for more on the topic.

    For build steps that are not platform-specific, such as compiling CoffeeScript or other languages to JavaScript, use the prepublish script to do this, and make the required package a devDependency.

    For example:

    { "name": "ethopia-waza",
      "description": "a delightfully fruity coffee varietal",
      "version": "1.2.3",
      "devDependencies": {
        "coffee-script": "~1.6.3"
      },
      "scripts": {
        "prepublish": "coffee -o lib/ -c src/waza.coffee"
      },
      "main": "lib/waza.js"
    }
    

    The prepublish script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves. In dev mode (ie, locally running npm install), it'll run this script as well, so that you can test it easily.

    peerDependencies

    In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a require of this host. This is usually referred to as a plugin. Notably, your module may be exposing a specific interface, expected and specified by the host documentation.

    For example:

    {
      "name": "tea-latte",
      "version": "1.3.5"
      "peerDependencies": {
        "tea": "2.x"
      }
    }
    

    This ensures your package tea-latte can be installed along with the second major version of the host package tea only. The host package is automatically installed if needed. npm install tea-latte could possibly yield the following dependency graph:

    ├── tea-latte@1.3.5
    └── tea@2.2.0
    

    Trying to install another plugin with a conflicting requirement will cause an error. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions.

    Assuming the host complies with semver, only changes in the host package's major version will break your plugin. Thus, if you've worked with every 1.x version of the host package, use "^1.0" or "1.x" to express this. If you depend on features introduced in 1.5.2, use ">= 1.5.2 < 2".

    bundledDependencies

    Array of package names that will be bundled when publishing the package.

    If this is spelled "bundleDependencies", then that is also honorable.

    optionalDependencies

    If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the optionalDependencies object. This is a map of package name to version or url, just like the dependencies object. The difference is that build failures do not cause installation to fail.

    It is still your program's responsibility to handle the lack of the dependency. For example, something like this:

    try {
      var foo = require('foo')
      var fooVersion = require('foo/package.json').version
    } catch (er) {
      foo = null
    }
    if ( notGoodFooVersion(fooVersion) ) {
      foo = null
    }
    
    // .. then later in your program ..
    
    if (foo) {
      foo.doFooThings()
    }
    

    Entries in optionalDependencies will override entries of the same name in dependencies, so it's usually best to only put in one place.

    engines

    You can specify the version of node that your stuff works on:

    { "engines" : { "node" : ">=0.10.3 <0.12" } }
    

    And, like with dependencies, if you don't specify the version (or if you specify "*" as the version), then any version of node will do.

    If you specify an "engines" field, then npm will require that "node" be somewhere on that list. If "engines" is omitted, then npm will just assume that it works on node.

    You can also use the "engines" field to specify which versions of npm are capable of properly installing your program. For example:

    { "engines" : { "npm" : "~1.0.20" } }
    

    Note that, unless the user has set the engine-strict config flag, this field is advisory only.

    engineStrict

    If you are sure that your module will definitely not run properly on versions of Node/npm other than those specified in the engines object, then you can set "engineStrict": true in your package.json file. This will override the user's engine-strict config setting.

    Please do not do this unless you are really very very sure. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node. Consider this choice carefully. If people abuse it, it will be removed in a future version of npm.

    os

    You can specify which operating systems your module will run on:

    "os" : [ "darwin", "linux" ]
    

    You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!':

    "os" : [ "!win32" ]
    

    The host operating system is determined by process.platform

    It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this.

    cpu

    If your code only runs on certain cpu architectures, you can specify which ones.

    "cpu" : [ "x64", "ia32" ]
    

    Like the os option, you can also blacklist architectures:

    "cpu" : [ "!arm", "!mips" ]
    

    The host architecture is determined by process.arch

    preferGlobal

    If your package is primarily a command-line application that should be installed globally, then set this value to true to provide a warning if it is installed locally.

    It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected.

    private

    If you set "private": true in your package.json, then npm will refuse to publish it.

    This is a way to prevent accidental publication of private repositories. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the publishConfig dictionary described below to override the registry config param at publish-time.

    publishConfig

    This is a set of config values that will be used at publish-time. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default.

    Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing.

    See npm-config(7) to see the list of config options that can be overridden.

    DEFAULT VALUES

    npm will default some values based on package contents.

    • "scripts": {"start": "node server.js"}

      If there is a server.js file in the root of your package, then npm will default the start command to node server.js.

    • "scripts":{"preinstall": "node-gyp rebuild"}

      If there is a binding.gyp file in the root of your package, npm will default the preinstall command to compile using node-gyp.

    • "contributors": [...]

      If there is an AUTHORS file in the root of your package, npm will treat each line as a Name <email> (url) format, where email and url are optional. Lines which start with a # or are blank, will be ignored.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/files/npmrc.html000644 000766 000024 00000006013 12455173731 027156 0ustar00iojsstaff000000 000000

    npmrc

    The npm config files

    DESCRIPTION

    npm gets its config settings from the command line, environment variables, and npmrc files.

    The npm config command can be used to update and edit the contents of the user and global npmrc files.

    For a list of available configuration options, see npm-config(7).

    FILES

    The four relevant files are:

    • per-project config file (/path/to/my/project/.npmrc)
    • per-user config file (~/.npmrc)
    • global config file ($PREFIX/npmrc)
    • npm builtin config file (/path/to/npm/npmrc)

    All npm config files are an ini-formatted list of key = value parameters. Environment variables can be replaced using ${VARIABLE_NAME}. For example:

    prefix = ${HOME}/.npm-packages
    

    Each of these files is loaded, and config options are resolved in priority order. For example, a setting in the userconfig file would override the setting in the globalconfig file.

    Array values are specified by adding "[]" after the key name. For example:

    key[] = "first value"
    key[] = "second value"
    

    Per-project config file

    When working locally in a project, a .npmrc file in the root of the project (ie, a sibling of node_modules and package.json) will set config values specific to this project.

    Note that this only applies to the root of the project that you're running npm in. It has no effect when your module is published. For example, you can't publish a module that forces itself to install globally, or in a different location.

    Per-user config file

    $HOME/.npmrc (or the userconfig param, if set in the environment or on the command line)

    Global config file

    $PREFIX/etc/npmrc (or the globalconfig param, if set above): This file is an ini-file formatted list of key = value parameters. Environment variables can be replaced as above.

    Built-in config file

    path/to/npm/itself/npmrc

    This is an unchangeable "builtin" configuration file that npm keeps consistent across updates. Set fields in here using the ./configure script that comes with npm. This is primarily for distribution maintainers to override default configs in a standard and consistent manner.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/files/package.json.html000644 000766 000024 00000065375 12455173731 030422 0ustar00iojsstaff000000 000000

    package.json

    Specifics of npm's package.json handling

    DESCRIPTION

    This document is all you need to know about what's required in your package.json file. It must be actual JSON, not just a JavaScript object literal.

    A lot of the behavior described in this document is affected by the config settings described in npm-config(7).

    name

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    The name is what your thing is called. Some tips:

    • Don't put "js" or "node" in the name. It's assumed that it's js, since you're writing a package.json file, and you can specify the engine using the "engines" field. (See below.)
    • The name ends up being part of a URL, an argument on the command line, and a folder name. Any name with non-url-safe characters will be rejected. Also, it can't start with a dot or an underscore.
    • The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive.
    • You may want to check the npm registry to see if there's something by that name already, before you get too attached to it. http://registry.npmjs.org/

    A name can be optionally prefixed by a scope, e.g. @myorg/mypackage. See npm-scope(7) for more detail.

    version

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    Version must be parseable by node-semver, which is bundled with npm as a dependency. (npm install semver to use it yourself.)

    More on version numbers and ranges at semver(7).

    description

    Put a description in it. It's a string. This helps people discover your package, as it's listed in npm search.

    keywords

    Put keywords in it. It's an array of strings. This helps people discover your package as it's listed in npm search.

    homepage

    The url to the project homepage.

    NOTE: This is not the same as "url". If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you.

    Literally. Spit. I'm so not kidding.

    bugs

    The url to your project's issue tracker and / or the email address to which issues should be reported. These are helpful for people who encounter issues with your package.

    It should look like this:

    { "url" : "http://github.com/owner/project/issues"
    , "email" : "project@hostname.com"
    }
    

    You can specify either one or both values. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object.

    If a url is provided, it will be used by the npm bugs command.

    license

    You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it.

    The simplest way, assuming you're using a common license such as BSD-3-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this:

    { "license" : "BSD-3-Clause" }
    

    You can check the full list of SPDX license IDs. Ideally you should pick one that is OSI approved.

    It's also a good idea to include a LICENSE file at the top level in your package.

    people fields: author, contributors

    The "author" is one person. "contributors" is an array of people. A "person" is an object with a "name" field and optionally "url" and "email", like this:

    { "name" : "Barney Rubble"
    , "email" : "b@rubble.com"
    , "url" : "http://barnyrubble.tumblr.com/"
    }
    

    Or you can shorten that all into a single string, and npm will parse it for you:

    "Barney Rubble <b@rubble.com> (http://barnyrubble.tumblr.com/)
    

    Both email and url are optional either way.

    npm also sets a top-level "maintainers" field with your npm user info.

    files

    The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)

    You can also provide a ".npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array. The ".npmignore" file works just like a ".gitignore".

    main

    The main field is a module ID that is the primary entry point to your program. That is, if your package is named foo, and a user installs it, and then does require("foo"), then your main module's exports object will be returned.

    This should be a module ID relative to the root of your package folder.

    For most modules, it makes the most sense to have a main script and often not much else.

    bin

    A lot of packages have one or more executable files that they'd like to install into the PATH. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable.)

    To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.

    For example, npm has this:

    { "bin" : { "npm" : "./cli.js" } }
    

    So, when you install npm, it'll create a symlink from the cli.js script to /usr/local/bin/npm.

    If you have a single executable, and its name should be the name of the package, then you can just supply it as a string. For example:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin": "./path/to/program" }
    

    would be the same as this:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin" : { "my-program" : "./path/to/program" } }
    

    man

    Specify either a single file or an array of filenames to put in place for the man program to find.

    If only a single file is provided, then it's installed such that it is the result from man <pkgname>, regardless of its actual filename. For example:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : "./man/doc.1"
    }
    

    would link the ./man/doc.1 file in such that it is the target for man foo

    If the filename doesn't start with the package name, then it's prefixed. So, this:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/bar.1" ]
    }
    

    will create files to do man foo and man foo-bar.

    Man files must end with a number, and optionally a .gz suffix if they are compressed. The number dictates which man section the file is installed into.

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/foo.2" ]
    }
    

    will create entries for man foo and man 2 foo

    directories

    The CommonJS Packages spec details a few ways that you can indicate the structure of your package using a directories object. If you look at npm's package.json, you'll see that it has directories for doc, lib, and man.

    In the future, this information may be used in other creative ways.

    directories.lib

    Tell people where the bulk of your library is. Nothing special is done with the lib folder in any way, but it's useful meta info.

    directories.bin

    If you specify a bin directory, then all the files in that folder will be added as children of the bin path.

    If you have a bin path already, then this has no effect.

    directories.man

    A folder that is full of man pages. Sugar to generate a "man" array by walking the folder.

    directories.doc

    Put markdown files in here. Eventually, these will be displayed nicely, maybe, someday.

    directories.example

    Put example scripts in here. Someday, it might be exposed in some clever way.

    repository

    Specify the place where your code lives. This is helpful for people who want to contribute. If the git repo is on GitHub, then the npm docs command will be able to find you.

    Do it like this:

    "repository" :
      { "type" : "git"
      , "url" : "http://github.com/npm/npm.git"
      }
    
    "repository" :
      { "type" : "svn"
      , "url" : "http://v8.googlecode.com/svn/trunk/"
      }
    

    The URL should be a publicly available (perhaps read-only) url that can be handed directly to a VCS program without any modification. It should not be a url to an html project page that you put in your browser. It's for computers.

    scripts

    The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package. The key is the lifecycle event, and the value is the command to run at that point.

    See npm-scripts(7) to find out more about writing package scripts.

    config

    A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades. For instance, if a package had the following:

    { "name" : "foo"
    , "config" : { "port" : "8080" } }
    

    and then had a "start" command that then referenced the npm_package_config_port environment variable, then the user could override that by doing npm config set foo:port 8001.

    See npm-config(7) and npm-scripts(7) for more on package configs.

    dependencies

    Dependencies are specified in a simple object that maps a package name to a version range. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or git URL.

    Please do not put test harnesses or transpilers in your dependencies object. See devDependencies, below.

    See semver(7) for more details about specifying version ranges.

    • version Must match version exactly
    • >version Must be greater than version
    • >=version etc
    • <version
    • <=version
    • ~version "Approximately equivalent to version" See semver(7)
    • ^version "Compatible with version" See semver(7)
    • 1.2.x 1.2.0, 1.2.1, etc., but not 1.3.0
    • http://... See 'URLs as Dependencies' below
    • * Matches any version
    • "" (just an empty string) Same as *
    • version1 - version2 Same as >=version1 <=version2.
    • range1 || range2 Passes if either range1 or range2 are satisfied.
    • git... See 'Git URLs as Dependencies' below
    • user/repo See 'GitHub URLs' below
    • tag A specific version tagged and published as tag See npm-tag(1)
    • path/path/path See Local Paths below

    For example, these are all valid:

    { "dependencies" :
      { "foo" : "1.0.0 - 2.9999.9999"
      , "bar" : ">=1.0.2 <2.1.2"
      , "baz" : ">1.0.2 <=2.3.4"
      , "boo" : "2.0.1"
      , "qux" : "<1.0.0 || >=2.3.1 <2.4.5 || >=2.5.2 <3.0.0"
      , "asd" : "http://asdf.com/asdf.tar.gz"
      , "til" : "~1.2"
      , "elf" : "~1.2.3"
      , "two" : "2.x"
      , "thr" : "3.3.x"
      , "lat" : "latest"
      , "dyl" : "file:../dyl"
      }
    }
    

    URLs as Dependencies

    You may specify a tarball URL in place of a version range.

    This tarball will be downloaded and installed locally to your package at install time.

    Git URLs as Dependencies

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+ssh://user@hostname/project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    GitHub URLs

    As of version 1.1.65, you can refer to GitHub urls as just "foo": "user/foo-project". Just as with git URLs, a commit-ish suffix can be included. For example:

    {
      "name": "foo",
      "version": "0.0.0",
      "dependencies": {
        "express": "visionmedia/express",
        "mocha": "visionmedia/mocha#4727d357ea"
      }
    }
    

    Local Paths

    As of version 2.0.0 you can provide a path to a local directory that contains a package. Local paths can be saved using npm install --save, using any of these forms:

    ../foo/bar
    ~/foo/bar
    ./foo/bar
    /foo/bar
    

    in which case they will be normalized to a relative path and added to your package.json. For example:

    {
      "name": "baz",
      "dependencies": {
        "bar": "file:../foo/bar"
      }
    }
    

    This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry.

    devDependencies

    If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use.

    In this case, it's best to map these additional items in a devDependencies object.

    These things will be installed when doing npm link or npm install from the root of a package, and can be managed like any other npm configuration param. See npm-config(7) for more on the topic.

    For build steps that are not platform-specific, such as compiling CoffeeScript or other languages to JavaScript, use the prepublish script to do this, and make the required package a devDependency.

    For example:

    { "name": "ethopia-waza",
      "description": "a delightfully fruity coffee varietal",
      "version": "1.2.3",
      "devDependencies": {
        "coffee-script": "~1.6.3"
      },
      "scripts": {
        "prepublish": "coffee -o lib/ -c src/waza.coffee"
      },
      "main": "lib/waza.js"
    }
    

    The prepublish script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves. In dev mode (ie, locally running npm install), it'll run this script as well, so that you can test it easily.

    peerDependencies

    In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a require of this host. This is usually referred to as a plugin. Notably, your module may be exposing a specific interface, expected and specified by the host documentation.

    For example:

    {
      "name": "tea-latte",
      "version": "1.3.5"
      "peerDependencies": {
        "tea": "2.x"
      }
    }
    

    This ensures your package tea-latte can be installed along with the second major version of the host package tea only. The host package is automatically installed if needed. npm install tea-latte could possibly yield the following dependency graph:

    ├── tea-latte@1.3.5
    └── tea@2.2.0
    

    Trying to install another plugin with a conflicting requirement will cause an error. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions.

    Assuming the host complies with semver, only changes in the host package's major version will break your plugin. Thus, if you've worked with every 1.x version of the host package, use "^1.0" or "1.x" to express this. If you depend on features introduced in 1.5.2, use ">= 1.5.2 < 2".

    bundledDependencies

    Array of package names that will be bundled when publishing the package.

    If this is spelled "bundleDependencies", then that is also honorable.

    optionalDependencies

    If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the optionalDependencies object. This is a map of package name to version or url, just like the dependencies object. The difference is that build failures do not cause installation to fail.

    It is still your program's responsibility to handle the lack of the dependency. For example, something like this:

    try {
      var foo = require('foo')
      var fooVersion = require('foo/package.json').version
    } catch (er) {
      foo = null
    }
    if ( notGoodFooVersion(fooVersion) ) {
      foo = null
    }
    
    // .. then later in your program ..
    
    if (foo) {
      foo.doFooThings()
    }
    

    Entries in optionalDependencies will override entries of the same name in dependencies, so it's usually best to only put in one place.

    engines

    You can specify the version of node that your stuff works on:

    { "engines" : { "node" : ">=0.10.3 <0.12" } }
    

    And, like with dependencies, if you don't specify the version (or if you specify "*" as the version), then any version of node will do.

    If you specify an "engines" field, then npm will require that "node" be somewhere on that list. If "engines" is omitted, then npm will just assume that it works on node.

    You can also use the "engines" field to specify which versions of npm are capable of properly installing your program. For example:

    { "engines" : { "npm" : "~1.0.20" } }
    

    Note that, unless the user has set the engine-strict config flag, this field is advisory only.

    engineStrict

    If you are sure that your module will definitely not run properly on versions of Node/npm other than those specified in the engines object, then you can set "engineStrict": true in your package.json file. This will override the user's engine-strict config setting.

    Please do not do this unless you are really very very sure. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node. Consider this choice carefully. If people abuse it, it will be removed in a future version of npm.

    os

    You can specify which operating systems your module will run on:

    "os" : [ "darwin", "linux" ]
    

    You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!':

    "os" : [ "!win32" ]
    

    The host operating system is determined by process.platform

    It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this.

    cpu

    If your code only runs on certain cpu architectures, you can specify which ones.

    "cpu" : [ "x64", "ia32" ]
    

    Like the os option, you can also blacklist architectures:

    "cpu" : [ "!arm", "!mips" ]
    

    The host architecture is determined by process.arch

    preferGlobal

    If your package is primarily a command-line application that should be installed globally, then set this value to true to provide a warning if it is installed locally.

    It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected.

    private

    If you set "private": true in your package.json, then npm will refuse to publish it.

    This is a way to prevent accidental publication of private repositories. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the publishConfig dictionary described below to override the registry config param at publish-time.

    publishConfig

    This is a set of config values that will be used at publish-time. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default.

    Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing.

    See npm-config(7) to see the list of config options that can be overridden.

    DEFAULT VALUES

    npm will default some values based on package contents.

    • "scripts": {"start": "node server.js"}

      If there is a server.js file in the root of your package, then npm will default the start command to node server.js.

    • "scripts":{"preinstall": "node-gyp rebuild"}

      If there is a binding.gyp file in the root of your package, npm will default the preinstall command to compile using node-gyp.

    • "contributors": [...]

      If there is an AUTHORS file in the root of your package, npm will treat each line as a Name <email> (url) format, where email and url are optional. Lines which start with a # or are blank, will be ignored.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-adduser.html000644 000766 000024 00000006130 12455173731 027723 0ustar00iojsstaff000000 000000

    npm-adduser

    Add a registry user account

    SYNOPSIS

    npm adduser [--registry=url] [--scope=@orgname] [--always-auth]
    

    DESCRIPTION

    Create or verify a user named <username> in the specified registry, and save the credentials to the .npmrc file. If no registry is specified, the default registry will be used (see npm-config(7)).

    The username, password, and email are read in from prompts.

    To reset your password, go to https://www.npmjs.com/forgot

    To change your email address, go to https://www.npmjs.com/email-edit

    You may use this command multiple times with the same user account to authorize on a new machine. When authenticating on a new machine, the username, password and email address must all match with your existing record.

    npm login is an alias to adduser and behaves exactly the same way.

    CONFIGURATION

    registry

    Default: http://registry.npmjs.org/

    The base URL of the npm package registry. If scope is also specified, this registry will only be used for packages with that scope. See npm-scope(7).

    scope

    Default: none

    If specified, the user and login credentials given will be associated with the specified scope. See npm-scope(7). You can use both at the same time, e.g.

    npm adduser --registry=http://myregistry.example.com --scope=@myco
    

    This will set a registry for the given scope and login or create a user for that registry at the same time.

    always-auth

    Default: false

    If specified, save configuration indicating that all requests to the given registry should include authorization information. Useful for private registries. Can be used with --registry and / or --scope, e.g.

    npm adduser --registry=http://private-registry.example.com --always-auth
    

    This will ensure that all requests to that registry (including for tarballs) include an authorization header. See always-auth in npm-config(7) for more details on always-auth. Registry-specific configuration of always-auth takes precedence over any global configuration.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-bin.html000644 000766 000024 00000001200 12455173731 027035 0ustar00iojsstaff000000 000000

    npm-bin

    Display npm bin folder

    SYNOPSIS

    npm bin
    

    DESCRIPTION

    Print the folder where npm will install executables.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-bugs.html000644 000766 000024 00000003127 12455173731 027237 0ustar00iojsstaff000000 000000

    npm-bugs

    Bugs for a package in a web browser maybe

    SYNOPSIS

    npm bugs <pkgname>
    npm bugs (with no args in a package dir)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the --browser config param. If no package name is provided, it will search for a package.json in the current folder and use the name property.

    CONFIGURATION

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm bugs command to open websites.

    registry

    The base URL of the npm package registry.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-build.html000644 000766 000024 00000001400 12455173731 027366 0ustar00iojsstaff000000 000000

    npm-build

    Build a package

    SYNOPSIS

    npm build <package-folder>
    
    • <package-folder>: A folder containing a package.json file in its root.

    DESCRIPTION

    This is the plumbing command called by npm link and npm install.

    It should generally not be called directly.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-bundle.html000644 000766 000024 00000000767 12455173731 027557 0ustar00iojsstaff000000 000000

    npm-bundle

    REMOVED

    DESCRIPTION

    The npm bundle command has been removed in 1.0, for the simple reason that it is no longer necessary, as the default behavior is now to install packages into the local space.

    Just use npm install now to do what npm bundle used to do.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-cache.html000644 000766 000024 00000005121 12455173731 027336 0ustar00iojsstaff000000 000000

    npm-cache

    Manipulates packages cache

    SYNOPSIS

    npm cache add <tarball file>
    npm cache add <folder>
    npm cache add <tarball url>
    npm cache add <name>@<version>
    
    npm cache ls [<path>]
    
    npm cache clean [<path>]
    

    DESCRIPTION

    Used to add, list, or clear the npm cache folder.

    • add: Add the specified package to the local cache. This command is primarily intended to be used internally by npm, but it can provide a way to add data to the local installation cache explicitly.

    • ls: Show the data in the cache. Argument is a path to show in the cache folder. Works a bit like the find program, but limited by the depth config.

    • clean: Delete data out of the cache folder. If an argument is provided, then it specifies a subpath to delete. If no argument is provided, then the entire cache is cleared.

    DETAILS

    npm stores cache data in the directory specified in npm config get cache. For each package that is added to the cache, three pieces of information are stored in {cache}/{name}/{version}:

    • .../package/package.json: The package.json file, as npm sees it.
    • .../package.tgz: The tarball for that version.

    Additionally, whenever a registry request is made, a .cache.json file is placed at the corresponding URI, to store the ETag and the requested data. This is stored in {cache}/{hostname}/{path}/.cache.json.

    Commands that make non-essential registry requests (such as search and view, or the completion scripts) generally specify a minimum timeout. If the .cache.json file is younger than the specified timeout, then they do not make an HTTP request to the registry.

    CONFIGURATION

    cache

    Default: ~/.npm on Posix, or %AppData%/npm-cache on Windows.

    The root cache folder.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-completion.html000644 000766 000024 00000002042 12455173731 030443 0ustar00iojsstaff000000 000000

    npm-completion

    Tab Completion for npm

    SYNOPSIS

    . <(npm completion)
    

    DESCRIPTION

    Enables tab-completion in all npm commands.

    The synopsis above loads the completions into your current shell. Adding it to your ~/.bashrc or ~/.zshrc will make the completions available everywhere.

    You may of course also pipe the output of npm completion to a file such as /usr/local/etc/bash_completion.d/npm if you have a system that will read that file for you.

    When COMP_CWORD, COMP_LINE, and COMP_POINT are defined in the environment, npm completion acts in "plumbing mode", and outputs completions based on the arguments.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-config.html000644 000766 000024 00000004002 12455173731 027535 0ustar00iojsstaff000000 000000

    npm-config

    Manage the npm configuration files

    SYNOPSIS

    npm config set <key> <value> [--global]
    npm config get <key>
    npm config delete <key>
    npm config list
    npm config edit
    npm c [set|get|delete|list]
    npm get <key>
    npm set <key> <value> [--global]
    

    DESCRIPTION

    npm gets its config settings from the command line, environment variables, npmrc files, and in some cases, the package.json file.

    See npmrc(5) for more information about the npmrc files.

    See npm-config(7) for a more thorough discussion of the mechanisms involved.

    The npm config command can be used to update and edit the contents of the user and global npmrc files.

    Sub-commands

    Config supports the following sub-commands:

    set

    npm config set key value
    

    Sets the config key to the value.

    If value is omitted, then it sets it to "true".

    get

    npm config get key
    

    Echo the config value to stdout.

    list

    npm config list
    

    Show all the config settings.

    delete

    npm config delete key
    

    Deletes the key from all configuration files.

    edit

    npm config edit
    

    Opens the config file in an editor. Use the --global flag to edit the global config.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-dedupe.html000644 000766 000024 00000003572 12455173731 027551 0ustar00iojsstaff000000 000000

    npm-dedupe

    Reduce duplication

    SYNOPSIS

    npm dedupe [package names...]
    npm ddp [package names...]
    

    DESCRIPTION

    Searches the local package tree and attempts to simplify the overall structure by moving dependencies further up the tree, where they can be more effectively shared by multiple dependent packages.

    For example, consider this dependency graph:

    a
    +-- b <-- depends on c@1.0.x
    |   `-- c@1.0.3
    `-- d <-- depends on c@~1.0.9
        `-- c@1.0.10
    

    In this case, npm-dedupe(1) will transform the tree to:

    a
    +-- b
    +-- d
    `-- c@1.0.10
    

    Because of the hierarchical nature of node's module lookup, b and d will both get their dependency met by the single c package at the root level of the tree.

    If a suitable version exists at the target location in the tree already, then it will be left untouched, but the other duplicates will be deleted.

    If no suitable version can be found, then a warning is printed, and nothing is done.

    If any arguments are supplied, then they are filters, and only the named packages will be touched.

    Note that this operation transforms the dependency tree, and may result in packages getting updated versions, perhaps from the npm registry.

    This feature is experimental, and may change in future versions.

    The --tag argument will apply to all of the affected dependencies. If a tag with the given name exists, the tagged version is preferred over newer versions.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-deprecate.html000644 000766 000024 00000002015 12455173731 030226 0ustar00iojsstaff000000 000000

    npm-deprecate

    Deprecate a version of a package

    SYNOPSIS

    npm deprecate <name>[@<version>] <message>
    

    DESCRIPTION

    This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it.

    It works on version ranges as well as specific versions, so you can do something like this:

    npm deprecate my-thing@"< 0.2.3" "critical bug fixed in v0.2.3"
    

    Note that you must be the package owner to deprecate something. See the owner and adduser help topics.

    To un-deprecate a package, specify an empty string ("") for the message argument.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-docs.html000644 000766 000024 00000003300 12455173731 027220 0ustar00iojsstaff000000 000000

    npm-docs

    Docs for a package in a web browser maybe

    SYNOPSIS

    npm docs [<pkgname> [<pkgname> ...]]
    npm docs (with no args in a package dir)
    npm home [<pkgname> [<pkgname> ...]]
    npm home (with no args in a package dir)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the --browser config param. You can pass multiple package names at once. If no package name is provided, it will search for a package.json in the current folder and use the name property.

    CONFIGURATION

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm docs command to open websites.

    registry

    The base URL of the npm package registry.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-edit.html000644 000766 000024 00000002677 12455173731 027235 0ustar00iojsstaff000000 000000

    npm-edit

    Edit an installed package

    SYNOPSIS

    npm edit <name>[@<version>]
    

    DESCRIPTION

    Opens the package folder in the default editor (or whatever you've configured as the npm editor config -- see npm-config(7).)

    After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages.

    For instance, you can do npm install connect to install connect into your package, and then npm edit connect to make a few changes to your locally installed copy.

    CONFIGURATION

    editor

    • Default: EDITOR environment variable if set, or "vi" on Posix, or "notepad" on Windows.
    • Type: path

    The command to run for npm edit or npm config edit.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-explore.html000644 000766 000024 00000002530 12455173731 027752 0ustar00iojsstaff000000 000000

    npm-explore

    Browse an installed package

    SYNOPSIS

    npm explore <name> [ -- <cmd>]
    

    DESCRIPTION

    Spawn a subshell in the directory of the installed package specified.

    If a command is specified, then it is run in the subshell, which then immediately terminates.

    This is particularly handy in the case of git submodules in the node_modules folder:

    npm explore some-dependency -- git pull origin master
    

    Note that the package is not automatically rebuilt afterwards, so be sure to use npm rebuild <pkg> if you make any changes.

    CONFIGURATION

    shell

    • Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows
    • Type: path

    The shell to run for the npm explore command.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-help-search.html000644 000766 000024 00000002215 12455173731 030467 0ustar00iojsstaff000000 000000

    npm-help-search

    Search npm help documentation

    SYNOPSIS

    npm help-search some search terms
    

    DESCRIPTION

    This command will search the npm markdown documentation files for the terms provided, and then list the results, sorted by relevance.

    If only one result is found, then it will show that help topic.

    If the argument to npm help is not a known help topic, then it will call help-search. It is rarely if ever necessary to call this command directly.

    CONFIGURATION

    long

    • Type: Boolean
    • Default false

    If true, the "long" flag will cause help-search to output context around where the terms were found in the documentation.

    If false, then help-search will just list out the help topics found.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-help.html000644 000766 000024 00000003025 12455173731 027224 0ustar00iojsstaff000000 000000

    npm-help

    Get help on npm

    SYNOPSIS

    npm help <topic>
    npm help some search terms
    

    DESCRIPTION

    If supplied a topic, then show the appropriate documentation page.

    If the topic does not exist, or if multiple terms are provided, then run the help-search command to find a match. Note that, if help-search finds a single subject, then it will run help on that topic, so unique matches are equivalent to specifying a topic name.

    CONFIGURATION

    viewer

    • Default: "man" on Posix, "browser" on Windows
    • Type: path

    The program to use to view help content.

    Set to "browser" to view html help content in the default web browser.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-init.html000644 000766 000024 00000002212 12455173731 027234 0ustar00iojsstaff000000 000000

    npm-init

    Interactively create a package.json file

    SYNOPSIS

    npm init [-f|--force|-y|--yes]
    

    DESCRIPTION

    This will ask you a bunch of questions, and then write a package.json for you.

    It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected.

    If you already have a package.json file, it'll read that first, and default to the options in there.

    It is strictly additive, so it does not delete options from your package.json without a really good reason to do so.

    If you invoke it with -f, --force, -y, or --yes, it will use only defaults and not prompt you for any options.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-install.html000644 000766 000024 00000027650 12455173731 027754 0ustar00iojsstaff000000 000000

    npm-install

    Install a package

    SYNOPSIS

    npm install (with no args in a package dir)
    npm install <tarball file>
    npm install <tarball url>
    npm install <folder>
    npm install [@<scope>/]<name> [--save|--save-dev|--save-optional] [--save-exact]
    npm install [@<scope>/]<name>@<tag>
    npm install [@<scope>/]<name>@<version>
    npm install [@<scope>/]<name>@<version range>
    npm i (with any of the previous argument usage)
    

    DESCRIPTION

    This command installs a package, and any packages that it depends on. If the package has a shrinkwrap file, the installation of dependencies will be driven by that. See npm-shrinkwrap(1).

    A package is:

    • a) a folder containing a program described by a package.json file
    • b) a gzipped tarball containing (a)
    • c) a url that resolves to (b)
    • d) a <name>@<version> that is published on the registry (see npm-registry(7)) with (c)
    • e) a <name>@<tag> that points to (d)
    • f) a <name> that has a "latest" tag satisfying (e)
    • g) a <git remote url> that resolves to (b)

    Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b).

    • npm install (in package directory, no arguments):

      Install the dependencies in the local node_modules folder.

      In global mode (ie, with -g or --global appended to the command), it installs the current package context (ie, the current working directory) as a global package.

      By default, npm install will install all modules listed as dependencies. With the --production flag, npm will not install modules listed in devDependencies.

    • npm install <folder>:

      Install a package that is sitting in a folder on the filesystem.

    • npm install <tarball file>:

      Install a package that is sitting on the filesystem. Note: if you just want to link a dev directory into your npm root, you can do this more easily by using npm link.

      Example:

          npm install ./package.tgz
      
    • npm install <tarball url>:

      Fetch the tarball url, and then install it. In order to distinguish between this and other options, the argument must start with "http://" or "https://"

      Example:

          npm install https://github.com/indexzero/forever/tarball/v0.5.6
      
    • npm install [@<scope>/]<name> [--save|--save-dev|--save-optional]:

      Do a <name>@<tag> install, where <tag> is the "tag" config. (See npm-config(7).)

      In most cases, this will install the latest version of the module published on npm.

      Example:

          npm install sax
      

      npm install takes 3 exclusive, optional flags which save or update the package version in your main package.json:

      • --save: Package will appear in your dependencies.

      • --save-dev: Package will appear in your devDependencies.

      • --save-optional: Package will appear in your optionalDependencies.

        When using any of the above options to save dependencies to your package.json, there is an additional, optional flag:

      • --save-exact: Saved dependencies will be configured with an exact version rather than using npm's default semver range operator.

        <scope> is optional. The package will be downloaded from the registry associated with the specified scope. If no registry is associated with the given scope the default registry is assumed. See npm-scope(7).

        Note: if you do not include the @-symbol on your scope name, npm will interpret this as a GitHub repository instead, see below. Scopes names must also be followed by a slash.

        Examples:

        npm install sax --save
        npm install githubname/reponame
        npm install @myorg/privatepackage
        npm install node-tap --save-dev
        npm install dtrace-provider --save-optional
        npm install readable-stream --save --save-exact
        
    **Note**: If there is a file or folder named `<name>` in the current
    working directory, then it will try to install that, and only try to
    fetch the package by name if it is not valid.
    
    • npm install [@<scope>/]<name>@<tag>:

      Install the version of the package that is referenced by the specified tag. If the tag does not exist in the registry data for that package, then this will fail.

      Example:

          npm install sax@latest
          npm install @myorg/mypackage@latest
      
    • npm install [@<scope>/]<name>@<version>:

      Install the specified version of the package. This will fail if the version has not been published to the registry.

      Example:

          npm install sax@0.1.1
          npm install @myorg/privatepackage@1.5.0
      
    • npm install [@<scope>/]<name>@<version range>:

      Install a version of the package matching the specified version range. This will follow the same rules for resolving dependencies described in package.json(5).

      Note that most version ranges must be put in quotes so that your shell will treat it as a single argument.

      Example:

          npm install sax@">=0.1.0 <0.2.0"
          npm install @myorg/privatepackage@">=0.1.0 <0.2.0"
      
    • npm install <githubname>/<githubrepo>:

      Install the package at https://github.com/githubname/githubrepo" by attempting to clone it usinggit`.

      Example:

          npm install mygithubuser/myproject
      

      To reference a package in a git repo that is not on GitHub, see git remote urls below.

    • npm install <git remote url>:

      Install a package by cloning a git remote url. The format of the git url is:

          <protocol>://[<user>@]<hostname><separator><path>[#<commit-ish>]
      

      <protocol> is one of git, git+ssh, git+http, or git+https. If no <commit-ish> is specified, then master is used.

      Examples:

          git+ssh://git@github.com:npm/npm.git#v1.0.27
          git+https://isaacs@github.com/npm/npm.git
          git://github.com/npm/npm.git#v1.0.27
      

    You may combine multiple arguments, and even multiple types of arguments. For example:

    npm install sax@">=0.1.0 <0.2.0" bench supervisor
    

    The --tag argument will apply to all of the specified install targets. If a tag with the given name exists, the tagged version is preferred over newer versions.

    The --force argument will force npm to fetch remote resources even if a local copy exists on disk.

    npm install sax --force
    

    The --global argument will cause npm to install the package globally rather than locally. See npm-folders(5).

    The --link argument will cause npm to link global installs into the local space in some cases.

    The --no-bin-links argument will prevent npm from creating symlinks for any binaries the package might contain.

    The --no-optional argument will prevent optional dependencies from being installed.

    The --no-shrinkwrap argument, which will ignore an available shrinkwrap file and use the package.json instead.

    The --nodedir=/path/to/node/source argument will allow npm to find the node source code so that npm can compile native modules.

    See npm-config(7). Many of the configuration params have some effect on installation, since that's most of what npm does.

    ALGORITHM

    To install a package, npm uses the following algorithm:

    install(where, what, family, ancestors)
    fetch what, unpack to <where>/node_modules/<what>
    for each dep in what.dependencies
      resolve dep to precise version
    for each dep@version in what.dependencies
        not in <where>/node_modules/<what>/node_modules/*
        and not in <family>
      add precise version deps to <family>
      install(<where>/node_modules/<what>, dep, family)
    

    For this package{dep} structure: A{B,C}, B{C}, C{D}, this algorithm produces:

    A
    +-- B
    `-- C
        `-- D
    

    That is, the dependency from B to C is satisfied by the fact that A already caused C to be installed at a higher level.

    See npm-folders(5) for a more detailed description of the specific folder structures that npm creates.

    Limitations of npm's Install Algorithm

    There are some very rare and pathological edge-cases where a cycle can cause npm to try to install a never-ending tree of packages. Here is the simplest case:

    A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ...
    

    where A is some version of a package, and A' is a different version of the same package. Because B depends on a different version of A than the one that is already in the tree, it must install a separate copy. The same is true of A', which must install B'. Because B' depends on the original version of A, which has been overridden, the cycle falls into infinite regress.

    To avoid this situation, npm flat-out refuses to install any name@version that is already present anywhere in the tree of package folder ancestors. A more correct, but more complex, solution would be to symlink the existing version into the new location. If this ever affects a real use-case, it will be investigated.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-link.html000644 000766 000024 00000005766 12455173731 027247 0ustar00iojsstaff000000 000000

    npm-link

    Symlink a package folder

    SYNOPSIS

    npm link (in package folder)
    npm link [@<scope>/]<pkgname>
    npm ln (with any of the previous argument usage)
    

    DESCRIPTION

    Package linking is a two-step process.

    First, npm link in a package folder will create a globally-installed symbolic link from prefix/package-name to the current folder (see npm-config(7) for the value of prefix).

    Next, in some other location, npm link package-name will create a symlink from the local node_modules folder to the global symlink.

    Note that package-name is taken from package.json, not from directory name.

    The package name can be optionally prefixed with a scope. See npm-scope(7). The scope must be preceded by an @-symbol and followed by a slash.

    When creating tarballs for npm publish, the linked packages are "snapshotted" to their current state by resolving the symbolic links.

    This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild.

    For example:

    cd ~/projects/node-redis    # go into the package directory
    npm link                    # creates global link
    cd ~/projects/node-bloggy   # go into some other package directory.
    npm link redis              # link-install the package
    

    Now, any changes to ~/projects/node-redis will be reflected in ~/projects/node-bloggy/node_modules/redis/

    You may also shortcut the two steps in one. For example, to do the above use-case in a shorter way:

    cd ~/projects/node-bloggy  # go into the dir of your main project
    npm link ../node-redis     # link the dir of your dependency
    

    The second line is the equivalent of doing:

    (cd ../node-redis; npm link)
    npm link redis
    

    That is, it first creates a global link, and then links the global installation target into your project's node_modules folder.

    If your linked package is scoped (see npm-scope(7)) your link command must include that scope, e.g.

    npm link @myorg/privatepackage
    

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-ls.html000644 000766 000024 00000004756 12455173731 026726 0ustar00iojsstaff000000 000000

    npm-ls

    List installed packages

    SYNOPSIS

    npm list [[@<scope>/]<pkg> ...]
    npm ls [[@<scope>/]<pkg> ...]
    npm la [[@<scope>/]<pkg> ...]
    npm ll [[@<scope>/]<pkg> ...]
    

    DESCRIPTION

    This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree-structure.

    Positional arguments are name@version-range identifiers, which will limit the results to only the paths to the packages named. Note that nested packages will also show the paths to the specified packages. For example, running npm ls promzard in npm's source tree will show:

    npm@2.1.18 /path/to/npm
    └─┬ init-package-json@0.0.4
      └── promzard@0.1.5
    

    It will print out extraneous, missing, and invalid packages.

    If a project specifies git urls for dependencies these are shown in parentheses after the name@version to make it easier for users to recognize potential forks of a project.

    When run as ll or la, it shows extended information by default.

    CONFIGURATION

    json

    • Default: false
    • Type: Boolean

    Show information in JSON format.

    long

    • Default: false
    • Type: Boolean

    Show extended information.

    parseable

    • Default: false
    • Type: Boolean

    Show parseable output instead of tree view.

    global

    • Default: false
    • Type: Boolean

    List packages in the global install prefix instead of in the current project.

    depth

    • Type: Int

    Max display depth of the dependency tree.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-outdated.html000644 000766 000024 00000002704 12455173731 030110 0ustar00iojsstaff000000 000000

    npm-outdated

    Check for outdated packages

    SYNOPSIS

    npm outdated [<name> [<name> ...]]
    

    DESCRIPTION

    This command will check the registry to see if any (or, specific) installed packages are currently outdated.

    The resulting field 'wanted' shows the latest version according to the version specified in the package.json, the field 'latest' the very latest version of the package.

    CONFIGURATION

    json

    • Default: false
    • Type: Boolean

    Show information in JSON format.

    long

    • Default: false
    • Type: Boolean

    Show extended information.

    parseable

    • Default: false
    • Type: Boolean

    Show parseable output instead of tree view.

    global

    • Default: false
    • Type: Boolean

    Check packages in the global install prefix instead of in the current project.

    depth

    • Type: Int

    Max depth for checking dependency tree.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-owner.html000644 000766 000024 00000002350 12455173731 027426 0ustar00iojsstaff000000 000000

    npm-owner

    Manage package owners

    SYNOPSIS

    npm owner ls <package name>
    npm owner add <user> <package name>
    npm owner rm <user> <package name>
    

    DESCRIPTION

    Manage ownership of published packages.

    • ls: List all the users who have access to modify a package and push new versions. Handy when you need to know who to bug for help.
    • add: Add a new user as a maintainer of a package. This user is enabled to modify metadata, publish new versions, and add other owners.
    • rm: Remove a user from the package owner list. This immediately revokes their privileges.

    Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-pack.html000644 000766 000024 00000002060 12455173731 027210 0ustar00iojsstaff000000 000000

    npm-pack

    Create a tarball from a package

    SYNOPSIS

    npm pack [<pkg> [<pkg> ...]]
    

    DESCRIPTION

    For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as <name>-<version>.tgz, and then write the filenames out to stdout.

    If the same package is specified multiple times, then the file will be overwritten the second time.

    If no arguments are supplied, then npm packs the current package folder.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-prefix.html000644 000766 000024 00000001614 12455173731 027573 0ustar00iojsstaff000000 000000

    npm-prefix

    Display prefix

    SYNOPSIS

    npm prefix [-g]
    

    DESCRIPTION

    Print the local prefix to standard out. This is the closest parent directory to contain a package.json file unless -g is also specified.

    If -g is specified, this will be the value of the global prefix. See npm-config(7) for more detail.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-prune.html000644 000766 000024 00000001606 12455173731 027430 0ustar00iojsstaff000000 000000

    npm-prune

    Remove extraneous packages

    SYNOPSIS

    npm prune [<name> [<name ...]]
    npm prune [<name> [<name ...]] [--production]
    

    DESCRIPTION

    This command removes "extraneous" packages. If a package name is provided, then only packages matching one of the supplied names are removed.

    Extraneous packages are packages that are not listed on the parent package's dependencies list.

    If the --production flag is specified, this command will remove the packages specified in your devDependencies.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-publish.html000644 000766 000024 00000003774 12455173731 027755 0ustar00iojsstaff000000 000000

    npm-publish

    Publish a package

    SYNOPSIS

    npm publish <tarball> [--tag <tag>]
    npm publish <folder> [--tag <tag>]
    

    DESCRIPTION

    Publishes a package to the registry so that it can be installed by name. See npm-developers(7) for details on what's included in the published package, as well as details on how the package is built.

    By default npm will publish to the public registry. This can be overridden by specifying a different default registry or using a npm-scope(7) in the name (see package.json(5)).

    • <folder>: A folder containing a package.json file

    • <tarball>: A url or file path to a gzipped tar archive containing a single folder with a package.json file inside.

    • [--tag <tag>] Registers the published package with the given tag, such that npm install <name>@<tag> will install this version. By default, npm publish updates and npm install installs the latest tag.

    Fails if the package name and version combination already exists in the specified registry.

    Once a package is published with a given name and version, that specific name and version combination can never be used again, even if it is removed with npm-unpublish(1).

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-rebuild.html000644 000766 000024 00000001273 12455173731 027725 0ustar00iojsstaff000000 000000

    npm-rebuild

    Rebuild a package

    SYNOPSIS

    npm rebuild [<name> [<name> ...]]
    npm rb [<name> [<name> ...]]
    
    • <name>: The package to rebuild

    DESCRIPTION

    This command runs the npm build command on the matched folders. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-repo.html000644 000766 000024 00000002047 12455173731 027244 0ustar00iojsstaff000000 000000

    npm-repo

    Open package repository page in the browser

    SYNOPSIS

    npm repo <pkgname>
    npm repo (with no args in a package dir)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the --browser config param. If no package name is provided, it will search for a package.json in the current folder and use the name property.

    CONFIGURATION

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm repo command to open websites.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-restart.html000644 000766 000024 00000002453 12455173731 027764 0ustar00iojsstaff000000 000000

    npm-restart

    Restart a package

    SYNOPSIS

    npm restart [-- <args>]
    

    DESCRIPTION

    This restarts a package.

    This runs a package's "stop", "restart", and "start" scripts, and associated pre- and post- scripts, in the order given below:

    1. prerestart
    2. prestop
    3. stop
    4. poststop
    5. restart
    6. prestart
    7. start
    8. poststart
    9. postrestart

    NOTE

    Note that the "restart" script is run in addition to the "stop" and "start" scripts, not instead of them.

    This is the behavior as of npm major version 2. A change in this behavior will be accompanied by an increase in major version number

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-rm.html000644 000766 000024 00000001354 12455173731 026715 0ustar00iojsstaff000000 000000

    npm-rm

    Remove a package

    SYNOPSIS

    npm rm <name>
    npm r <name>
    npm uninstall <name>
    npm un <name>
    

    DESCRIPTION

    This uninstalls a package, completely removing everything npm installed on its behalf.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-root.html000644 000766 000024 00000001214 12455173731 027255 0ustar00iojsstaff000000 000000

    npm-root

    Display npm root

    SYNOPSIS

    npm root
    

    DESCRIPTION

    Print the effective node_modules folder to standard out.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-run-script.html000644 000766 000024 00000003126 12455173731 030404 0ustar00iojsstaff000000 000000

    npm-run-script

    Run arbitrary package scripts

    SYNOPSIS

    npm run-script [command] [-- <args>]
    npm run [command] [-- <args>]
    

    DESCRIPTION

    This runs an arbitrary command from a package's "scripts" object. If no package name is provided, it will search for a package.json in the current folder and use its "scripts" object. If no "command" is provided, it will list the available top level scripts.

    It is used by the test, start, restart, and stop commands, but can be called directly, as well.

    As of npm@2.0.0, you can use custom arguments when executing scripts. The special option -- is used by getopt to delimit the end of the options. npm will pass all the arguments after the -- directly to your script:

    npm run test -- --grep="pattern"
    

    The arguments will only be passed to the script specified after npm run and not to any pre or post script.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-search.html000644 000766 000024 00000002367 12455173731 027551 0ustar00iojsstaff000000 000000

    npm-search

    Search for packages

    SYNOPSIS

    npm search [--long] [search terms ...]
    npm s [search terms ...]
    npm se [search terms ...]
    

    DESCRIPTION

    Search the registry for packages matching the search terms.

    If a term starts with /, then it's interpreted as a regular expression. A trailing / will be ignored in this case. (Note that many regular expression characters must be escaped or quoted in most shells.)

    CONFIGURATION

    long

    • Default: false
    • Type: Boolean

    Display full package descriptions and other long text across multiple lines. When disabled (default) search results are truncated to fit neatly on a single line. Modules with extremely long names will fall on multiple lines.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-shrinkwrap.html000644 000766 000024 00000015643 12455173731 030475 0ustar00iojsstaff000000 000000

    npm-shrinkwrap

    Lock down dependency versions

    SYNOPSIS

    npm shrinkwrap
    

    DESCRIPTION

    This command locks down the versions of a package's dependencies so that you can control exactly which versions of each dependency will be used when your package is installed. The "package.json" file is still required if you want to use "npm install".

    By default, "npm install" recursively installs the target's dependencies (as specified in package.json), choosing the latest available version that satisfies the dependency's semver pattern. In some situations, particularly when shipping software where each change is tightly managed, it's desirable to fully specify each version of each dependency recursively so that subsequent builds and deploys do not inadvertently pick up newer versions of a dependency that satisfy the semver pattern. Specifying specific semver patterns in each dependency's package.json would facilitate this, but that's not always possible or desirable, as when another author owns the npm package. It's also possible to check dependencies directly into source control, but that may be undesirable for other reasons.

    As an example, consider package A:

    {
      "name": "A",
      "version": "0.1.0",
      "dependencies": {
        "B": "<0.1.0"
      }
    }
    

    package B:

    {
      "name": "B",
      "version": "0.0.1",
      "dependencies": {
        "C": "<0.1.0"
      }
    }
    

    and package C:

    {
      "name": "C,
      "version": "0.0.1"
    }
    

    If these are the only versions of A, B, and C available in the registry, then a normal "npm install A" will install:

    A@0.1.0
    `-- B@0.0.1
        `-- C@0.0.1
    

    However, if B@0.0.2 is published, then a fresh "npm install A" will install:

    A@0.1.0
    `-- B@0.0.2
        `-- C@0.0.1
    

    assuming the new version did not modify B's dependencies. Of course, the new version of B could include a new version of C and any number of new dependencies. If such changes are undesirable, the author of A could specify a dependency on B@0.0.1. However, if A's author and B's author are not the same person, there's no way for A's author to say that he or she does not want to pull in newly published versions of C when B hasn't changed at all.

    In this case, A's author can run

    npm shrinkwrap
    

    This generates npm-shrinkwrap.json, which will look something like this:

    {
      "name": "A",
      "version": "0.1.0",
      "dependencies": {
        "B": {
          "version": "0.0.1",
          "dependencies": {
            "C": {
              "version": "0.1.0"
            }
          }
        }
      }
    }
    

    The shrinkwrap command has locked down the dependencies based on what's currently installed in node_modules. When "npm install" installs a package with a npm-shrinkwrap.json file in the package root, the shrinkwrap file (rather than package.json files) completely drives the installation of that package and all of its dependencies (recursively). So now the author publishes A@0.1.0, and subsequent installs of this package will use B@0.0.1 and C@0.1.0, regardless the dependencies and versions listed in A's, B's, and C's package.json files.

    Using shrinkwrapped packages

    Using a shrinkwrapped package is no different than using any other package: you can "npm install" it by hand, or add a dependency to your package.json file and "npm install" it.

    Building shrinkwrapped packages

    To shrinkwrap an existing package:

    1. Run "npm install" in the package root to install the current versions of all dependencies.
    2. Validate that the package works as expected with these versions.
    3. Run "npm shrinkwrap", add npm-shrinkwrap.json to git, and publish your package.

    To add or update a dependency in a shrinkwrapped package:

    1. Run "npm install" in the package root to install the current versions of all dependencies.
    2. Add or update dependencies. "npm install" each new or updated package individually and then update package.json. Note that they must be explicitly named in order to be installed: running npm install with no arguments will merely reproduce the existing shrinkwrap.
    3. Validate that the package works as expected with the new dependencies.
    4. Run "npm shrinkwrap", commit the new npm-shrinkwrap.json, and publish your package.

    You can use npm-outdated(1) to view dependencies with newer versions available.

    Other Notes

    A shrinkwrap file must be consistent with the package's package.json file. "npm shrinkwrap" will fail if required dependencies are not already installed, since that would result in a shrinkwrap that wouldn't actually work. Similarly, the command will fail if there are extraneous packages (not referenced by package.json), since that would indicate that package.json is not correct.

    Since "npm shrinkwrap" is intended to lock down your dependencies for production use, devDependencies will not be included unless you explicitly set the --dev flag when you run npm shrinkwrap. If installed devDependencies are excluded, then npm will print a warning. If you want them to be installed with your module by default, please consider adding them to dependencies instead.

    If shrinkwrapped package A depends on shrinkwrapped package B, B's shrinkwrap will not be used as part of the installation of A. However, because A's shrinkwrap is constructed from a valid installation of B and recursively specifies all dependencies, the contents of B's shrinkwrap will implicitly be included in A's shrinkwrap.

    Caveats

    If you wish to lock down the specific bytes included in a package, for example to have 100% confidence in being able to reproduce a deployment or build, then you ought to check your dependencies into source control, or pursue some other mechanism that can verify contents rather than versions.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-star.html000644 000766 000024 00000001406 12455173731 027246 0ustar00iojsstaff000000 000000

    npm-star

    Mark your favorite packages

    SYNOPSIS

    npm star <pkgname> [<pkg>, ...]
    npm unstar <pkgname> [<pkg>, ...]
    

    DESCRIPTION

    "Starring" a package means that you have some interest in it. It's a vaguely positive way to show that you care.

    "Unstarring" is the same thing, but in reverse.

    It's a boolean thing. Starring repeatedly has no additional effect.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-stars.html000644 000766 000024 00000001351 12455173731 027430 0ustar00iojsstaff000000 000000

    npm-stars

    View packages marked as favorites

    SYNOPSIS

    npm stars
    npm stars [username]
    

    DESCRIPTION

    If you have starred a lot of neat things and want to find them again quickly this command lets you do just that.

    You may also want to see your friend's favorite packages, in this case you will most certainly enjoy this command.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-start.html000644 000766 000024 00000001165 12455173731 027434 0ustar00iojsstaff000000 000000

    npm-start

    Start a package

    SYNOPSIS

    npm start [-- <args>]
    

    DESCRIPTION

    This runs a package's "start" script, if one was provided.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-stop.html000644 000766 000024 00000001162 12455173731 027261 0ustar00iojsstaff000000 000000

    npm-stop

    Stop a package

    SYNOPSIS

    npm stop [-- <args>]
    

    DESCRIPTION

    This runs a package's "stop" script, if one was provided.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-submodule.html000644 000766 000024 00000002171 12455173731 030274 0ustar00iojsstaff000000 000000

    npm-submodule

    Add a package as a git submodule

    SYNOPSIS

    npm submodule <pkg>
    

    DESCRIPTION

    If the specified package has a git repository url in its package.json description, then this command will add it as a git submodule at node_modules/<pkg name>.

    This is a convenience only. From then on, it's up to you to manage updates by using the appropriate git commands. npm will stubbornly refuse to update, modify, or remove anything with a .git subfolder in it.

    This command also does not install missing dependencies, if the package does not include them in its git repository. If npm ls reports that things are missing, you can either install, link, or submodule them yourself, or you can do npm explore <pkgname> -- npm install to install the dependencies into the submodule folder.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-tag.html000644 000766 000024 00000002402 12455173731 027045 0ustar00iojsstaff000000 000000

    npm-tag

    Tag a published version

    SYNOPSIS

    npm tag <name>@<version> [<tag>]
    

    DESCRIPTION

    Tags the specified version of the package with the specified tag, or the --tag config if not specified.

    A tag can be used when installing packages as a reference to a version instead of using a specific version number:

    npm install <name>@<tag>
    

    When installing dependencies, a preferred tagged version may be specified:

    npm install --tag <tag>
    

    This also applies to npm dedupe.

    Publishing a package always sets the "latest" tag to the published version.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-test.html000644 000766 000024 00000001356 12455173731 027260 0ustar00iojsstaff000000 000000

    npm-test

    Test a package

    SYNOPSIS

      npm test [-- <args>]
      npm tst [-- <args>]
    

    DESCRIPTION

    This runs a package's "test" script, if one was provided.

    To run tests as a condition of installation, set the npat config to true.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-uninstall.html000644 000766 000024 00000003400 12455173731 030302 0ustar00iojsstaff000000 000000

    npm-rm

    Remove a package

    SYNOPSIS

    npm uninstall [@<scope>/]<package> [--save|--save-dev|--save-optional]
    npm rm (with any of the previous argument usage)
    

    DESCRIPTION

    This uninstalls a package, completely removing everything npm installed on its behalf.

    Example:

    npm uninstall sax
    

    In global mode (ie, with -g or --global appended to the command), it uninstalls the current package context as a global package.

    npm uninstall takes 3 exclusive, optional flags which save or update the package version in your main package.json:

    • --save: Package will be removed from your dependencies.

    • --save-dev: Package will be removed from your devDependencies.

    • --save-optional: Package will be removed from your optionalDependencies.

    Scope is optional and follows the usual rules for npm-scope(7).

    Examples:

    npm uninstall sax --save
    npm uninstall @myorg/privatepackage --save
    npm uninstall node-tap --save-dev
    npm uninstall dtrace-provider --save-optional
    

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-unpublish.html000644 000766 000024 00000002715 12455173731 030312 0ustar00iojsstaff000000 000000

    npm-unpublish

    Remove a package from the registry

    SYNOPSIS

    npm unpublish [@<scope>/]<name>[@<version>]
    

    WARNING

    It is generally considered bad behavior to remove versions of a library that others are depending on!

    Consider using the deprecate command instead, if your intent is to encourage users to upgrade.

    There is plenty of room on the registry.

    DESCRIPTION

    This removes a package version from the registry, deleting its entry and removing the tarball.

    If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely.

    Even if a package version is unpublished, that specific name and version combination can never be reused. In order to publish the package again, a new version number must be used.

    The scope is optional and follows the usual rules for npm-scope(7).

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-update.html000644 000766 000024 00000001701 12455173731 027555 0ustar00iojsstaff000000 000000

    npm-update

    Update a package

    SYNOPSIS

    npm update [-g] [<name> [<name> ...]]
    

    DESCRIPTION

    This command will update all the packages listed to the latest version (specified by the tag config).

    It will also install missing packages.

    If the -g flag is specified, this command will update globally installed packages.

    If no package name is specified, all packages in the specified location (global or local) will be updated.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-version.html000644 000766 000024 00000003756 12455173731 027774 0ustar00iojsstaff000000 000000

    npm-version

    Bump a package version

    SYNOPSIS

    npm version [<newversion> | major | minor | patch | premajor | preminor | prepatch | prerelease]
    

    DESCRIPTION

    Run this in a package directory to bump the version and write the new data back to package.json and, if present, npm-shrinkwrap.json.

    The newversion argument should be a valid semver string, or a valid second argument to semver.inc (one of "patch", "minor", "major", "prepatch", "preminor", "premajor", "prerelease"). In the second case, the existing version will be incremented by 1 in the specified field.

    If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean.

    If supplied with --message (shorthand: -m) config option, npm will use it as a commit message when creating a version commit. If the message config contains %s then that will be replaced with the resulting version number. For example:

    npm version patch -m "Upgrade to %s for reasons"
    

    If the sign-git-tag config is set, then the tag will be signed using the -s flag to git. Note that you must have a default GPG key set up in your git config for this to work properly. For example:

    $ npm config set sign-git-tag true
    $ npm version patch
    
    You need a passphrase to unlock the secret key for
    user: "isaacs (http://blog.izs.me/) <i@izs.me>"
    2048-bit RSA key, ID 6C481CF6, created 2010-08-31
    
    Enter passphrase:
    

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-view.html000644 000766 000024 00000007263 12455173731 027256 0ustar00iojsstaff000000 000000

    npm-view

    View registry info

    SYNOPSIS

    npm view [@<scope>/]<name>[@<version>] [<field>[.<subfield>]...]
    npm v [@<scope>/]<name>[@<version>] [<field>[.<subfield>]...]
    

    DESCRIPTION

    This command shows data about a package and prints it to the stream referenced by the outfd config, which defaults to stdout.

    To show the package registry entry for the connect package, you can do this:

    npm view connect
    

    The default version is "latest" if unspecified.

    Field names can be specified after the package descriptor. For example, to show the dependencies of the ronn package at version 0.3.5, you could do the following:

    npm view ronn@0.3.5 dependencies
    

    You can view child field by separating them with a period. To view the git repository URL for the latest version of npm, you could do this:

    npm view npm repository.url
    

    This makes it easy to view information about a dependency with a bit of shell scripting. For example, to view all the data about the version of opts that ronn depends on, you can do this:

    npm view opts@$(npm view ronn dependencies.opts)
    

    For fields that are arrays, requesting a non-numeric field will return all of the values from the objects in the list. For example, to get all the contributor names for the "express" project, you can do this:

    npm view express contributors.email
    

    You may also use numeric indices in square braces to specifically select an item in an array field. To just get the email address of the first contributor in the list, you can do this:

    npm view express contributors[0].email
    

    Multiple fields may be specified, and will be printed one after another. For exampls, to get all the contributor names and email addresses, you can do this:

    npm view express contributors.name contributors.email
    

    "Person" fields are shown as a string if they would be shown as an object. So, for example, this will show the list of npm contributors in the shortened string format. (See package.json(5) for more on this.)

    npm view npm contributors
    

    If a version range is provided, then data will be printed for every matching version of the package. This will show which version of jsdom was required by each matching version of yui3:

    npm view yui3@'>0.5.4' dependencies.jsdom
    

    OUTPUT

    If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command. If the field is an object, it will be output as a JavaScript object literal.

    If the --json flag is given, the outputted fields will be JSON.

    If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to.

    If multiple fields are requested, than each of them are prefixed with the field name.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm-whoami.html000644 000766 000024 00000001027 12455173731 027560 0ustar00iojsstaff000000 000000

    npm-whoami

    Display npm username

    SYNOPSIS

    npm whoami
    

    DESCRIPTION

    Print the username config to standard output.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/cli/npm.html000644 000766 000024 00000017135 12455173731 026305 0ustar00iojsstaff000000 000000

    npm

    node package manager

    SYNOPSIS

    npm <command> [args]
    

    VERSION

    2.1.18

    DESCRIPTION

    npm is the package manager for the Node JavaScript platform. It puts modules in place so that node can find them, and manages dependency conflicts intelligently.

    It is extremely configurable to support a wide variety of use cases. Most commonly, it is used to publish, discover, install, and develop node programs.

    Run npm help to get a list of available commands.

    INTRODUCTION

    You probably got npm because you want to install stuff.

    Use npm install blerg to install the latest version of "blerg". Check out npm-install(1) for more info. It can do a lot of stuff.

    Use the npm search command to show everything that's available. Use npm ls to show everything you've installed.

    DEPENDENCIES

    If a package references to another package with a git URL, npm depends on a preinstalled git.

    If one of the packages npm tries to install is a native node module and requires compiling of C++ Code, npm will use node-gyp for that task. For a Unix system, node-gyp needs Python, make and a buildchain like GCC. On Windows, Python and Microsoft Visual Studio C++ is needed. Python 3 is not supported by node-gyp. For more information visit the node-gyp repository and the node-gyp Wiki.

    DIRECTORIES

    See npm-folders(5) to learn about where npm puts stuff.

    In particular, npm has two modes of operation:

    • global mode:
      npm installs packages into the install prefix at prefix/lib/node_modules and bins are installed in prefix/bin.
    • local mode:
      npm installs packages into the current project directory, which defaults to the current working directory. Packages are installed to ./node_modules, and bins are installed to ./node_modules/.bin.

    Local mode is the default. Use --global or -g on any command to operate in global mode instead.

    DEVELOPER USAGE

    If you're using npm to develop and publish your code, check out the following help topics:

    • json: Make a package.json file. See package.json(5).
    • link: For linking your current working code into Node's path, so that you don't have to reinstall every time you make a change. Use npm link to do this.
    • install: It's a good idea to install things if you don't need the symbolic link. Especially, installing other peoples code from the registry is done via npm install
    • adduser: Create an account or log in. Credentials are stored in the user config file.
    • publish: Use the npm publish command to upload your code to the registry.

    CONFIGURATION

    npm is extremely configurable. It reads its configuration options from 5 places.

    • Command line switches:
      Set a config with --key val. All keys take a value, even if they are booleans (the config parser doesn't know what the options are at the time of parsing.) If no value is provided, then the option is set to boolean true.
    • Environment Variables:
      Set any config by prefixing the name in an environment variable with npm_config_. For example, export npm_config_key=val.
    • User Configs:
      The file at $HOME/.npmrc is an ini-formatted list of configs. If present, it is parsed. If the userconfig option is set in the cli or env, then that will be used instead.
    • Global Configs:
      The file found at ../etc/npmrc (from the node executable, by default this resolves to /usr/local/etc/npmrc) will be parsed if it is found. If the globalconfig option is set in the cli, env, or user config, then that file is parsed instead.
    • Defaults:
      npm's default configuration options are defined in lib/utils/config-defs.js. These must not be changed.

    See npm-config(7) for much much more information.

    CONTRIBUTIONS

    Patches welcome!

    • code: Read through npm-coding-style(7) if you plan to submit code. You don't have to agree with it, but you do have to follow it.
    • docs: If you find an error in the documentation, edit the appropriate markdown file in the "doc" folder. (Don't worry about generating the man page.)

    Contributors are listed in npm's package.json file. You can view them easily by doing npm view npm contributors.

    If you would like to contribute, but don't know what to work on, check the issues list or ask on the mailing list.

    BUGS

    When you find issues, please report them:

    Be sure to include all of the output from the npm command that didn't work as expected. The npm-debug.log file is also helpful to provide.

    You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email.

    AUTHOR

    Isaac Z. Schlueter :: isaacs :: @izs :: i@izs.me

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-bin.html000644 000766 000024 00000000574 12455173731 027054 0ustar00iojsstaff000000 000000

    npm-bin

    Display npm bin folder

    SYNOPSIS

    npm.commands.bin(args, cb)
    

    DESCRIPTION

    Print the folder where npm will install executables.

    This function should not be used programmatically. Instead, just refer to the npm.bin property.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-bugs.html000644 000766 000024 00000001256 12455173731 027242 0ustar00iojsstaff000000 000000

    npm-bugs

    Bugs for a package in a web browser maybe

    SYNOPSIS

    npm.commands.bugs(package, callback)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the --browser config param.

    Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number.

    This command will launch a browser, so this command may not be the most friendly for programmatic use.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-cache.html000644 000766 000024 00000002132 12455173731 027337 0ustar00iojsstaff000000 000000

    npm-cache

    manage the npm cache programmatically

    SYNOPSIS

    npm.commands.cache([args], callback)
    
    // helpers
    npm.commands.cache.clean([args], callback)
    npm.commands.cache.add([args], callback)
    npm.commands.cache.read(name, version, forceBypass, callback)
    

    DESCRIPTION

    This acts much the same ways as the npm-cache(1) command line functionality.

    The callback is called with the package.json data of the thing that is eventually added to or read from the cache.

    The top level npm.commands.cache(...) functionality is a public interface, and like all commands on the npm.commands object, it will match the command line behavior exactly.

    However, the cache folder structure and the cache helper functions are considered internal API surface, and as such, may change in future releases of npm, potentially without warning or significant version incrementation.

    Use at your own risk.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-commands.html000644 000766 000024 00000001521 12455173731 030076 0ustar00iojsstaff000000 000000

    npm-commands

    npm commands

    SYNOPSIS

    npm.commands[<command>](args, callback)
    

    DESCRIPTION

    npm comes with a full set of commands, and each of the commands takes a similar set of arguments.

    In general, all commands on the command object take an array of positional argument strings. The last argument to any function is a callback. Some commands are special and take other optional arguments.

    All commands have their own man page. See man npm-<command> for command-line usage, or man 3 npm-<command> for programmatic usage.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-config.html000644 000766 000024 00000003024 12455173731 027542 0ustar00iojsstaff000000 000000

    npm-config

    Manage the npm configuration files

    SYNOPSIS

    npm.commands.config(args, callback)
    var val = npm.config.get(key)
    npm.config.set(key, val)
    

    DESCRIPTION

    This function acts much the same way as the command-line version. The first element in the array tells config what to do. Possible values are:

    • set

      Sets a config parameter. The second element in args is interpreted as the key, and the third element is interpreted as the value.

    • get

      Gets the value of a config parameter. The second element in args is the key to get the value of.

    • delete (rm or del)

      Deletes a parameter from the config. The second element in args is the key to delete.

    • list (ls)

      Show all configs that aren't secret. No parameters necessary.

    • edit:

      Opens the config file in the default editor. This command isn't very useful programmatically, but it is made available.

    To programmatically access npm configuration settings, or set them for the duration of a program, use the npm.config.set and npm.config.get functions instead.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-deprecate.html000644 000766 000024 00000002346 12455173731 030237 0ustar00iojsstaff000000 000000

    npm-deprecate

    Deprecate a version of a package

    SYNOPSIS

    npm.commands.deprecate(args, callback)
    

    DESCRIPTION

    This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it.

    The 'args' parameter must have exactly two elements:

    • package[@version]

      The version portion is optional, and may be either a range, or a specific version, or a tag.

    • message

      The warning message that will be printed whenever a user attempts to install the package.

    Note that you must be the package owner to deprecate something. See the owner and adduser help topics.

    To un-deprecate a package, specify an empty string ("") for the message argument.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-docs.html000644 000766 000024 00000001260 12455173731 027225 0ustar00iojsstaff000000 000000

    npm-docs

    Docs for a package in a web browser maybe

    SYNOPSIS

    npm.commands.docs(package, callback)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the --browser config param.

    Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number.

    This command will launch a browser, so this command may not be the most friendly for programmatic use.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-edit.html000644 000766 000024 00000001715 12455173731 027227 0ustar00iojsstaff000000 000000

    npm-edit

    Edit an installed package

    SYNOPSIS

    npm.commands.edit(package, callback)
    

    DESCRIPTION

    Opens the package folder in the default editor (or whatever you've configured as the npm editor config -- see npm help config.)

    After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages.

    For instance, you can do npm install connect to install connect into your package, and then npm.commands.edit(["connect"], callback) to make a few changes to your locally installed copy.

    The first parameter is a string array with a single element, the package to open. The package can optionally have a version number attached.

    Since this command opens an editor in a new process, be careful about where and how this is used.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-explore.html000644 000766 000024 00000001424 12455173731 027755 0ustar00iojsstaff000000 000000

    npm-explore

    Browse an installed package

    SYNOPSIS

    npm.commands.explore(args, callback)
    

    DESCRIPTION

    Spawn a subshell in the directory of the installed package specified.

    If a command is specified, then it is run in the subshell, which then immediately terminates.

    Note that the package is not automatically rebuilt afterwards, so be sure to use npm rebuild <pkg> if you make any changes.

    The first element in the 'args' parameter must be a package name. After that is the optional command, which can be any number of strings. All of the strings will be combined into one, space-delimited command.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-help-search.html000644 000766 000024 00000002065 12455173731 030474 0ustar00iojsstaff000000 000000

    npm-help-search

    Search the help pages

    SYNOPSIS

    npm.commands.helpSearch(args, [silent,] callback)
    

    DESCRIPTION

    This command is rarely useful, but it exists in the rare case that it is.

    This command takes an array of search terms and returns the help pages that match in order of best match.

    If there is only one match, then npm displays that help section. If there are multiple results, the results are printed to the screen formatted and the array of results is returned. Each result is an object with these properties:

    • hits: A map of args to number of hits on that arg. For example, {"npm": 3}
    • found: Total number of unique args that matched.
    • totalHits: Total number of hits.
    • lines: An array of all matching lines (and some adjacent lines).
    • file: Name of the file that matched

    The silent parameter is not necessary not used, but it may in the future.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-init.html000644 000766 000024 00000002220 12455173731 027235 0ustar00iojsstaff000000 000000

    npm init

    Interactively create a package.json file

    SYNOPSIS

    npm.commands.init(args, callback)
    

    DESCRIPTION

    This will ask you a bunch of questions, and then write a package.json for you.

    It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected.

    If you already have a package.json file, it'll read that first, and default to the options in there.

    It is strictly additive, so it does not delete options from your package.json without a really good reason to do so.

    Since this function expects to be run on the command-line, it doesn't work very well as a programmatically. The best option is to roll your own, and since JavaScript makes it stupid simple to output formatted JSON, that is the preferred method. If you're sure you want to handle command-line prompting, then go ahead and use this programmatically.

    SEE ALSO

    package.json(5)

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-install.html000644 000766 000024 00000001332 12455173731 027743 0ustar00iojsstaff000000 000000

    npm-install

    install a package programmatically

    SYNOPSIS

    npm.commands.install([where,] packages, callback)
    

    DESCRIPTION

    This acts much the same ways as installing on the command-line.

    The 'where' parameter is optional and only used internally, and it specifies where the packages should be installed to.

    The 'packages' parameter is an array of strings. Each element in the array is the name of a package to be installed.

    Finally, 'callback' is a function that will be called when all packages have been installed or when an error has been encountered.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-link.html000644 000766 000024 00000002265 12455173731 027240 0ustar00iojsstaff000000 000000

    npm-link

    Symlink a package folder

    SYNOPSIS

    npm.commands.link(callback)
    npm.commands.link(packages, callback)
    

    DESCRIPTION

    Package linking is a two-step process.

    Without parameters, link will create a globally-installed symbolic link from prefix/package-name to the current folder.

    With a parameters, link will create a symlink from the local node_modules folder to the global symlink.

    When creating tarballs for npm publish, the linked packages are "snapshotted" to their current state by resolving the symbolic links.

    This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild.

    For example:

    npm.commands.link(cb)           # creates global link from the cwd
                                    # (say redis package)
    npm.commands.link('redis', cb)  # link-install the package
    

    Now, any changes to the redis package will be reflected in the package in the current working directory

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-load.html000644 000766 000024 00000001471 12455173731 027220 0ustar00iojsstaff000000 000000

    npm-load

    Load config settings

    SYNOPSIS

    npm.load(conf, cb)
    

    DESCRIPTION

    npm.load() must be called before any other function call. Both parameters are optional, but the second is recommended.

    The first parameter is an object containing command-line config params, and the second parameter is a callback that will be called when npm is loaded and ready to serve.

    The first parameter should follow a similar structure as the package.json config object.

    For example, to emulate the --dev flag, pass an object that looks like this:

    {
      "dev": true
    }
    

    For a list of all the available command-line configs, see npm help config

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-ls.html000644 000766 000024 00000003602 12455173731 026715 0ustar00iojsstaff000000 000000

    npm-ls

    List installed packages

    SYNOPSIS

    npm.commands.ls(args, [silent,] callback)
    

    DESCRIPTION

    This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree-structure. It will also return that data using the callback.

    This command does not take any arguments, but args must be defined. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments, though you may set config flags like with any other command, such as global to list global packages.

    It will print out extraneous, missing, and invalid packages.

    If the silent parameter is set to true, nothing will be output to the screen, but the data will still be returned.

    Callback is provided an error if one occurred, the full data about which packages are installed and which dependencies they will receive, and a "lite" data object which just shows which versions are installed where. Note that the full data object is a circular structure, so care must be taken if it is serialized to JSON.

    CONFIGURATION

    long

    • Default: false
    • Type: Boolean

    Show extended information.

    parseable

    • Default: false
    • Type: Boolean

    Show parseable output instead of tree view.

    global

    • Default: false
    • Type: Boolean

    List packages in the global install prefix instead of in the current project.

    Note, if parseable is set or long isn't set, then duplicates will be trimmed. This means that if a submodule has the same dependency as a parent module, then the dependency will only be output once.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-outdated.html000644 000766 000024 00000000645 12455173731 030114 0ustar00iojsstaff000000 000000

    npm-outdated

    Check for outdated packages

    SYNOPSIS

    npm.commands.outdated([packages,] callback)
    

    DESCRIPTION

    This command will check the registry to see if the specified packages are currently outdated.

    If the 'packages' parameter is left out, npm will check all packages.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-owner.html000644 000766 000024 00000002330 12455173731 027426 0ustar00iojsstaff000000 000000

    npm-owner

    Manage package owners

    SYNOPSIS

    npm.commands.owner(args, callback)
    

    DESCRIPTION

    The first element of the 'args' parameter defines what to do, and the subsequent elements depend on the action. Possible values for the action are (order of parameters are given in parenthesis):

    • ls (package): List all the users who have access to modify a package and push new versions. Handy when you need to know who to bug for help.
    • add (user, package): Add a new user as a maintainer of a package. This user is enabled to modify metadata, publish new versions, and add other owners.
    • rm (user, package): Remove a user from the package owner list. This immediately revokes their privileges.

    Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-pack.html000644 000766 000024 00000001337 12455173731 027220 0ustar00iojsstaff000000 000000

    npm-pack

    Create a tarball from a package

    SYNOPSIS

    npm.commands.pack([packages,] callback)
    

    DESCRIPTION

    For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as <name>-<version>.tgz, and then write the filenames out to stdout.

    If the same package is specified multiple times, then the file will be overwritten the second time.

    If no arguments are supplied, then npm packs the current package folder.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-prefix.html000644 000766 000024 00000000660 12455173731 027575 0ustar00iojsstaff000000 000000

    npm-prefix

    Display prefix

    SYNOPSIS

    npm.commands.prefix(args, callback)
    

    DESCRIPTION

    Print the prefix to standard out.

    'args' is never used and callback is never called with data. 'args' must be present or things will break.

    This function is not useful programmatically

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-prune.html000644 000766 000024 00000001037 12455173731 027430 0ustar00iojsstaff000000 000000

    npm-prune

    Remove extraneous packages

    SYNOPSIS

    npm.commands.prune([packages,] callback)
    

    DESCRIPTION

    This command removes "extraneous" packages.

    The first parameter is optional, and it specifies packages to be removed.

    No packages are specified, then all packages will be checked.

    Extraneous packages are packages that are not listed on the parent package's dependencies list.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-publish.html000644 000766 000024 00000002116 12455173731 027744 0ustar00iojsstaff000000 000000

    npm-publish

    Publish a package

    SYNOPSIS

    npm.commands.publish([packages,] callback)
    

    DESCRIPTION

    Publishes a package to the registry so that it can be installed by name. Possible values in the 'packages' array are:

    • <folder>: A folder containing a package.json file

    • <tarball>: A url or file path to a gzipped tar archive containing a single folder with a package.json file inside.

    If the package array is empty, npm will try to publish something in the current working directory.

    This command could fails if one of the packages specified already exists in the registry. Overwrites when the "force" environment variable is set.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-rebuild.html000644 000766 000024 00000001116 12455173731 027723 0ustar00iojsstaff000000 000000

    npm-rebuild

    Rebuild a package

    SYNOPSIS

    npm.commands.rebuild([packages,] callback)
    

    DESCRIPTION

    This command runs the npm build command on each of the matched packages. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary. If no 'packages' parameter is specify, every package will be rebuilt.

    CONFIGURATION

    See npm help build

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-repo.html000644 000766 000024 00000001257 12455173731 027250 0ustar00iojsstaff000000 000000

    npm-repo

    Open package repository page in the browser

    SYNOPSIS

    npm.commands.repo(package, callback)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the --browser config param.

    Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number.

    This command will launch a browser, so this command may not be the most friendly for programmatic use.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-restart.html000644 000766 000024 00000002431 12455173731 027762 0ustar00iojsstaff000000 000000

    npm-restart

    Restart a package

    SYNOPSIS

    npm.commands.restart(packages, callback)
    

    DESCRIPTION

    This restarts a package (or multiple packages).

    This runs a package's "stop", "restart", and "start" scripts, and associated pre- and post- scripts, in the order given below:

    1. prerestart
    2. prestop
    3. stop
    4. poststop
    5. restart
    6. prestart
    7. start
    8. poststart
    9. postrestart

    If no version is specified, then it restarts the "active" version.

    npm can restart multiple packages. Just specify multiple packages in the packages parameter.

    NOTE

    Note that the "restart" script is run in addition to the "stop" and "start" scripts, not instead of them.

    This is the behavior as of npm major version 2. A change in this behavior will be accompanied by an increase in major version number

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-root.html000644 000766 000024 00000000721 12455173731 027261 0ustar00iojsstaff000000 000000

    npm-root

    Display npm root

    SYNOPSIS

    npm.commands.root(args, callback)
    

    DESCRIPTION

    Print the effective node_modules folder to standard out.

    'args' is never used and callback is never called with data. 'args' must be present or things will break.

    This function is not useful programmatically.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-run-script.html000644 000766 000024 00000002174 12455173731 030410 0ustar00iojsstaff000000 000000

    npm-run-script

    Run arbitrary package scripts

    SYNOPSIS

    npm.commands.run-script(args, callback)
    

    DESCRIPTION

    This runs an arbitrary command from a package's "scripts" object.

    It is used by the test, start, restart, and stop commands, but can be called directly, as well.

    The 'args' parameter is an array of strings. Behavior depends on the number of elements. If there is only one element, npm assumes that the element represents a command to be run on the local repository. If there is more than one element, then the first is assumed to be the package and the second is assumed to be the command to run. All other elements are ignored.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-search.html000644 000766 000024 00000002632 12455173731 027546 0ustar00iojsstaff000000 000000

    npm-search

    Search for packages

    SYNOPSIS

    npm.commands.search(searchTerms, [silent,] [staleness,] callback)
    

    DESCRIPTION

    Search the registry for packages matching the search terms. The available parameters are:

    • searchTerms: Array of search terms. These terms are case-insensitive.
    • silent: If true, npm will not log anything to the console.
    • staleness: This is the threshold for stale packages. "Fresh" packages are not refreshed from the registry. This value is measured in seconds.
    • callback: Returns an object where each key is the name of a package, and the value is information about that package along with a 'words' property, which is a space-delimited string of all of the interesting words in that package. The only properties included are those that are searched, which generally include:

      • name
      • description
      • maintainers
      • url
      • keywords

    A search on the registry excludes any result that does not match all of the search terms. It also removes any items from the results that contain an excluded term (the "searchexclude" config). The search is case insensitive and doesn't try to read your mind (it doesn't do any verb tense matching or the like).

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-shrinkwrap.html000644 000766 000024 00000001422 12455173731 030465 0ustar00iojsstaff000000 000000

    npm-shrinkwrap

    programmatically generate package shrinkwrap file

    SYNOPSIS

    npm.commands.shrinkwrap(args, [silent,] callback)
    

    DESCRIPTION

    This acts much the same ways as shrinkwrapping on the command-line.

    This command does not take any arguments, but 'args' must be defined. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments.

    If the 'silent' parameter is set to true, nothing will be output to the screen, but the shrinkwrap file will still be written.

    Finally, 'callback' is a function that will be called when the shrinkwrap has been saved.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-start.html000644 000766 000024 00000000623 12455173731 027434 0ustar00iojsstaff000000 000000

    npm-start

    Start a package

    SYNOPSIS

    npm.commands.start(packages, callback)
    

    DESCRIPTION

    This runs a package's "start" script, if one was provided.

    npm can start multiple packages. Just specify multiple packages in the packages parameter.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-stop.html000644 000766 000024 00000000624 12455173731 027265 0ustar00iojsstaff000000 000000

    npm-stop

    Stop a package

    SYNOPSIS

    npm.commands.stop(packages, callback)
    

    DESCRIPTION

    This runs a package's "stop" script, if one was provided.

    npm can run stop on multiple packages. Just specify multiple packages in the packages parameter.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-submodule.html000644 000766 000024 00000002144 12455173731 030276 0ustar00iojsstaff000000 000000

    npm-submodule

    Add a package as a git submodule

    SYNOPSIS

    npm.commands.submodule(packages, callback)
    

    DESCRIPTION

    For each package specified, npm will check if it has a git repository url in its package.json description then add it as a git submodule at node_modules/<pkg name>.

    This is a convenience only. From then on, it's up to you to manage updates by using the appropriate git commands. npm will stubbornly refuse to update, modify, or remove anything with a .git subfolder in it.

    This command also does not install missing dependencies, if the package does not include them in its git repository. If npm ls reports that things are missing, you can either install, link, or submodule them yourself, or you can do npm explore <pkgname> -- npm install to install the dependencies into the submodule folder.

    SEE ALSO

    • npm help json
    • git help submodule
    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-tag.html000644 000766 000024 00000001676 12455173731 027063 0ustar00iojsstaff000000 000000

    npm-tag

    Tag a published version

    SYNOPSIS

    npm.commands.tag(package@version, tag, callback)
    

    DESCRIPTION

    Tags the specified version of the package with the specified tag, or the --tag config if not specified.

    The 'package@version' is an array of strings, but only the first two elements are currently used.

    The first element must be in the form package@version, where package is the package name and version is the version number (much like installing a specific version).

    The second element is the name of the tag to tag this version with. If this parameter is missing or falsey (empty), the default froom the config will be used. For more information about how to set this config, check man 3 npm-config for programmatic usage or man npm-config for cli usage.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-test.html000644 000766 000024 00000000765 12455173731 027265 0ustar00iojsstaff000000 000000

    npm-test

    Test a package

    SYNOPSIS

      npm.commands.test(packages, callback)
    

    DESCRIPTION

    This runs a package's "test" script, if one was provided.

    To run tests as a condition of installation, set the npat config to true.

    npm can run tests on multiple packages. Just specify multiple packages in the packages parameter.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-uninstall.html000644 000766 000024 00000001130 12455173731 030302 0ustar00iojsstaff000000 000000

    npm-uninstall

    uninstall a package programmatically

    SYNOPSIS

    npm.commands.uninstall(packages, callback)
    

    DESCRIPTION

    This acts much the same ways as uninstalling on the command-line.

    The 'packages' parameter is an array of strings. Each element in the array is the name of a package to be uninstalled.

    Finally, 'callback' is a function that will be called when all packages have been uninstalled or when an error has been encountered.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-unpublish.html000644 000766 000024 00000001277 12455173731 030316 0ustar00iojsstaff000000 000000

    npm-unpublish

    Remove a package from the registry

    SYNOPSIS

    npm.commands.unpublish(package, callback)
    

    DESCRIPTION

    This removes a package version from the registry, deleting its entry and removing the tarball.

    The package parameter must be defined.

    Only the first element in the package parameter is used. If there is no first element, then npm assumes that the package at the current working directory is what is meant.

    If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-update.html000644 000766 000024 00000000732 12455173731 027562 0ustar00iojsstaff000000 000000

    npm-update

    Update a package

    SYNOPSIS

    npm.commands.update(packages, callback)
    

    DESCRIPTION

    Updates a package, upgrading it to the latest version. It also installs any missing packages.

    The 'packages' argument is an array of packages to update. The 'callback' parameter will be called when done or when an error occurs.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-version.html000644 000766 000024 00000001247 12455173731 027767 0ustar00iojsstaff000000 000000

    npm-version

    Bump a package version

    SYNOPSIS

    npm.commands.version(newversion, callback)
    

    DESCRIPTION

    Run this in a package directory to bump the version and write the new data back to the package.json file.

    If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean.

    Like all other commands, this function takes a string array as its first parameter. The difference, however, is this function will fail if it does not have exactly one element. The only element should be a version number.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-view.html000644 000766 000024 00000007606 12455173731 027261 0ustar00iojsstaff000000 000000

    npm-view

    View registry info

    SYNOPSIS

    npm.commands.view(args, [silent,] callback)
    

    DESCRIPTION

    This command shows data about a package and prints it to the stream referenced by the outfd config, which defaults to stdout.

    The "args" parameter is an ordered list that closely resembles the command-line usage. The elements should be ordered such that the first element is the package and version (package@version). The version is optional. After that, the rest of the parameters are fields with optional subfields ("field.subfield") which can be used to get only the information desired from the registry.

    The callback will be passed all of the data returned by the query.

    For example, to get the package registry entry for the connect package, you can do this:

    npm.commands.view(["connect"], callback)
    

    If no version is specified, "latest" is assumed.

    Field names can be specified after the package descriptor. For example, to show the dependencies of the ronn package at version 0.3.5, you could do the following:

    npm.commands.view(["ronn@0.3.5", "dependencies"], callback)
    

    You can view child field by separating them with a period. To view the git repository URL for the latest version of npm, you could do this:

    npm.commands.view(["npm", "repository.url"], callback)
    

    For fields that are arrays, requesting a non-numeric field will return all of the values from the objects in the list. For example, to get all the contributor names for the "express" project, you can do this:

    npm.commands.view(["express", "contributors.email"], callback)
    

    You may also use numeric indices in square braces to specifically select an item in an array field. To just get the email address of the first contributor in the list, you can do this:

    npm.commands.view(["express", "contributors[0].email"], callback)
    

    Multiple fields may be specified, and will be printed one after another. For exampls, to get all the contributor names and email addresses, you can do this:

    npm.commands.view(["express", "contributors.name", "contributors.email"], callback)
    

    "Person" fields are shown as a string if they would be shown as an object. So, for example, this will show the list of npm contributors in the shortened string format. (See npm help json for more on this.)

    npm.commands.view(["npm", "contributors"], callback)
    

    If a version range is provided, then data will be printed for every matching version of the package. This will show which version of jsdom was required by each matching version of yui3:

    npm.commands.view(["yui3@'>0.5.4'", "dependencies.jsdom"], callback)
    

    OUTPUT

    If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command.

    If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to.

    If multiple fields are requested, than each of them are prefixed with the field name.

    Console output can be disabled by setting the 'silent' parameter to true.

    RETURN VALUE

    The data returned will be an object in this formation:

    { <version>:
      { <field>: <value>
      , ... }
    , ... }
    

    corresponding to the list of fields selected.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm-whoami.html000644 000766 000024 00000000717 12455173731 027567 0ustar00iojsstaff000000 000000

    npm-whoami

    Display npm username

    SYNOPSIS

    npm.commands.whoami(args, callback)
    

    DESCRIPTION

    Print the username config to standard output.

    'args' is never used and callback is never called with data. 'args' must be present or things will break.

    This function is not useful programmatically

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/partial/doc/api/npm.html000644 000766 000024 00000010357 12455173731 026306 0ustar00iojsstaff000000 000000

    npm

    node package manager

    SYNOPSIS

    var npm = require("npm")
    npm.load([configObject, ]function (er, npm) {
      // use the npm object, now that it's loaded.
    
      npm.config.set(key, val)
      val = npm.config.get(key)
    
      console.log("prefix = %s", npm.prefix)
    
      npm.commands.install(["package"], cb)
    })
    

    VERSION

    2.1.18

    DESCRIPTION

    This is the API documentation for npm. To find documentation of the command line client, see npm(1).

    Prior to using npm's commands, npm.load() must be called. If you provide configObject as an object map of top-level configs, they override the values stored in the various config locations. In the npm command line client, this set of configs is parsed from the command line options. Additional configuration params are loaded from two configuration files. See npm-config(1), npm-config(7), and npmrc(5) for more information.

    After that, each of the functions are accessible in the commands object: npm.commands.<cmd>. See npm-index(7) for a list of all possible commands.

    All commands on the command object take an array of positional argument strings. The last argument to any function is a callback. Some commands take other optional arguments.

    Configs cannot currently be set on a per function basis, as each call to npm.config.set will change the value for all npm commands in that process.

    To find API documentation for a specific command, run the npm apihelp command.

    METHODS AND PROPERTIES

    • npm.load(configs, cb)

      Load the configuration params, and call the cb function once the globalconfig and userconfig files have been loaded as well, or on nextTick if they've already been loaded.

    • npm.config

      An object for accessing npm configuration parameters.

      • npm.config.get(key)
      • npm.config.set(key, val)
      • npm.config.del(key)
    • npm.dir or npm.root

      The node_modules directory where npm will operate.

    • npm.prefix

      The prefix where npm is operating. (Most often the current working directory.)

    • npm.cache

      The place where npm keeps JSON and tarballs it fetches from the registry (or uploads to the registry).

    • npm.tmp

      npm's temporary working directory.

    • npm.deref

      Get the "real" name for a command that has either an alias or abbreviation.

    MAGIC

    For each of the methods in the npm.commands object, a method is added to the npm object, which takes a set of positional string arguments rather than an array and a callback.

    If the last argument is a callback, then it will use the supplied callback. However, if no callback is provided, then it will print out the error or results.

    For example, this would work in a node repl:

    > npm = require("npm")
    > npm.load()  // wait a sec...
    > npm.install("dnode", "express")
    

    Note that that won't work in a node program, since the install method will get called before the configuration load is completed.

    ABBREVS

    In order to support npm ins foo instead of npm install foo, the npm.commands object has a set of abbreviations as well as the full method names. Use the npm.deref method to find the real name.

    For example:

    var cmd = npm.deref("unp") // cmd === "unpublish"
    
    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/000755 000766 000024 00000000000 12456115117 023157 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/000755 000766 000024 00000000000 12456115117 023155 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/files/000755 000766 000024 00000000000 12456115117 023510 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/index.html000644 000766 000024 00000041372 12455173731 024417 0ustar00iojsstaff000000 000000 index

    npm-index

    Index of all npm documentation

    README

    a JavaScript package manager

    Command Line Documentation

    Using npm on the command line

    npm(1)

    node package manager

    npm-adduser(1)

    Add a registry user account

    npm-bin(1)

    Display npm bin folder

    npm-bugs(1)

    Bugs for a package in a web browser maybe

    npm-build(1)

    Build a package

    npm-bundle(1)

    REMOVED

    npm-cache(1)

    Manipulates packages cache

    npm-completion(1)

    Tab Completion for npm

    npm-config(1)

    Manage the npm configuration files

    npm-dedupe(1)

    Reduce duplication

    npm-deprecate(1)

    Deprecate a version of a package

    npm-docs(1)

    Docs for a package in a web browser maybe

    npm-edit(1)

    Edit an installed package

    npm-explore(1)

    Browse an installed package

    npm-help-search(1)

    Search npm help documentation

    npm-help(1)

    Get help on npm

    npm-init(1)

    Interactively create a package.json file

    npm-install(1)

    Install a package

    Symlink a package folder

    npm-ls(1)

    List installed packages

    npm-outdated(1)

    Check for outdated packages

    npm-owner(1)

    Manage package owners

    npm-pack(1)

    Create a tarball from a package

    npm-prefix(1)

    Display prefix

    npm-prune(1)

    Remove extraneous packages

    npm-publish(1)

    Publish a package

    npm-rebuild(1)

    Rebuild a package

    npm-repo(1)

    Open package repository page in the browser

    npm-restart(1)

    Restart a package

    npm-rm(1)

    Remove a package

    npm-root(1)

    Display npm root

    npm-run-script(1)

    Run arbitrary package scripts

    npm-search(1)

    Search for packages

    npm-shrinkwrap(1)

    Lock down dependency versions

    npm-star(1)

    Mark your favorite packages

    npm-stars(1)

    View packages marked as favorites

    npm-start(1)

    Start a package

    npm-stop(1)

    Stop a package

    npm-tag(1)

    Tag a published version

    npm-test(1)

    Test a package

    npm-uninstall(1)

    Remove a package

    npm-unpublish(1)

    Remove a package from the registry

    npm-update(1)

    Update a package

    npm-version(1)

    Bump a package version

    npm-view(1)

    View registry info

    npm-whoami(1)

    Display npm username

    API Documentation

    Using npm in your Node programs

    npm(3)

    node package manager

    npm-bin(3)

    Display npm bin folder

    npm-bugs(3)

    Bugs for a package in a web browser maybe

    npm-cache(3)

    manage the npm cache programmatically

    npm-commands(3)

    npm commands

    npm-config(3)

    Manage the npm configuration files

    npm-deprecate(3)

    Deprecate a version of a package

    npm-docs(3)

    Docs for a package in a web browser maybe

    npm-edit(3)

    Edit an installed package

    npm-explore(3)

    Browse an installed package

    npm-help-search(3)

    Search the help pages

    npm-init(3)

    Interactively create a package.json file

    npm-install(3)

    install a package programmatically

    Symlink a package folder

    npm-load(3)

    Load config settings

    npm-ls(3)

    List installed packages

    npm-outdated(3)

    Check for outdated packages

    npm-owner(3)

    Manage package owners

    npm-pack(3)

    Create a tarball from a package

    npm-prefix(3)

    Display prefix

    npm-prune(3)

    Remove extraneous packages

    npm-publish(3)

    Publish a package

    npm-rebuild(3)

    Rebuild a package

    npm-repo(3)

    Open package repository page in the browser

    npm-restart(3)

    Restart a package

    npm-root(3)

    Display npm root

    npm-run-script(3)

    Run arbitrary package scripts

    npm-search(3)

    Search for packages

    npm-shrinkwrap(3)

    programmatically generate package shrinkwrap file

    npm-start(3)

    Start a package

    npm-stop(3)

    Stop a package

    npm-tag(3)

    Tag a published version

    npm-test(3)

    Test a package

    npm-uninstall(3)

    uninstall a package programmatically

    npm-unpublish(3)

    Remove a package from the registry

    npm-update(3)

    Update a package

    npm-version(3)

    Bump a package version

    npm-view(3)

    View registry info

    npm-whoami(3)

    Display npm username

    Files

    File system structures npm uses

    npm-folders(5)

    Folder Structures Used by npm

    npmrc(5)

    The npm config files

    package.json(5)

    Specifics of npm's package.json handling

    Misc

    Various other bits and bobs

    npm-coding-style(7)

    npm's "funny" coding style

    npm-config(7)

    More than you probably want to know about npm configuration

    npm-developers(7)

    Developer Guide

    npm-disputes(7)

    Handling Module Name Disputes

    npm-faq(7)

    Frequently Asked Questions

    npm-index(7)

    Index of all npm documentation

    npm-registry(7)

    The JavaScript Package Registry

    npm-scope(7)

    Scoped packages

    npm-scripts(7)

    How npm handles the "scripts" field

    removing-npm(7)

    Cleaning the Slate

    semver(7)

    The semantic versioner for npm

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/000755 000766 000024 00000000000 12456115117 023341 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/README.html000644 000766 000024 00000027301 12455173731 024241 0ustar00iojsstaff000000 000000 README

    npm

    a JavaScript package manager

    Build Status

    SYNOPSIS

    This is just enough info to get you up and running.

    Much more info available via npm help once it's installed.

    IMPORTANT

    You need node v0.8 or higher to run this program.

    To install an old and unsupported version of npm that works on node 0.3 and prior, clone the git repo and dig through the old tags and branches.

    Super Easy Install

    npm comes with node now.

    Windows Computers

    Get the MSI. npm is in it.

    Apple Macintosh Computers

    Get the pkg. npm is in it.

    Other Sorts of Unices

    Run make install. npm will be installed with node.

    If you want a more fancy pants install (a different version, customized paths, etc.) then read on.

    Fancy Install (Unix)

    There's a pretty robust install script at https://www.npmjs.com/install.sh. You can download that and run it.

    Here's an example using curl:

    curl -L https://npmjs.com/install.sh | sh
    

    Slightly Fancier

    You can set any npm configuration params with that script:

    npm_config_prefix=/some/path sh install.sh
    

    Or, you can run it in uber-debuggery mode:

    npm_debug=1 sh install.sh
    

    Even Fancier

    Get the code with git. Use make to build the docs and do other stuff. If you plan on hacking on npm, make link is your friend.

    If you've got the npm source code, you can also semi-permanently set arbitrary config keys using the ./configure --key=val ..., and then run npm commands by doing node cli.js <cmd> <args>. (This is helpful for testing, or running stuff without actually installing npm itself.)

    Windows Install or Upgrade

    You can download a zip file from https://github.com/npm/npm/releases, and unpack it in the same folder where node.exe lives.

    The latest version in a zip file is 1.4.12. To upgrade to npm 2, follow the Windows upgrade instructions in the npm Troubleshooting Guide:

    https://github.com/npm/npm/wiki/Troubleshooting#upgrading-on-windows

    If that's not fancy enough for you, then you can fetch the code with git, and mess with it directly.

    Installing on Cygwin

    No.

    Uninstalling

    So sad to see you go.

    sudo npm uninstall npm -g
    

    Or, if that fails,

    sudo make uninstall
    

    More Severe Uninstalling

    Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed.

    If you would like to remove all the packages that you have installed, then you can use the npm ls command to find them, and then npm rm to remove them.

    To remove cruft left behind by npm 0.x, you can use the included clean-old.sh script file. You can run it conveniently like this:

    npm explore npm -g -- sh scripts/clean-old.sh
    

    npm uses two configuration files, one for per-user configs, and another for global (every-user) configs. You can view them by doing:

    npm config get userconfig   # defaults to ~/.npmrc
    npm config get globalconfig # defaults to /usr/local/etc/npmrc
    

    Uninstalling npm does not remove configuration files by default. You must remove them yourself manually if you want them gone. Note that this means that future npm installs will not remember the settings that you have chosen.

    Using npm Programmatically

    If you would like to use npm programmatically, you can do that. It's not very well documented, but it is rather simple.

    Most of the time, unless you actually want to do all the things that npm does, you should try using one of npm's dependencies rather than using npm itself, if possible.

    Eventually, npm will be just a thin cli wrapper around the modules that it depends on, but for now, there are some things that you must use npm itself to do.

    var npm = require("npm")
    npm.load(myConfigObject, function (er) {
      if (er) return handlError(er)
      npm.commands.install(["some", "args"], function (er, data) {
        if (er) return commandFailed(er)
        // command succeeded, and data might have some info
      })
      npm.registry.log.on("log", function (message) { .... })
    })
    

    The load function takes an object hash of the command-line configs. The various npm.commands.<cmd> functions take an array of positional argument strings. The last argument to any npm.commands.<cmd> function is a callback. Some commands take other optional arguments. Read the source.

    You cannot set configs individually for any single npm function at this time. Since npm is a singleton, any call to npm.config.set will change the value for all npm commands in that process.

    See ./bin/npm-cli.js for an example of pulling config values off of the command line arguments using nopt. You may also want to check out npm help config to learn about all the options you can set there.

    More Docs

    Check out the docs, especially the faq.

    You can use the npm help command to read any of them.

    If you're a developer, and you want to use npm to publish your program, you should read this

    "npm" and "The npm Registry" are owned by npm, Inc. All rights reserved. See the included LICENSE file for more details.

    "Node.js" and "node" are trademarks owned by Joyent, Inc.

    Modules published on the npm registry are not officially endorsed by npm, Inc. or the Node.js project.

    Data published to the npm registry is not part of npm itself, and is the sole property of the publisher. While every effort is made to ensure accountability, there is absolutely no guarantee, warrantee, or assertion expressed or implied as to the quality, fitness for a specific purpose, or lack of malice in any given npm package.

    If you have a complaint about a package in the public npm registry, and cannot resolve it with the package owner, please email support@npmjs.com and explain the situation.

    Any data published to The npm Registry (including user account information) may be removed or modified at the sole discretion of the npm server administrators.

    In plainer english

    npm is the property of npm, Inc.

    If you publish something, it's yours, and you are solely accountable for it.

    If other people publish something, it's theirs.

    Users can publish Bad Stuff. It will be removed promptly if reported. But there is no vetting process for published modules, and you use them at your own risk. Please inspect the source.

    If you publish Bad Stuff, we may delete it from the registry, or even ban your account in extreme cases. So don't do that.

    BUGS

    When you find issues, please report them:

    Be sure to include all of the output from the npm command that didn't work as expected. The npm-debug.log file is also helpful to provide.

    You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-coding-style.html000644 000766 000024 00000021547 12455173731 027436 0ustar00iojsstaff000000 000000 npm-coding-style

    npm-coding-style

    npm's "funny" coding style

    DESCRIPTION

    npm's coding style is a bit unconventional. It is not different for difference's sake, but rather a carefully crafted style that is designed to reduce visual clutter and make bugs more apparent.

    If you want to contribute to npm (which is very encouraged), you should make your code conform to npm's style.

    Note: this concerns npm's code not the specific packages that you can download from the npm registry.

    Line Length

    Keep lines shorter than 80 characters. It's better for lines to be too short than to be too long. Break up long lists, objects, and other statements onto multiple lines.

    Indentation

    Two-spaces. Tabs are better, but they look like hell in web browsers (and on GitHub), and node uses 2 spaces, so that's that.

    Configure your editor appropriately.

    Curly braces

    Curly braces belong on the same line as the thing that necessitates them.

    Bad:

    function ()
    {
    

    Good:

    function () {
    

    If a block needs to wrap to the next line, use a curly brace. Don't use it if it doesn't.

    Bad:

    if (foo) { bar() }
    while (foo)
      bar()
    

    Good:

    if (foo) bar()
    while (foo) {
      bar()
    }
    

    Semicolons

    Don't use them except in four situations:

    • for (;;) loops. They're actually required.
    • null loops like: while (something) ; (But you'd better have a good reason for doing that.)
    • case "foo": doSomething(); break
    • In front of a leading ( or [ at the start of the line. This prevents the expression from being interpreted as a function call or property access, respectively.

    Some examples of good semicolon usage:

    ;(x || y).doSomething()
    ;[a, b, c].forEach(doSomething)
    for (var i = 0; i < 10; i ++) {
      switch (state) {
        case "begin": start(); continue
        case "end": finish(); break
        default: throw new Error("unknown state")
      }
      end()
    }
    

    Note that starting lines with - and + also should be prefixed with a semicolon, but this is much less common.

    Comma First

    If there is a list of things separated by commas, and it wraps across multiple lines, put the comma at the start of the next line, directly below the token that starts the list. Put the final token in the list on a line by itself. For example:

    var magicWords = [ "abracadabra"
                     , "gesundheit"
                     , "ventrilo"
                     ]
      , spells = { "fireball" : function () { setOnFire() }
                 , "water" : function () { putOut() }
                 }
      , a = 1
      , b = "abc"
      , etc
      , somethingElse
    

    Whitespace

    Put a single space in front of ( for anything other than a function call. Also use a single space wherever it makes things more readable.

    Don't leave trailing whitespace at the end of lines. Don't indent empty lines. Don't use more spaces than are helpful.

    Functions

    Use named functions. They make stack traces a lot easier to read.

    Callbacks, Sync/async Style

    Use the asynchronous/non-blocking versions of things as much as possible. It might make more sense for npm to use the synchronous fs APIs, but this way, the fs and http and child process stuff all uses the same callback-passing methodology.

    The callback should always be the last argument in the list. Its first argument is the Error or null.

    Be very careful never to ever ever throw anything. It's worse than useless. Just send the error message back as the first argument to the callback.

    Errors

    Always create a new Error object with your message. Don't just return a string message to the callback. Stack traces are handy.

    Logging

    Logging is done using the npmlog utility.

    Please clean up logs when they are no longer helpful. In particular, logging the same object over and over again is not helpful. Logs should report what's happening so that it's easier to track down where a fault occurs.

    Use appropriate log levels. See npm-config(7) and search for "loglevel".

    Case, naming, etc.

    Use lowerCamelCase for multiword identifiers when they refer to objects, functions, methods, properties, or anything not specified in this section.

    Use UpperCamelCase for class names (things that you'd pass to "new").

    Use all-lower-hyphen-css-case for multiword filenames and config keys.

    Use named functions. They make stack traces easier to follow.

    Use CAPS_SNAKE_CASE for constants, things that should never change and are rarely used.

    Use a single uppercase letter for function names where the function would normally be anonymous, but needs to call itself recursively. It makes it clear that it's a "throwaway" function.

    null, undefined, false, 0

    Boolean variables and functions should always be either true or false. Don't set it to 0 unless it's supposed to be a number.

    When something is intentionally missing or removed, set it to null.

    Don't set things to undefined. Reserve that value to mean "not yet set to anything."

    Boolean objects are verboten.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-config.html000644 000766 000024 00000075616 12455173731 026310 0ustar00iojsstaff000000 000000 npm-config

    npm-config

    More than you probably want to know about npm configuration

    DESCRIPTION

    npm gets its configuration values from 6 sources, in this priority:

    Command Line Flags

    Putting --foo bar on the command line sets the foo configuration parameter to "bar". A -- argument tells the cli parser to stop reading flags. A --flag parameter that is at the end of the command will be given the value of true.

    Environment Variables

    Any environment variables that start with npm_config_ will be interpreted as a configuration parameter. For example, putting npm_config_foo=bar in your environment will set the foo configuration parameter to bar. Any environment configurations that are not given a value will be given the value of true. Config values are case-insensitive, so NPM_CONFIG_FOO=bar will work the same.

    npmrc Files

    The four relevant files are:

    • per-project config file (/path/to/my/project/.npmrc)
    • per-user config file (~/.npmrc)
    • global config file ($PREFIX/npmrc)
    • npm builtin config file (/path/to/npm/npmrc)

    See npmrc(5) for more details.

    Default Configs

    A set of configuration parameters that are internal to npm, and are defaults if nothing else is specified.

    Shorthands and Other CLI Niceties

    The following shorthands are parsed on the command-line:

    • -v: --version
    • -h, -?, --help, -H: --usage
    • -s, --silent: --loglevel silent
    • -q, --quiet: --loglevel warn
    • -d: --loglevel info
    • -dd, --verbose: --loglevel verbose
    • -ddd: --loglevel silly
    • -g: --global
    • -C: --prefix
    • -l: --long
    • -m: --message
    • -p, --porcelain: --parseable
    • -reg: --registry
    • -v: --version
    • -f: --force
    • -desc: --description
    • -S: --save
    • -D: --save-dev
    • -O: --save-optional
    • -B: --save-bundle
    • -E: --save-exact
    • -y: --yes
    • -n: --yes false
    • ll and la commands: ls --long

    If the specified configuration param resolves unambiguously to a known configuration parameter, then it is expanded to that configuration parameter. For example:

    npm ls --par
    # same as:
    npm ls --parseable
    

    If multiple single-character shorthands are strung together, and the resulting combination is unambiguously not some other configuration param, then it is expanded to its various component pieces. For example:

    npm ls -gpld
    # same as:
    npm ls --global --parseable --long --loglevel info
    

    Per-Package Config Settings

    When running scripts (see npm-scripts(7)) the package.json "config" keys are overwritten in the environment if there is a config param of <name>[@<version>]:<key>. For example, if the package.json has this:

    { "name" : "foo"
    , "config" : { "port" : "8080" }
    , "scripts" : { "start" : "node server.js" } }
    

    and the server.js is this:

    http.createServer(...).listen(process.env.npm_package_config_port)
    

    then the user could change the behavior by doing:

    npm config set foo:port 80
    

    See package.json(5) for more information.

    Config Settings

    always-auth

    • Default: false
    • Type: Boolean

    Force npm to always require authentication when accessing the registry, even for GET requests.

    • Default: true
    • Type: Boolean

    Tells npm to create symlinks (or .cmd shims on Windows) for package executables.

    Set to false to have it not do this. This can be used to work around the fact that some file systems don't support symlinks, even on ostensibly Unix systems.

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm docs command to open websites.

    ca

    • Default: The npm CA certificate
    • Type: String, Array or null

    The Certificate Authority signing certificate that is trusted for SSL connections to the registry. Values should be in PEM format with newlines replaced by the string "\n". For example:

    ca="-----BEGIN CERTIFICATE-----\nXXXX\nXXXX\n-----END CERTIFICATE-----"
    

    Set to null to only allow "known" registrars, or to a specific CA cert to trust only that specific signing authority.

    Multiple CAs can be trusted by specifying an array of certificates:

    ca[]="..."
    ca[]="..."
    

    See also the strict-ssl config.

    cafile

    • Default: null
    • Type: path

    A path to a file containing one or multiple Certificate Authority signing certificates. Similar to the ca setting, but allows for multiple CA's, as well as for the CA information to be stored in a file on disk.

    cache

    • Default: Windows: %AppData%\npm-cache, Posix: ~/.npm
    • Type: path

    The location of npm's cache directory. See npm-cache(1)

    cache-lock-stale

    • Default: 60000 (1 minute)
    • Type: Number

    The number of ms before cache folder lockfiles are considered stale.

    cache-lock-retries

    • Default: 10
    • Type: Number

    Number of times to retry to acquire a lock on cache folder lockfiles.

    cache-lock-wait

    • Default: 10000 (10 seconds)
    • Type: Number

    Number of ms to wait for cache lock files to expire.

    cache-max

    • Default: Infinity
    • Type: Number

    The maximum time (in seconds) to keep items in the registry cache before re-checking against the registry.

    Note that no purging is done unless the npm cache clean command is explicitly used, and that only GET requests use the cache.

    cache-min

    • Default: 10
    • Type: Number

    The minimum time (in seconds) to keep items in the registry cache before re-checking against the registry.

    Note that no purging is done unless the npm cache clean command is explicitly used, and that only GET requests use the cache.

    cert

    • Default: null
    • Type: String

    A client certificate to pass when accessing the registry.

    color

    • Default: true on Posix, false on Windows
    • Type: Boolean or "always"

    If false, never shows colors. If "always" then always shows colors. If true, then only prints color codes for tty file descriptors.

    depth

    • Default: Infinity
    • Type: Number

    The depth to go when recursing directories for npm ls and npm cache ls.

    description

    • Default: true
    • Type: Boolean

    Show the description in npm search

    dev

    • Default: false
    • Type: Boolean

    Install dev-dependencies along with packages.

    Note that dev-dependencies are also installed if the npat flag is set.

    editor

    • Default: EDITOR environment variable if set, or "vi" on Posix, or "notepad" on Windows.
    • Type: path

    The command to run for npm edit or npm config edit.

    engine-strict

    • Default: false
    • Type: Boolean

    If set to true, then npm will stubbornly refuse to install (or even consider installing) any package that claims to not be compatible with the current Node.js version.

    force

    • Default: false
    • Type: Boolean

    Makes various commands more forceful.

    • lifecycle script failure does not block progress.
    • publishing clobbers previously published versions.
    • skips cache when requesting from the registry.
    • prevents checks against clobbering non-npm files.

    fetch-retries

    • Default: 2
    • Type: Number

    The "retries" config for the retry module to use when fetching packages from the registry.

    fetch-retry-factor

    • Default: 10
    • Type: Number

    The "factor" config for the retry module to use when fetching packages.

    fetch-retry-mintimeout

    • Default: 10000 (10 seconds)
    • Type: Number

    The "minTimeout" config for the retry module to use when fetching packages.

    fetch-retry-maxtimeout

    • Default: 60000 (1 minute)
    • Type: Number

    The "maxTimeout" config for the retry module to use when fetching packages.

    git

    • Default: "git"
    • Type: String

    The command to use for git commands. If git is installed on the computer, but is not in the PATH, then set this to the full path to the git binary.

    git-tag-version

    • Default: true
    • Type: Boolean

    Tag the commit when using the npm version command.

    global

    • Default: false
    • Type: Boolean

    Operates in "global" mode, so that packages are installed into the prefix folder instead of the current working directory. See npm-folders(5) for more on the differences in behavior.

    • packages are installed into the {prefix}/lib/node_modules folder, instead of the current working directory.
    • bin files are linked to {prefix}/bin
    • man pages are linked to {prefix}/share/man

    globalconfig

    • Default: {prefix}/etc/npmrc
    • Type: path

    The config file to read for global config options.

    group

    • Default: GID of the current process
    • Type: String or Number

    The group to use when running package scripts in global mode as the root user.

    heading

    • Default: "npm"
    • Type: String

    The string that starts all the debugging log output.

    https-proxy

    • Default: null
    • Type: url

    A proxy to use for outgoing https requests. If the HTTPS_PROXY or https_proxy or HTTP_PROXY or http_proxy environment variables are set, proxy settings will be honored by the underlying request library.

    ignore-scripts

    • Default: false
    • Type: Boolean

    If true, npm does not run scripts specified in package.json files.

    init-module

    • Default: ~/.npm-init.js
    • Type: path

    A module that will be loaded by the npm init command. See the documentation for the init-package-json module for more information, or npm-init(1).

    init-author-name

    • Default: ""
    • Type: String

    The value npm init should use by default for the package author's name.

    init-author-email

    • Default: ""
    • Type: String

    The value npm init should use by default for the package author's email.

    init-author-url

    • Default: ""
    • Type: String

    The value npm init should use by default for the package author's homepage.

    init-license

    • Default: "ISC"
    • Type: String

    The value npm init should use by default for the package license.

    init-version

    • Default: "0.0.0"
    • Type: semver

    The value that npm init should use by default for the package version number, if not already set in package.json.

    json

    • Default: false
    • Type: Boolean

    Whether or not to output JSON data, rather than the normal output.

    This feature is currently experimental, and the output data structures for many commands is either not implemented in JSON yet, or subject to change. Only the output from npm ls --json is currently valid.

    key

    • Default: null
    • Type: String

    A client key to pass when accessing the registry.

    • Default: false
    • Type: Boolean

    If true, then local installs will link if there is a suitable globally installed package.

    Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met:

    • The package is not already installed globally, or
    • the globally installed version is identical to the version that is being installed locally.

    local-address

    • Default: undefined
    • Type: IP Address

    The IP address of the local interface to use when making connections to the npm registry. Must be IPv4 in versions of Node prior to 0.12.

    loglevel

    • Default: "warn"
    • Type: String
    • Values: "silent", "error", "warn", "http", "info", "verbose", "silly"

    What level of logs to report. On failure, all logs are written to npm-debug.log in the current working directory.

    Any logs of a higher level than the setting are shown. The default is "warn", which shows warn and error output.

    logstream

    • Default: process.stderr
    • Type: Stream

    This is the stream that is passed to the npmlog module at run time.

    It cannot be set from the command line, but if you are using npm programmatically, you may wish to send logs to somewhere other than stderr.

    If the color config is set to true, then this stream will receive colored output if it is a TTY.

    long

    • Default: false
    • Type: Boolean

    Show extended information in npm ls and npm search.

    message

    • Default: "%s"
    • Type: String

    Commit message which is used by npm version when creating version commit.

    Any "%s" in the message will be replaced with the version number.

    node-version

    • Default: process.version
    • Type: semver or false

    The node version to use when checking a package's engines map.

    npat

    • Default: false
    • Type: Boolean

    Run tests on installation.

    onload-script

    • Default: false
    • Type: path

    A node module to require() when npm loads. Useful for programmatic usage.

    optional

    • Default: true
    • Type: Boolean

    Attempt to install packages in the optionalDependencies object. Note that if these packages fail to install, the overall installation process is not aborted.

    parseable

    • Default: false
    • Type: Boolean

    Output parseable results from commands that write to standard output.

    prefix

    The location to install global items. If set on the command line, then it forces non-global commands to run in the specified folder.

    production

    • Default: false
    • Type: Boolean

    Set to true to run in "production" mode.

    1. devDependencies are not installed at the topmost level when running local npm install without any arguments.
    2. Set the NODE_ENV="production" for lifecycle scripts.

    proprietary-attribs

    • Default: true
    • Type: Boolean

    Whether or not to include proprietary extended attributes in the tarballs created by npm.

    Unless you are expecting to unpack package tarballs with something other than npm -- particularly a very outdated tar implementation -- leave this as true.

    proxy

    • Default: null
    • Type: url

    A proxy to use for outgoing http requests. If the HTTP_PROXY or http_proxy environment variables are set, proxy settings will be honored by the underlying request library.

    rebuild-bundle

    • Default: true
    • Type: Boolean

    Rebuild bundled dependencies after installation.

    registry

    The base URL of the npm package registry.

    rollback

    • Default: true
    • Type: Boolean

    Remove failed installs.

    save

    • Default: false
    • Type: Boolean

    Save installed packages to a package.json file as dependencies.

    When used with the npm rm command, it removes it from the dependencies object.

    Only works if there is already a package.json file present.

    save-bundle

    • Default: false
    • Type: Boolean

    If a package would be saved at install time by the use of --save, --save-dev, or --save-optional, then also put it in the bundleDependencies list.

    When used with the npm rm command, it removes it from the bundledDependencies list.

    save-dev

    • Default: false
    • Type: Boolean

    Save installed packages to a package.json file as devDependencies.

    When used with the npm rm command, it removes it from the devDependencies object.

    Only works if there is already a package.json file present.

    save-exact

    • Default: false
    • Type: Boolean

    Dependencies saved to package.json using --save, --save-dev or --save-optional will be configured with an exact version rather than using npm's default semver range operator.

    save-optional

    • Default: false
    • Type: Boolean

    Save installed packages to a package.json file as optionalDependencies.

    When used with the npm rm command, it removes it from the devDependencies object.

    Only works if there is already a package.json file present.

    save-prefix

    • Default: '^'
    • Type: String

    Configure how versions of packages installed to a package.json file via --save or --save-dev get prefixed.

    For example if a package has version 1.2.3, by default it's version is set to ^1.2.3 which allows minor upgrades for that package, but after npm config set save-prefix='~' it would be set to ~1.2.3 which only allows patch upgrades.

    scope

    • Default: ""
    • Type: String

    Associate an operation with a scope for a scoped registry. Useful when logging in to a private registry for the first time: npm login --scope=@organization --registry=registry.organization.com, which will cause @organization to be mapped to the registry for future installation of packages specified according to the pattern @organization/package.

    searchopts

    • Default: ""
    • Type: String

    Space-separated options that are always passed to search.

    searchexclude

    • Default: ""
    • Type: String

    Space-separated options that limit the results from search.

    searchsort

    • Default: "name"
    • Type: String
    • Values: "name", "-name", "date", "-date", "description", "-description", "keywords", "-keywords"

    Indication of which field to sort search results by. Prefix with a - character to indicate reverse sort.

    shell

    • Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows
    • Type: path

    The shell to run for the npm explore command.

    shrinkwrap

    • Default: true
    • Type: Boolean

    If set to false, then ignore npm-shrinkwrap.json files when installing.

    sign-git-tag

    • Default: false
    • Type: Boolean

    If set to true, then the npm version command will tag the version using -s to add a signature.

    Note that git requires you to have set up GPG keys in your git configs for this to work properly.

    spin

    • Default: true
    • Type: Boolean or "always"

    When set to true, npm will display an ascii spinner while it is doing things, if process.stderr is a TTY.

    Set to false to suppress the spinner, or set to always to output the spinner even for non-TTY outputs.

    strict-ssl

    • Default: true
    • Type: Boolean

    Whether or not to do SSL key validation when making requests to the registry via https.

    See also the ca config.

    tag

    • Default: latest
    • Type: String

    If you ask npm to install a package and don't tell it a specific version, then it will install the specified tag.

    Also the tag that is added to the package@version specified by the npm tag command, if no explicit tag is given.

    tmp

    • Default: TMPDIR environment variable, or "/tmp"
    • Type: path

    Where to store temporary files and folders. All temp files are deleted on success, but left behind on failure for forensic purposes.

    unicode

    • Default: true
    • Type: Boolean

    When set to true, npm uses unicode characters in the tree output. When false, it uses ascii characters to draw trees.

    unsafe-perm

    • Default: false if running as root, true otherwise
    • Type: Boolean

    Set to true to suppress the UID/GID switching when running package scripts. If set explicitly to false, then installing as a non-root user will fail.

    usage

    • Default: false
    • Type: Boolean

    Set to show short usage output (like the -H output) instead of complete help when doing npm-help(1).

    user

    • Default: "nobody"
    • Type: String or Number

    The UID to set to when running package scripts as root.

    userconfig

    • Default: ~/.npmrc
    • Type: path

    The location of user-level configuration settings.

    umask

    • Default: 022
    • Type: Octal numeric string

    The "umask" value to use when setting the file creation mode on files and folders.

    Folders and executables are given a mode which is 0777 masked against this value. Other files are given a mode which is 0666 masked against this value. Thus, the defaults are 0755 and 0644 respectively.

    user-agent

    • Default: node/{process.version} {process.platform} {process.arch}
    • Type: String

    Sets a User-Agent to the request header

    version

    • Default: false
    • Type: boolean

    If true, output the npm version and exit successfully.

    Only relevant when specified explicitly on the command line.

    versions

    • Default: false
    • Type: boolean

    If true, output the npm version as well as node's process.versions map, and exit successfully.

    Only relevant when specified explicitly on the command line.

    viewer

    • Default: "man" on Posix, "browser" on Windows
    • Type: path

    The program to use to view help content.

    Set to "browser" to view html help content in the default web browser.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-developers.html000644 000766 000024 00000027367 12455173731 027213 0ustar00iojsstaff000000 000000 npm-developers

    npm-developers

    Developer Guide

    DESCRIPTION

    So, you've decided to use npm to develop (and maybe publish/deploy) your project.

    Fantastic!

    There are a few things that you need to do above the simple steps that your users will do to install your program.

    About These Documents

    These are man pages. If you install npm, you should be able to then do man npm-thing to get the documentation on a particular topic, or npm help thing to see the same information.

    What is a package

    A package is:

    • a) a folder containing a program described by a package.json file
    • b) a gzipped tarball containing (a)
    • c) a url that resolves to (b)
    • d) a <name>@<version> that is published on the registry with (c)
    • e) a <name>@<tag> that points to (d)
    • f) a <name> that has a "latest" tag satisfying (e)
    • g) a git url that, when cloned, results in (a).

    Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b).

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    The package.json File

    You need to have a package.json file in the root of your project to do much of anything with npm. That is basically the whole interface.

    See package.json(5) for details about what goes in that file. At the very least, you need:

    • name: This should be a string that identifies your project. Please do not use the name to specify that it runs on node, or is in JavaScript. You can use the "engines" field to explicitly state the versions of node (or whatever else) that your program requires, and it's pretty well assumed that it's javascript.

      It does not necessarily need to match your github repository name.

      So, node-foo and bar-js are bad names. foo or bar are better.

    • version: A semver-compatible version.

    • engines: Specify the versions of node (or whatever else) that your program runs on. The node API changes a lot, and there may be bugs or new functionality that you depend on. Be explicit.

    • author: Take some credit.

    • scripts: If you have a special compilation or installation script, then you should put it in the scripts object. You should definitely have at least a basic smoke-test command as the "scripts.test" field. See npm-scripts(7).

    • main: If you have a single module that serves as the entry point to your program (like what the "foo" package gives you at require("foo")), then you need to specify that in the "main" field.

    • directories: This is an object mapping names to folders. The best ones to include are "lib" and "doc", but if you use "man" to specify a folder full of man pages, they'll get installed just like these ones.

    You can use npm init in the root of your package in order to get you started with a pretty basic package.json file. See npm-init(1) for more info.

    Keeping files out of your package

    Use a .npmignore file to keep stuff out of your package. If there's no .npmignore file, but there is a .gitignore file, then npm will ignore the stuff matched by the .gitignore file. If you want to include something that is excluded by your .gitignore file, you can create an empty .npmignore file to override it.

    .npmignore files follow the same pattern rules as .gitignore files:

    • Blank lines or lines starting with # are ignored.
    • Standard glob patterns work.
    • You can end patterns with a forward slash / to specify a directory.
    • You can negate a pattern by starting it with an exclamation point !.

    By default, the following paths and files are ignored, so there's no need to add them to .npmignore explicitly:

    • .*.swp
    • ._*
    • .DS_Store
    • .git
    • .hg
    • .lock-wscript
    • .svn
    • .wafpickle-*
    • CVS
    • npm-debug.log

    Additionally, everything in node_modules is ignored, except for bundled dependencies. npm automatically handles this for you, so don't bother adding node_modules to .npmignore.

    The following paths and files are never ignored, so adding them to .npmignore is pointless:

    npm link is designed to install a development package and see the changes in real time without having to keep re-installing it. (You do need to either re-link or npm rebuild -g to update compiled packages, of course.)

    More info at npm-link(1).

    Before Publishing: Make Sure Your Package Installs and Works

    This is important.

    If you can not install it locally, you'll have problems trying to publish it. Or, worse yet, you'll be able to publish it, but you'll be publishing a broken or pointless package. So don't do that.

    In the root of your package, do this:

    npm install . -g
    

    That'll show you that it's working. If you'd rather just create a symlink package that points to your working directory, then do this:

    npm link
    

    Use npm ls -g to see if it's there.

    To test a local install, go into some other folder, and then do:

    cd ../some-other-folder
    npm install ../my-package
    

    to install it locally into the node_modules folder in that other place.

    Then go into the node-repl, and try using require("my-thing") to bring in your module's main module.

    Create a User Account

    Create a user with the adduser command. It works like this:

    npm adduser
    

    and then follow the prompts.

    This is documented better in npm-adduser(1).

    Publish your package

    This part's easy. IN the root of your folder, do this:

    npm publish
    

    You can give publish a url to a tarball, or a filename of a tarball, or a path to a folder.

    Note that pretty much everything in that folder will be exposed by default. So, if you have secret stuff in there, use a .npmignore file to list out the globs to ignore, or publish from a fresh checkout.

    Brag about it

    Send emails, write blogs, blab in IRC.

    Tell the world how easy it is to install your program!

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-disputes.html000644 000766 000024 00000020450 12455173731 026665 0ustar00iojsstaff000000 000000 npm-disputes

    npm-disputes

    Handling Module Name Disputes

    SYNOPSIS

    1. Get the author email with npm owner ls <pkgname>
    2. Email the author, CC support@npmjs.com
    3. After a few weeks, if there's no resolution, we'll sort it out.

    Don't squat on package names. Publish code or move out of the way.

    DESCRIPTION

    There sometimes arise cases where a user publishes a module, and then later, some other user wants to use that name. Here are some common ways that happens (each of these is based on actual events.)

    1. Joe writes a JavaScript module foo, which is not node-specific. Joe doesn't use node at all. Bob wants to use foo in node, so he wraps it in an npm module. Some time later, Joe starts using node, and wants to take over management of his program.
    2. Bob writes an npm module foo, and publishes it. Perhaps much later, Joe finds a bug in foo, and fixes it. He sends a pull request to Bob, but Bob doesn't have the time to deal with it, because he has a new job and a new baby and is focused on his new erlang project, and kind of not involved with node any more. Joe would like to publish a new foo, but can't, because the name is taken.
    3. Bob writes a 10-line flow-control library, and calls it foo, and publishes it to the npm registry. Being a simple little thing, it never really has to be updated. Joe works for Foo Inc, the makers of the critically acclaimed and widely-marketed foo JavaScript toolkit framework. They publish it to npm as foojs, but people are routinely confused when npm install foo is some different thing.
    4. Bob writes a parser for the widely-known foo file format, because he needs it for work. Then, he gets a new job, and never updates the prototype. Later on, Joe writes a much more complete foo parser, but can't publish, because Bob's foo is in the way.

    The validity of Joe's claim in each situation can be debated. However, Joe's appropriate course of action in each case is the same.

    1. npm owner ls foo. This will tell Joe the email address of the owner (Bob).
    2. Joe emails Bob, explaining the situation as respectfully as possible, and what he would like to do with the module name. He adds the npm support staff support@npmjs.com to the CC list of the email. Mention in the email that Bob can run npm owner add joe foo to add Joe as an owner of the foo package.
    3. After a reasonable amount of time, if Bob has not responded, or if Bob and Joe can't come to any sort of resolution, email support support@npmjs.com and we'll sort it out. ("Reasonable" is usually at least 4 weeks, but extra time is allowed around common holidays.)

    REASONING

    In almost every case so far, the parties involved have been able to reach an amicable resolution without any major intervention. Most people really do want to be reasonable, and are probably not even aware that they're in your way.

    Module ecosystems are most vibrant and powerful when they are as self-directed as possible. If an admin one day deletes something you had worked on, then that is going to make most people quite upset, regardless of the justification. When humans solve their problems by talking to other humans with respect, everyone has the chance to end up feeling good about the interaction.

    EXCEPTIONS

    Some things are not allowed, and will be removed without discussion if they are brought to the attention of the npm registry admins, including but not limited to:

    1. Malware (that is, a package designed to exploit or harm the machine on which it is installed).
    2. Violations of copyright or licenses (for example, cloning an MIT-licensed program, and then removing or changing the copyright and license statement).
    3. Illegal content.
    4. "Squatting" on a package name that you plan to use, but aren't actually using. Sorry, I don't care how great the name is, or how perfect a fit it is for the thing that someday might happen. If someone wants to use it today, and you're just taking up space with an empty tarball, you're going to be evicted.
    5. Putting empty packages in the registry. Packages must have SOME functionality. It can be silly, but it can't be nothing. (See also: squatting.)
    6. Doing weird things with the registry, like using it as your own personal application database or otherwise putting non-packagey things into it.

    If you see bad behavior like this, please report it right away.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-faq.html000644 000766 000024 00000055000 12455173731 025573 0ustar00iojsstaff000000 000000 npm-faq

    npm-faq

    Frequently Asked Questions

    Where can I find these docs in HTML?

    https://docs.npmjs.com/, or run:

    npm config set viewer browser
    

    to open these documents in your default web browser rather than man.

    It didn't work.

    That's not really a question.

    Why didn't it work?

    I don't know yet.

    Read the error output, and if you can't figure out what it means, do what it says and post a bug with all the information it asks for.

    Where does npm put stuff?

    See npm-folders(5)

    tl;dr:

    • Use the npm root command to see where modules go, and the npm bin command to see where executables go
    • Global installs are different from local installs. If you install something with the -g flag, then its executables go in npm bin -g and its modules go in npm root -g.

    How do I install something on my computer in a central location?

    Install it globally by tacking -g or --global to the command. (This is especially important for command line utilities that need to add their bins to the global system PATH.)

    I installed something globally, but I can't require() it

    Install it locally.

    The global install location is a place for command-line utilities to put their bins in the system PATH. It's not for use with require().

    If you require() a module in your code, then that means it's a dependency, and a part of your program. You need to install it locally in your program.

    Why can't npm just put everything in one place, like other package managers?

    Not every change is an improvement, but every improvement is a change. This would be like asking git to do network IO for every commit. It's not going to happen, because it's a terrible idea that causes more problems than it solves.

    It is much harder to avoid dependency conflicts without nesting dependencies. This is fundamental to the way that npm works, and has proven to be an extremely successful approach. See npm-folders(5) for more details.

    If you want a package to be installed in one place, and have all your programs reference the same copy of it, then use the npm link command. That's what it's for. Install it globally, then link it into each program that uses it.

    Whatever, I really want the old style 'everything global' style.

    Write your own package manager. You could probably even wrap up npm in a shell script if you really wanted to.

    npm will not help you do something that is known to be a bad idea.

    Should I check my node_modules folder into git?

    Usually, no. Allow npm to resolve dependencies for your packages.

    For packages you deploy, such as websites and apps, you should use npm shrinkwrap to lock down your full dependency tree:

    https://docs.npmjs.com/cli/shrinkwrap

    If you are paranoid about depending on the npm ecosystem, you should run a private npm mirror or a private cache.

    If you want 100% confidence in being able to reproduce the specific bytes included in a deployment, you should use an additional mechanism that can verify contents rather than versions. For example, Amazon machine images, DigitalOcean snapshots, Heroku slugs, or simple tarballs.

    Is it 'npm' or 'NPM' or 'Npm'?

    npm should never be capitalized unless it is being displayed in a location that is customarily all-caps (such as the title of man pages.)

    If 'npm' is an acronym, why is it never capitalized?

    Contrary to the belief of many, "npm" is not in fact an abbreviation for "Node Package Manager". It is a recursive bacronymic abbreviation for "npm is not an acronym". (If it was "ninaa", then it would be an acronym, and thus incorrectly named.)

    "NPM", however, is an acronym (more precisely, a capitonym) for the National Association of Pastoral Musicians. You can learn more about them at http://npm.org/.

    In software, "NPM" is a Non-Parametric Mapping utility written by Chris Rorden. You can analyze pictures of brains with it. Learn more about the (capitalized) NPM program at http://www.cabiatl.com/mricro/npm/.

    The first seed that eventually grew into this flower was a bash utility named "pm", which was a shortened descendent of "pkgmakeinst", a bash function that was used to install various different things on different platforms, most often using Yahoo's yinst. If npm was ever an acronym for anything, it was node pm or maybe new pm.

    So, in all seriousness, the "npm" project is named after its command-line utility, which was organically selected to be easily typed by a right-handed programmer using a US QWERTY keyboard layout, ending with the right-ring-finger in a postition to type the - key for flags and other command-line arguments. That command-line utility is always lower-case, though it starts most sentences it is a part of.

    How do I list installed packages?

    npm ls

    How do I search for packages?

    npm search

    Arguments are greps. npm search jsdom shows jsdom packages.

    How do I update npm?

    npm install npm -g
    

    You can also update all outdated local packages by doing npm update without any arguments, or global packages by doing npm update -g.

    Occasionally, the version of npm will progress such that the current version cannot be properly installed with the version that you have installed already. (Consider, if there is ever a bug in the update command.)

    In those cases, you can do this:

    curl https://www.npmjs.com/install.sh | sh
    

    What is a package?

    A package is:

    • a) a folder containing a program described by a package.json file
    • b) a gzipped tarball containing (a)
    • c) a url that resolves to (b)
    • d) a <name>@<version> that is published on the registry with (c)
    • e) a <name>@<tag> that points to (d)
    • f) a <name> that has a "latest" tag satisfying (e)
    • g) a git url that, when cloned, results in (a).

    Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b).

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    What is a module?

    A module is anything that can be loaded with require() in a Node.js program. The following things are all examples of things that can be loaded as modules:

    • A folder with a package.json file containing a main field.
    • A folder with an index.js file in it.
    • A JavaScript file.

    Most npm packages are modules, because they are libraries that you load with require. However, there's no requirement that an npm package be a module! Some only contain an executable command-line interface, and don't provide a main field for use in Node programs.

    Almost all npm packages (at least, those that are Node programs) contain many modules within them (because every file they load with require() is a module).

    In the context of a Node program, the module is also the thing that was loaded from a file. For example, in the following program:

    var req = require('request')
    

    we might say that "The variable req refers to the request module".

    So, why is it the "node_modules" folder, but "package.json" file? Why not node_packages or module.json?

    The package.json file defines the package. (See "What is a package?" above.)

    The node_modules folder is the place Node.js looks for modules. (See "What is a module?" above.)

    For example, if you create a file at node_modules/foo.js and then had a program that did var f = require('foo.js') then it would load the module. However, foo.js is not a "package" in this case, because it does not have a package.json.

    Alternatively, if you create a package which does not have an index.js or a "main" field in the package.json file, then it is not a module. Even if it's installed in node_modules, it can't be an argument to require().

    "node_modules" is the name of my deity's arch-rival, and a Forbidden Word in my religion. Can I configure npm to use a different folder?

    No. This will never happen. This question comes up sometimes, because it seems silly from the outside that npm couldn't just be configured to put stuff somewhere else, and then npm could load them from there. It's an arbitrary spelling choice, right? What's the big deal?

    At the time of this writing, the string 'node_modules' appears 151 times in 53 separate files in npm and node core (excluding tests and documentation).

    Some of these references are in node's built-in module loader. Since npm is not involved at all at run-time, node itself would have to be configured to know where you've decided to stick stuff. Complexity hurdle #1. Since the Node module system is locked, this cannot be changed, and is enough to kill this request. But I'll continue, in deference to your deity's delicate feelings regarding spelling.

    Many of the others are in dependencies that npm uses, which are not necessarily tightly coupled to npm (in the sense that they do not read npm's configuration files, etc.) Each of these would have to be configured to take the name of the node_modules folder as a parameter. Complexity hurdle #2.

    Furthermore, npm has the ability to "bundle" dependencies by adding the dep names to the "bundledDependencies" list in package.json, which causes the folder to be included in the package tarball. What if the author of a module bundles its dependencies, and they use a different spelling for node_modules? npm would have to rename the folder at publish time, and then be smart enough to unpack it using your locally configured name. Complexity hurdle #3.

    Furthermore, what happens when you change this name? Fine, it's easy enough the first time, just rename the node_modules folders to ./blergyblerp/ or whatever name you choose. But what about when you change it again? npm doesn't currently track any state about past configuration settings, so this would be rather difficult to do properly. It would have to track every previous value for this config, and always accept any of them, or else yesterday's install may be broken tomorrow. Complexity hurdle #4.

    Never going to happen. The folder is named node_modules. It is written indelibly in the Node Way, handed down from the ancient times of Node 0.3.

    How do I install node with npm?

    You don't. Try one of these node version managers:

    Unix:

    Windows:

    How can I use npm for development?

    See npm-developers(7) and package.json(5).

    You'll most likely want to npm link your development folder. That's awesomely handy.

    To set up your own private registry, check out npm-registry(7).

    Can I list a url as a dependency?

    Yes. It should be a url to a gzipped tarball containing a single folder that has a package.json in its root, or a git url. (See "what is a package?" above.)

    See npm-link(1)

    The package registry website. What is that exactly?

    See npm-registry(7).

    I forgot my password, and can't publish. How do I reset it?

    Go to https://npmjs.com/forgot.

    I get ECONNREFUSED a lot. What's up?

    Either the registry is down, or node's DNS isn't able to reach out.

    To check if the registry is down, open up https://registry.npmjs.org/ in a web browser. This will also tell you if you are just unable to access the internet for some reason.

    If the registry IS down, let us know by emailing support@npmjs.com or posting an issue at https://github.com/npm/npm/issues. If it's down for the world (and not just on your local network) then we're probably already being pinged about it.

    You can also often get a faster response by visiting the #npm channel on Freenode IRC.

    Why no namespaces?

    npm has only one global namespace. If you want to namespace your own packages, you may: simply use the - character to separate the names. npm is a mostly anarchic system. There is not sufficient need to impose namespace rules on everyone.

    As of 2.0, npm supports scoped packages, which allow you to publish a group of related modules without worrying about name collisions.

    Every npm user owns the scope associated with their username. For example, the user named npm owns the scope @npm. Scoped packages are published inside a scope by naming them as if they were files under the scope directory, e.g., by setting name in package.json to @npm/npm.

    Scoped packages can coexist with public npm packages in a private npm registry. At present (2014-11-04) scoped packages may NOT be published to the public npm registry.

    Unscoped packages can only depend on other unscoped packages. Scoped packages can depend on packages from their own scope, a different scope, or the public registry (unscoped).

    For the current documentation of scoped packages, see https://docs.npmjs.com/misc/scope

    References:

    1. For the reasoning behind the "one global namespace", please see this discussion: https://github.com/npm/npm/issues/798 (TL;DR: It doesn't actually make things better, and can make them worse.)

    2. For the pre-implementation discussion of the scoped package feature, see this discussion: https://github.com/npm/npm/issues/5239

    Who does npm?

    npm was originally written by Isaac Z. Schlueter, and many others have contributed to it, some of them quite substantially.

    The npm open source project, The npm Registry, and the community website are maintained and operated by the good folks at npm, Inc.

    I have a question or request not addressed here. Where should I put it?

    Post an issue on the github project:

    Why does npm hate me?

    npm is not capable of hatred. It loves everyone, especially you.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-index.html000644 000766 000024 00000042554 12455173731 026145 0ustar00iojsstaff000000 000000 npm-index

    npm-index

    Index of all npm documentation

    README

    a JavaScript package manager

    Command Line Documentation

    Using npm on the command line

    npm(1)

    node package manager

    npm-adduser(1)

    Add a registry user account

    npm-bin(1)

    Display npm bin folder

    npm-bugs(1)

    Bugs for a package in a web browser maybe

    npm-build(1)

    Build a package

    npm-bundle(1)

    REMOVED

    npm-cache(1)

    Manipulates packages cache

    npm-completion(1)

    Tab Completion for npm

    npm-config(1)

    Manage the npm configuration files

    npm-dedupe(1)

    Reduce duplication

    npm-deprecate(1)

    Deprecate a version of a package

    npm-docs(1)

    Docs for a package in a web browser maybe

    npm-edit(1)

    Edit an installed package

    npm-explore(1)

    Browse an installed package

    npm-help-search(1)

    Search npm help documentation

    npm-help(1)

    Get help on npm

    npm-init(1)

    Interactively create a package.json file

    npm-install(1)

    Install a package

    Symlink a package folder

    npm-ls(1)

    List installed packages

    npm-outdated(1)

    Check for outdated packages

    npm-owner(1)

    Manage package owners

    npm-pack(1)

    Create a tarball from a package

    npm-prefix(1)

    Display prefix

    npm-prune(1)

    Remove extraneous packages

    npm-publish(1)

    Publish a package

    npm-rebuild(1)

    Rebuild a package

    npm-repo(1)

    Open package repository page in the browser

    npm-restart(1)

    Restart a package

    npm-rm(1)

    Remove a package

    npm-root(1)

    Display npm root

    npm-run-script(1)

    Run arbitrary package scripts

    npm-search(1)

    Search for packages

    npm-shrinkwrap(1)

    Lock down dependency versions

    npm-star(1)

    Mark your favorite packages

    npm-stars(1)

    View packages marked as favorites

    npm-start(1)

    Start a package

    npm-stop(1)

    Stop a package

    npm-tag(1)

    Tag a published version

    npm-test(1)

    Test a package

    npm-uninstall(1)

    Remove a package

    npm-unpublish(1)

    Remove a package from the registry

    npm-update(1)

    Update a package

    npm-version(1)

    Bump a package version

    npm-view(1)

    View registry info

    npm-whoami(1)

    Display npm username

    API Documentation

    Using npm in your Node programs

    npm(3)

    node package manager

    npm-bin(3)

    Display npm bin folder

    npm-bugs(3)

    Bugs for a package in a web browser maybe

    npm-cache(3)

    manage the npm cache programmatically

    npm-commands(3)

    npm commands

    npm-config(3)

    Manage the npm configuration files

    npm-deprecate(3)

    Deprecate a version of a package

    npm-docs(3)

    Docs for a package in a web browser maybe

    npm-edit(3)

    Edit an installed package

    npm-explore(3)

    Browse an installed package

    npm-help-search(3)

    Search the help pages

    npm-init(3)

    Interactively create a package.json file

    npm-install(3)

    install a package programmatically

    Symlink a package folder

    npm-load(3)

    Load config settings

    npm-ls(3)

    List installed packages

    npm-outdated(3)

    Check for outdated packages

    npm-owner(3)

    Manage package owners

    npm-pack(3)

    Create a tarball from a package

    npm-prefix(3)

    Display prefix

    npm-prune(3)

    Remove extraneous packages

    npm-publish(3)

    Publish a package

    npm-rebuild(3)

    Rebuild a package

    npm-repo(3)

    Open package repository page in the browser

    npm-restart(3)

    Restart a package

    npm-root(3)

    Display npm root

    npm-run-script(3)

    Run arbitrary package scripts

    npm-search(3)

    Search for packages

    npm-shrinkwrap(3)

    programmatically generate package shrinkwrap file

    npm-start(3)

    Start a package

    npm-stop(3)

    Stop a package

    npm-tag(3)

    Tag a published version

    npm-test(3)

    Test a package

    npm-uninstall(3)

    uninstall a package programmatically

    npm-unpublish(3)

    Remove a package from the registry

    npm-update(3)

    Update a package

    npm-version(3)

    Bump a package version

    npm-view(3)

    View registry info

    npm-whoami(3)

    Display npm username

    Files

    File system structures npm uses

    npm-folders(5)

    Folder Structures Used by npm

    npmrc(5)

    The npm config files

    package.json(5)

    Specifics of npm's package.json handling

    Misc

    Various other bits and bobs

    npm-coding-style(7)

    npm's "funny" coding style

    npm-config(7)

    More than you probably want to know about npm configuration

    npm-developers(7)

    Developer Guide

    npm-disputes(7)

    Handling Module Name Disputes

    npm-faq(7)

    Frequently Asked Questions

    npm-index(7)

    Index of all npm documentation

    npm-registry(7)

    The JavaScript Package Registry

    npm-scope(7)

    Scoped packages

    npm-scripts(7)

    How npm handles the "scripts" field

    removing-npm(7)

    Cleaning the Slate

    semver(7)

    The semantic versioner for npm

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-registry.html000644 000766 000024 00000014762 12455173731 026706 0ustar00iojsstaff000000 000000 npm-registry

    npm-registry

    The JavaScript Package Registry

    DESCRIPTION

    To resolve packages by name and version, npm talks to a registry website that implements the CommonJS Package Registry specification for reading package info.

    Additionally, npm's package registry implementation supports several write APIs as well, to allow for publishing packages and managing user account information.

    The official public npm registry is at http://registry.npmjs.org/. It is powered by a CouchDB database, of which there is a public mirror at http://skimdb.npmjs.com/registry. The code for the couchapp is available at http://github.com/npm/npm-registry-couchapp.

    The registry URL used is determined by the scope of the package (see npm-scope(7)). If no scope is specified, the default registry is used, which is supplied by the registry config parameter. See npm-config(1), npmrc(5), and npm-config(7) for more on managing npm's configuration.

    Can I run my own private registry?

    Yes!

    The easiest way is to replicate the couch database, and use the same (or similar) design doc to implement the APIs.

    If you set up continuous replication from the official CouchDB, and then set your internal CouchDB as the registry config, then you'll be able to read any published packages, in addition to your private ones, and by default will only publish internally. If you then want to publish a package for the whole world to see, you can simply override the --registry config for that command.

    I don't want my package published in the official registry. It's private.

    Set "private": true in your package.json to prevent it from being published at all, or "publishConfig":{"registry":"http://my-internal-registry.local"} to force it to be published only to your internal registry.

    See package.json(5) for more info on what goes in the package.json file.

    Will you replicate from my registry into the public one?

    No. If you want things to be public, then publish them into the public registry using npm. What little security there is would be for nought otherwise.

    Do I have to use couchdb to build a registry that npm can talk to?

    No, but it's way easier. Basically, yes, you do, or you have to effectively implement the entire CouchDB API anyway.

    Is there a website or something to see package docs and such?

    Yes, head over to https://npmjs.com/

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-scope.html000644 000766 000024 00000014411 12455173731 026136 0ustar00iojsstaff000000 000000 npm-scope

    npm-scope

    Scoped packages

    DESCRIPTION

    All npm packages have a name. Some package names also have a scope. A scope follows the usual rules for package names (url-safe characters, no leading dots or underscores). When used in package names, preceded by an @-symbol and followed by a slash, e.g.

    @somescope/somepackagename
    

    Scopes are a way of grouping related packages together, and also affect a few things about the way npm treats the package.

    As of 2014-09-03, scoped packages are not supported by the public npm registry. However, the npm client is backwards-compatible with un-scoped registries, so it can be used to work with scoped and un-scoped registries at the same time.

    Installing scoped packages

    Scoped packages are installed to a sub-folder of the regular installation folder, e.g. if your other packages are installed in node_modules/packagename, scoped modules will be in node_modules/@myorg/packagename. The scope folder (@myorg) is simply the name of the scope preceded by an @-symbol, and can contain any number of scoped packages.

    A scoped package is installed by referencing it by name, preceded by an @-symbol, in npm install:

    npm install @myorg/mypackage
    

    Or in package.json:

    "dependencies": {
      "@myorg/mypackage": "^1.3.0"
    }
    

    Note that if the @-symbol is omitted in either case npm will instead attempt to install from GitHub; see npm-install(1).

    Requiring scoped packages

    Because scoped packages are installed into a scope folder, you have to include the name of the scope when requiring them in your code, e.g.

    require('@myorg/mypackage')
    

    There is nothing special about the way Node treats scope folders, this is just specifying to require the module mypackage in the folder called @myorg.

    Publishing scoped packages

    Scoped packages can be published to any registry that supports them. As of 2014-09-03, the public npm registry does not support scoped packages, so attempting to publish a scoped package to the registry will fail unless you have associated that scope with a different registry, see below.

    Associating a scope with a registry

    Scopes can be associated with a separate registry. This allows you to seamlessly use a mix of packages from the public npm registry and one or more private registries, such as npm Enterprise.

    You can associate a scope with a registry at login, e.g.

    npm login --registry=http://reg.example.com --scope=@myco
    

    Scopes have a many-to-one relationship with registries: one registry can host multiple scopes, but a scope only ever points to one registry.

    You can also associate a scope with a registry using npm config:

    npm config set @myco:registry http://reg.example.com
    

    Once a scope is associated with a registry, any npm install for a package with that scope will request packages from that registry instead. Any npm publish for a package name that contains the scope will be published to that registry instead.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/npm-scripts.html000644 000766 000024 00000032726 12455173731 026525 0ustar00iojsstaff000000 000000 npm-scripts

    npm-scripts

    How npm handles the "scripts" field

    DESCRIPTION

    npm supports the "scripts" property of the package.json script, for the following scripts:

    • prepublish: Run BEFORE the package is published. (Also run on local npm install without any arguments.)
    • publish, postpublish: Run AFTER the package is published.
    • preinstall: Run BEFORE the package is installed
    • install, postinstall: Run AFTER the package is installed.
    • preuninstall, uninstall: Run BEFORE the package is uninstalled.
    • postuninstall: Run AFTER the package is uninstalled.
    • pretest, test, posttest: Run by the npm test command.
    • prestop, stop, poststop: Run by the npm stop command.
    • prestart, start, poststart: Run by the npm start command.
    • prerestart, restart, postrestart: Run by the npm restart command. Note: npm restart will run the stop and start scripts if no restart script is provided.

    Additionally, arbitrary scripts can be executed by running npm run-script <pkg> <stage>. Pre and post commands with matching names will be run for those as well (e.g. premyscript, myscript, postmyscript).

    NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN

    tl;dr Don't use install. Use a .gyp file for compilation, and prepublish for anything else.

    You should almost never have to explicitly set a preinstall or install script. If you are doing this, please consider if there is another option.

    The only valid use of install or preinstall scripts is for compilation which must be done on the target architecture. In early versions of node, this was often done using the node-waf scripts, or a standalone Makefile, and early versions of npm required that it be explicitly set in package.json. This was not portable, and harder to do properly.

    In the current version of node, the standard way to do this is using a .gyp file. If you have a file with a .gyp extension in the root of your package, then npm will run the appropriate node-gyp commands automatically at install time. This is the only officially supported method for compiling binary addons, and does not require that you add anything to your package.json file.

    If you have to do other things before your package is used, in a way that is not dependent on the operating system or architecture of the target system, then use a prepublish script instead. This includes tasks such as:

    • Compile CoffeeScript source code into JavaScript.
    • Create minified versions of JavaScript source code.
    • Fetching remote resources that your package will use.

    The advantage of doing these things at prepublish time instead of preinstall or install time is that they can be done once, in a single place, and thus greatly reduce complexity and variability. Additionally, this means that:

    • You can depend on coffee-script as a devDependency, and thus your users don't need to have it installed.
    • You don't need to include the minifiers in your package, reducing the size for your users.
    • You don't need to rely on your users having curl or wget or other system tools on the target machines.

    DEFAULT VALUES

    npm will default some script values based on package contents.

    • "start": "node server.js":

      If there is a server.js file in the root of your package, then npm will default the start command to node server.js.

    • "preinstall": "node-waf clean || true; node-waf configure build":

      If there is a wscript file in the root of your package, npm will default the preinstall command to compile using node-waf.

    USER

    If npm was invoked with root privileges, then it will change the uid to the user account or uid specified by the user config, which defaults to nobody. Set the unsafe-perm flag to run scripts with root privileges.

    ENVIRONMENT

    Package scripts run in an environment where many pieces of information are made available regarding the setup of npm and the current state of the process.

    path

    If you depend on modules that define executable scripts, like test suites, then those executables will be added to the PATH for executing the scripts. So, if your package.json has this:

    { "name" : "foo"
    , "dependencies" : { "bar" : "0.1.x" }
    , "scripts": { "start" : "bar ./test" } }
    

    then you could run npm start to execute the bar script, which is exported into the node_modules/.bin directory on npm install.

    package.json vars

    The package.json fields are tacked onto the npm_package_ prefix. So, for instance, if you had {"name":"foo", "version":"1.2.5"} in your package.json file, then your package scripts would have the npm_package_name environment variable set to "foo", and the npm_package_version set to "1.2.5"

    configuration

    Configuration parameters are put in the environment with the npm_config_ prefix. For instance, you can view the effective root config by checking the npm_config_root environment variable.

    Special: package.json "config" object

    The package.json "config" keys are overwritten in the environment if there is a config param of <name>[@<version>]:<key>. For example, if the package.json has this:

    { "name" : "foo"
    , "config" : { "port" : "8080" }
    , "scripts" : { "start" : "node server.js" } }
    

    and the server.js is this:

    http.createServer(...).listen(process.env.npm_package_config_port)
    

    then the user could change the behavior by doing:

    npm config set foo:port 80
    

    current lifecycle event

    Lastly, the npm_lifecycle_event environment variable is set to whichever stage of the cycle is being executed. So, you could have a single script used for different parts of the process which switches based on what's currently happening.

    Objects are flattened following this format, so if you had {"scripts":{"install":"foo.js"}} in your package.json, then you'd see this in the script:

    process.env.npm_package_scripts_install === "foo.js"
    

    EXAMPLES

    For example, if your package.json contains this:

    { "scripts" :
      { "install" : "scripts/install.js"
      , "postinstall" : "scripts/install.js"
      , "uninstall" : "scripts/uninstall.js"
      }
    }
    

    then the scripts/install.js will be called for the install, post-install, stages of the lifecycle, and the scripts/uninstall.js would be called when the package is uninstalled. Since scripts/install.js is running for three different phases, it would be wise in this case to look at the npm_lifecycle_event environment variable.

    If you want to run a make command, you can do so. This works just fine:

    { "scripts" :
      { "preinstall" : "./configure"
      , "install" : "make && make install"
      , "test" : "make test"
      }
    }
    

    EXITING

    Scripts are run by passing the line as a script argument to sh.

    If the script exits with a code other than 0, then this will abort the process.

    Note that these script files don't have to be nodejs or even javascript programs. They just have to be some kind of executable file.

    HOOK SCRIPTS

    If you want to run a specific script at a specific lifecycle event for ALL packages, then you can use a hook script.

    Place an executable file at node_modules/.hooks/{eventname}, and it'll get run for all packages when they are going through that point in the package lifecycle for any packages installed in that root.

    Hook scripts are run exactly the same way as package.json scripts. That is, they are in a separate child process, with the env described above.

    BEST PRACTICES

    • Don't exit with a non-zero error code unless you really mean it. Except for uninstall scripts, this will cause the npm action to fail, and potentially be rolled back. If the failure is minor or only will prevent some optional features, then it's better to just print a warning and exit successfully.
    • Try not to use scripts to do what npm can do for you. Read through package.json(5) to see all the things that you can specify and enable by simply describing your package appropriately. In general, this will lead to a more robust and consistent state.
    • Inspect the env to determine where to put things. For instance, if the npm_config_binroot environ is set to /home/user/bin, then don't try to install executables into /usr/local/bin. The user probably set it up that way for a reason.
    • Don't prefix your script commands with "sudo". If root permissions are required for some reason, then it'll fail with that error, and the user will sudo the npm command in question.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/removing-npm.html000644 000766 000024 00000011221 12455173731 026647 0ustar00iojsstaff000000 000000 removing-npm

    npm-removal

    Cleaning the Slate

    SYNOPSIS

    So sad to see you go.

    sudo npm uninstall npm -g
    

    Or, if that fails, get the npm source code, and do:

    sudo make uninstall
    

    More Severe Uninstalling

    Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed.

    If that doesn't work, or if you require more drastic measures, continue reading.

    Note that this is only necessary for globally-installed packages. Local installs are completely contained within a project's node_modules folder. Delete that folder, and everything is gone (unless a package's install script is particularly ill-behaved).

    This assumes that you installed node and npm in the default place. If you configured node with a different --prefix, or installed npm with a different prefix setting, then adjust the paths accordingly, replacing /usr/local with your install prefix.

    To remove everything npm-related manually:

    rm -rf /usr/local/{lib/node{,/.npm,_modules},bin,share/man}/npm*
    

    If you installed things with npm, then your best bet is to uninstall them with npm first, and then install them again once you have a proper install. This can help find any symlinks that are lying around:

    ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm
    

    Prior to version 0.3, npm used shim files for executables and node modules. To track those down, you can do the following:

    find /usr/local/{lib/node,bin} -exec grep -l npm \{\} \; ;
    

    (This is also in the README file.)

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/misc/semver.html000644 000766 000024 00000043714 12455173731 025546 0ustar00iojsstaff000000 000000 semver

    semver

    The semantic versioner for npm

    Usage

    $ npm install semver
    
    semver.valid('1.2.3') // '1.2.3'
    semver.valid('a.b.c') // null
    semver.clean('  =v1.2.3   ') // '1.2.3'
    semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true
    semver.gt('1.2.3', '9.8.7') // false
    semver.lt('1.2.3', '9.8.7') // true
    

    As a command-line utility:

    $ semver -h
    
    Usage: semver <version> [<version> [...]] [-r <range> | -i <inc> | --preid <identifier> | -l | -rv]
    Test if version(s) satisfy the supplied range(s), and sort them.
    
    Multiple versions or ranges may be supplied, unless increment
    option is specified.  In that case, only a single version may
    be used, and it is incremented by the specified level
    
    Program exits successfully if any valid version satisfies
    all supplied ranges, and prints all satisfying versions.
    
    If no versions are valid, or ranges are not satisfied,
    then exits failure.
    
    Versions are printed in ascending order, so supplying
    multiple versions to the utility will just sort them.
    

    Versions

    A "version" is described by the v2.0.0 specification found at http://semver.org/.

    A leading "=" or "v" character is stripped off and ignored.

    Ranges

    A version range is a set of comparators which specify versions that satisfy the range.

    A comparator is composed of an operator and a version. The set of primitive operators is:

    • < Less than
    • <= Less than or equal to
    • > Greater than
    • >= Greater than or equal to
    • = Equal. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included.

    For example, the comparator >=1.2.7 would match the versions 1.2.7, 1.2.8, 2.5.3, and 1.3.9, but not the versions 1.2.6 or 1.1.0.

    Comparators can be joined by whitespace to form a comparator set, which is satisfied by the intersection of all of the comparators it includes.

    A range is composed of one or more comparator sets, joined by ||. A version matches a range if and only if every comparator in at least one of the ||-separated comparator sets is satisfied by the version.

    For example, the range >=1.2.7 <1.3.0 would match the versions 1.2.7, 1.2.8, and 1.2.99, but not the versions 1.2.6, 1.3.0, or 1.1.0.

    The range 1.2.7 || >=1.2.9 <2.0.0 would match the versions 1.2.7, 1.2.9, and 1.4.6, but not the versions 1.2.8 or 2.0.0.

    Prerelease Tags

    If a version has a prerelease tag (for example, 1.2.3-alpha.3) then it will only be allowed to satisfy comparator sets if at least one comparator with the same [major, minor, patch] tuple also has a prerelease tag.

    For example, the range >1.2.3-alpha.3 would be allowed to match the version 1.2.3-alpha.7, but it would not be satisfied by 3.4.5-alpha.9, even though 3.4.5-alpha.9 is technically "greater than" 1.2.3-alpha.3 according to the SemVer sort rules. The version range only accepts prerelease tags on the 1.2.3 version. The version 3.4.5 would satisfy the range, because it does not have a prerelease flag, and 3.4.5 is greater than 1.2.3-alpha.7.

    The purpose for this behavior is twofold. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author's design) not yet fit for public consumption. Therefore, by default, they are excluded from range matching semantics.

    Second, a user who has opted into using a prerelease version has clearly indicated the intent to use that specific set of alpha/beta/rc versions. By including a prerelease tag in the range, the user is indicating that they are aware of the risk. However, it is still not appropriate to assume that they have opted into taking a similar risk on the next set of prerelease versions.

    Prerelease Identifiers

    The method .inc takes an additional identifier string argument that will append the value of the string as a prerelease identifier:

    > semver.inc('1.2.3', 'pre', 'beta')
    '1.2.4-beta.0'
    

    command-line example:

    $ semver 1.2.3 -i prerelease --preid beta
    1.2.4-beta.0
    

    Which then can be used to increment further:

    $ semver 1.2.4-beta.0 -i prerelease
    1.2.4-beta.1
    

    Advanced Range Syntax

    Advanced range syntax desugars to primitive comparators in deterministic ways.

    Advanced ranges may be combined in the same way as primitive comparators using white space or ||.

    Hyphen Ranges X.Y.Z - A.B.C

    Specifies an inclusive set.

    • 1.2.3 - 2.3.4 := >=1.2.3 <=2.3.4

    If a partial version is provided as the first version in the inclusive range, then the missing pieces are replaced with zeroes.

    • 1.2 - 2.3.4 := >=1.2.0 <=2.3.4

    If a partial version is provided as the second version in the inclusive range, then all versions that start with the supplied parts of the tuple are accepted, but nothing that would be greater than the provided tuple parts.

    • 1.2.3 - 2.3 := >=1.2.3 <2.4.0
    • 1.2.3 - 2 := >=1.2.3 <3.0.0

    X-Ranges 1.2.x 1.X 1.2.* *

    Any of X, x, or * may be used to "stand in" for one of the numeric values in the [major, minor, patch] tuple.

    • * := >=0.0.0 (Any version satisfies)
    • 1.x := >=1.0.0 <2.0.0 (Matching major version)
    • 1.2.x := >=1.2.0 <1.3.0 (Matching major and minor versions)

    A partial version range is treated as an X-Range, so the special character is in fact optional.

    • "" (empty string) := * := >=0.0.0
    • 1 := 1.x.x := >=1.0.0 <2.0.0
    • 1.2 := 1.2.x := >=1.2.0 <1.3.0

    Tilde Ranges ~1.2.3 ~1.2 ~1

    Allows patch-level changes if a minor version is specified on the comparator. Allows minor-level changes if not.

    • ~1.2.3 := >=1.2.3 <1.(2+1).0 := >=1.2.3 <1.3.0
    • ~1.2 := >=1.2.0 <1.(2+1).0 := >=1.2.0 <1.3.0 (Same as 1.2.x)
    • ~1 := >=1.0.0 <(1+1).0.0 := >=1.0.0 <2.0.0 (Same as 1.x)
    • ~0.2.3 := >=0.2.3 <0.(2+1).0 := >=0.2.3 <0.3.0
    • ~0.2 := >=0.2.0 <0.(2+1).0 := >=0.2.0 <0.3.0 (Same as 0.2.x)
    • ~0 := >=0.0.0 <(0+1).0.0 := >=0.0.0 <1.0.0 (Same as 0.x)
    • ~1.2.3-beta.2 := >=1.2.3-beta.2 <1.3.0 Note that prereleases in the 1.2.3 version will be allowed, if they are greater than or equal to beta.2. So, 1.2.3-beta.4 would be allowed, but 1.2.4-beta.2 would not, because it is a prerelease of a different [major, minor, patch] tuple.

    Caret Ranges ^1.2.3 ^0.2.5 ^0.0.4

    Allows changes that do not modify the left-most non-zero digit in the [major, minor, patch] tuple. In other words, this allows patch and minor updates for versions 1.0.0 and above, patch updates for versions 0.X >=0.1.0, and no updates for versions 0.0.X.

    Many authors treat a 0.x version as if the x were the major "breaking-change" indicator.

    Caret ranges are ideal when an author may make breaking changes between 0.2.4 and 0.3.0 releases, which is a common practice. However, it presumes that there will not be breaking changes between 0.2.4 and 0.2.5. It allows for changes that are presumed to be additive (but non-breaking), according to commonly observed practices.

    • ^1.2.3 := >=1.2.3 <2.0.0
    • ^0.2.3 := >=0.2.3 <0.3.0
    • ^0.0.3 := >=0.0.3 <0.0.4
    • ^1.2.3-beta.2 := >=1.2.3-beta.2 <2.0.0 Note that prereleases in the 1.2.3 version will be allowed, if they are greater than or equal to beta.2. So, 1.2.3-beta.4 would be allowed, but 1.2.4-beta.2 would not, because it is a prerelease of a different [major, minor, patch] tuple.
    • ^0.0.3-beta := >=0.0.3-beta <0.0.4 Note that prereleases in the 0.0.3 version only will be allowed, if they are greater than or equal to beta. So, 0.0.3-pr.2 would be allowed.

    When parsing caret ranges, a missing patch value desugars to the number 0, but will allow flexibility within that value, even if the major and minor versions are both 0.

    • ^1.2.x := >=1.2.0 <2.0.0
    • ^0.0.x := >=0.0.0 <0.1.0
    • ^0.0 := >=0.0.0 <0.1.0

    A missing minor and patch values will desugar to zero, but also allow flexibility within those values, even if the major version is zero.

    • ^1.x := >=1.0.0 <2.0.0
    • ^0.x := >=0.0.0 <1.0.0

    Functions

    All methods and classes take a final loose boolean argument that, if true, will be more forgiving about not-quite-valid semver strings. The resulting output will always be 100% strict, of course.

    Strict-mode Comparators and Ranges will be strict about the SemVer strings that they parse.

    • valid(v): Return the parsed version, or null if it's not valid.
    • inc(v, release): Return the version incremented by the release type (major, premajor, minor, preminor, patch, prepatch, or prerelease), or null if it's not valid
      • premajor in one call will bump the version up to the next major version and down to a prerelease of that major version. preminor, and prepatch work the same way.
      • If called from a non-prerelease version, the prerelease will work the same as prepatch. It increments the patch version, then makes a prerelease. If the input version is already a prerelease it simply increments it.

    Comparison

    • gt(v1, v2): v1 > v2
    • gte(v1, v2): v1 >= v2
    • lt(v1, v2): v1 < v2
    • lte(v1, v2): v1 <= v2
    • eq(v1, v2): v1 == v2 This is true if they're logically equivalent, even if they're not the exact same string. You already know how to compare strings.
    • neq(v1, v2): v1 != v2 The opposite of eq.
    • cmp(v1, comparator, v2): Pass in a comparison string, and it'll call the corresponding function above. "===" and "!==" do simple string comparison, but are included for completeness. Throws if an invalid comparison string is provided.
    • compare(v1, v2): Return 0 if v1 == v2, or 1 if v1 is greater, or -1 if v2 is greater. Sorts in ascending order if passed to Array.sort().
    • rcompare(v1, v2): The reverse of compare. Sorts an array of versions in descending order when passed to Array.sort().
    • diff(v1, v2): Returns difference between two versions by the release type (major, premajor, minor, preminor, patch, prepatch, or prerelease), or null if the versions are the same.

    Ranges

    • validRange(range): Return the valid range or null if it's not valid
    • satisfies(version, range): Return true if the version satisfies the range.
    • maxSatisfying(versions, range): Return the highest version in the list that satisfies the range, or null if none of them do.
    • gtr(version, range): Return true if version is greater than all the versions possible in the range.
    • ltr(version, range): Return true if version is less than all the versions possible in the range.
    • outside(version, range, hilo): Return true if the version is outside the bounds of the range in either the high or low direction. The hilo argument must be either the string '>' or '<'. (This is the function called by gtr and ltr.)

    Note that, since ranges may be non-contiguous, a version might not be greater than a range, less than a range, or satisfy a range! For example, the range 1.2 <1.2.9 || >2.0.0 would have a hole from 1.2.9 until 2.0.0, so the version 1.2.10 would not be greater than the range (because 2.0.1 satisfies, which is higher), nor less than the range (since 1.2.8 satisfies, which is lower), and it also does not satisfy the range.

    If you want to know if a version satisfies or does not satisfy a range, use the satisfies(version, range) function.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/files/npm-folders.html000644 000766 000024 00000031014 12455173731 026630 0ustar00iojsstaff000000 000000 npm-folders

    npm-folders

    Folder Structures Used by npm

    DESCRIPTION

    npm puts various things on your computer. That's its job.

    This document will tell you what it puts where.

    tl;dr

    • Local install (default): puts stuff in ./node_modules of the current package root.
    • Global install (with -g): puts stuff in /usr/local or wherever node is installed.
    • Install it locally if you're going to require() it.
    • Install it globally if you're going to run it on the command line.
    • If you need both, then install it in both places, or use npm link.

    prefix Configuration

    The prefix config defaults to the location where node is installed. On most systems, this is /usr/local, and most of the time is the same as node's process.installPrefix.

    On windows, this is the exact location of the node.exe binary. On Unix systems, it's one level up, since node is typically installed at {prefix}/bin/node rather than {prefix}/node.exe.

    When the global flag is set, npm installs things into this prefix. When it is not set, it uses the root of the current package, or the current working directory if not in a package already.

    Node Modules

    Packages are dropped into the node_modules folder under the prefix. When installing locally, this means that you can require("packagename") to load its main module, or require("packagename/lib/path/to/sub/module") to load other modules.

    Global installs on Unix systems go to {prefix}/lib/node_modules. Global installs on Windows go to {prefix}/node_modules (that is, no lib folder.)

    Scoped packages are installed the same way, except they are grouped together in a sub-folder of the relevant node_modules folder with the name of that scope prefix by the @ symbol, e.g. npm install @myorg/package would place the package in {prefix}/node_modules/@myorg/package. See scopes(7) for more details.

    If you wish to require() a package, then install it locally.

    Executables

    When in global mode, executables are linked into {prefix}/bin on Unix, or directly into {prefix} on Windows.

    When in local mode, executables are linked into ./node_modules/.bin so that they can be made available to scripts run through npm. (For example, so that a test runner will be in the path when you run npm test.)

    Man Pages

    When in global mode, man pages are linked into {prefix}/share/man.

    When in local mode, man pages are not installed.

    Man pages are not installed on Windows systems.

    Cache

    See npm-cache(1). Cache files are stored in ~/.npm on Posix, or ~/npm-cache on Windows.

    This is controlled by the cache configuration param.

    Temp Files

    Temporary files are stored by default in the folder specified by the tmp config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or /tmp on Unix and c:\windows\temp on Windows.

    Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit.

    More Information

    When installing locally, npm first tries to find an appropriate prefix folder. This is so that npm install foo@1.2.3 will install to the sensible root of your package, even if you happen to have cded into some other folder.

    Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a package.json file, or a node_modules folder. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands. (This behavior is inspired by and similar to git's .git-folder seeking logic when running git commands in a working dir.)

    If no package root is found, then the current folder is used.

    When you run npm install foo@1.2.3, then the package is loaded into the cache, and then unpacked into ./node_modules/foo. Then, any of foo's dependencies are similarly unpacked into ./node_modules/foo/node_modules/....

    Any bin files are symlinked to ./node_modules/.bin/, so that they may be found by npm scripts when necessary.

    Global Installation

    If the global configuration is set to true, then npm will install packages "globally".

    For global installation, packages are installed roughly the same way, but using the folders described above.

    Cycles, Conflicts, and Folder Parsimony

    Cycles are handled using the property of node's module system that it walks up the directories looking for node_modules folders. So, at every stage, if a package is already installed in an ancestor node_modules folder, then it is not installed at the current location.

    Consider the case above, where foo -> bar -> baz. Imagine if, in addition to that, baz depended on bar, so you'd have: foo -> bar -> baz -> bar -> baz .... However, since the folder structure is: foo/node_modules/bar/node_modules/baz, there's no need to put another copy of bar into .../baz/node_modules, since when it calls require("bar"), it will get the copy that is installed in foo/node_modules/bar.

    This shortcut is only used if the exact same version would be installed in multiple nested node_modules folders. It is still possible to have a/node_modules/b/node_modules/a if the two "a" packages are different versions. However, without repeating the exact same package multiple times, an infinite regress will always be prevented.

    Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder.

    Example

    Consider this dependency graph:

    foo
    +-- blerg@1.2.5
    +-- bar@1.2.3
    |   +-- blerg@1.x (latest=1.3.7)
    |   +-- baz@2.x
    |   |   `-- quux@3.x
    |   |       `-- bar@1.2.3 (cycle)
    |   `-- asdf@*
    `-- baz@1.2.3
        `-- quux@3.x
            `-- bar
    

    In this case, we might expect a folder structure like this:

    foo
    +-- node_modules
        +-- blerg (1.2.5) <---[A]
        +-- bar (1.2.3) <---[B]
        |   `-- node_modules
        |       +-- baz (2.0.2) <---[C]
        |       |   `-- node_modules
        |       |       `-- quux (3.2.0)
        |       `-- asdf (2.3.4)
        `-- baz (1.2.3) <---[D]
            `-- node_modules
                `-- quux (3.2.0) <---[E]
    

    Since foo depends directly on bar@1.2.3 and baz@1.2.3, those are installed in foo's node_modules folder.

    Even though the latest copy of blerg is 1.3.7, foo has a specific dependency on version 1.2.5. So, that gets installed at [A]. Since the parent installation of blerg satisfies bar's dependency on blerg@1.x, it does not install another copy under [B].

    Bar [B] also has dependencies on baz and asdf, so those are installed in bar's node_modules folder. Because it depends on baz@2.x, it cannot re-use the baz@1.2.3 installed in the parent node_modules folder [D], and must install its own copy [C].

    Underneath bar, the baz -> quux -> bar dependency creates a cycle. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder.

    Underneath foo -> baz [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B].

    For a graphical breakdown of what is installed where, use npm ls.

    Publishing

    Upon publishing, npm will look in the node_modules folder. If any of the items there are not in the bundledDependencies array, then they will not be included in the package tarball.

    This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re-publish those items that cannot be found elsewhere. See package.json(5) for more information.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/files/npm-global.html000644 000766 000024 00000031011 12455173731 026427 0ustar00iojsstaff000000 000000 npm-global

    npm-folders

    Folder Structures Used by npm

    DESCRIPTION

    npm puts various things on your computer. That's its job.

    This document will tell you what it puts where.

    tl;dr

    • Local install (default): puts stuff in ./node_modules of the current package root.
    • Global install (with -g): puts stuff in /usr/local or wherever node is installed.
    • Install it locally if you're going to require() it.
    • Install it globally if you're going to run it on the command line.
    • If you need both, then install it in both places, or use npm link.

    prefix Configuration

    The prefix config defaults to the location where node is installed. On most systems, this is /usr/local, and most of the time is the same as node's process.installPrefix.

    On windows, this is the exact location of the node.exe binary. On Unix systems, it's one level up, since node is typically installed at {prefix}/bin/node rather than {prefix}/node.exe.

    When the global flag is set, npm installs things into this prefix. When it is not set, it uses the root of the current package, or the current working directory if not in a package already.

    Node Modules

    Packages are dropped into the node_modules folder under the prefix. When installing locally, this means that you can require("packagename") to load its main module, or require("packagename/lib/path/to/sub/module") to load other modules.

    Global installs on Unix systems go to {prefix}/lib/node_modules. Global installs on Windows go to {prefix}/node_modules (that is, no lib folder.)

    Scoped packages are installed the same way, except they are grouped together in a sub-folder of the relevant node_modules folder with the name of that scope prefix by the @ symbol, e.g. npm install @myorg/package would place the package in {prefix}/node_modules/@myorg/package. See scopes(7) for more details.

    If you wish to require() a package, then install it locally.

    Executables

    When in global mode, executables are linked into {prefix}/bin on Unix, or directly into {prefix} on Windows.

    When in local mode, executables are linked into ./node_modules/.bin so that they can be made available to scripts run through npm. (For example, so that a test runner will be in the path when you run npm test.)

    Man Pages

    When in global mode, man pages are linked into {prefix}/share/man.

    When in local mode, man pages are not installed.

    Man pages are not installed on Windows systems.

    Cache

    See npm-cache(1). Cache files are stored in ~/.npm on Posix, or ~/npm-cache on Windows.

    This is controlled by the cache configuration param.

    Temp Files

    Temporary files are stored by default in the folder specified by the tmp config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or /tmp on Unix and c:\windows\temp on Windows.

    Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit.

    More Information

    When installing locally, npm first tries to find an appropriate prefix folder. This is so that npm install foo@1.2.3 will install to the sensible root of your package, even if you happen to have cded into some other folder.

    Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a package.json file, or a node_modules folder. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands. (This behavior is inspired by and similar to git's .git-folder seeking logic when running git commands in a working dir.)

    If no package root is found, then the current folder is used.

    When you run npm install foo@1.2.3, then the package is loaded into the cache, and then unpacked into ./node_modules/foo. Then, any of foo's dependencies are similarly unpacked into ./node_modules/foo/node_modules/....

    Any bin files are symlinked to ./node_modules/.bin/, so that they may be found by npm scripts when necessary.

    Global Installation

    If the global configuration is set to true, then npm will install packages "globally".

    For global installation, packages are installed roughly the same way, but using the folders described above.

    Cycles, Conflicts, and Folder Parsimony

    Cycles are handled using the property of node's module system that it walks up the directories looking for node_modules folders. So, at every stage, if a package is already installed in an ancestor node_modules folder, then it is not installed at the current location.

    Consider the case above, where foo -> bar -> baz. Imagine if, in addition to that, baz depended on bar, so you'd have: foo -> bar -> baz -> bar -> baz .... However, since the folder structure is: foo/node_modules/bar/node_modules/baz, there's no need to put another copy of bar into .../baz/node_modules, since when it calls require("bar"), it will get the copy that is installed in foo/node_modules/bar.

    This shortcut is only used if the exact same version would be installed in multiple nested node_modules folders. It is still possible to have a/node_modules/b/node_modules/a if the two "a" packages are different versions. However, without repeating the exact same package multiple times, an infinite regress will always be prevented.

    Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder.

    Example

    Consider this dependency graph:

    foo
    +-- blerg@1.2.5
    +-- bar@1.2.3
    |   +-- blerg@1.x (latest=1.3.7)
    |   +-- baz@2.x
    |   |   `-- quux@3.x
    |   |       `-- bar@1.2.3 (cycle)
    |   `-- asdf@*
    `-- baz@1.2.3
        `-- quux@3.x
            `-- bar
    

    In this case, we might expect a folder structure like this:

    foo
    +-- node_modules
        +-- blerg (1.2.5) <---[A]
        +-- bar (1.2.3) <---[B]
        |   `-- node_modules
        |       +-- baz (2.0.2) <---[C]
        |       |   `-- node_modules
        |       |       `-- quux (3.2.0)
        |       `-- asdf (2.3.4)
        `-- baz (1.2.3) <---[D]
            `-- node_modules
                `-- quux (3.2.0) <---[E]
    

    Since foo depends directly on bar@1.2.3 and baz@1.2.3, those are installed in foo's node_modules folder.

    Even though the latest copy of blerg is 1.3.7, foo has a specific dependency on version 1.2.5. So, that gets installed at [A]. Since the parent installation of blerg satisfies bar's dependency on blerg@1.x, it does not install another copy under [B].

    Bar [B] also has dependencies on baz and asdf, so those are installed in bar's node_modules folder. Because it depends on baz@2.x, it cannot re-use the baz@1.2.3 installed in the parent node_modules folder [D], and must install its own copy [C].

    Underneath bar, the baz -> quux -> bar dependency creates a cycle. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder.

    Underneath foo -> baz [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B].

    For a graphical breakdown of what is installed where, use npm ls.

    Publishing

    Upon publishing, npm will look in the node_modules folder. If any of the items there are not in the bundledDependencies array, then they will not be included in the package tarball.

    This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re-publish those items that cannot be found elsewhere. See package.json(5) for more information.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/files/npm-json.html000644 000766 000024 00000073746 12455173731 026165 0ustar00iojsstaff000000 000000 npm-json

    package.json

    Specifics of npm's package.json handling

    DESCRIPTION

    This document is all you need to know about what's required in your package.json file. It must be actual JSON, not just a JavaScript object literal.

    A lot of the behavior described in this document is affected by the config settings described in npm-config(7).

    name

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    The name is what your thing is called. Some tips:

    • Don't put "js" or "node" in the name. It's assumed that it's js, since you're writing a package.json file, and you can specify the engine using the "engines" field. (See below.)
    • The name ends up being part of a URL, an argument on the command line, and a folder name. Any name with non-url-safe characters will be rejected. Also, it can't start with a dot or an underscore.
    • The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive.
    • You may want to check the npm registry to see if there's something by that name already, before you get too attached to it. http://registry.npmjs.org/

    A name can be optionally prefixed by a scope, e.g. @myorg/mypackage. See npm-scope(7) for more detail.

    version

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    Version must be parseable by node-semver, which is bundled with npm as a dependency. (npm install semver to use it yourself.)

    More on version numbers and ranges at semver(7).

    description

    Put a description in it. It's a string. This helps people discover your package, as it's listed in npm search.

    keywords

    Put keywords in it. It's an array of strings. This helps people discover your package as it's listed in npm search.

    homepage

    The url to the project homepage.

    NOTE: This is not the same as "url". If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you.

    Literally. Spit. I'm so not kidding.

    bugs

    The url to your project's issue tracker and / or the email address to which issues should be reported. These are helpful for people who encounter issues with your package.

    It should look like this:

    { "url" : "http://github.com/owner/project/issues"
    , "email" : "project@hostname.com"
    }
    

    You can specify either one or both values. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object.

    If a url is provided, it will be used by the npm bugs command.

    license

    You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it.

    The simplest way, assuming you're using a common license such as BSD-3-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this:

    { "license" : "BSD-3-Clause" }
    

    You can check the full list of SPDX license IDs. Ideally you should pick one that is OSI approved.

    It's also a good idea to include a LICENSE file at the top level in your package.

    people fields: author, contributors

    The "author" is one person. "contributors" is an array of people. A "person" is an object with a "name" field and optionally "url" and "email", like this:

    { "name" : "Barney Rubble"
    , "email" : "b@rubble.com"
    , "url" : "http://barnyrubble.tumblr.com/"
    }
    

    Or you can shorten that all into a single string, and npm will parse it for you:

    "Barney Rubble <b@rubble.com> (http://barnyrubble.tumblr.com/)
    

    Both email and url are optional either way.

    npm also sets a top-level "maintainers" field with your npm user info.

    files

    The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)

    You can also provide a ".npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array. The ".npmignore" file works just like a ".gitignore".

    main

    The main field is a module ID that is the primary entry point to your program. That is, if your package is named foo, and a user installs it, and then does require("foo"), then your main module's exports object will be returned.

    This should be a module ID relative to the root of your package folder.

    For most modules, it makes the most sense to have a main script and often not much else.

    bin

    A lot of packages have one or more executable files that they'd like to install into the PATH. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable.)

    To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.

    For example, npm has this:

    { "bin" : { "npm" : "./cli.js" } }
    

    So, when you install npm, it'll create a symlink from the cli.js script to /usr/local/bin/npm.

    If you have a single executable, and its name should be the name of the package, then you can just supply it as a string. For example:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin": "./path/to/program" }
    

    would be the same as this:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin" : { "my-program" : "./path/to/program" } }
    

    man

    Specify either a single file or an array of filenames to put in place for the man program to find.

    If only a single file is provided, then it's installed such that it is the result from man <pkgname>, regardless of its actual filename. For example:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : "./man/doc.1"
    }
    

    would link the ./man/doc.1 file in such that it is the target for man foo

    If the filename doesn't start with the package name, then it's prefixed. So, this:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/bar.1" ]
    }
    

    will create files to do man foo and man foo-bar.

    Man files must end with a number, and optionally a .gz suffix if they are compressed. The number dictates which man section the file is installed into.

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/foo.2" ]
    }
    

    will create entries for man foo and man 2 foo

    directories

    The CommonJS Packages spec details a few ways that you can indicate the structure of your package using a directories object. If you look at npm's package.json, you'll see that it has directories for doc, lib, and man.

    In the future, this information may be used in other creative ways.

    directories.lib

    Tell people where the bulk of your library is. Nothing special is done with the lib folder in any way, but it's useful meta info.

    directories.bin

    If you specify a bin directory, then all the files in that folder will be added as children of the bin path.

    If you have a bin path already, then this has no effect.

    directories.man

    A folder that is full of man pages. Sugar to generate a "man" array by walking the folder.

    directories.doc

    Put markdown files in here. Eventually, these will be displayed nicely, maybe, someday.

    directories.example

    Put example scripts in here. Someday, it might be exposed in some clever way.

    repository

    Specify the place where your code lives. This is helpful for people who want to contribute. If the git repo is on GitHub, then the npm docs command will be able to find you.

    Do it like this:

    "repository" :
      { "type" : "git"
      , "url" : "http://github.com/npm/npm.git"
      }
    
    "repository" :
      { "type" : "svn"
      , "url" : "http://v8.googlecode.com/svn/trunk/"
      }
    

    The URL should be a publicly available (perhaps read-only) url that can be handed directly to a VCS program without any modification. It should not be a url to an html project page that you put in your browser. It's for computers.

    scripts

    The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package. The key is the lifecycle event, and the value is the command to run at that point.

    See npm-scripts(7) to find out more about writing package scripts.

    config

    A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades. For instance, if a package had the following:

    { "name" : "foo"
    , "config" : { "port" : "8080" } }
    

    and then had a "start" command that then referenced the npm_package_config_port environment variable, then the user could override that by doing npm config set foo:port 8001.

    See npm-config(7) and npm-scripts(7) for more on package configs.

    dependencies

    Dependencies are specified in a simple object that maps a package name to a version range. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or git URL.

    Please do not put test harnesses or transpilers in your dependencies object. See devDependencies, below.

    See semver(7) for more details about specifying version ranges.

    • version Must match version exactly
    • >version Must be greater than version
    • >=version etc
    • <version
    • <=version
    • ~version "Approximately equivalent to version" See semver(7)
    • ^version "Compatible with version" See semver(7)
    • 1.2.x 1.2.0, 1.2.1, etc., but not 1.3.0
    • http://... See 'URLs as Dependencies' below
    • * Matches any version
    • "" (just an empty string) Same as *
    • version1 - version2 Same as >=version1 <=version2.
    • range1 || range2 Passes if either range1 or range2 are satisfied.
    • git... See 'Git URLs as Dependencies' below
    • user/repo See 'GitHub URLs' below
    • tag A specific version tagged and published as tag See npm-tag(1)
    • path/path/path See Local Paths below

    For example, these are all valid:

    { "dependencies" :
      { "foo" : "1.0.0 - 2.9999.9999"
      , "bar" : ">=1.0.2 <2.1.2"
      , "baz" : ">1.0.2 <=2.3.4"
      , "boo" : "2.0.1"
      , "qux" : "<1.0.0 || >=2.3.1 <2.4.5 || >=2.5.2 <3.0.0"
      , "asd" : "http://asdf.com/asdf.tar.gz"
      , "til" : "~1.2"
      , "elf" : "~1.2.3"
      , "two" : "2.x"
      , "thr" : "3.3.x"
      , "lat" : "latest"
      , "dyl" : "file:../dyl"
      }
    }
    

    URLs as Dependencies

    You may specify a tarball URL in place of a version range.

    This tarball will be downloaded and installed locally to your package at install time.

    Git URLs as Dependencies

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+ssh://user@hostname/project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    GitHub URLs

    As of version 1.1.65, you can refer to GitHub urls as just "foo": "user/foo-project". Just as with git URLs, a commit-ish suffix can be included. For example:

    {
      "name": "foo",
      "version": "0.0.0",
      "dependencies": {
        "express": "visionmedia/express",
        "mocha": "visionmedia/mocha#4727d357ea"
      }
    }
    

    Local Paths

    As of version 2.0.0 you can provide a path to a local directory that contains a package. Local paths can be saved using npm install --save, using any of these forms:

    ../foo/bar
    ~/foo/bar
    ./foo/bar
    /foo/bar
    

    in which case they will be normalized to a relative path and added to your package.json. For example:

    {
      "name": "baz",
      "dependencies": {
        "bar": "file:../foo/bar"
      }
    }
    

    This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry.

    devDependencies

    If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use.

    In this case, it's best to map these additional items in a devDependencies object.

    These things will be installed when doing npm link or npm install from the root of a package, and can be managed like any other npm configuration param. See npm-config(7) for more on the topic.

    For build steps that are not platform-specific, such as compiling CoffeeScript or other languages to JavaScript, use the prepublish script to do this, and make the required package a devDependency.

    For example:

    { "name": "ethopia-waza",
      "description": "a delightfully fruity coffee varietal",
      "version": "1.2.3",
      "devDependencies": {
        "coffee-script": "~1.6.3"
      },
      "scripts": {
        "prepublish": "coffee -o lib/ -c src/waza.coffee"
      },
      "main": "lib/waza.js"
    }
    

    The prepublish script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves. In dev mode (ie, locally running npm install), it'll run this script as well, so that you can test it easily.

    peerDependencies

    In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a require of this host. This is usually referred to as a plugin. Notably, your module may be exposing a specific interface, expected and specified by the host documentation.

    For example:

    {
      "name": "tea-latte",
      "version": "1.3.5"
      "peerDependencies": {
        "tea": "2.x"
      }
    }
    

    This ensures your package tea-latte can be installed along with the second major version of the host package tea only. The host package is automatically installed if needed. npm install tea-latte could possibly yield the following dependency graph:

    ├── tea-latte@1.3.5
    └── tea@2.2.0
    

    Trying to install another plugin with a conflicting requirement will cause an error. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions.

    Assuming the host complies with semver, only changes in the host package's major version will break your plugin. Thus, if you've worked with every 1.x version of the host package, use "^1.0" or "1.x" to express this. If you depend on features introduced in 1.5.2, use ">= 1.5.2 < 2".

    bundledDependencies

    Array of package names that will be bundled when publishing the package.

    If this is spelled "bundleDependencies", then that is also honorable.

    optionalDependencies

    If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the optionalDependencies object. This is a map of package name to version or url, just like the dependencies object. The difference is that build failures do not cause installation to fail.

    It is still your program's responsibility to handle the lack of the dependency. For example, something like this:

    try {
      var foo = require('foo')
      var fooVersion = require('foo/package.json').version
    } catch (er) {
      foo = null
    }
    if ( notGoodFooVersion(fooVersion) ) {
      foo = null
    }
    
    // .. then later in your program ..
    
    if (foo) {
      foo.doFooThings()
    }
    

    Entries in optionalDependencies will override entries of the same name in dependencies, so it's usually best to only put in one place.

    engines

    You can specify the version of node that your stuff works on:

    { "engines" : { "node" : ">=0.10.3 <0.12" } }
    

    And, like with dependencies, if you don't specify the version (or if you specify "*" as the version), then any version of node will do.

    If you specify an "engines" field, then npm will require that "node" be somewhere on that list. If "engines" is omitted, then npm will just assume that it works on node.

    You can also use the "engines" field to specify which versions of npm are capable of properly installing your program. For example:

    { "engines" : { "npm" : "~1.0.20" } }
    

    Note that, unless the user has set the engine-strict config flag, this field is advisory only.

    engineStrict

    If you are sure that your module will definitely not run properly on versions of Node/npm other than those specified in the engines object, then you can set "engineStrict": true in your package.json file. This will override the user's engine-strict config setting.

    Please do not do this unless you are really very very sure. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node. Consider this choice carefully. If people abuse it, it will be removed in a future version of npm.

    os

    You can specify which operating systems your module will run on:

    "os" : [ "darwin", "linux" ]
    

    You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!':

    "os" : [ "!win32" ]
    

    The host operating system is determined by process.platform

    It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this.

    cpu

    If your code only runs on certain cpu architectures, you can specify which ones.

    "cpu" : [ "x64", "ia32" ]
    

    Like the os option, you can also blacklist architectures:

    "cpu" : [ "!arm", "!mips" ]
    

    The host architecture is determined by process.arch

    preferGlobal

    If your package is primarily a command-line application that should be installed globally, then set this value to true to provide a warning if it is installed locally.

    It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected.

    private

    If you set "private": true in your package.json, then npm will refuse to publish it.

    This is a way to prevent accidental publication of private repositories. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the publishConfig dictionary described below to override the registry config param at publish-time.

    publishConfig

    This is a set of config values that will be used at publish-time. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default.

    Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing.

    See npm-config(7) to see the list of config options that can be overridden.

    DEFAULT VALUES

    npm will default some values based on package contents.

    • "scripts": {"start": "node server.js"}

      If there is a server.js file in the root of your package, then npm will default the start command to node server.js.

    • "scripts":{"preinstall": "node-gyp rebuild"}

      If there is a binding.gyp file in the root of your package, npm will default the preinstall command to compile using node-gyp.

    • "contributors": [...]

      If there is an AUTHORS file in the root of your package, npm will treat each line as a Name <email> (url) format, where email and url are optional. Lines which start with a # or are blank, will be ignored.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/files/npmrc.html000644 000766 000024 00000013255 12455173731 025530 0ustar00iojsstaff000000 000000 npmrc

    npmrc

    The npm config files

    DESCRIPTION

    npm gets its config settings from the command line, environment variables, and npmrc files.

    The npm config command can be used to update and edit the contents of the user and global npmrc files.

    For a list of available configuration options, see npm-config(7).

    FILES

    The four relevant files are:

    • per-project config file (/path/to/my/project/.npmrc)
    • per-user config file (~/.npmrc)
    • global config file ($PREFIX/npmrc)
    • npm builtin config file (/path/to/npm/npmrc)

    All npm config files are an ini-formatted list of key = value parameters. Environment variables can be replaced using ${VARIABLE_NAME}. For example:

    prefix = ${HOME}/.npm-packages
    

    Each of these files is loaded, and config options are resolved in priority order. For example, a setting in the userconfig file would override the setting in the globalconfig file.

    Array values are specified by adding "[]" after the key name. For example:

    key[] = "first value"
    key[] = "second value"
    

    Per-project config file

    When working locally in a project, a .npmrc file in the root of the project (ie, a sibling of node_modules and package.json) will set config values specific to this project.

    Note that this only applies to the root of the project that you're running npm in. It has no effect when your module is published. For example, you can't publish a module that forces itself to install globally, or in a different location.

    Per-user config file

    $HOME/.npmrc (or the userconfig param, if set in the environment or on the command line)

    Global config file

    $PREFIX/etc/npmrc (or the globalconfig param, if set above): This file is an ini-file formatted list of key = value parameters. Environment variables can be replaced as above.

    Built-in config file

    path/to/npm/itself/npmrc

    This is an unchangeable "builtin" configuration file that npm keeps consistent across updates. Set fields in here using the ./configure script that comes with npm. This is primarily for distribution maintainers to override default configs in a standard and consistent manner.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/files/package.json.html000644 000766 000024 00000073762 12455173731 026765 0ustar00iojsstaff000000 000000 package.json

    package.json

    Specifics of npm's package.json handling

    DESCRIPTION

    This document is all you need to know about what's required in your package.json file. It must be actual JSON, not just a JavaScript object literal.

    A lot of the behavior described in this document is affected by the config settings described in npm-config(7).

    name

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    The name is what your thing is called. Some tips:

    • Don't put "js" or "node" in the name. It's assumed that it's js, since you're writing a package.json file, and you can specify the engine using the "engines" field. (See below.)
    • The name ends up being part of a URL, an argument on the command line, and a folder name. Any name with non-url-safe characters will be rejected. Also, it can't start with a dot or an underscore.
    • The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive.
    • You may want to check the npm registry to see if there's something by that name already, before you get too attached to it. http://registry.npmjs.org/

    A name can be optionally prefixed by a scope, e.g. @myorg/mypackage. See npm-scope(7) for more detail.

    version

    The most important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version.

    Version must be parseable by node-semver, which is bundled with npm as a dependency. (npm install semver to use it yourself.)

    More on version numbers and ranges at semver(7).

    description

    Put a description in it. It's a string. This helps people discover your package, as it's listed in npm search.

    keywords

    Put keywords in it. It's an array of strings. This helps people discover your package as it's listed in npm search.

    homepage

    The url to the project homepage.

    NOTE: This is not the same as "url". If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you.

    Literally. Spit. I'm so not kidding.

    bugs

    The url to your project's issue tracker and / or the email address to which issues should be reported. These are helpful for people who encounter issues with your package.

    It should look like this:

    { "url" : "http://github.com/owner/project/issues"
    , "email" : "project@hostname.com"
    }
    

    You can specify either one or both values. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object.

    If a url is provided, it will be used by the npm bugs command.

    license

    You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it.

    The simplest way, assuming you're using a common license such as BSD-3-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this:

    { "license" : "BSD-3-Clause" }
    

    You can check the full list of SPDX license IDs. Ideally you should pick one that is OSI approved.

    It's also a good idea to include a LICENSE file at the top level in your package.

    people fields: author, contributors

    The "author" is one person. "contributors" is an array of people. A "person" is an object with a "name" field and optionally "url" and "email", like this:

    { "name" : "Barney Rubble"
    , "email" : "b@rubble.com"
    , "url" : "http://barnyrubble.tumblr.com/"
    }
    

    Or you can shorten that all into a single string, and npm will parse it for you:

    "Barney Rubble <b@rubble.com> (http://barnyrubble.tumblr.com/)
    

    Both email and url are optional either way.

    npm also sets a top-level "maintainers" field with your npm user info.

    files

    The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.)

    You can also provide a ".npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array. The ".npmignore" file works just like a ".gitignore".

    main

    The main field is a module ID that is the primary entry point to your program. That is, if your package is named foo, and a user installs it, and then does require("foo"), then your main module's exports object will be returned.

    This should be a module ID relative to the root of your package folder.

    For most modules, it makes the most sense to have a main script and often not much else.

    bin

    A lot of packages have one or more executable files that they'd like to install into the PATH. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable.)

    To use this, supply a bin field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into prefix/bin for global installs, or ./node_modules/.bin/ for local installs.

    For example, npm has this:

    { "bin" : { "npm" : "./cli.js" } }
    

    So, when you install npm, it'll create a symlink from the cli.js script to /usr/local/bin/npm.

    If you have a single executable, and its name should be the name of the package, then you can just supply it as a string. For example:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin": "./path/to/program" }
    

    would be the same as this:

    { "name": "my-program"
    , "version": "1.2.5"
    , "bin" : { "my-program" : "./path/to/program" } }
    

    man

    Specify either a single file or an array of filenames to put in place for the man program to find.

    If only a single file is provided, then it's installed such that it is the result from man <pkgname>, regardless of its actual filename. For example:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : "./man/doc.1"
    }
    

    would link the ./man/doc.1 file in such that it is the target for man foo

    If the filename doesn't start with the package name, then it's prefixed. So, this:

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/bar.1" ]
    }
    

    will create files to do man foo and man foo-bar.

    Man files must end with a number, and optionally a .gz suffix if they are compressed. The number dictates which man section the file is installed into.

    { "name" : "foo"
    , "version" : "1.2.3"
    , "description" : "A packaged foo fooer for fooing foos"
    , "main" : "foo.js"
    , "man" : [ "./man/foo.1", "./man/foo.2" ]
    }
    

    will create entries for man foo and man 2 foo

    directories

    The CommonJS Packages spec details a few ways that you can indicate the structure of your package using a directories object. If you look at npm's package.json, you'll see that it has directories for doc, lib, and man.

    In the future, this information may be used in other creative ways.

    directories.lib

    Tell people where the bulk of your library is. Nothing special is done with the lib folder in any way, but it's useful meta info.

    directories.bin

    If you specify a bin directory, then all the files in that folder will be added as children of the bin path.

    If you have a bin path already, then this has no effect.

    directories.man

    A folder that is full of man pages. Sugar to generate a "man" array by walking the folder.

    directories.doc

    Put markdown files in here. Eventually, these will be displayed nicely, maybe, someday.

    directories.example

    Put example scripts in here. Someday, it might be exposed in some clever way.

    repository

    Specify the place where your code lives. This is helpful for people who want to contribute. If the git repo is on GitHub, then the npm docs command will be able to find you.

    Do it like this:

    "repository" :
      { "type" : "git"
      , "url" : "http://github.com/npm/npm.git"
      }
    
    "repository" :
      { "type" : "svn"
      , "url" : "http://v8.googlecode.com/svn/trunk/"
      }
    

    The URL should be a publicly available (perhaps read-only) url that can be handed directly to a VCS program without any modification. It should not be a url to an html project page that you put in your browser. It's for computers.

    scripts

    The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package. The key is the lifecycle event, and the value is the command to run at that point.

    See npm-scripts(7) to find out more about writing package scripts.

    config

    A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades. For instance, if a package had the following:

    { "name" : "foo"
    , "config" : { "port" : "8080" } }
    

    and then had a "start" command that then referenced the npm_package_config_port environment variable, then the user could override that by doing npm config set foo:port 8001.

    See npm-config(7) and npm-scripts(7) for more on package configs.

    dependencies

    Dependencies are specified in a simple object that maps a package name to a version range. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or git URL.

    Please do not put test harnesses or transpilers in your dependencies object. See devDependencies, below.

    See semver(7) for more details about specifying version ranges.

    • version Must match version exactly
    • >version Must be greater than version
    • >=version etc
    • <version
    • <=version
    • ~version "Approximately equivalent to version" See semver(7)
    • ^version "Compatible with version" See semver(7)
    • 1.2.x 1.2.0, 1.2.1, etc., but not 1.3.0
    • http://... See 'URLs as Dependencies' below
    • * Matches any version
    • "" (just an empty string) Same as *
    • version1 - version2 Same as >=version1 <=version2.
    • range1 || range2 Passes if either range1 or range2 are satisfied.
    • git... See 'Git URLs as Dependencies' below
    • user/repo See 'GitHub URLs' below
    • tag A specific version tagged and published as tag See npm-tag(1)
    • path/path/path See Local Paths below

    For example, these are all valid:

    { "dependencies" :
      { "foo" : "1.0.0 - 2.9999.9999"
      , "bar" : ">=1.0.2 <2.1.2"
      , "baz" : ">1.0.2 <=2.3.4"
      , "boo" : "2.0.1"
      , "qux" : "<1.0.0 || >=2.3.1 <2.4.5 || >=2.5.2 <3.0.0"
      , "asd" : "http://asdf.com/asdf.tar.gz"
      , "til" : "~1.2"
      , "elf" : "~1.2.3"
      , "two" : "2.x"
      , "thr" : "3.3.x"
      , "lat" : "latest"
      , "dyl" : "file:../dyl"
      }
    }
    

    URLs as Dependencies

    You may specify a tarball URL in place of a version range.

    This tarball will be downloaded and installed locally to your package at install time.

    Git URLs as Dependencies

    Git urls can be of the form:

    git://github.com/user/project.git#commit-ish
    git+ssh://user@hostname:project.git#commit-ish
    git+ssh://user@hostname/project.git#commit-ish
    git+http://user@hostname/project/blah.git#commit-ish
    git+https://user@hostname/project/blah.git#commit-ish
    

    The commit-ish can be any tag, sha, or branch which can be supplied as an argument to git checkout. The default is master.

    GitHub URLs

    As of version 1.1.65, you can refer to GitHub urls as just "foo": "user/foo-project". Just as with git URLs, a commit-ish suffix can be included. For example:

    {
      "name": "foo",
      "version": "0.0.0",
      "dependencies": {
        "express": "visionmedia/express",
        "mocha": "visionmedia/mocha#4727d357ea"
      }
    }
    

    Local Paths

    As of version 2.0.0 you can provide a path to a local directory that contains a package. Local paths can be saved using npm install --save, using any of these forms:

    ../foo/bar
    ~/foo/bar
    ./foo/bar
    /foo/bar
    

    in which case they will be normalized to a relative path and added to your package.json. For example:

    {
      "name": "baz",
      "dependencies": {
        "bar": "file:../foo/bar"
      }
    }
    

    This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry.

    devDependencies

    If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use.

    In this case, it's best to map these additional items in a devDependencies object.

    These things will be installed when doing npm link or npm install from the root of a package, and can be managed like any other npm configuration param. See npm-config(7) for more on the topic.

    For build steps that are not platform-specific, such as compiling CoffeeScript or other languages to JavaScript, use the prepublish script to do this, and make the required package a devDependency.

    For example:

    { "name": "ethopia-waza",
      "description": "a delightfully fruity coffee varietal",
      "version": "1.2.3",
      "devDependencies": {
        "coffee-script": "~1.6.3"
      },
      "scripts": {
        "prepublish": "coffee -o lib/ -c src/waza.coffee"
      },
      "main": "lib/waza.js"
    }
    

    The prepublish script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves. In dev mode (ie, locally running npm install), it'll run this script as well, so that you can test it easily.

    peerDependencies

    In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a require of this host. This is usually referred to as a plugin. Notably, your module may be exposing a specific interface, expected and specified by the host documentation.

    For example:

    {
      "name": "tea-latte",
      "version": "1.3.5"
      "peerDependencies": {
        "tea": "2.x"
      }
    }
    

    This ensures your package tea-latte can be installed along with the second major version of the host package tea only. The host package is automatically installed if needed. npm install tea-latte could possibly yield the following dependency graph:

    ├── tea-latte@1.3.5
    └── tea@2.2.0
    

    Trying to install another plugin with a conflicting requirement will cause an error. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions.

    Assuming the host complies with semver, only changes in the host package's major version will break your plugin. Thus, if you've worked with every 1.x version of the host package, use "^1.0" or "1.x" to express this. If you depend on features introduced in 1.5.2, use ">= 1.5.2 < 2".

    bundledDependencies

    Array of package names that will be bundled when publishing the package.

    If this is spelled "bundleDependencies", then that is also honorable.

    optionalDependencies

    If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the optionalDependencies object. This is a map of package name to version or url, just like the dependencies object. The difference is that build failures do not cause installation to fail.

    It is still your program's responsibility to handle the lack of the dependency. For example, something like this:

    try {
      var foo = require('foo')
      var fooVersion = require('foo/package.json').version
    } catch (er) {
      foo = null
    }
    if ( notGoodFooVersion(fooVersion) ) {
      foo = null
    }
    
    // .. then later in your program ..
    
    if (foo) {
      foo.doFooThings()
    }
    

    Entries in optionalDependencies will override entries of the same name in dependencies, so it's usually best to only put in one place.

    engines

    You can specify the version of node that your stuff works on:

    { "engines" : { "node" : ">=0.10.3 <0.12" } }
    

    And, like with dependencies, if you don't specify the version (or if you specify "*" as the version), then any version of node will do.

    If you specify an "engines" field, then npm will require that "node" be somewhere on that list. If "engines" is omitted, then npm will just assume that it works on node.

    You can also use the "engines" field to specify which versions of npm are capable of properly installing your program. For example:

    { "engines" : { "npm" : "~1.0.20" } }
    

    Note that, unless the user has set the engine-strict config flag, this field is advisory only.

    engineStrict

    If you are sure that your module will definitely not run properly on versions of Node/npm other than those specified in the engines object, then you can set "engineStrict": true in your package.json file. This will override the user's engine-strict config setting.

    Please do not do this unless you are really very very sure. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node. Consider this choice carefully. If people abuse it, it will be removed in a future version of npm.

    os

    You can specify which operating systems your module will run on:

    "os" : [ "darwin", "linux" ]
    

    You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!':

    "os" : [ "!win32" ]
    

    The host operating system is determined by process.platform

    It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this.

    cpu

    If your code only runs on certain cpu architectures, you can specify which ones.

    "cpu" : [ "x64", "ia32" ]
    

    Like the os option, you can also blacklist architectures:

    "cpu" : [ "!arm", "!mips" ]
    

    The host architecture is determined by process.arch

    preferGlobal

    If your package is primarily a command-line application that should be installed globally, then set this value to true to provide a warning if it is installed locally.

    It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected.

    private

    If you set "private": true in your package.json, then npm will refuse to publish it.

    This is a way to prevent accidental publication of private repositories. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the publishConfig dictionary described below to override the registry config param at publish-time.

    publishConfig

    This is a set of config values that will be used at publish-time. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default.

    Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing.

    See npm-config(7) to see the list of config options that can be overridden.

    DEFAULT VALUES

    npm will default some values based on package contents.

    • "scripts": {"start": "node server.js"}

      If there is a server.js file in the root of your package, then npm will default the start command to node server.js.

    • "scripts":{"preinstall": "node-gyp rebuild"}

      If there is a binding.gyp file in the root of your package, npm will default the preinstall command to compile using node-gyp.

    • "contributors": [...]

      If there is an AUTHORS file in the root of your package, npm will treat each line as a Name <email> (url) format, where email and url are optional. Lines which start with a # or are blank, will be ignored.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-adduser.html000644 000766 000024 00000013636 12455173731 026300 0ustar00iojsstaff000000 000000 npm-adduser

    npm-adduser

    Add a registry user account

    SYNOPSIS

    npm adduser [--registry=url] [--scope=@orgname] [--always-auth]
    

    DESCRIPTION

    Create or verify a user named <username> in the specified registry, and save the credentials to the .npmrc file. If no registry is specified, the default registry will be used (see npm-config(7)).

    The username, password, and email are read in from prompts.

    To reset your password, go to https://www.npmjs.com/forgot

    To change your email address, go to https://www.npmjs.com/email-edit

    You may use this command multiple times with the same user account to authorize on a new machine. When authenticating on a new machine, the username, password and email address must all match with your existing record.

    npm login is an alias to adduser and behaves exactly the same way.

    CONFIGURATION

    registry

    Default: http://registry.npmjs.org/

    The base URL of the npm package registry. If scope is also specified, this registry will only be used for packages with that scope. See npm-scope(7).

    scope

    Default: none

    If specified, the user and login credentials given will be associated with the specified scope. See npm-scope(7). You can use both at the same time, e.g.

    npm adduser --registry=http://myregistry.example.com --scope=@myco
    

    This will set a registry for the given scope and login or create a user for that registry at the same time.

    always-auth

    Default: false

    If specified, save configuration indicating that all requests to the given registry should include authorization information. Useful for private registries. Can be used with --registry and / or --scope, e.g.

    npm adduser --registry=http://private-registry.example.com --always-auth
    

    This will ensure that all requests to that registry (including for tarballs) include an authorization header. See always-auth in npm-config(7) for more details on always-auth. Registry-specific configuration of always-auth takes precedence over any global configuration.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-bin.html000644 000766 000024 00000006443 12455173731 025417 0ustar00iojsstaff000000 000000 npm-bin

    npm-bin

    Display npm bin folder

    SYNOPSIS

    npm bin
    

    DESCRIPTION

    Print the folder where npm will install executables.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-bugs.html000644 000766 000024 00000010512 12455173731 025577 0ustar00iojsstaff000000 000000 npm-bugs

    npm-bugs

    Bugs for a package in a web browser maybe

    SYNOPSIS

    npm bugs <pkgname>
    npm bugs (with no args in a package dir)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the --browser config param. If no package name is provided, it will search for a package.json in the current folder and use the name property.

    CONFIGURATION

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm bugs command to open websites.

    registry

    The base URL of the npm package registry.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-build.html000644 000766 000024 00000006545 12455173731 025751 0ustar00iojsstaff000000 000000 npm-build

    npm-build

    Build a package

    SYNOPSIS

    npm build <package-folder>
    
    • <package-folder>: A folder containing a package.json file in its root.

    DESCRIPTION

    This is the plumbing command called by npm link and npm install.

    It should generally not be called directly.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-bundle.html000644 000766 000024 00000005754 12455173731 026124 0ustar00iojsstaff000000 000000 npm-bundle

    npm-bundle

    REMOVED

    DESCRIPTION

    The npm bundle command has been removed in 1.0, for the simple reason that it is no longer necessary, as the default behavior is now to install packages into the local space.

    Just use npm install now to do what npm bundle used to do.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-cache.html000644 000766 000024 00000012441 12455173731 025705 0ustar00iojsstaff000000 000000 npm-cache

    npm-cache

    Manipulates packages cache

    SYNOPSIS

    npm cache add <tarball file>
    npm cache add <folder>
    npm cache add <tarball url>
    npm cache add <name>@<version>
    
    npm cache ls [<path>]
    
    npm cache clean [<path>]
    

    DESCRIPTION

    Used to add, list, or clear the npm cache folder.

    • add: Add the specified package to the local cache. This command is primarily intended to be used internally by npm, but it can provide a way to add data to the local installation cache explicitly.

    • ls: Show the data in the cache. Argument is a path to show in the cache folder. Works a bit like the find program, but limited by the depth config.

    • clean: Delete data out of the cache folder. If an argument is provided, then it specifies a subpath to delete. If no argument is provided, then the entire cache is cleared.

    DETAILS

    npm stores cache data in the directory specified in npm config get cache. For each package that is added to the cache, three pieces of information are stored in {cache}/{name}/{version}:

    • .../package/package.json: The package.json file, as npm sees it.
    • .../package.tgz: The tarball for that version.

    Additionally, whenever a registry request is made, a .cache.json file is placed at the corresponding URI, to store the ETag and the requested data. This is stored in {cache}/{hostname}/{path}/.cache.json.

    Commands that make non-essential registry requests (such as search and view, or the completion scripts) generally specify a minimum timeout. If the .cache.json file is younger than the specified timeout, then they do not make an HTTP request to the registry.

    CONFIGURATION

    cache

    Default: ~/.npm on Posix, or %AppData%/npm-cache on Windows.

    The root cache folder.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-completion.html000644 000766 000024 00000007150 12455173731 027014 0ustar00iojsstaff000000 000000 npm-completion

    npm-completion

    Tab Completion for npm

    SYNOPSIS

    . <(npm completion)
    

    DESCRIPTION

    Enables tab-completion in all npm commands.

    The synopsis above loads the completions into your current shell. Adding it to your ~/.bashrc or ~/.zshrc will make the completions available everywhere.

    You may of course also pipe the output of npm completion to a file such as /usr/local/etc/bash_completion.d/npm if you have a system that will read that file for you.

    When COMP_CWORD, COMP_LINE, and COMP_POINT are defined in the environment, npm completion acts in "plumbing mode", and outputs completions based on the arguments.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-config.html000644 000766 000024 00000011320 12455173731 026102 0ustar00iojsstaff000000 000000 npm-config

    npm-config

    Manage the npm configuration files

    SYNOPSIS

    npm config set <key> <value> [--global]
    npm config get <key>
    npm config delete <key>
    npm config list
    npm config edit
    npm c [set|get|delete|list]
    npm get <key>
    npm set <key> <value> [--global]
    

    DESCRIPTION

    npm gets its config settings from the command line, environment variables, npmrc files, and in some cases, the package.json file.

    See npmrc(5) for more information about the npmrc files.

    See npm-config(7) for a more thorough discussion of the mechanisms involved.

    The npm config command can be used to update and edit the contents of the user and global npmrc files.

    Sub-commands

    Config supports the following sub-commands:

    set

    npm config set key value
    

    Sets the config key to the value.

    If value is omitted, then it sets it to "true".

    get

    npm config get key
    

    Echo the config value to stdout.

    list

    npm config list
    

    Show all the config settings.

    delete

    npm config delete key
    

    Deletes the key from all configuration files.

    edit

    npm config edit
    

    Opens the config file in an editor. Use the --global flag to edit the global config.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-dedupe.html000644 000766 000024 00000010732 12455173731 026111 0ustar00iojsstaff000000 000000 npm-dedupe

    npm-dedupe

    Reduce duplication

    SYNOPSIS

    npm dedupe [package names...]
    npm ddp [package names...]
    

    DESCRIPTION

    Searches the local package tree and attempts to simplify the overall structure by moving dependencies further up the tree, where they can be more effectively shared by multiple dependent packages.

    For example, consider this dependency graph:

    a
    +-- b <-- depends on c@1.0.x
    |   `-- c@1.0.3
    `-- d <-- depends on c@~1.0.9
        `-- c@1.0.10
    

    In this case, npm-dedupe(1) will transform the tree to:

    a
    +-- b
    +-- d
    `-- c@1.0.10
    

    Because of the hierarchical nature of node's module lookup, b and d will both get their dependency met by the single c package at the root level of the tree.

    If a suitable version exists at the target location in the tree already, then it will be left untouched, but the other duplicates will be deleted.

    If no suitable version can be found, then a warning is printed, and nothing is done.

    If any arguments are supplied, then they are filters, and only the named packages will be touched.

    Note that this operation transforms the dependency tree, and may result in packages getting updated versions, perhaps from the npm registry.

    This feature is experimental, and may change in future versions.

    The --tag argument will apply to all of the affected dependencies. If a tag with the given name exists, the tagged version is preferred over newer versions.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-deprecate.html000644 000766 000024 00000007063 12455173731 026602 0ustar00iojsstaff000000 000000 npm-deprecate

    npm-deprecate

    Deprecate a version of a package

    SYNOPSIS

    npm deprecate <name>[@<version>] <message>
    

    DESCRIPTION

    This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it.

    It works on version ranges as well as specific versions, so you can do something like this:

    npm deprecate my-thing@"< 0.2.3" "critical bug fixed in v0.2.3"
    

    Note that you must be the package owner to deprecate something. See the owner and adduser help topics.

    To un-deprecate a package, specify an empty string ("") for the message argument.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-docs.html000644 000766 000024 00000010620 12455173731 025567 0ustar00iojsstaff000000 000000 npm-docs

    npm-docs

    Docs for a package in a web browser maybe

    SYNOPSIS

    npm docs [<pkgname> [<pkgname> ...]]
    npm docs (with no args in a package dir)
    npm home [<pkgname> [<pkgname> ...]]
    npm home (with no args in a package dir)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the --browser config param. You can pass multiple package names at once. If no package name is provided, it will search for a package.json in the current folder and use the name property.

    CONFIGURATION

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm docs command to open websites.

    registry

    The base URL of the npm package registry.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-edit.html000644 000766 000024 00000010217 12455173731 025566 0ustar00iojsstaff000000 000000 npm-edit

    npm-edit

    Edit an installed package

    SYNOPSIS

    npm edit <name>[@<version>]
    

    DESCRIPTION

    Opens the package folder in the default editor (or whatever you've configured as the npm editor config -- see npm-config(7).)

    After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages.

    For instance, you can do npm install connect to install connect into your package, and then npm edit connect to make a few changes to your locally installed copy.

    CONFIGURATION

    editor

    • Default: EDITOR environment variable if set, or "vi" on Posix, or "notepad" on Windows.
    • Type: path

    The command to run for npm edit or npm config edit.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-explore.html000644 000766 000024 00000007745 12455173731 026333 0ustar00iojsstaff000000 000000 npm-explore

    npm-explore

    Browse an installed package

    SYNOPSIS

    npm explore <name> [ -- <cmd>]
    

    DESCRIPTION

    Spawn a subshell in the directory of the installed package specified.

    If a command is specified, then it is run in the subshell, which then immediately terminates.

    This is particularly handy in the case of git submodules in the node_modules folder:

    npm explore some-dependency -- git pull origin master
    

    Note that the package is not automatically rebuilt afterwards, so be sure to use npm rebuild <pkg> if you make any changes.

    CONFIGURATION

    shell

    • Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows
    • Type: path

    The shell to run for the npm explore command.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-help-search.html000644 000766 000024 00000007317 12455173731 027043 0ustar00iojsstaff000000 000000 npm-help-search

    npm-help-search

    Search npm help documentation

    SYNOPSIS

    npm help-search some search terms
    

    DESCRIPTION

    This command will search the npm markdown documentation files for the terms provided, and then list the results, sorted by relevance.

    If only one result is found, then it will show that help topic.

    If the argument to npm help is not a known help topic, then it will call help-search. It is rarely if ever necessary to call this command directly.

    CONFIGURATION

    long

    • Type: Boolean
    • Default false

    If true, the "long" flag will cause help-search to output context around where the terms were found in the documentation.

    If false, then help-search will just list out the help topics found.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-help.html000644 000766 000024 00000010520 12455173731 025566 0ustar00iojsstaff000000 000000 npm-help

    npm-help

    Get help on npm

    SYNOPSIS

    npm help <topic>
    npm help some search terms
    

    DESCRIPTION

    If supplied a topic, then show the appropriate documentation page.

    If the topic does not exist, or if multiple terms are provided, then run the help-search command to find a match. Note that, if help-search finds a single subject, then it will run help on that topic, so unique matches are equivalent to specifying a topic name.

    CONFIGURATION

    viewer

    • Default: "man" on Posix, "browser" on Windows
    • Type: path

    The program to use to view help content.

    Set to "browser" to view html help content in the default web browser.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-init.html000644 000766 000024 00000007242 12455173731 025610 0ustar00iojsstaff000000 000000 npm-init

    npm-init

    Interactively create a package.json file

    SYNOPSIS

    npm init [-f|--force|-y|--yes]
    

    DESCRIPTION

    This will ask you a bunch of questions, and then write a package.json for you.

    It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected.

    If you already have a package.json file, it'll read that first, and default to the options in there.

    It is strictly additive, so it does not delete options from your package.json without a really good reason to do so.

    If you invoke it with -f, --force, -y, or --yes, it will use only defaults and not prompt you for any options.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-install.html000644 000766 000024 00000036227 12455173731 026320 0ustar00iojsstaff000000 000000 npm-install

    npm-install

    Install a package

    SYNOPSIS

    npm install (with no args in a package dir)
    npm install <tarball file>
    npm install <tarball url>
    npm install <folder>
    npm install [@<scope>/]<name> [--save|--save-dev|--save-optional] [--save-exact]
    npm install [@<scope>/]<name>@<tag>
    npm install [@<scope>/]<name>@<version>
    npm install [@<scope>/]<name>@<version range>
    npm i (with any of the previous argument usage)
    

    DESCRIPTION

    This command installs a package, and any packages that it depends on. If the package has a shrinkwrap file, the installation of dependencies will be driven by that. See npm-shrinkwrap(1).

    A package is:

    • a) a folder containing a program described by a package.json file
    • b) a gzipped tarball containing (a)
    • c) a url that resolves to (b)
    • d) a <name>@<version> that is published on the registry (see npm-registry(7)) with (c)
    • e) a <name>@<tag> that points to (d)
    • f) a <name> that has a "latest" tag satisfying (e)
    • g) a <git remote url> that resolves to (b)

    Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b).

    • npm install (in package directory, no arguments):

      Install the dependencies in the local node_modules folder.

      In global mode (ie, with -g or --global appended to the command), it installs the current package context (ie, the current working directory) as a global package.

      By default, npm install will install all modules listed as dependencies. With the --production flag, npm will not install modules listed in devDependencies.

    • npm install <folder>:

      Install a package that is sitting in a folder on the filesystem.

    • npm install <tarball file>:

      Install a package that is sitting on the filesystem. Note: if you just want to link a dev directory into your npm root, you can do this more easily by using npm link.

      Example:

          npm install ./package.tgz
      
    • npm install <tarball url>:

      Fetch the tarball url, and then install it. In order to distinguish between this and other options, the argument must start with "http://" or "https://"

      Example:

          npm install https://github.com/indexzero/forever/tarball/v0.5.6
      
    • npm install [@<scope>/]<name> [--save|--save-dev|--save-optional]:

      Do a <name>@<tag> install, where <tag> is the "tag" config. (See npm-config(7).)

      In most cases, this will install the latest version of the module published on npm.

      Example:

          npm install sax
      

      npm install takes 3 exclusive, optional flags which save or update the package version in your main package.json:

      • --save: Package will appear in your dependencies.

      • --save-dev: Package will appear in your devDependencies.

      • --save-optional: Package will appear in your optionalDependencies.

        When using any of the above options to save dependencies to your package.json, there is an additional, optional flag:

      • --save-exact: Saved dependencies will be configured with an exact version rather than using npm's default semver range operator.

        <scope> is optional. The package will be downloaded from the registry associated with the specified scope. If no registry is associated with the given scope the default registry is assumed. See npm-scope(7).

        Note: if you do not include the @-symbol on your scope name, npm will interpret this as a GitHub repository instead, see below. Scopes names must also be followed by a slash.

        Examples:

        npm install sax --save
        npm install githubname/reponame
        npm install @myorg/privatepackage
        npm install node-tap --save-dev
        npm install dtrace-provider --save-optional
        npm install readable-stream --save --save-exact
        
    **Note**: If there is a file or folder named `<name>` in the current
    working directory, then it will try to install that, and only try to
    fetch the package by name if it is not valid.
    
    • npm install [@<scope>/]<name>@<tag>:

      Install the version of the package that is referenced by the specified tag. If the tag does not exist in the registry data for that package, then this will fail.

      Example:

          npm install sax@latest
          npm install @myorg/mypackage@latest
      
    • npm install [@<scope>/]<name>@<version>:

      Install the specified version of the package. This will fail if the version has not been published to the registry.

      Example:

          npm install sax@0.1.1
          npm install @myorg/privatepackage@1.5.0
      
    • npm install [@<scope>/]<name>@<version range>:

      Install a version of the package matching the specified version range. This will follow the same rules for resolving dependencies described in package.json(5).

      Note that most version ranges must be put in quotes so that your shell will treat it as a single argument.

      Example:

          npm install sax@">=0.1.0 <0.2.0"
          npm install @myorg/privatepackage@">=0.1.0 <0.2.0"
      
    • npm install <githubname>/<githubrepo>:

      Install the package at https://github.com/githubname/githubrepo" by attempting to clone it usinggit`.

      Example:

          npm install mygithubuser/myproject
      

      To reference a package in a git repo that is not on GitHub, see git remote urls below.

    • npm install <git remote url>:

      Install a package by cloning a git remote url. The format of the git url is:

          <protocol>://[<user>@]<hostname><separator><path>[#<commit-ish>]
      

      <protocol> is one of git, git+ssh, git+http, or git+https. If no <commit-ish> is specified, then master is used.

      Examples:

          git+ssh://git@github.com:npm/npm.git#v1.0.27
          git+https://isaacs@github.com/npm/npm.git
          git://github.com/npm/npm.git#v1.0.27
      

    You may combine multiple arguments, and even multiple types of arguments. For example:

    npm install sax@">=0.1.0 <0.2.0" bench supervisor
    

    The --tag argument will apply to all of the specified install targets. If a tag with the given name exists, the tagged version is preferred over newer versions.

    The --force argument will force npm to fetch remote resources even if a local copy exists on disk.

    npm install sax --force
    

    The --global argument will cause npm to install the package globally rather than locally. See npm-folders(5).

    The --link argument will cause npm to link global installs into the local space in some cases.

    The --no-bin-links argument will prevent npm from creating symlinks for any binaries the package might contain.

    The --no-optional argument will prevent optional dependencies from being installed.

    The --no-shrinkwrap argument, which will ignore an available shrinkwrap file and use the package.json instead.

    The --nodedir=/path/to/node/source argument will allow npm to find the node source code so that npm can compile native modules.

    See npm-config(7). Many of the configuration params have some effect on installation, since that's most of what npm does.

    ALGORITHM

    To install a package, npm uses the following algorithm:

    install(where, what, family, ancestors)
    fetch what, unpack to <where>/node_modules/<what>
    for each dep in what.dependencies
      resolve dep to precise version
    for each dep@version in what.dependencies
        not in <where>/node_modules/<what>/node_modules/*
        and not in <family>
      add precise version deps to <family>
      install(<where>/node_modules/<what>, dep, family)
    

    For this package{dep} structure: A{B,C}, B{C}, C{D}, this algorithm produces:

    A
    +-- B
    `-- C
        `-- D
    

    That is, the dependency from B to C is satisfied by the fact that A already caused C to be installed at a higher level.

    See npm-folders(5) for a more detailed description of the specific folder structures that npm creates.

    Limitations of npm's Install Algorithm

    There are some very rare and pathological edge-cases where a cycle can cause npm to try to install a never-ending tree of packages. Here is the simplest case:

    A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ...
    

    where A is some version of a package, and A' is a different version of the same package. Because B depends on a different version of A than the one that is already in the tree, it must install a separate copy. The same is true of A', which must install B'. Because B' depends on the original version of A, which has been overridden, the cycle falls into infinite regress.

    To avoid this situation, npm flat-out refuses to install any name@version that is already present anywhere in the tree of package folder ancestors. A more correct, but more complex, solution would be to symlink the existing version into the new location. If this ever affects a real use-case, it will be investigated.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-link.html000644 000766 000024 00000013540 12455173731 025600 0ustar00iojsstaff000000 000000 npm-link

    npm-link

    Symlink a package folder

    SYNOPSIS

    npm link (in package folder)
    npm link [@<scope>/]<pkgname>
    npm ln (with any of the previous argument usage)
    

    DESCRIPTION

    Package linking is a two-step process.

    First, npm link in a package folder will create a globally-installed symbolic link from prefix/package-name to the current folder (see npm-config(7) for the value of prefix).

    Next, in some other location, npm link package-name will create a symlink from the local node_modules folder to the global symlink.

    Note that package-name is taken from package.json, not from directory name.

    The package name can be optionally prefixed with a scope. See npm-scope(7). The scope must be preceded by an @-symbol and followed by a slash.

    When creating tarballs for npm publish, the linked packages are "snapshotted" to their current state by resolving the symbolic links.

    This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild.

    For example:

    cd ~/projects/node-redis    # go into the package directory
    npm link                    # creates global link
    cd ~/projects/node-bloggy   # go into some other package directory.
    npm link redis              # link-install the package
    

    Now, any changes to ~/projects/node-redis will be reflected in ~/projects/node-bloggy/node_modules/redis/

    You may also shortcut the two steps in one. For example, to do the above use-case in a shorter way:

    cd ~/projects/node-bloggy  # go into the dir of your main project
    npm link ../node-redis     # link the dir of your dependency
    

    The second line is the equivalent of doing:

    (cd ../node-redis; npm link)
    npm link redis
    

    That is, it first creates a global link, and then links the global installation target into your project's node_modules folder.

    If your linked package is scoped (see npm-scope(7)) your link command must include that scope, e.g.

    npm link @myorg/privatepackage
    

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-ls.html000644 000766 000024 00000012377 12455173731 025270 0ustar00iojsstaff000000 000000 npm-ls

    npm-ls

    List installed packages

    SYNOPSIS

    npm list [[@<scope>/]<pkg> ...]
    npm ls [[@<scope>/]<pkg> ...]
    npm la [[@<scope>/]<pkg> ...]
    npm ll [[@<scope>/]<pkg> ...]
    

    DESCRIPTION

    This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree-structure.

    Positional arguments are name@version-range identifiers, which will limit the results to only the paths to the packages named. Note that nested packages will also show the paths to the specified packages. For example, running npm ls promzard in npm's source tree will show:

    npm@2.1.18 /path/to/npm
    └─┬ init-package-json@0.0.4
      └── promzard@0.1.5
    

    It will print out extraneous, missing, and invalid packages.

    If a project specifies git urls for dependencies these are shown in parentheses after the name@version to make it easier for users to recognize potential forks of a project.

    When run as ll or la, it shows extended information by default.

    CONFIGURATION

    json

    • Default: false
    • Type: Boolean

    Show information in JSON format.

    long

    • Default: false
    • Type: Boolean

    Show extended information.

    parseable

    • Default: false
    • Type: Boolean

    Show parseable output instead of tree view.

    global

    • Default: false
    • Type: Boolean

    List packages in the global install prefix instead of in the current project.

    depth

    • Type: Int

    Max display depth of the dependency tree.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-outdated.html000644 000766 000024 00000010016 12455173731 026447 0ustar00iojsstaff000000 000000 npm-outdated

    npm-outdated

    Check for outdated packages

    SYNOPSIS

    npm outdated [<name> [<name> ...]]
    

    DESCRIPTION

    This command will check the registry to see if any (or, specific) installed packages are currently outdated.

    The resulting field 'wanted' shows the latest version according to the version specified in the package.json, the field 'latest' the very latest version of the package.

    CONFIGURATION

    json

    • Default: false
    • Type: Boolean

    Show information in JSON format.

    long

    • Default: false
    • Type: Boolean

    Show extended information.

    parseable

    • Default: false
    • Type: Boolean

    Show parseable output instead of tree view.

    global

    • Default: false
    • Type: Boolean

    Check packages in the global install prefix instead of in the current project.

    depth

    • Type: Int

    Max depth for checking dependency tree.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-owner.html000644 000766 000024 00000007520 12455173731 025776 0ustar00iojsstaff000000 000000 npm-owner

    npm-owner

    Manage package owners

    SYNOPSIS

    npm owner ls <package name>
    npm owner add <user> <package name>
    npm owner rm <user> <package name>
    

    DESCRIPTION

    Manage ownership of published packages.

    • ls: List all the users who have access to modify a package and push new versions. Handy when you need to know who to bug for help.
    • add: Add a new user as a maintainer of a package. This user is enabled to modify metadata, publish new versions, and add other owners.
    • rm: Remove a user from the package owner list. This immediately revokes their privileges.

    Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-pack.html000644 000766 000024 00000007260 12455173731 025563 0ustar00iojsstaff000000 000000 npm-pack

    npm-pack

    Create a tarball from a package

    SYNOPSIS

    npm pack [<pkg> [<pkg> ...]]
    

    DESCRIPTION

    For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as <name>-<version>.tgz, and then write the filenames out to stdout.

    If the same package is specified multiple times, then the file will be overwritten the second time.

    If no arguments are supplied, then npm packs the current package folder.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-prefix.html000644 000766 000024 00000007133 12455173731 026141 0ustar00iojsstaff000000 000000 npm-prefix

    npm-prefix

    Display prefix

    SYNOPSIS

    npm prefix [-g]
    

    DESCRIPTION

    Print the local prefix to standard out. This is the closest parent directory to contain a package.json file unless -g is also specified.

    If -g is specified, this will be the value of the global prefix. See npm-config(7) for more detail.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-prune.html000644 000766 000024 00000006674 12455173731 026006 0ustar00iojsstaff000000 000000 npm-prune

    npm-prune

    Remove extraneous packages

    SYNOPSIS

    npm prune [<name> [<name ...]]
    npm prune [<name> [<name ...]] [--production]
    

    DESCRIPTION

    This command removes "extraneous" packages. If a package name is provided, then only packages matching one of the supplied names are removed.

    Extraneous packages are packages that are not listed on the parent package's dependencies list.

    If the --production flag is specified, this command will remove the packages specified in your devDependencies.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-publish.html000644 000766 000024 00000011452 12455173731 026311 0ustar00iojsstaff000000 000000 npm-publish

    npm-publish

    Publish a package

    SYNOPSIS

    npm publish <tarball> [--tag <tag>]
    npm publish <folder> [--tag <tag>]
    

    DESCRIPTION

    Publishes a package to the registry so that it can be installed by name. See npm-developers(7) for details on what's included in the published package, as well as details on how the package is built.

    By default npm will publish to the public registry. This can be overridden by specifying a different default registry or using a npm-scope(7) in the name (see package.json(5)).

    • <folder>: A folder containing a package.json file

    • <tarball>: A url or file path to a gzipped tar archive containing a single folder with a package.json file inside.

    • [--tag <tag>] Registers the published package with the given tag, such that npm install <name>@<tag> will install this version. By default, npm publish updates and npm install installs the latest tag.

    Fails if the package name and version combination already exists in the specified registry.

    Once a package is published with a given name and version, that specific name and version combination can never be used again, even if it is removed with npm-unpublish(1).

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-rebuild.html000644 000766 000024 00000006327 12455173731 026276 0ustar00iojsstaff000000 000000 npm-rebuild

    npm-rebuild

    Rebuild a package

    SYNOPSIS

    npm rebuild [<name> [<name> ...]]
    npm rb [<name> [<name> ...]]
    
    • <name>: The package to rebuild

    DESCRIPTION

    This command runs the npm build command on the matched folders. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-repo.html000644 000766 000024 00000007070 12455173731 025611 0ustar00iojsstaff000000 000000 npm-repo

    npm-repo

    Open package repository page in the browser

    SYNOPSIS

    npm repo <pkgname>
    npm repo (with no args in a package dir)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the --browser config param. If no package name is provided, it will search for a package.json in the current folder and use the name property.

    CONFIGURATION

    browser

    • Default: OS X: "open", Windows: "start", Others: "xdg-open"
    • Type: String

    The browser that is called by the npm repo command to open websites.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-restart.html000644 000766 000024 00000007735 12455173731 026340 0ustar00iojsstaff000000 000000 npm-restart

    npm-restart

    Restart a package

    SYNOPSIS

    npm restart [-- <args>]
    

    DESCRIPTION

    This restarts a package.

    This runs a package's "stop", "restart", and "start" scripts, and associated pre- and post- scripts, in the order given below:

    1. prerestart
    2. prestop
    3. stop
    4. poststop
    5. restart
    6. prestart
    7. start
    8. poststart
    9. postrestart

    NOTE

    Note that the "restart" script is run in addition to the "stop" and "start" scripts, not instead of them.

    This is the behavior as of npm major version 2. A change in this behavior will be accompanied by an increase in major version number

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-rm.html000644 000766 000024 00000006616 12455173731 025267 0ustar00iojsstaff000000 000000 npm-rm

    npm-rm

    Remove a package

    SYNOPSIS

    npm rm <name>
    npm r <name>
    npm uninstall <name>
    npm un <name>
    

    DESCRIPTION

    This uninstalls a package, completely removing everything npm installed on its behalf.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-root.html000644 000766 000024 00000006461 12455173731 025632 0ustar00iojsstaff000000 000000 npm-root

    npm-root

    Display npm root

    SYNOPSIS

    npm root
    

    DESCRIPTION

    Print the effective node_modules folder to standard out.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-run-script.html000644 000766 000024 00000010350 12455173731 026745 0ustar00iojsstaff000000 000000 npm-run-script

    npm-run-script

    Run arbitrary package scripts

    SYNOPSIS

    npm run-script [command] [-- <args>]
    npm run [command] [-- <args>]
    

    DESCRIPTION

    This runs an arbitrary command from a package's "scripts" object. If no package name is provided, it will search for a package.json in the current folder and use its "scripts" object. If no "command" is provided, it will list the available top level scripts.

    It is used by the test, start, restart, and stop commands, but can be called directly, as well.

    As of npm@2.0.0, you can use custom arguments when executing scripts. The special option -- is used by getopt to delimit the end of the options. npm will pass all the arguments after the -- directly to your script:

    npm run test -- --grep="pattern"
    

    The arguments will only be passed to the script specified after npm run and not to any pre or post script.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-search.html000644 000766 000024 00000007576 12455173731 026124 0ustar00iojsstaff000000 000000 npm-search

    npm-search

    Search for packages

    SYNOPSIS

    npm search [--long] [search terms ...]
    npm s [search terms ...]
    npm se [search terms ...]
    

    DESCRIPTION

    Search the registry for packages matching the search terms.

    If a term starts with /, then it's interpreted as a regular expression. A trailing / will be ignored in this case. (Note that many regular expression characters must be escaped or quoted in most shells.)

    CONFIGURATION

    long

    • Default: false
    • Type: Boolean

    Display full package descriptions and other long text across multiple lines. When disabled (default) search results are truncated to fit neatly on a single line. Modules with extremely long names will fall on multiple lines.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-shrinkwrap.html000644 000766 000024 00000023025 12455173731 027032 0ustar00iojsstaff000000 000000 npm-shrinkwrap

    npm-shrinkwrap

    Lock down dependency versions

    SYNOPSIS

    npm shrinkwrap
    

    DESCRIPTION

    This command locks down the versions of a package's dependencies so that you can control exactly which versions of each dependency will be used when your package is installed. The "package.json" file is still required if you want to use "npm install".

    By default, "npm install" recursively installs the target's dependencies (as specified in package.json), choosing the latest available version that satisfies the dependency's semver pattern. In some situations, particularly when shipping software where each change is tightly managed, it's desirable to fully specify each version of each dependency recursively so that subsequent builds and deploys do not inadvertently pick up newer versions of a dependency that satisfy the semver pattern. Specifying specific semver patterns in each dependency's package.json would facilitate this, but that's not always possible or desirable, as when another author owns the npm package. It's also possible to check dependencies directly into source control, but that may be undesirable for other reasons.

    As an example, consider package A:

    {
      "name": "A",
      "version": "0.1.0",
      "dependencies": {
        "B": "<0.1.0"
      }
    }
    

    package B:

    {
      "name": "B",
      "version": "0.0.1",
      "dependencies": {
        "C": "<0.1.0"
      }
    }
    

    and package C:

    {
      "name": "C,
      "version": "0.0.1"
    }
    

    If these are the only versions of A, B, and C available in the registry, then a normal "npm install A" will install:

    A@0.1.0
    `-- B@0.0.1
        `-- C@0.0.1
    

    However, if B@0.0.2 is published, then a fresh "npm install A" will install:

    A@0.1.0
    `-- B@0.0.2
        `-- C@0.0.1
    

    assuming the new version did not modify B's dependencies. Of course, the new version of B could include a new version of C and any number of new dependencies. If such changes are undesirable, the author of A could specify a dependency on B@0.0.1. However, if A's author and B's author are not the same person, there's no way for A's author to say that he or she does not want to pull in newly published versions of C when B hasn't changed at all.

    In this case, A's author can run

    npm shrinkwrap
    

    This generates npm-shrinkwrap.json, which will look something like this:

    {
      "name": "A",
      "version": "0.1.0",
      "dependencies": {
        "B": {
          "version": "0.0.1",
          "dependencies": {
            "C": {
              "version": "0.1.0"
            }
          }
        }
      }
    }
    

    The shrinkwrap command has locked down the dependencies based on what's currently installed in node_modules. When "npm install" installs a package with a npm-shrinkwrap.json file in the package root, the shrinkwrap file (rather than package.json files) completely drives the installation of that package and all of its dependencies (recursively). So now the author publishes A@0.1.0, and subsequent installs of this package will use B@0.0.1 and C@0.1.0, regardless the dependencies and versions listed in A's, B's, and C's package.json files.

    Using shrinkwrapped packages

    Using a shrinkwrapped package is no different than using any other package: you can "npm install" it by hand, or add a dependency to your package.json file and "npm install" it.

    Building shrinkwrapped packages

    To shrinkwrap an existing package:

    1. Run "npm install" in the package root to install the current versions of all dependencies.
    2. Validate that the package works as expected with these versions.
    3. Run "npm shrinkwrap", add npm-shrinkwrap.json to git, and publish your package.

    To add or update a dependency in a shrinkwrapped package:

    1. Run "npm install" in the package root to install the current versions of all dependencies.
    2. Add or update dependencies. "npm install" each new or updated package individually and then update package.json. Note that they must be explicitly named in order to be installed: running npm install with no arguments will merely reproduce the existing shrinkwrap.
    3. Validate that the package works as expected with the new dependencies.
    4. Run "npm shrinkwrap", commit the new npm-shrinkwrap.json, and publish your package.

    You can use npm-outdated(1) to view dependencies with newer versions available.

    Other Notes

    A shrinkwrap file must be consistent with the package's package.json file. "npm shrinkwrap" will fail if required dependencies are not already installed, since that would result in a shrinkwrap that wouldn't actually work. Similarly, the command will fail if there are extraneous packages (not referenced by package.json), since that would indicate that package.json is not correct.

    Since "npm shrinkwrap" is intended to lock down your dependencies for production use, devDependencies will not be included unless you explicitly set the --dev flag when you run npm shrinkwrap. If installed devDependencies are excluded, then npm will print a warning. If you want them to be installed with your module by default, please consider adding them to dependencies instead.

    If shrinkwrapped package A depends on shrinkwrapped package B, B's shrinkwrap will not be used as part of the installation of A. However, because A's shrinkwrap is constructed from a valid installation of B and recursively specifies all dependencies, the contents of B's shrinkwrap will implicitly be included in A's shrinkwrap.

    Caveats

    If you wish to lock down the specific bytes included in a package, for example to have 100% confidence in being able to reproduce a deployment or build, then you ought to check your dependencies into source control, or pursue some other mechanism that can verify contents rather than versions.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-star.html000644 000766 000024 00000006475 12455173731 025625 0ustar00iojsstaff000000 000000 npm-star

    npm-star

    Mark your favorite packages

    SYNOPSIS

    npm star <pkgname> [<pkg>, ...]
    npm unstar <pkgname> [<pkg>, ...]
    

    DESCRIPTION

    "Starring" a package means that you have some interest in it. It's a vaguely positive way to show that you care.

    "Unstarring" is the same thing, but in reverse.

    It's a boolean thing. Starring repeatedly has no additional effect.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-stars.html000644 000766 000024 00000006506 12455173731 026003 0ustar00iojsstaff000000 000000 npm-stars

    npm-stars

    View packages marked as favorites

    SYNOPSIS

    npm stars
    npm stars [username]
    

    DESCRIPTION

    If you have starred a lot of neat things and want to find them again quickly this command lets you do just that.

    You may also want to see your friend's favorite packages, in this case you will most certainly enjoy this command.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-start.html000644 000766 000024 00000006375 12455173731 026010 0ustar00iojsstaff000000 000000 npm-start

    npm-start

    Start a package

    SYNOPSIS

    npm start [-- <args>]
    

    DESCRIPTION

    This runs a package's "start" script, if one was provided.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-stop.html000644 000766 000024 00000006370 12455173731 025633 0ustar00iojsstaff000000 000000 npm-stop

    npm-stop

    Stop a package

    SYNOPSIS

    npm stop [-- <args>]
    

    DESCRIPTION

    This runs a package's "stop" script, if one was provided.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-submodule.html000644 000766 000024 00000007121 12455173731 026640 0ustar00iojsstaff000000 000000 npm-submodule

    npm-submodule

    Add a package as a git submodule

    SYNOPSIS

    npm submodule <pkg>
    

    DESCRIPTION

    If the specified package has a git repository url in its package.json description, then this command will add it as a git submodule at node_modules/<pkg name>.

    This is a convenience only. From then on, it's up to you to manage updates by using the appropriate git commands. npm will stubbornly refuse to update, modify, or remove anything with a .git subfolder in it.

    This command also does not install missing dependencies, if the package does not include them in its git repository. If npm ls reports that things are missing, you can either install, link, or submodule them yourself, or you can do npm explore <pkgname> -- npm install to install the dependencies into the submodule folder.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-tag.html000644 000766 000024 00000007716 12455173731 025426 0ustar00iojsstaff000000 000000 npm-tag

    npm-tag

    Tag a published version

    SYNOPSIS

    npm tag <name>@<version> [<tag>]
    

    DESCRIPTION

    Tags the specified version of the package with the specified tag, or the --tag config if not specified.

    A tag can be used when installing packages as a reference to a version instead of using a specific version number:

    npm install <name>@<tag>
    

    When installing dependencies, a preferred tagged version may be specified:

    npm install --tag <tag>
    

    This also applies to npm dedupe.

    Publishing a package always sets the "latest" tag to the published version.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-test.html000644 000766 000024 00000006564 12455173731 025632 0ustar00iojsstaff000000 000000 npm-test

    npm-test

    Test a package

    SYNOPSIS

      npm test [-- <args>]
      npm tst [-- <args>]
    

    DESCRIPTION

    This runs a package's "test" script, if one was provided.

    To run tests as a condition of installation, set the npat config to true.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-uninstall.html000644 000766 000024 00000010734 12455173731 026656 0ustar00iojsstaff000000 000000 npm-uninstall

    npm-rm

    Remove a package

    SYNOPSIS

    npm uninstall [@<scope>/]<package> [--save|--save-dev|--save-optional]
    npm rm (with any of the previous argument usage)
    

    DESCRIPTION

    This uninstalls a package, completely removing everything npm installed on its behalf.

    Example:

    npm uninstall sax
    

    In global mode (ie, with -g or --global appended to the command), it uninstalls the current package context as a global package.

    npm uninstall takes 3 exclusive, optional flags which save or update the package version in your main package.json:

    • --save: Package will be removed from your dependencies.

    • --save-dev: Package will be removed from your devDependencies.

    • --save-optional: Package will be removed from your optionalDependencies.

    Scope is optional and follows the usual rules for npm-scope(7).

    Examples:

    npm uninstall sax --save
    npm uninstall @myorg/privatepackage --save
    npm uninstall node-tap --save-dev
    npm uninstall dtrace-provider --save-optional
    

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-unpublish.html000644 000766 000024 00000010212 12455173731 026645 0ustar00iojsstaff000000 000000 npm-unpublish

    npm-unpublish

    Remove a package from the registry

    SYNOPSIS

    npm unpublish [@<scope>/]<name>[@<version>]
    

    WARNING

    It is generally considered bad behavior to remove versions of a library that others are depending on!

    Consider using the deprecate command instead, if your intent is to encourage users to upgrade.

    There is plenty of room on the registry.

    DESCRIPTION

    This removes a package version from the registry, deleting its entry and removing the tarball.

    If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely.

    Even if a package version is unpublished, that specific name and version combination can never be reused. In order to publish the package again, a new version number must be used.

    The scope is optional and follows the usual rules for npm-scope(7).

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-update.html000644 000766 000024 00000007116 12455173731 026127 0ustar00iojsstaff000000 000000 npm-update

    npm-update

    Update a package

    SYNOPSIS

    npm update [-g] [<name> [<name> ...]]
    

    DESCRIPTION

    This command will update all the packages listed to the latest version (specified by the tag config).

    It will also install missing packages.

    If the -g flag is specified, this command will update globally installed packages.

    If no package name is specified, all packages in the specified location (global or local) will be updated.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-version.html000644 000766 000024 00000011056 12455173731 026330 0ustar00iojsstaff000000 000000 npm-version

    npm-version

    Bump a package version

    SYNOPSIS

    npm version [<newversion> | major | minor | patch | premajor | preminor | prepatch | prerelease]
    

    DESCRIPTION

    Run this in a package directory to bump the version and write the new data back to package.json and, if present, npm-shrinkwrap.json.

    The newversion argument should be a valid semver string, or a valid second argument to semver.inc (one of "patch", "minor", "major", "prepatch", "preminor", "premajor", "prerelease"). In the second case, the existing version will be incremented by 1 in the specified field.

    If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean.

    If supplied with --message (shorthand: -m) config option, npm will use it as a commit message when creating a version commit. If the message config contains %s then that will be replaced with the resulting version number. For example:

    npm version patch -m "Upgrade to %s for reasons"
    

    If the sign-git-tag config is set, then the tag will be signed using the -s flag to git. Note that you must have a default GPG key set up in your git config for this to work properly. For example:

    $ npm config set sign-git-tag true
    $ npm version patch
    
    You need a passphrase to unlock the secret key for
    user: "isaacs (http://blog.izs.me/) <i@izs.me>"
    2048-bit RSA key, ID 6C481CF6, created 2010-08-31
    
    Enter passphrase:
    

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-view.html000644 000766 000024 00000014602 12455173731 025615 0ustar00iojsstaff000000 000000 npm-view

    npm-view

    View registry info

    SYNOPSIS

    npm view [@<scope>/]<name>[@<version>] [<field>[.<subfield>]...]
    npm v [@<scope>/]<name>[@<version>] [<field>[.<subfield>]...]
    

    DESCRIPTION

    This command shows data about a package and prints it to the stream referenced by the outfd config, which defaults to stdout.

    To show the package registry entry for the connect package, you can do this:

    npm view connect
    

    The default version is "latest" if unspecified.

    Field names can be specified after the package descriptor. For example, to show the dependencies of the ronn package at version 0.3.5, you could do the following:

    npm view ronn@0.3.5 dependencies
    

    You can view child field by separating them with a period. To view the git repository URL for the latest version of npm, you could do this:

    npm view npm repository.url
    

    This makes it easy to view information about a dependency with a bit of shell scripting. For example, to view all the data about the version of opts that ronn depends on, you can do this:

    npm view opts@$(npm view ronn dependencies.opts)
    

    For fields that are arrays, requesting a non-numeric field will return all of the values from the objects in the list. For example, to get all the contributor names for the "express" project, you can do this:

    npm view express contributors.email
    

    You may also use numeric indices in square braces to specifically select an item in an array field. To just get the email address of the first contributor in the list, you can do this:

    npm view express contributors[0].email
    

    Multiple fields may be specified, and will be printed one after another. For exampls, to get all the contributor names and email addresses, you can do this:

    npm view express contributors.name contributors.email
    

    "Person" fields are shown as a string if they would be shown as an object. So, for example, this will show the list of npm contributors in the shortened string format. (See package.json(5) for more on this.)

    npm view npm contributors
    

    If a version range is provided, then data will be printed for every matching version of the package. This will show which version of jsdom was required by each matching version of yui3:

    npm view yui3@'>0.5.4' dependencies.jsdom
    

    OUTPUT

    If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command. If the field is an object, it will be output as a JavaScript object literal.

    If the --json flag is given, the outputted fields will be JSON.

    If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to.

    If multiple fields are requested, than each of them are prefixed with the field name.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm-whoami.html000644 000766 000024 00000006171 12455173731 026131 0ustar00iojsstaff000000 000000 npm-whoami

    npm-whoami

    Display npm username

    SYNOPSIS

    npm whoami
    

    DESCRIPTION

    Print the username config to standard output.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/cli/npm.html000644 000766 000024 00000025111 12455173731 024642 0ustar00iojsstaff000000 000000 npm

    npm

    node package manager

    SYNOPSIS

    npm <command> [args]
    

    VERSION

    2.1.18

    DESCRIPTION

    npm is the package manager for the Node JavaScript platform. It puts modules in place so that node can find them, and manages dependency conflicts intelligently.

    It is extremely configurable to support a wide variety of use cases. Most commonly, it is used to publish, discover, install, and develop node programs.

    Run npm help to get a list of available commands.

    INTRODUCTION

    You probably got npm because you want to install stuff.

    Use npm install blerg to install the latest version of "blerg". Check out npm-install(1) for more info. It can do a lot of stuff.

    Use the npm search command to show everything that's available. Use npm ls to show everything you've installed.

    DEPENDENCIES

    If a package references to another package with a git URL, npm depends on a preinstalled git.

    If one of the packages npm tries to install is a native node module and requires compiling of C++ Code, npm will use node-gyp for that task. For a Unix system, node-gyp needs Python, make and a buildchain like GCC. On Windows, Python and Microsoft Visual Studio C++ is needed. Python 3 is not supported by node-gyp. For more information visit the node-gyp repository and the node-gyp Wiki.

    DIRECTORIES

    See npm-folders(5) to learn about where npm puts stuff.

    In particular, npm has two modes of operation:

    • global mode:
      npm installs packages into the install prefix at prefix/lib/node_modules and bins are installed in prefix/bin.
    • local mode:
      npm installs packages into the current project directory, which defaults to the current working directory. Packages are installed to ./node_modules, and bins are installed to ./node_modules/.bin.

    Local mode is the default. Use --global or -g on any command to operate in global mode instead.

    DEVELOPER USAGE

    If you're using npm to develop and publish your code, check out the following help topics:

    • json: Make a package.json file. See package.json(5).
    • link: For linking your current working code into Node's path, so that you don't have to reinstall every time you make a change. Use npm link to do this.
    • install: It's a good idea to install things if you don't need the symbolic link. Especially, installing other peoples code from the registry is done via npm install
    • adduser: Create an account or log in. Credentials are stored in the user config file.
    • publish: Use the npm publish command to upload your code to the registry.

    CONFIGURATION

    npm is extremely configurable. It reads its configuration options from 5 places.

    • Command line switches:
      Set a config with --key val. All keys take a value, even if they are booleans (the config parser doesn't know what the options are at the time of parsing.) If no value is provided, then the option is set to boolean true.
    • Environment Variables:
      Set any config by prefixing the name in an environment variable with npm_config_. For example, export npm_config_key=val.
    • User Configs:
      The file at $HOME/.npmrc is an ini-formatted list of configs. If present, it is parsed. If the userconfig option is set in the cli or env, then that will be used instead.
    • Global Configs:
      The file found at ../etc/npmrc (from the node executable, by default this resolves to /usr/local/etc/npmrc) will be parsed if it is found. If the globalconfig option is set in the cli, env, or user config, then that file is parsed instead.
    • Defaults:
      npm's default configuration options are defined in lib/utils/config-defs.js. These must not be changed.

    See npm-config(7) for much much more information.

    CONTRIBUTIONS

    Patches welcome!

    • code: Read through npm-coding-style(7) if you plan to submit code. You don't have to agree with it, but you do have to follow it.
    • docs: If you find an error in the documentation, edit the appropriate markdown file in the "doc" folder. (Don't worry about generating the man page.)

    Contributors are listed in npm's package.json file. You can view them easily by doing npm view npm contributors.

    If you would like to contribute, but don't know what to work on, check the issues list or ask on the mailing list.

    BUGS

    When you find issues, please report them:

    Be sure to include all of the output from the npm command that didn't work as expected. The npm-debug.log file is also helpful to provide.

    You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email.

    AUTHOR

    Isaac Z. Schlueter :: isaacs :: @izs :: i@izs.me

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-bin.html000644 000766 000024 00000005502 12455173731 025414 0ustar00iojsstaff000000 000000 npm-bin

    npm-bin

    Display npm bin folder

    SYNOPSIS

    npm.commands.bin(args, cb)
    

    DESCRIPTION

    Print the folder where npm will install executables.

    This function should not be used programmatically. Instead, just refer to the npm.bin property.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-bugs.html000644 000766 000024 00000006167 12455173731 025614 0ustar00iojsstaff000000 000000 npm-bugs

    npm-bugs

    Bugs for a package in a web browser maybe

    SYNOPSIS

    npm.commands.bugs(package, callback)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the --browser config param.

    Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number.

    This command will launch a browser, so this command may not be the most friendly for programmatic use.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-cache.html000644 000766 000024 00000007112 12455173731 025706 0ustar00iojsstaff000000 000000 npm-cache

    npm-cache

    manage the npm cache programmatically

    SYNOPSIS

    npm.commands.cache([args], callback)
    
    // helpers
    npm.commands.cache.clean([args], callback)
    npm.commands.cache.add([args], callback)
    npm.commands.cache.read(name, version, forceBypass, callback)
    

    DESCRIPTION

    This acts much the same ways as the npm-cache(1) command line functionality.

    The callback is called with the package.json data of the thing that is eventually added to or read from the cache.

    The top level npm.commands.cache(...) functionality is a public interface, and like all commands on the npm.commands object, it will match the command line behavior exactly.

    However, the cache folder structure and the cache helper functions are considered internal API surface, and as such, may change in future releases of npm, potentially without warning or significant version incrementation.

    Use at your own risk.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-commands.html000644 000766 000024 00000006513 12455173731 026450 0ustar00iojsstaff000000 000000 npm-commands

    npm-commands

    npm commands

    SYNOPSIS

    npm.commands[<command>](args, callback)
    

    DESCRIPTION

    npm comes with a full set of commands, and each of the commands takes a similar set of arguments.

    In general, all commands on the command object take an array of positional argument strings. The last argument to any function is a callback. Some commands are special and take other optional arguments.

    All commands have their own man page. See man npm-<command> for command-line usage, or man 3 npm-<command> for programmatic usage.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-config.html000644 000766 000024 00000010001 12455173731 026077 0ustar00iojsstaff000000 000000 npm-config

    npm-config

    Manage the npm configuration files

    SYNOPSIS

    npm.commands.config(args, callback)
    var val = npm.config.get(key)
    npm.config.set(key, val)
    

    DESCRIPTION

    This function acts much the same way as the command-line version. The first element in the array tells config what to do. Possible values are:

    • set

      Sets a config parameter. The second element in args is interpreted as the key, and the third element is interpreted as the value.

    • get

      Gets the value of a config parameter. The second element in args is the key to get the value of.

    • delete (rm or del)

      Deletes a parameter from the config. The second element in args is the key to delete.

    • list (ls)

      Show all configs that aren't secret. No parameters necessary.

    • edit:

      Opens the config file in the default editor. This command isn't very useful programmatically, but it is made available.

    To programmatically access npm configuration settings, or set them for the duration of a program, use the npm.config.set and npm.config.get functions instead.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-deprecate.html000644 000766 000024 00000007464 12455173731 026611 0ustar00iojsstaff000000 000000 npm-deprecate

    npm-deprecate

    Deprecate a version of a package

    SYNOPSIS

    npm.commands.deprecate(args, callback)
    

    DESCRIPTION

    This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it.

    The 'args' parameter must have exactly two elements:

    • package[@version]

      The version portion is optional, and may be either a range, or a specific version, or a tag.

    • message

      The warning message that will be printed whenever a user attempts to install the package.

    Note that you must be the package owner to deprecate something. See the owner and adduser help topics.

    To un-deprecate a package, specify an empty string ("") for the message argument.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-docs.html000644 000766 000024 00000006171 12455173731 025577 0ustar00iojsstaff000000 000000 npm-docs

    npm-docs

    Docs for a package in a web browser maybe

    SYNOPSIS

    npm.commands.docs(package, callback)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the --browser config param.

    Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number.

    This command will launch a browser, so this command may not be the most friendly for programmatic use.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-edit.html000644 000766 000024 00000006626 12455173731 025601 0ustar00iojsstaff000000 000000 npm-edit

    npm-edit

    Edit an installed package

    SYNOPSIS

    npm.commands.edit(package, callback)
    

    DESCRIPTION

    Opens the package folder in the default editor (or whatever you've configured as the npm editor config -- see npm help config.)

    After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages.

    For instance, you can do npm install connect to install connect into your package, and then npm.commands.edit(["connect"], callback) to make a few changes to your locally installed copy.

    The first parameter is a string array with a single element, the package to open. The package can optionally have a version number attached.

    Since this command opens an editor in a new process, be careful about where and how this is used.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-explore.html000644 000766 000024 00000006346 12455173731 026331 0ustar00iojsstaff000000 000000 npm-explore

    npm-explore

    Browse an installed package

    SYNOPSIS

    npm.commands.explore(args, callback)
    

    DESCRIPTION

    Spawn a subshell in the directory of the installed package specified.

    If a command is specified, then it is run in the subshell, which then immediately terminates.

    Note that the package is not automatically rebuilt afterwards, so be sure to use npm rebuild <pkg> if you make any changes.

    The first element in the 'args' parameter must be a package name. After that is the optional command, which can be any number of strings. All of the strings will be combined into one, space-delimited command.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-help-search.html000644 000766 000024 00000007023 12455173731 027037 0ustar00iojsstaff000000 000000 npm-help-search

    npm-help-search

    Search the help pages

    SYNOPSIS

    npm.commands.helpSearch(args, [silent,] callback)
    

    DESCRIPTION

    This command is rarely useful, but it exists in the rare case that it is.

    This command takes an array of search terms and returns the help pages that match in order of best match.

    If there is only one match, then npm displays that help section. If there are multiple results, the results are printed to the screen formatted and the array of results is returned. Each result is an object with these properties:

    • hits: A map of args to number of hits on that arg. For example, {"npm": 3}
    • found: Total number of unique args that matched.
    • totalHits: Total number of hits.
    • lines: An array of all matching lines (and some adjacent lines).
    • file: Name of the file that matched

    The silent parameter is not necessary not used, but it may in the future.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-init.html000644 000766 000024 00000007202 12455173731 025606 0ustar00iojsstaff000000 000000 npm-init

    npm init

    Interactively create a package.json file

    SYNOPSIS

    npm.commands.init(args, callback)
    

    DESCRIPTION

    This will ask you a bunch of questions, and then write a package.json for you.

    It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected.

    If you already have a package.json file, it'll read that first, and default to the options in there.

    It is strictly additive, so it does not delete options from your package.json without a really good reason to do so.

    Since this function expects to be run on the command-line, it doesn't work very well as a programmatically. The best option is to roll your own, and since JavaScript makes it stupid simple to output formatted JSON, that is the preferred method. If you're sure you want to handle command-line prompting, then go ahead and use this programmatically.

    SEE ALSO

    package.json(5)

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-install.html000644 000766 000024 00000006254 12455173731 026317 0ustar00iojsstaff000000 000000 npm-install

    npm-install

    install a package programmatically

    SYNOPSIS

    npm.commands.install([where,] packages, callback)
    

    DESCRIPTION

    This acts much the same ways as installing on the command-line.

    The 'where' parameter is optional and only used internally, and it specifies where the packages should be installed to.

    The 'packages' parameter is an array of strings. Each element in the array is the name of a package to be installed.

    Finally, 'callback' is a function that will be called when all packages have been installed or when an error has been encountered.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-link.html000644 000766 000024 00000007176 12455173731 025612 0ustar00iojsstaff000000 000000 npm-link

    npm-link

    Symlink a package folder

    SYNOPSIS

    npm.commands.link(callback)
    npm.commands.link(packages, callback)
    

    DESCRIPTION

    Package linking is a two-step process.

    Without parameters, link will create a globally-installed symbolic link from prefix/package-name to the current folder.

    With a parameters, link will create a symlink from the local node_modules folder to the global symlink.

    When creating tarballs for npm publish, the linked packages are "snapshotted" to their current state by resolving the symbolic links.

    This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild.

    For example:

    npm.commands.link(cb)           # creates global link from the cwd
                                    # (say redis package)
    npm.commands.link('redis', cb)  # link-install the package
    

    Now, any changes to the redis package will be reflected in the package in the current working directory

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-load.html000644 000766 000024 00000006402 12455173731 025563 0ustar00iojsstaff000000 000000 npm-load

    npm-load

    Load config settings

    SYNOPSIS

    npm.load(conf, cb)
    

    DESCRIPTION

    npm.load() must be called before any other function call. Both parameters are optional, but the second is recommended.

    The first parameter is an object containing command-line config params, and the second parameter is a callback that will be called when npm is loaded and ready to serve.

    The first parameter should follow a similar structure as the package.json config object.

    For example, to emulate the --dev flag, pass an object that looks like this:

    {
      "dev": true
    }
    

    For a list of all the available command-line configs, see npm help config

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-ls.html000644 000766 000024 00000010505 12455173731 025261 0ustar00iojsstaff000000 000000 npm-ls

    npm-ls

    List installed packages

    SYNOPSIS

    npm.commands.ls(args, [silent,] callback)
    

    DESCRIPTION

    This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree-structure. It will also return that data using the callback.

    This command does not take any arguments, but args must be defined. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments, though you may set config flags like with any other command, such as global to list global packages.

    It will print out extraneous, missing, and invalid packages.

    If the silent parameter is set to true, nothing will be output to the screen, but the data will still be returned.

    Callback is provided an error if one occurred, the full data about which packages are installed and which dependencies they will receive, and a "lite" data object which just shows which versions are installed where. Note that the full data object is a circular structure, so care must be taken if it is serialized to JSON.

    CONFIGURATION

    long

    • Default: false
    • Type: Boolean

    Show extended information.

    parseable

    • Default: false
    • Type: Boolean

    Show parseable output instead of tree view.

    global

    • Default: false
    • Type: Boolean

    List packages in the global install prefix instead of in the current project.

    Note, if parseable is set or long isn't set, then duplicates will be trimmed. This means that if a submodule has the same dependency as a parent module, then the dependency will only be output once.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-outdated.html000644 000766 000024 00000005572 12455173731 026464 0ustar00iojsstaff000000 000000 npm-outdated

    npm-outdated

    Check for outdated packages

    SYNOPSIS

    npm.commands.outdated([packages,] callback)
    

    DESCRIPTION

    This command will check the registry to see if the specified packages are currently outdated.

    If the 'packages' parameter is left out, npm will check all packages.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-owner.html000644 000766 000024 00000007362 12455173731 026004 0ustar00iojsstaff000000 000000 npm-owner

    npm-owner

    Manage package owners

    SYNOPSIS

    npm.commands.owner(args, callback)
    

    DESCRIPTION

    The first element of the 'args' parameter defines what to do, and the subsequent elements depend on the action. Possible values for the action are (order of parameters are given in parenthesis):

    • ls (package): List all the users who have access to modify a package and push new versions. Handy when you need to know who to bug for help.
    • add (user, package): Add a new user as a maintainer of a package. This user is enabled to modify metadata, publish new versions, and add other owners.
    • rm (user, package): Remove a user from the package owner list. This immediately revokes their privileges.

    Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-pack.html000644 000766 000024 00000006250 12455173731 025563 0ustar00iojsstaff000000 000000 npm-pack

    npm-pack

    Create a tarball from a package

    SYNOPSIS

    npm.commands.pack([packages,] callback)
    

    DESCRIPTION

    For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as <name>-<version>.tgz, and then write the filenames out to stdout.

    If the same package is specified multiple times, then the file will be overwritten the second time.

    If no arguments are supplied, then npm packs the current package folder.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-prefix.html000644 000766 000024 00000005577 12455173731 026155 0ustar00iojsstaff000000 000000 npm-prefix

    npm-prefix

    Display prefix

    SYNOPSIS

    npm.commands.prefix(args, callback)
    

    DESCRIPTION

    Print the prefix to standard out.

    'args' is never used and callback is never called with data. 'args' must be present or things will break.

    This function is not useful programmatically

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-prune.html000644 000766 000024 00000005753 12455173731 026005 0ustar00iojsstaff000000 000000 npm-prune

    npm-prune

    Remove extraneous packages

    SYNOPSIS

    npm.commands.prune([packages,] callback)
    

    DESCRIPTION

    This command removes "extraneous" packages.

    The first parameter is optional, and it specifies packages to be removed.

    No packages are specified, then all packages will be checked.

    Extraneous packages are packages that are not listed on the parent package's dependencies list.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-publish.html000644 000766 000024 00000007222 12455173731 026313 0ustar00iojsstaff000000 000000 npm-publish

    npm-publish

    Publish a package

    SYNOPSIS

    npm.commands.publish([packages,] callback)
    

    DESCRIPTION

    Publishes a package to the registry so that it can be installed by name. Possible values in the 'packages' array are:

    • <folder>: A folder containing a package.json file

    • <tarball>: A url or file path to a gzipped tar archive containing a single folder with a package.json file inside.

    If the package array is empty, npm will try to publish something in the current working directory.

    This command could fails if one of the packages specified already exists in the registry. Overwrites when the "force" environment variable is set.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-rebuild.html000644 000766 000024 00000006040 12455173731 026270 0ustar00iojsstaff000000 000000 npm-rebuild

    npm-rebuild

    Rebuild a package

    SYNOPSIS

    npm.commands.rebuild([packages,] callback)
    

    DESCRIPTION

    This command runs the npm build command on each of the matched packages. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary. If no 'packages' parameter is specify, every package will be rebuilt.

    CONFIGURATION

    See npm help build

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-repo.html000644 000766 000024 00000006170 12455173731 025613 0ustar00iojsstaff000000 000000 npm-repo

    npm-repo

    Open package repository page in the browser

    SYNOPSIS

    npm.commands.repo(package, callback)
    

    DESCRIPTION

    This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the --browser config param.

    Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number.

    This command will launch a browser, so this command may not be the most friendly for programmatic use.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-restart.html000644 000766 000024 00000007462 12455173731 026337 0ustar00iojsstaff000000 000000 npm-restart

    npm-restart

    Restart a package

    SYNOPSIS

    npm.commands.restart(packages, callback)
    

    DESCRIPTION

    This restarts a package (or multiple packages).

    This runs a package's "stop", "restart", and "start" scripts, and associated pre- and post- scripts, in the order given below:

    1. prerestart
    2. prestop
    3. stop
    4. poststop
    5. restart
    6. prestart
    7. start
    8. poststart
    9. postrestart

    If no version is specified, then it restarts the "active" version.

    npm can restart multiple packages. Just specify multiple packages in the packages parameter.

    NOTE

    Note that the "restart" script is run in addition to the "stop" and "start" scripts, not instead of them.

    This is the behavior as of npm major version 2. A change in this behavior will be accompanied by an increase in major version number

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-root.html000644 000766 000024 00000005632 12455173731 025633 0ustar00iojsstaff000000 000000 npm-root

    npm-root

    Display npm root

    SYNOPSIS

    npm.commands.root(args, callback)
    

    DESCRIPTION

    Print the effective node_modules folder to standard out.

    'args' is never used and callback is never called with data. 'args' must be present or things will break.

    This function is not useful programmatically.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-run-script.html000644 000766 000024 00000007416 12455173731 026760 0ustar00iojsstaff000000 000000 npm-run-script

    npm-run-script

    Run arbitrary package scripts

    SYNOPSIS

    npm.commands.run-script(args, callback)
    

    DESCRIPTION

    This runs an arbitrary command from a package's "scripts" object.

    It is used by the test, start, restart, and stop commands, but can be called directly, as well.

    The 'args' parameter is an array of strings. Behavior depends on the number of elements. If there is only one element, npm assumes that the element represents a command to be run on the local repository. If there is more than one element, then the first is assumed to be the package and the second is assumed to be the command to run. All other elements are ignored.

    SEE ALSO

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-search.html000644 000766 000024 00000007551 12455173731 026117 0ustar00iojsstaff000000 000000 npm-search

    npm-search

    Search for packages

    SYNOPSIS

    npm.commands.search(searchTerms, [silent,] [staleness,] callback)
    

    DESCRIPTION

    Search the registry for packages matching the search terms. The available parameters are:

    • searchTerms: Array of search terms. These terms are case-insensitive.
    • silent: If true, npm will not log anything to the console.
    • staleness: This is the threshold for stale packages. "Fresh" packages are not refreshed from the registry. This value is measured in seconds.
    • callback: Returns an object where each key is the name of a package, and the value is information about that package along with a 'words' property, which is a space-delimited string of all of the interesting words in that package. The only properties included are those that are searched, which generally include:

      • name
      • description
      • maintainers
      • url
      • keywords

    A search on the registry excludes any result that does not match all of the search terms. It also removes any items from the results that contain an excluded term (the "searchexclude" config). The search is case insensitive and doesn't try to read your mind (it doesn't do any verb tense matching or the like).

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-shrinkwrap.html000644 000766 000024 00000006355 12455173731 027043 0ustar00iojsstaff000000 000000 npm-shrinkwrap

    npm-shrinkwrap

    programmatically generate package shrinkwrap file

    SYNOPSIS

    npm.commands.shrinkwrap(args, [silent,] callback)
    

    DESCRIPTION

    This acts much the same ways as shrinkwrapping on the command-line.

    This command does not take any arguments, but 'args' must be defined. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments.

    If the 'silent' parameter is set to true, nothing will be output to the screen, but the shrinkwrap file will still be written.

    Finally, 'callback' is a function that will be called when the shrinkwrap has been saved.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-start.html000644 000766 000024 00000005537 12455173731 026011 0ustar00iojsstaff000000 000000 npm-start

    npm-start

    Start a package

    SYNOPSIS

    npm.commands.start(packages, callback)
    

    DESCRIPTION

    This runs a package's "start" script, if one was provided.

    npm can start multiple packages. Just specify multiple packages in the packages parameter.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-stop.html000644 000766 000024 00000005535 12455173731 025637 0ustar00iojsstaff000000 000000 npm-stop

    npm-stop

    Stop a package

    SYNOPSIS

    npm.commands.stop(packages, callback)
    

    DESCRIPTION

    This runs a package's "stop" script, if one was provided.

    npm can run stop on multiple packages. Just specify multiple packages in the packages parameter.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-submodule.html000644 000766 000024 00000007074 12455173731 026651 0ustar00iojsstaff000000 000000 npm-submodule

    npm-submodule

    Add a package as a git submodule

    SYNOPSIS

    npm.commands.submodule(packages, callback)
    

    DESCRIPTION

    For each package specified, npm will check if it has a git repository url in its package.json description then add it as a git submodule at node_modules/<pkg name>.

    This is a convenience only. From then on, it's up to you to manage updates by using the appropriate git commands. npm will stubbornly refuse to update, modify, or remove anything with a .git subfolder in it.

    This command also does not install missing dependencies, if the package does not include them in its git repository. If npm ls reports that things are missing, you can either install, link, or submodule them yourself, or you can do npm explore <pkgname> -- npm install to install the dependencies into the submodule folder.

    SEE ALSO

    • npm help json
    • git help submodule
    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-tag.html000644 000766 000024 00000006604 12455173731 025423 0ustar00iojsstaff000000 000000 npm-tag

    npm-tag

    Tag a published version

    SYNOPSIS

    npm.commands.tag(package@version, tag, callback)
    

    DESCRIPTION

    Tags the specified version of the package with the specified tag, or the --tag config if not specified.

    The 'package@version' is an array of strings, but only the first two elements are currently used.

    The first element must be in the form package@version, where package is the package name and version is the version number (much like installing a specific version).

    The second element is the name of the tag to tag this version with. If this parameter is missing or falsey (empty), the default froom the config will be used. For more information about how to set this config, check man 3 npm-config for programmatic usage or man npm-config for cli usage.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-test.html000644 000766 000024 00000005676 12455173731 025637 0ustar00iojsstaff000000 000000 npm-test

    npm-test

    Test a package

    SYNOPSIS

      npm.commands.test(packages, callback)
    

    DESCRIPTION

    This runs a package's "test" script, if one was provided.

    To run tests as a condition of installation, set the npat config to true.

    npm can run tests on multiple packages. Just specify multiple packages in the packages parameter.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-uninstall.html000644 000766 000024 00000006060 12455173731 026655 0ustar00iojsstaff000000 000000 npm-uninstall

    npm-uninstall

    uninstall a package programmatically

    SYNOPSIS

    npm.commands.uninstall(packages, callback)
    

    DESCRIPTION

    This acts much the same ways as uninstalling on the command-line.

    The 'packages' parameter is an array of strings. Each element in the array is the name of a package to be uninstalled.

    Finally, 'callback' is a function that will be called when all packages have been uninstalled or when an error has been encountered.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-unpublish.html000644 000766 000024 00000006227 12455173731 026662 0ustar00iojsstaff000000 000000 npm-unpublish

    npm-unpublish

    Remove a package from the registry

    SYNOPSIS

    npm.commands.unpublish(package, callback)
    

    DESCRIPTION

    This removes a package version from the registry, deleting its entry and removing the tarball.

    The package parameter must be defined.

    Only the first element in the package parameter is used. If there is no first element, then npm assumes that the package at the current working directory is what is meant.

    If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-update.html000644 000766 000024 00000005651 12455173731 026133 0ustar00iojsstaff000000 000000 npm-update

    npm-update

    Update a package

    SYNOPSIS

    npm.commands.update(packages, callback)
    

    DESCRIPTION

    Updates a package, upgrading it to the latest version. It also installs any missing packages.

    The 'packages' argument is an array of packages to update. The 'callback' parameter will be called when done or when an error occurs.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-version.html000644 000766 000024 00000006171 12455173731 026334 0ustar00iojsstaff000000 000000 npm-version

    npm-version

    Bump a package version

    SYNOPSIS

    npm.commands.version(newversion, callback)
    

    DESCRIPTION

    Run this in a package directory to bump the version and write the new data back to the package.json file.

    If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean.

    Like all other commands, this function takes a string array as its first parameter. The difference, however, is this function will fail if it does not have exactly one element. The only element should be a version number.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-view.html000644 000766 000024 00000014517 12455173731 025624 0ustar00iojsstaff000000 000000 npm-view

    npm-view

    View registry info

    SYNOPSIS

    npm.commands.view(args, [silent,] callback)
    

    DESCRIPTION

    This command shows data about a package and prints it to the stream referenced by the outfd config, which defaults to stdout.

    The "args" parameter is an ordered list that closely resembles the command-line usage. The elements should be ordered such that the first element is the package and version (package@version). The version is optional. After that, the rest of the parameters are fields with optional subfields ("field.subfield") which can be used to get only the information desired from the registry.

    The callback will be passed all of the data returned by the query.

    For example, to get the package registry entry for the connect package, you can do this:

    npm.commands.view(["connect"], callback)
    

    If no version is specified, "latest" is assumed.

    Field names can be specified after the package descriptor. For example, to show the dependencies of the ronn package at version 0.3.5, you could do the following:

    npm.commands.view(["ronn@0.3.5", "dependencies"], callback)
    

    You can view child field by separating them with a period. To view the git repository URL for the latest version of npm, you could do this:

    npm.commands.view(["npm", "repository.url"], callback)
    

    For fields that are arrays, requesting a non-numeric field will return all of the values from the objects in the list. For example, to get all the contributor names for the "express" project, you can do this:

    npm.commands.view(["express", "contributors.email"], callback)
    

    You may also use numeric indices in square braces to specifically select an item in an array field. To just get the email address of the first contributor in the list, you can do this:

    npm.commands.view(["express", "contributors[0].email"], callback)
    

    Multiple fields may be specified, and will be printed one after another. For exampls, to get all the contributor names and email addresses, you can do this:

    npm.commands.view(["express", "contributors.name", "contributors.email"], callback)
    

    "Person" fields are shown as a string if they would be shown as an object. So, for example, this will show the list of npm contributors in the shortened string format. (See npm help json for more on this.)

    npm.commands.view(["npm", "contributors"], callback)
    

    If a version range is provided, then data will be printed for every matching version of the package. This will show which version of jsdom was required by each matching version of yui3:

    npm.commands.view(["yui3@'>0.5.4'", "dependencies.jsdom"], callback)
    

    OUTPUT

    If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command.

    If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to.

    If multiple fields are requested, than each of them are prefixed with the field name.

    Console output can be disabled by setting the 'silent' parameter to true.

    RETURN VALUE

    The data returned will be an object in this formation:

    { <version>:
      { <field>: <value>
      , ... }
    , ... }
    

    corresponding to the list of fields selected.

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm-whoami.html000644 000766 000024 00000005636 12455173731 026140 0ustar00iojsstaff000000 000000 npm-whoami

    npm-whoami

    Display npm username

    SYNOPSIS

    npm.commands.whoami(args, callback)
    

    DESCRIPTION

    Print the username config to standard output.

    'args' is never used and callback is never called with data. 'args' must be present or things will break.

    This function is not useful programmatically

    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/html/doc/api/npm.html000644 000766 000024 00000015531 12455173731 024651 0ustar00iojsstaff000000 000000 npm

    npm

    node package manager

    SYNOPSIS

    var npm = require("npm")
    npm.load([configObject, ]function (er, npm) {
      // use the npm object, now that it's loaded.
    
      npm.config.set(key, val)
      val = npm.config.get(key)
    
      console.log("prefix = %s", npm.prefix)
    
      npm.commands.install(["package"], cb)
    })
    

    VERSION

    2.1.18

    DESCRIPTION

    This is the API documentation for npm. To find documentation of the command line client, see npm(1).

    Prior to using npm's commands, npm.load() must be called. If you provide configObject as an object map of top-level configs, they override the values stored in the various config locations. In the npm command line client, this set of configs is parsed from the command line options. Additional configuration params are loaded from two configuration files. See npm-config(1), npm-config(7), and npmrc(5) for more information.

    After that, each of the functions are accessible in the commands object: npm.commands.<cmd>. See npm-index(7) for a list of all possible commands.

    All commands on the command object take an array of positional argument strings. The last argument to any function is a callback. Some commands take other optional arguments.

    Configs cannot currently be set on a per function basis, as each call to npm.config.set will change the value for all npm commands in that process.

    To find API documentation for a specific command, run the npm apihelp command.

    METHODS AND PROPERTIES

    • npm.load(configs, cb)

      Load the configuration params, and call the cb function once the globalconfig and userconfig files have been loaded as well, or on nextTick if they've already been loaded.

    • npm.config

      An object for accessing npm configuration parameters.

      • npm.config.get(key)
      • npm.config.set(key, val)
      • npm.config.del(key)
    • npm.dir or npm.root

      The node_modules directory where npm will operate.

    • npm.prefix

      The prefix where npm is operating. (Most often the current working directory.)

    • npm.cache

      The place where npm keeps JSON and tarballs it fetches from the registry (or uploads to the registry).

    • npm.tmp

      npm's temporary working directory.

    • npm.deref

      Get the "real" name for a command that has either an alias or abbreviation.

    MAGIC

    For each of the methods in the npm.commands object, a method is added to the npm object, which takes a set of positional string arguments rather than an array and a callback.

    If the last argument is a callback, then it will use the supplied callback. However, if no callback is provided, then it will print out the error or results.

    For example, this would work in a node repl:

    > npm = require("npm")
    > npm.load()  // wait a sec...
    > npm.install("dnode", "express")
    

    Note that that won't work in a node program, since the install method will get called before the configuration load is completed.

    ABBREVS

    In order to support npm ins foo instead of npm install foo, the npm.commands object has a set of abbreviations as well as the full method names. Use the npm.deref method to find the real name.

    For example:

    var cmd = npm.deref("unp") // cmd === "unpublish"
    
    iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/000755 000766 000024 00000000000 12456115117 022213 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/000755 000766 000024 00000000000 12456115117 022211 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/files/000755 000766 000024 00000000000 12456115117 022544 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/000755 000766 000024 00000000000 12456115117 022375 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-coding-style.md000644 000766 000024 00000012101 12455173731 026110 0ustar00iojsstaff000000 000000 npm-coding-style(7) -- npm's "funny" coding style ================================================= ## DESCRIPTION npm's coding style is a bit unconventional. It is not different for difference's sake, but rather a carefully crafted style that is designed to reduce visual clutter and make bugs more apparent. If you want to contribute to npm (which is very encouraged), you should make your code conform to npm's style. Note: this concerns npm's code not the specific packages that you can download from the npm registry. ## Line Length Keep lines shorter than 80 characters. It's better for lines to be too short than to be too long. Break up long lists, objects, and other statements onto multiple lines. ## Indentation Two-spaces. Tabs are better, but they look like hell in web browsers (and on GitHub), and node uses 2 spaces, so that's that. Configure your editor appropriately. ## Curly braces Curly braces belong on the same line as the thing that necessitates them. Bad: function () { Good: function () { If a block needs to wrap to the next line, use a curly brace. Don't use it if it doesn't. Bad: if (foo) { bar() } while (foo) bar() Good: if (foo) bar() while (foo) { bar() } ## Semicolons Don't use them except in four situations: * `for (;;)` loops. They're actually required. * null loops like: `while (something) ;` (But you'd better have a good reason for doing that.) * `case "foo": doSomething(); break` * In front of a leading `(` or `[` at the start of the line. This prevents the expression from being interpreted as a function call or property access, respectively. Some examples of good semicolon usage: ;(x || y).doSomething() ;[a, b, c].forEach(doSomething) for (var i = 0; i < 10; i ++) { switch (state) { case "begin": start(); continue case "end": finish(); break default: throw new Error("unknown state") } end() } Note that starting lines with `-` and `+` also should be prefixed with a semicolon, but this is much less common. ## Comma First If there is a list of things separated by commas, and it wraps across multiple lines, put the comma at the start of the next line, directly below the token that starts the list. Put the final token in the list on a line by itself. For example: var magicWords = [ "abracadabra" , "gesundheit" , "ventrilo" ] , spells = { "fireball" : function () { setOnFire() } , "water" : function () { putOut() } } , a = 1 , b = "abc" , etc , somethingElse ## Whitespace Put a single space in front of ( for anything other than a function call. Also use a single space wherever it makes things more readable. Don't leave trailing whitespace at the end of lines. Don't indent empty lines. Don't use more spaces than are helpful. ## Functions Use named functions. They make stack traces a lot easier to read. ## Callbacks, Sync/async Style Use the asynchronous/non-blocking versions of things as much as possible. It might make more sense for npm to use the synchronous fs APIs, but this way, the fs and http and child process stuff all uses the same callback-passing methodology. The callback should always be the last argument in the list. Its first argument is the Error or null. Be very careful never to ever ever throw anything. It's worse than useless. Just send the error message back as the first argument to the callback. ## Errors Always create a new Error object with your message. Don't just return a string message to the callback. Stack traces are handy. ## Logging Logging is done using the [npmlog](https://github.com/npm/npmlog) utility. Please clean up logs when they are no longer helpful. In particular, logging the same object over and over again is not helpful. Logs should report what's happening so that it's easier to track down where a fault occurs. Use appropriate log levels. See `npm-config(7)` and search for "loglevel". ## Case, naming, etc. Use `lowerCamelCase` for multiword identifiers when they refer to objects, functions, methods, properties, or anything not specified in this section. Use `UpperCamelCase` for class names (things that you'd pass to "new"). Use `all-lower-hyphen-css-case` for multiword filenames and config keys. Use named functions. They make stack traces easier to follow. Use `CAPS_SNAKE_CASE` for constants, things that should never change and are rarely used. Use a single uppercase letter for function names where the function would normally be anonymous, but needs to call itself recursively. It makes it clear that it's a "throwaway" function. ## null, undefined, false, 0 Boolean variables and functions should always be either `true` or `false`. Don't set it to 0 unless it's supposed to be a number. When something is intentionally missing or removed, set it to `null`. Don't set things to `undefined`. Reserve that value to mean "not yet set to anything." Boolean objects are verboten. ## SEE ALSO * npm-developers(7) * npm-faq(7) * npm(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-config.md000644 000766 000024 00000047252 12455173731 024773 0ustar00iojsstaff000000 000000 npm-config(7) -- More than you probably want to know about npm configuration ============================================================================ ## DESCRIPTION npm gets its configuration values from 6 sources, in this priority: ### Command Line Flags Putting `--foo bar` on the command line sets the `foo` configuration parameter to `"bar"`. A `--` argument tells the cli parser to stop reading flags. A `--flag` parameter that is at the *end* of the command will be given the value of `true`. ### Environment Variables Any environment variables that start with `npm_config_` will be interpreted as a configuration parameter. For example, putting `npm_config_foo=bar` in your environment will set the `foo` configuration parameter to `bar`. Any environment configurations that are not given a value will be given the value of `true`. Config values are case-insensitive, so `NPM_CONFIG_FOO=bar` will work the same. ### npmrc Files The four relevant files are: * per-project config file (/path/to/my/project/.npmrc) * per-user config file (~/.npmrc) * global config file ($PREFIX/npmrc) * npm builtin config file (/path/to/npm/npmrc) See npmrc(5) for more details. ### Default Configs A set of configuration parameters that are internal to npm, and are defaults if nothing else is specified. ## Shorthands and Other CLI Niceties The following shorthands are parsed on the command-line: * `-v`: `--version` * `-h`, `-?`, `--help`, `-H`: `--usage` * `-s`, `--silent`: `--loglevel silent` * `-q`, `--quiet`: `--loglevel warn` * `-d`: `--loglevel info` * `-dd`, `--verbose`: `--loglevel verbose` * `-ddd`: `--loglevel silly` * `-g`: `--global` * `-C`: `--prefix` * `-l`: `--long` * `-m`: `--message` * `-p`, `--porcelain`: `--parseable` * `-reg`: `--registry` * `-v`: `--version` * `-f`: `--force` * `-desc`: `--description` * `-S`: `--save` * `-D`: `--save-dev` * `-O`: `--save-optional` * `-B`: `--save-bundle` * `-E`: `--save-exact` * `-y`: `--yes` * `-n`: `--yes false` * `ll` and `la` commands: `ls --long` If the specified configuration param resolves unambiguously to a known configuration parameter, then it is expanded to that configuration parameter. For example: npm ls --par # same as: npm ls --parseable If multiple single-character shorthands are strung together, and the resulting combination is unambiguously not some other configuration param, then it is expanded to its various component pieces. For example: npm ls -gpld # same as: npm ls --global --parseable --long --loglevel info ## Per-Package Config Settings When running scripts (see `npm-scripts(7)`) the package.json "config" keys are overwritten in the environment if there is a config param of `[@]:`. For example, if the package.json has this: { "name" : "foo" , "config" : { "port" : "8080" } , "scripts" : { "start" : "node server.js" } } and the server.js is this: http.createServer(...).listen(process.env.npm_package_config_port) then the user could change the behavior by doing: npm config set foo:port 80 See package.json(5) for more information. ## Config Settings ### always-auth * Default: false * Type: Boolean Force npm to always require authentication when accessing the registry, even for `GET` requests. ### bin-links * Default: `true` * Type: Boolean Tells npm to create symlinks (or `.cmd` shims on Windows) for package executables. Set to false to have it not do this. This can be used to work around the fact that some file systems don't support symlinks, even on ostensibly Unix systems. ### browser * Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"` * Type: String The browser that is called by the `npm docs` command to open websites. ### ca * Default: The npm CA certificate * Type: String, Array or null The Certificate Authority signing certificate that is trusted for SSL connections to the registry. Values should be in PEM format with newlines replaced by the string "\n". For example: ca="-----BEGIN CERTIFICATE-----\nXXXX\nXXXX\n-----END CERTIFICATE-----" Set to `null` to only allow "known" registrars, or to a specific CA cert to trust only that specific signing authority. Multiple CAs can be trusted by specifying an array of certificates: ca[]="..." ca[]="..." See also the `strict-ssl` config. ### cafile * Default: `null` * Type: path A path to a file containing one or multiple Certificate Authority signing certificates. Similar to the `ca` setting, but allows for multiple CA's, as well as for the CA information to be stored in a file on disk. ### cache * Default: Windows: `%AppData%\npm-cache`, Posix: `~/.npm` * Type: path The location of npm's cache directory. See `npm-cache(1)` ### cache-lock-stale * Default: 60000 (1 minute) * Type: Number The number of ms before cache folder lockfiles are considered stale. ### cache-lock-retries * Default: 10 * Type: Number Number of times to retry to acquire a lock on cache folder lockfiles. ### cache-lock-wait * Default: 10000 (10 seconds) * Type: Number Number of ms to wait for cache lock files to expire. ### cache-max * Default: Infinity * Type: Number The maximum time (in seconds) to keep items in the registry cache before re-checking against the registry. Note that no purging is done unless the `npm cache clean` command is explicitly used, and that only GET requests use the cache. ### cache-min * Default: 10 * Type: Number The minimum time (in seconds) to keep items in the registry cache before re-checking against the registry. Note that no purging is done unless the `npm cache clean` command is explicitly used, and that only GET requests use the cache. ### cert * Default: `null` * Type: String A client certificate to pass when accessing the registry. ### color * Default: true on Posix, false on Windows * Type: Boolean or `"always"` If false, never shows colors. If `"always"` then always shows colors. If true, then only prints color codes for tty file descriptors. ### depth * Default: Infinity * Type: Number The depth to go when recursing directories for `npm ls` and `npm cache ls`. ### description * Default: true * Type: Boolean Show the description in `npm search` ### dev * Default: false * Type: Boolean Install `dev-dependencies` along with packages. Note that `dev-dependencies` are also installed if the `npat` flag is set. ### editor * Default: `EDITOR` environment variable if set, or `"vi"` on Posix, or `"notepad"` on Windows. * Type: path The command to run for `npm edit` or `npm config edit`. ### engine-strict * Default: false * Type: Boolean If set to true, then npm will stubbornly refuse to install (or even consider installing) any package that claims to not be compatible with the current Node.js version. ### force * Default: false * Type: Boolean Makes various commands more forceful. * lifecycle script failure does not block progress. * publishing clobbers previously published versions. * skips cache when requesting from the registry. * prevents checks against clobbering non-npm files. ### fetch-retries * Default: 2 * Type: Number The "retries" config for the `retry` module to use when fetching packages from the registry. ### fetch-retry-factor * Default: 10 * Type: Number The "factor" config for the `retry` module to use when fetching packages. ### fetch-retry-mintimeout * Default: 10000 (10 seconds) * Type: Number The "minTimeout" config for the `retry` module to use when fetching packages. ### fetch-retry-maxtimeout * Default: 60000 (1 minute) * Type: Number The "maxTimeout" config for the `retry` module to use when fetching packages. ### git * Default: `"git"` * Type: String The command to use for git commands. If git is installed on the computer, but is not in the `PATH`, then set this to the full path to the git binary. ### git-tag-version * Default: `true` * Type: Boolean Tag the commit when using the `npm version` command. ### global * Default: false * Type: Boolean Operates in "global" mode, so that packages are installed into the `prefix` folder instead of the current working directory. See `npm-folders(5)` for more on the differences in behavior. * packages are installed into the `{prefix}/lib/node_modules` folder, instead of the current working directory. * bin files are linked to `{prefix}/bin` * man pages are linked to `{prefix}/share/man` ### globalconfig * Default: {prefix}/etc/npmrc * Type: path The config file to read for global config options. ### group * Default: GID of the current process * Type: String or Number The group to use when running package scripts in global mode as the root user. ### heading * Default: `"npm"` * Type: String The string that starts all the debugging log output. ### https-proxy * Default: null * Type: url A proxy to use for outgoing https requests. If the `HTTPS_PROXY` or `https_proxy` or `HTTP_PROXY` or `http_proxy` environment variables are set, proxy settings will be honored by the underlying `request` library. ### ignore-scripts * Default: false * Type: Boolean If true, npm does not run scripts specified in package.json files. ### init-module * Default: ~/.npm-init.js * Type: path A module that will be loaded by the `npm init` command. See the documentation for the [init-package-json](https://github.com/isaacs/init-package-json) module for more information, or npm-init(1). ### init-author-name * Default: "" * Type: String The value `npm init` should use by default for the package author's name. ### init-author-email * Default: "" * Type: String The value `npm init` should use by default for the package author's email. ### init-author-url * Default: "" * Type: String The value `npm init` should use by default for the package author's homepage. ### init-license * Default: "ISC" * Type: String The value `npm init` should use by default for the package license. ### init-version * Default: "0.0.0" * Type: semver The value that `npm init` should use by default for the package version number, if not already set in package.json. ### json * Default: false * Type: Boolean Whether or not to output JSON data, rather than the normal output. This feature is currently experimental, and the output data structures for many commands is either not implemented in JSON yet, or subject to change. Only the output from `npm ls --json` is currently valid. ### key * Default: `null` * Type: String A client key to pass when accessing the registry. ### link * Default: false * Type: Boolean If true, then local installs will link if there is a suitable globally installed package. Note that this means that local installs can cause things to be installed into the global space at the same time. The link is only done if one of the two conditions are met: * The package is not already installed globally, or * the globally installed version is identical to the version that is being installed locally. ### local-address * Default: undefined * Type: IP Address The IP address of the local interface to use when making connections to the npm registry. Must be IPv4 in versions of Node prior to 0.12. ### loglevel * Default: "warn" * Type: String * Values: "silent", "error", "warn", "http", "info", "verbose", "silly" What level of logs to report. On failure, *all* logs are written to `npm-debug.log` in the current working directory. Any logs of a higher level than the setting are shown. The default is "warn", which shows warn and error output. ### logstream * Default: process.stderr * Type: Stream This is the stream that is passed to the [npmlog](https://github.com/npm/npmlog) module at run time. It cannot be set from the command line, but if you are using npm programmatically, you may wish to send logs to somewhere other than stderr. If the `color` config is set to true, then this stream will receive colored output if it is a TTY. ### long * Default: false * Type: Boolean Show extended information in `npm ls` and `npm search`. ### message * Default: "%s" * Type: String Commit message which is used by `npm version` when creating version commit. Any "%s" in the message will be replaced with the version number. ### node-version * Default: process.version * Type: semver or false The node version to use when checking a package's `engines` map. ### npat * Default: false * Type: Boolean Run tests on installation. ### onload-script * Default: false * Type: path A node module to `require()` when npm loads. Useful for programmatic usage. ### optional * Default: true * Type: Boolean Attempt to install packages in the `optionalDependencies` object. Note that if these packages fail to install, the overall installation process is not aborted. ### parseable * Default: false * Type: Boolean Output parseable results from commands that write to standard output. ### prefix * Default: see npm-folders(5) * Type: path The location to install global items. If set on the command line, then it forces non-global commands to run in the specified folder. ### production * Default: false * Type: Boolean Set to true to run in "production" mode. 1. devDependencies are not installed at the topmost level when running local `npm install` without any arguments. 2. Set the NODE_ENV="production" for lifecycle scripts. ### proprietary-attribs * Default: true * Type: Boolean Whether or not to include proprietary extended attributes in the tarballs created by npm. Unless you are expecting to unpack package tarballs with something other than npm -- particularly a very outdated tar implementation -- leave this as true. ### proxy * Default: null * Type: url A proxy to use for outgoing http requests. If the `HTTP_PROXY` or `http_proxy` environment variables are set, proxy settings will be honored by the underlying `request` library. ### rebuild-bundle * Default: true * Type: Boolean Rebuild bundled dependencies after installation. ### registry * Default: https://registry.npmjs.org/ * Type: url The base URL of the npm package registry. ### rollback * Default: true * Type: Boolean Remove failed installs. ### save * Default: false * Type: Boolean Save installed packages to a package.json file as dependencies. When used with the `npm rm` command, it removes it from the `dependencies` object. Only works if there is already a package.json file present. ### save-bundle * Default: false * Type: Boolean If a package would be saved at install time by the use of `--save`, `--save-dev`, or `--save-optional`, then also put it in the `bundleDependencies` list. When used with the `npm rm` command, it removes it from the bundledDependencies list. ### save-dev * Default: false * Type: Boolean Save installed packages to a package.json file as `devDependencies`. When used with the `npm rm` command, it removes it from the `devDependencies` object. Only works if there is already a package.json file present. ### save-exact * Default: false * Type: Boolean Dependencies saved to package.json using `--save`, `--save-dev` or `--save-optional` will be configured with an exact version rather than using npm's default semver range operator. ### save-optional * Default: false * Type: Boolean Save installed packages to a package.json file as optionalDependencies. When used with the `npm rm` command, it removes it from the `devDependencies` object. Only works if there is already a package.json file present. ### save-prefix * Default: '^' * Type: String Configure how versions of packages installed to a package.json file via `--save` or `--save-dev` get prefixed. For example if a package has version `1.2.3`, by default it's version is set to `^1.2.3` which allows minor upgrades for that package, but after `npm config set save-prefix='~'` it would be set to `~1.2.3` which only allows patch upgrades. ### scope * Default: "" * Type: String Associate an operation with a scope for a scoped registry. Useful when logging in to a private registry for the first time: `npm login --scope=@organization --registry=registry.organization.com`, which will cause `@organization` to be mapped to the registry for future installation of packages specified according to the pattern `@organization/package`. ### searchopts * Default: "" * Type: String Space-separated options that are always passed to search. ### searchexclude * Default: "" * Type: String Space-separated options that limit the results from search. ### searchsort * Default: "name" * Type: String * Values: "name", "-name", "date", "-date", "description", "-description", "keywords", "-keywords" Indication of which field to sort search results by. Prefix with a `-` character to indicate reverse sort. ### shell * Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows * Type: path The shell to run for the `npm explore` command. ### shrinkwrap * Default: true * Type: Boolean If set to false, then ignore `npm-shrinkwrap.json` files when installing. ### sign-git-tag * Default: false * Type: Boolean If set to true, then the `npm version` command will tag the version using `-s` to add a signature. Note that git requires you to have set up GPG keys in your git configs for this to work properly. ### spin * Default: true * Type: Boolean or `"always"` When set to `true`, npm will display an ascii spinner while it is doing things, if `process.stderr` is a TTY. Set to `false` to suppress the spinner, or set to `always` to output the spinner even for non-TTY outputs. ### strict-ssl * Default: true * Type: Boolean Whether or not to do SSL key validation when making requests to the registry via https. See also the `ca` config. ### tag * Default: latest * Type: String If you ask npm to install a package and don't tell it a specific version, then it will install the specified tag. Also the tag that is added to the package@version specified by the `npm tag` command, if no explicit tag is given. ### tmp * Default: TMPDIR environment variable, or "/tmp" * Type: path Where to store temporary files and folders. All temp files are deleted on success, but left behind on failure for forensic purposes. ### unicode * Default: true * Type: Boolean When set to true, npm uses unicode characters in the tree output. When false, it uses ascii characters to draw trees. ### unsafe-perm * Default: false if running as root, true otherwise * Type: Boolean Set to true to suppress the UID/GID switching when running package scripts. If set explicitly to false, then installing as a non-root user will fail. ### usage * Default: false * Type: Boolean Set to show short usage output (like the -H output) instead of complete help when doing `npm-help(1)`. ### user * Default: "nobody" * Type: String or Number The UID to set to when running package scripts as root. ### userconfig * Default: ~/.npmrc * Type: path The location of user-level configuration settings. ### umask * Default: 022 * Type: Octal numeric string The "umask" value to use when setting the file creation mode on files and folders. Folders and executables are given a mode which is `0777` masked against this value. Other files are given a mode which is `0666` masked against this value. Thus, the defaults are `0755` and `0644` respectively. ### user-agent * Default: node/{process.version} {process.platform} {process.arch} * Type: String Sets a User-Agent to the request header ### version * Default: false * Type: boolean If true, output the npm version and exit successfully. Only relevant when specified explicitly on the command line. ### versions * Default: false * Type: boolean If true, output the npm version as well as node's `process.versions` map, and exit successfully. Only relevant when specified explicitly on the command line. ### viewer * Default: "man" on Posix, "browser" on Windows * Type: path The program to use to view help content. Set to `"browser"` to view html help content in the default web browser. ## SEE ALSO * npm-config(1) * npm-config(7) * npmrc(5) * npm-scripts(7) * npm-folders(5) * npm(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-developers.md000644 000766 000024 00000014722 12455173731 025672 0ustar00iojsstaff000000 000000 npm-developers(7) -- Developer Guide ==================================== ## DESCRIPTION So, you've decided to use npm to develop (and maybe publish/deploy) your project. Fantastic! There are a few things that you need to do above the simple steps that your users will do to install your program. ## About These Documents These are man pages. If you install npm, you should be able to then do `man npm-thing` to get the documentation on a particular topic, or `npm help thing` to see the same information. ## What is a `package` A package is: * a) a folder containing a program described by a package.json file * b) a gzipped tarball containing (a) * c) a url that resolves to (b) * d) a `@` that is published on the registry with (c) * e) a `@` that points to (d) * f) a `` that has a "latest" tag satisfying (e) * g) a `git` url that, when cloned, results in (a). Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b). Git urls can be of the form: git://github.com/user/project.git#commit-ish git+ssh://user@hostname:project.git#commit-ish git+http://user@hostname/project/blah.git#commit-ish git+https://user@hostname/project/blah.git#commit-ish The `commit-ish` can be any tag, sha, or branch which can be supplied as an argument to `git checkout`. The default is `master`. ## The package.json File You need to have a `package.json` file in the root of your project to do much of anything with npm. That is basically the whole interface. See `package.json(5)` for details about what goes in that file. At the very least, you need: * name: This should be a string that identifies your project. Please do not use the name to specify that it runs on node, or is in JavaScript. You can use the "engines" field to explicitly state the versions of node (or whatever else) that your program requires, and it's pretty well assumed that it's javascript. It does not necessarily need to match your github repository name. So, `node-foo` and `bar-js` are bad names. `foo` or `bar` are better. * version: A semver-compatible version. * engines: Specify the versions of node (or whatever else) that your program runs on. The node API changes a lot, and there may be bugs or new functionality that you depend on. Be explicit. * author: Take some credit. * scripts: If you have a special compilation or installation script, then you should put it in the `scripts` object. You should definitely have at least a basic smoke-test command as the "scripts.test" field. See npm-scripts(7). * main: If you have a single module that serves as the entry point to your program (like what the "foo" package gives you at require("foo")), then you need to specify that in the "main" field. * directories: This is an object mapping names to folders. The best ones to include are "lib" and "doc", but if you use "man" to specify a folder full of man pages, they'll get installed just like these ones. You can use `npm init` in the root of your package in order to get you started with a pretty basic package.json file. See `npm-init(1)` for more info. ## Keeping files *out* of your package Use a `.npmignore` file to keep stuff out of your package. If there's no `.npmignore` file, but there *is* a `.gitignore` file, then npm will ignore the stuff matched by the `.gitignore` file. If you *want* to include something that is excluded by your `.gitignore` file, you can create an empty `.npmignore` file to override it. `.npmignore` files follow the [same pattern rules](http://git-scm.com/book/en/v2/Git-Basics-Recording-Changes-to-the-Repository#Ignoring-Files) as `.gitignore` files: * Blank lines or lines starting with `#` are ignored. * Standard glob patterns work. * You can end patterns with a forward slash `/` to specify a directory. * You can negate a pattern by starting it with an exclamation point `!`. By default, the following paths and files are ignored, so there's no need to add them to `.npmignore` explicitly: * `.*.swp` * `._*` * `.DS_Store` * `.git` * `.hg` * `.lock-wscript` * `.svn` * `.wafpickle-*` * `CVS` * `npm-debug.log` Additionally, everything in `node_modules` is ignored, except for bundled dependencies. npm automatically handles this for you, so don't bother adding `node_modules` to `.npmignore`. The following paths and files are never ignored, so adding them to `.npmignore` is pointless: * `package.json` * `README.*` ## Link Packages `npm link` is designed to install a development package and see the changes in real time without having to keep re-installing it. (You do need to either re-link or `npm rebuild -g` to update compiled packages, of course.) More info at `npm-link(1)`. ## Before Publishing: Make Sure Your Package Installs and Works **This is important.** If you can not install it locally, you'll have problems trying to publish it. Or, worse yet, you'll be able to publish it, but you'll be publishing a broken or pointless package. So don't do that. In the root of your package, do this: npm install . -g That'll show you that it's working. If you'd rather just create a symlink package that points to your working directory, then do this: npm link Use `npm ls -g` to see if it's there. To test a local install, go into some other folder, and then do: cd ../some-other-folder npm install ../my-package to install it locally into the node_modules folder in that other place. Then go into the node-repl, and try using require("my-thing") to bring in your module's main module. ## Create a User Account Create a user with the adduser command. It works like this: npm adduser and then follow the prompts. This is documented better in npm-adduser(1). ## Publish your package This part's easy. IN the root of your folder, do this: npm publish You can give publish a url to a tarball, or a filename of a tarball, or a path to a folder. Note that pretty much **everything in that folder will be exposed** by default. So, if you have secret stuff in there, use a `.npmignore` file to list out the globs to ignore, or publish from a fresh checkout. ## Brag about it Send emails, write blogs, blab in IRC. Tell the world how easy it is to install your program! ## SEE ALSO * npm-faq(7) * npm(1) * npm-init(1) * package.json(5) * npm-scripts(7) * npm-publish(1) * npm-adduser(1) * npm-registry(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-disputes.md000644 000766 000024 00000011002 12455173731 025346 0ustar00iojsstaff000000 000000 npm-disputes(7) -- Handling Module Name Disputes ================================================ ## SYNOPSIS 1. Get the author email with `npm owner ls ` 2. Email the author, CC 3. After a few weeks, if there's no resolution, we'll sort it out. Don't squat on package names. Publish code or move out of the way. ## DESCRIPTION There sometimes arise cases where a user publishes a module, and then later, some other user wants to use that name. Here are some common ways that happens (each of these is based on actual events.) 1. Joe writes a JavaScript module `foo`, which is not node-specific. Joe doesn't use node at all. Bob wants to use `foo` in node, so he wraps it in an npm module. Some time later, Joe starts using node, and wants to take over management of his program. 2. Bob writes an npm module `foo`, and publishes it. Perhaps much later, Joe finds a bug in `foo`, and fixes it. He sends a pull request to Bob, but Bob doesn't have the time to deal with it, because he has a new job and a new baby and is focused on his new erlang project, and kind of not involved with node any more. Joe would like to publish a new `foo`, but can't, because the name is taken. 3. Bob writes a 10-line flow-control library, and calls it `foo`, and publishes it to the npm registry. Being a simple little thing, it never really has to be updated. Joe works for Foo Inc, the makers of the critically acclaimed and widely-marketed `foo` JavaScript toolkit framework. They publish it to npm as `foojs`, but people are routinely confused when `npm install foo` is some different thing. 4. Bob writes a parser for the widely-known `foo` file format, because he needs it for work. Then, he gets a new job, and never updates the prototype. Later on, Joe writes a much more complete `foo` parser, but can't publish, because Bob's `foo` is in the way. The validity of Joe's claim in each situation can be debated. However, Joe's appropriate course of action in each case is the same. 1. `npm owner ls foo`. This will tell Joe the email address of the owner (Bob). 2. Joe emails Bob, explaining the situation **as respectfully as possible**, and what he would like to do with the module name. He adds the npm support staff to the CC list of the email. Mention in the email that Bob can run `npm owner add joe foo` to add Joe as an owner of the `foo` package. 3. After a reasonable amount of time, if Bob has not responded, or if Bob and Joe can't come to any sort of resolution, email support and we'll sort it out. ("Reasonable" is usually at least 4 weeks, but extra time is allowed around common holidays.) ## REASONING In almost every case so far, the parties involved have been able to reach an amicable resolution without any major intervention. Most people really do want to be reasonable, and are probably not even aware that they're in your way. Module ecosystems are most vibrant and powerful when they are as self-directed as possible. If an admin one day deletes something you had worked on, then that is going to make most people quite upset, regardless of the justification. When humans solve their problems by talking to other humans with respect, everyone has the chance to end up feeling good about the interaction. ## EXCEPTIONS Some things are not allowed, and will be removed without discussion if they are brought to the attention of the npm registry admins, including but not limited to: 1. Malware (that is, a package designed to exploit or harm the machine on which it is installed). 2. Violations of copyright or licenses (for example, cloning an MIT-licensed program, and then removing or changing the copyright and license statement). 3. Illegal content. 4. "Squatting" on a package name that you *plan* to use, but aren't actually using. Sorry, I don't care how great the name is, or how perfect a fit it is for the thing that someday might happen. If someone wants to use it today, and you're just taking up space with an empty tarball, you're going to be evicted. 5. Putting empty packages in the registry. Packages must have SOME functionality. It can be silly, but it can't be *nothing*. (See also: squatting.) 6. Doing weird things with the registry, like using it as your own personal application database or otherwise putting non-packagey things into it. If you see bad behavior like this, please report it right away. ## SEE ALSO * npm-registry(7) * npm-owner(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-faq.md000644 000766 000024 00000034266 12455173731 024276 0ustar00iojsstaff000000 000000 npm-faq(7) -- Frequently Asked Questions ======================================== ## Where can I find these docs in HTML? , or run: npm config set viewer browser to open these documents in your default web browser rather than `man`. ## It didn't work. That's not really a question. ## Why didn't it work? I don't know yet. Read the error output, and if you can't figure out what it means, do what it says and post a bug with all the information it asks for. ## Where does npm put stuff? See `npm-folders(5)` tl;dr: * Use the `npm root` command to see where modules go, and the `npm bin` command to see where executables go * Global installs are different from local installs. If you install something with the `-g` flag, then its executables go in `npm bin -g` and its modules go in `npm root -g`. ## How do I install something on my computer in a central location? Install it globally by tacking `-g` or `--global` to the command. (This is especially important for command line utilities that need to add their bins to the global system `PATH`.) ## I installed something globally, but I can't `require()` it Install it locally. The global install location is a place for command-line utilities to put their bins in the system `PATH`. It's not for use with `require()`. If you `require()` a module in your code, then that means it's a dependency, and a part of your program. You need to install it locally in your program. ## Why can't npm just put everything in one place, like other package managers? Not every change is an improvement, but every improvement is a change. This would be like asking git to do network IO for every commit. It's not going to happen, because it's a terrible idea that causes more problems than it solves. It is much harder to avoid dependency conflicts without nesting dependencies. This is fundamental to the way that npm works, and has proven to be an extremely successful approach. See `npm-folders(5)` for more details. If you want a package to be installed in one place, and have all your programs reference the same copy of it, then use the `npm link` command. That's what it's for. Install it globally, then link it into each program that uses it. ## Whatever, I really want the old style 'everything global' style. Write your own package manager. You could probably even wrap up `npm` in a shell script if you really wanted to. npm will not help you do something that is known to be a bad idea. ## Should I check my `node_modules` folder into git? Usually, no. Allow npm to resolve dependencies for your packages. For packages you **deploy**, such as websites and apps, you should use npm shrinkwrap to lock down your full dependency tree: If you are paranoid about depending on the npm ecosystem, you should run a private npm mirror or a private cache. If you want 100% confidence in being able to reproduce the specific bytes included in a deployment, you should use an additional mechanism that can verify contents rather than versions. For example, Amazon machine images, DigitalOcean snapshots, Heroku slugs, or simple tarballs. ## Is it 'npm' or 'NPM' or 'Npm'? npm should never be capitalized unless it is being displayed in a location that is customarily all-caps (such as the title of man pages.) ## If 'npm' is an acronym, why is it never capitalized? Contrary to the belief of many, "npm" is not in fact an abbreviation for "Node Package Manager". It is a recursive bacronymic abbreviation for "npm is not an acronym". (If it was "ninaa", then it would be an acronym, and thus incorrectly named.) "NPM", however, *is* an acronym (more precisely, a capitonym) for the National Association of Pastoral Musicians. You can learn more about them at . In software, "NPM" is a Non-Parametric Mapping utility written by Chris Rorden. You can analyze pictures of brains with it. Learn more about the (capitalized) NPM program at . The first seed that eventually grew into this flower was a bash utility named "pm", which was a shortened descendent of "pkgmakeinst", a bash function that was used to install various different things on different platforms, most often using Yahoo's `yinst`. If `npm` was ever an acronym for anything, it was `node pm` or maybe `new pm`. So, in all seriousness, the "npm" project is named after its command-line utility, which was organically selected to be easily typed by a right-handed programmer using a US QWERTY keyboard layout, ending with the right-ring-finger in a postition to type the `-` key for flags and other command-line arguments. That command-line utility is always lower-case, though it starts most sentences it is a part of. ## How do I list installed packages? `npm ls` ## How do I search for packages? `npm search` Arguments are greps. `npm search jsdom` shows jsdom packages. ## How do I update npm? npm install npm -g You can also update all outdated local packages by doing `npm update` without any arguments, or global packages by doing `npm update -g`. Occasionally, the version of npm will progress such that the current version cannot be properly installed with the version that you have installed already. (Consider, if there is ever a bug in the `update` command.) In those cases, you can do this: curl https://www.npmjs.com/install.sh | sh ## What is a `package`? A package is: * a) a folder containing a program described by a package.json file * b) a gzipped tarball containing (a) * c) a url that resolves to (b) * d) a `@` that is published on the registry with (c) * e) a `@` that points to (d) * f) a `` that has a "latest" tag satisfying (e) * g) a `git` url that, when cloned, results in (a). Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b). Git urls can be of the form: git://github.com/user/project.git#commit-ish git+ssh://user@hostname:project.git#commit-ish git+http://user@hostname/project/blah.git#commit-ish git+https://user@hostname/project/blah.git#commit-ish The `commit-ish` can be any tag, sha, or branch which can be supplied as an argument to `git checkout`. The default is `master`. ## What is a `module`? A module is anything that can be loaded with `require()` in a Node.js program. The following things are all examples of things that can be loaded as modules: * A folder with a `package.json` file containing a `main` field. * A folder with an `index.js` file in it. * A JavaScript file. Most npm packages are modules, because they are libraries that you load with `require`. However, there's no requirement that an npm package be a module! Some only contain an executable command-line interface, and don't provide a `main` field for use in Node programs. Almost all npm packages (at least, those that are Node programs) *contain* many modules within them (because every file they load with `require()` is a module). In the context of a Node program, the `module` is also the thing that was loaded *from* a file. For example, in the following program: var req = require('request') we might say that "The variable `req` refers to the `request` module". ## So, why is it the "`node_modules`" folder, but "`package.json`" file? Why not `node_packages` or `module.json`? The `package.json` file defines the package. (See "What is a package?" above.) The `node_modules` folder is the place Node.js looks for modules. (See "What is a module?" above.) For example, if you create a file at `node_modules/foo.js` and then had a program that did `var f = require('foo.js')` then it would load the module. However, `foo.js` is not a "package" in this case, because it does not have a package.json. Alternatively, if you create a package which does not have an `index.js` or a `"main"` field in the `package.json` file, then it is not a module. Even if it's installed in `node_modules`, it can't be an argument to `require()`. ## `"node_modules"` is the name of my deity's arch-rival, and a Forbidden Word in my religion. Can I configure npm to use a different folder? No. This will never happen. This question comes up sometimes, because it seems silly from the outside that npm couldn't just be configured to put stuff somewhere else, and then npm could load them from there. It's an arbitrary spelling choice, right? What's the big deal? At the time of this writing, the string `'node_modules'` appears 151 times in 53 separate files in npm and node core (excluding tests and documentation). Some of these references are in node's built-in module loader. Since npm is not involved **at all** at run-time, node itself would have to be configured to know where you've decided to stick stuff. Complexity hurdle #1. Since the Node module system is locked, this cannot be changed, and is enough to kill this request. But I'll continue, in deference to your deity's delicate feelings regarding spelling. Many of the others are in dependencies that npm uses, which are not necessarily tightly coupled to npm (in the sense that they do not read npm's configuration files, etc.) Each of these would have to be configured to take the name of the `node_modules` folder as a parameter. Complexity hurdle #2. Furthermore, npm has the ability to "bundle" dependencies by adding the dep names to the `"bundledDependencies"` list in package.json, which causes the folder to be included in the package tarball. What if the author of a module bundles its dependencies, and they use a different spelling for `node_modules`? npm would have to rename the folder at publish time, and then be smart enough to unpack it using your locally configured name. Complexity hurdle #3. Furthermore, what happens when you *change* this name? Fine, it's easy enough the first time, just rename the `node_modules` folders to `./blergyblerp/` or whatever name you choose. But what about when you change it again? npm doesn't currently track any state about past configuration settings, so this would be rather difficult to do properly. It would have to track every previous value for this config, and always accept any of them, or else yesterday's install may be broken tomorrow. Complexity hurdle #4. Never going to happen. The folder is named `node_modules`. It is written indelibly in the Node Way, handed down from the ancient times of Node 0.3. ## How do I install node with npm? You don't. Try one of these node version managers: Unix: * * * Windows: * * * ## How can I use npm for development? See `npm-developers(7)` and `package.json(5)`. You'll most likely want to `npm link` your development folder. That's awesomely handy. To set up your own private registry, check out `npm-registry(7)`. ## Can I list a url as a dependency? Yes. It should be a url to a gzipped tarball containing a single folder that has a package.json in its root, or a git url. (See "what is a package?" above.) ## How do I symlink to a dev folder so I don't have to keep re-installing? See `npm-link(1)` ## The package registry website. What is that exactly? See `npm-registry(7)`. ## I forgot my password, and can't publish. How do I reset it? Go to . ## I get ECONNREFUSED a lot. What's up? Either the registry is down, or node's DNS isn't able to reach out. To check if the registry is down, open up in a web browser. This will also tell you if you are just unable to access the internet for some reason. If the registry IS down, let us know by emailing or posting an issue at . If it's down for the world (and not just on your local network) then we're probably already being pinged about it. You can also often get a faster response by visiting the #npm channel on Freenode IRC. ## Why no namespaces? npm has only one global namespace. If you want to namespace your own packages, you may: simply use the `-` character to separate the names. npm is a mostly anarchic system. There is not sufficient need to impose namespace rules on everyone. As of 2.0, npm supports scoped packages, which allow you to publish a group of related modules without worrying about name collisions. Every npm user owns the scope associated with their username. For example, the user named `npm` owns the scope `@npm`. Scoped packages are published inside a scope by naming them as if they were files under the scope directory, e.g., by setting `name` in `package.json` to `@npm/npm`. Scoped packages can coexist with public npm packages in a private npm registry. At present (2014-11-04) scoped packages may NOT be published to the public npm registry. Unscoped packages can only depend on other unscoped packages. Scoped packages can depend on packages from their own scope, a different scope, or the public registry (unscoped). For the current documentation of scoped packages, see References: 1. For the reasoning behind the "one global namespace", please see this discussion: (TL;DR: It doesn't actually make things better, and can make them worse.) 2. For the pre-implementation discussion of the scoped package feature, see this discussion: ## Who does npm? npm was originally written by Isaac Z. Schlueter, and many others have contributed to it, some of them quite substantially. The npm open source project, The npm Registry, and [the community website](https://www.npmjs.com) are maintained and operated by the good folks at [npm, Inc.](http://www.npmjs.com) ## I have a question or request not addressed here. Where should I put it? Post an issue on the github project: * ## Why does npm hate me? npm is not capable of hatred. It loves everyone, especially you. ## SEE ALSO * npm(1) * npm-developers(7) * package.json(5) * npm-config(1) * npm-config(7) * npmrc(5) * npm-config(7) * npm-folders(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-index.md000644 000766 000024 00000011401 12455173731 024620 0ustar00iojsstaff000000 000000 npm-index(7) -- Index of all npm documentation ============================================== ### README(1) a JavaScript package manager ## Command Line Documentation Using npm on the command line ### npm(1) node package manager ### npm-adduser(1) Add a registry user account ### npm-bin(1) Display npm bin folder ### npm-bugs(1) Bugs for a package in a web browser maybe ### npm-build(1) Build a package ### npm-bundle(1) REMOVED ### npm-cache(1) Manipulates packages cache ### npm-completion(1) Tab Completion for npm ### npm-config(1) Manage the npm configuration files ### npm-dedupe(1) Reduce duplication ### npm-deprecate(1) Deprecate a version of a package ### npm-docs(1) Docs for a package in a web browser maybe ### npm-edit(1) Edit an installed package ### npm-explore(1) Browse an installed package ### npm-help-search(1) Search npm help documentation ### npm-help(1) Get help on npm ### npm-init(1) Interactively create a package.json file ### npm-install(1) Install a package ### npm-link(1) Symlink a package folder ### npm-ls(1) List installed packages ### npm-outdated(1) Check for outdated packages ### npm-owner(1) Manage package owners ### npm-pack(1) Create a tarball from a package ### npm-prefix(1) Display prefix ### npm-prune(1) Remove extraneous packages ### npm-publish(1) Publish a package ### npm-rebuild(1) Rebuild a package ### npm-repo(1) Open package repository page in the browser ### npm-restart(1) Restart a package ### npm-rm(1) Remove a package ### npm-root(1) Display npm root ### npm-run-script(1) Run arbitrary package scripts ### npm-search(1) Search for packages ### npm-shrinkwrap(1) Lock down dependency versions ### npm-star(1) Mark your favorite packages ### npm-stars(1) View packages marked as favorites ### npm-start(1) Start a package ### npm-stop(1) Stop a package ### npm-tag(1) Tag a published version ### npm-test(1) Test a package ### npm-uninstall(1) Remove a package ### npm-unpublish(1) Remove a package from the registry ### npm-update(1) Update a package ### npm-version(1) Bump a package version ### npm-view(1) View registry info ### npm-whoami(1) Display npm username ## API Documentation Using npm in your Node programs ### npm(3) node package manager ### npm-bin(3) Display npm bin folder ### npm-bugs(3) Bugs for a package in a web browser maybe ### npm-cache(3) manage the npm cache programmatically ### npm-commands(3) npm commands ### npm-config(3) Manage the npm configuration files ### npm-deprecate(3) Deprecate a version of a package ### npm-docs(3) Docs for a package in a web browser maybe ### npm-edit(3) Edit an installed package ### npm-explore(3) Browse an installed package ### npm-help-search(3) Search the help pages ### npm-init(3) Interactively create a package.json file ### npm-install(3) install a package programmatically ### npm-link(3) Symlink a package folder ### npm-load(3) Load config settings ### npm-ls(3) List installed packages ### npm-outdated(3) Check for outdated packages ### npm-owner(3) Manage package owners ### npm-pack(3) Create a tarball from a package ### npm-prefix(3) Display prefix ### npm-prune(3) Remove extraneous packages ### npm-publish(3) Publish a package ### npm-rebuild(3) Rebuild a package ### npm-repo(3) Open package repository page in the browser ### npm-restart(3) Restart a package ### npm-root(3) Display npm root ### npm-run-script(3) Run arbitrary package scripts ### npm-search(3) Search for packages ### npm-shrinkwrap(3) programmatically generate package shrinkwrap file ### npm-start(3) Start a package ### npm-stop(3) Stop a package ### npm-tag(3) Tag a published version ### npm-test(3) Test a package ### npm-uninstall(3) uninstall a package programmatically ### npm-unpublish(3) Remove a package from the registry ### npm-update(3) Update a package ### npm-version(3) Bump a package version ### npm-view(3) View registry info ### npm-whoami(3) Display npm username ## Files File system structures npm uses ### npm-folders(5) Folder Structures Used by npm ### npmrc(5) The npm config files ### package.json(5) Specifics of npm's package.json handling ## Misc Various other bits and bobs ### npm-coding-style(7) npm's "funny" coding style ### npm-config(7) More than you probably want to know about npm configuration ### npm-developers(7) Developer Guide ### npm-disputes(7) Handling Module Name Disputes ### npm-faq(7) Frequently Asked Questions ### npm-index(7) Index of all npm documentation ### npm-registry(7) The JavaScript Package Registry ### npm-scope(7) Scoped packages ### npm-scripts(7) How npm handles the "scripts" field ### removing-npm(7) Cleaning the Slate ### semver(7) The semantic versioner for npm iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-registry.md000644 000766 000024 00000004744 12455173731 025375 0ustar00iojsstaff000000 000000 npm-registry(7) -- The JavaScript Package Registry ================================================== ## DESCRIPTION To resolve packages by name and version, npm talks to a registry website that implements the CommonJS Package Registry specification for reading package info. Additionally, npm's package registry implementation supports several write APIs as well, to allow for publishing packages and managing user account information. The official public npm registry is at . It is powered by a CouchDB database, of which there is a public mirror at . The code for the couchapp is available at . The registry URL used is determined by the scope of the package (see `npm-scope(7)`). If no scope is specified, the default registry is used, which is supplied by the `registry` config parameter. See `npm-config(1)`, `npmrc(5)`, and `npm-config(7)` for more on managing npm's configuration. ## Can I run my own private registry? Yes! The easiest way is to replicate the couch database, and use the same (or similar) design doc to implement the APIs. If you set up continuous replication from the official CouchDB, and then set your internal CouchDB as the registry config, then you'll be able to read any published packages, in addition to your private ones, and by default will only publish internally. If you then want to publish a package for the whole world to see, you can simply override the `--registry` config for that command. ## I don't want my package published in the official registry. It's private. Set `"private": true` in your package.json to prevent it from being published at all, or `"publishConfig":{"registry":"http://my-internal-registry.local"}` to force it to be published only to your internal registry. See `package.json(5)` for more info on what goes in the package.json file. ## Will you replicate from my registry into the public one? No. If you want things to be public, then publish them into the public registry using npm. What little security there is would be for nought otherwise. ## Do I have to use couchdb to build a registry that npm can talk to? No, but it's way easier. Basically, yes, you do, or you have to effectively implement the entire CouchDB API anyway. ## Is there a website or something to see package docs and such? Yes, head over to ## SEE ALSO * npm-config(1) * npm-config(7) * npmrc(5) * npm-developers(7) * npm-disputes(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-scope.md000644 000766 000024 00000005763 12455173731 024640 0ustar00iojsstaff000000 000000 npm-scope(7) -- Scoped packages =============================== ## DESCRIPTION All npm packages have a name. Some package names also have a scope. A scope follows the usual rules for package names (url-safe characters, no leading dots or underscores). When used in package names, preceded by an @-symbol and followed by a slash, e.g. @somescope/somepackagename Scopes are a way of grouping related packages together, and also affect a few things about the way npm treats the package. **As of 2014-09-03, scoped packages are not supported by the public npm registry**. However, the npm client is backwards-compatible with un-scoped registries, so it can be used to work with scoped and un-scoped registries at the same time. ## Installing scoped packages Scoped packages are installed to a sub-folder of the regular installation folder, e.g. if your other packages are installed in `node_modules/packagename`, scoped modules will be in `node_modules/@myorg/packagename`. The scope folder (`@myorg`) is simply the name of the scope preceded by an @-symbol, and can contain any number of scoped packages. A scoped package is installed by referencing it by name, preceded by an @-symbol, in `npm install`: npm install @myorg/mypackage Or in `package.json`: "dependencies": { "@myorg/mypackage": "^1.3.0" } Note that if the @-symbol is omitted in either case npm will instead attempt to install from GitHub; see `npm-install(1)`. ## Requiring scoped packages Because scoped packages are installed into a scope folder, you have to include the name of the scope when requiring them in your code, e.g. require('@myorg/mypackage') There is nothing special about the way Node treats scope folders, this is just specifying to require the module `mypackage` in the folder called `@myorg`. ## Publishing scoped packages Scoped packages can be published to any registry that supports them. *As of 2014-09-03, the public npm registry does not support scoped packages*, so attempting to publish a scoped package to the registry will fail unless you have associated that scope with a different registry, see below. ## Associating a scope with a registry Scopes can be associated with a separate registry. This allows you to seamlessly use a mix of packages from the public npm registry and one or more private registries, such as npm Enterprise. You can associate a scope with a registry at login, e.g. npm login --registry=http://reg.example.com --scope=@myco Scopes have a many-to-one relationship with registries: one registry can host multiple scopes, but a scope only ever points to one registry. You can also associate a scope with a registry using `npm config`: npm config set @myco:registry http://reg.example.com Once a scope is associated with a registry, any `npm install` for a package with that scope will request packages from that registry instead. Any `npm publish` for a package name that contains the scope will be published to that registry instead. ## SEE ALSO * npm-install(1) * npm-publish(1)iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/npm-scripts.md000644 000766 000024 00000020761 12455173731 025211 0ustar00iojsstaff000000 000000 npm-scripts(7) -- How npm handles the "scripts" field ===================================================== ## DESCRIPTION npm supports the "scripts" property of the package.json script, for the following scripts: * prepublish: Run BEFORE the package is published. (Also run on local `npm install` without any arguments.) * publish, postpublish: Run AFTER the package is published. * preinstall: Run BEFORE the package is installed * install, postinstall: Run AFTER the package is installed. * preuninstall, uninstall: Run BEFORE the package is uninstalled. * postuninstall: Run AFTER the package is uninstalled. * pretest, test, posttest: Run by the `npm test` command. * prestop, stop, poststop: Run by the `npm stop` command. * prestart, start, poststart: Run by the `npm start` command. * prerestart, restart, postrestart: Run by the `npm restart` command. Note: `npm restart` will run the stop and start scripts if no `restart` script is provided. Additionally, arbitrary scripts can be executed by running `npm run-script `. *Pre* and *post* commands with matching names will be run for those as well (e.g. `premyscript`, `myscript`, `postmyscript`). ## NOTE: INSTALL SCRIPTS ARE AN ANTIPATTERN **tl;dr** Don't use `install`. Use a `.gyp` file for compilation, and `prepublish` for anything else. You should almost never have to explicitly set a `preinstall` or `install` script. If you are doing this, please consider if there is another option. The only valid use of `install` or `preinstall` scripts is for compilation which must be done on the target architecture. In early versions of node, this was often done using the `node-waf` scripts, or a standalone `Makefile`, and early versions of npm required that it be explicitly set in package.json. This was not portable, and harder to do properly. In the current version of node, the standard way to do this is using a `.gyp` file. If you have a file with a `.gyp` extension in the root of your package, then npm will run the appropriate `node-gyp` commands automatically at install time. This is the only officially supported method for compiling binary addons, and does not require that you add anything to your package.json file. If you have to do other things before your package is used, in a way that is not dependent on the operating system or architecture of the target system, then use a `prepublish` script instead. This includes tasks such as: * Compile CoffeeScript source code into JavaScript. * Create minified versions of JavaScript source code. * Fetching remote resources that your package will use. The advantage of doing these things at `prepublish` time instead of `preinstall` or `install` time is that they can be done once, in a single place, and thus greatly reduce complexity and variability. Additionally, this means that: * You can depend on `coffee-script` as a `devDependency`, and thus your users don't need to have it installed. * You don't need to include the minifiers in your package, reducing the size for your users. * You don't need to rely on your users having `curl` or `wget` or other system tools on the target machines. ## DEFAULT VALUES npm will default some script values based on package contents. * `"start": "node server.js"`: If there is a `server.js` file in the root of your package, then npm will default the `start` command to `node server.js`. * `"preinstall": "node-waf clean || true; node-waf configure build"`: If there is a `wscript` file in the root of your package, npm will default the `preinstall` command to compile using node-waf. ## USER If npm was invoked with root privileges, then it will change the uid to the user account or uid specified by the `user` config, which defaults to `nobody`. Set the `unsafe-perm` flag to run scripts with root privileges. ## ENVIRONMENT Package scripts run in an environment where many pieces of information are made available regarding the setup of npm and the current state of the process. ### path If you depend on modules that define executable scripts, like test suites, then those executables will be added to the `PATH` for executing the scripts. So, if your package.json has this: { "name" : "foo" , "dependencies" : { "bar" : "0.1.x" } , "scripts": { "start" : "bar ./test" } } then you could run `npm start` to execute the `bar` script, which is exported into the `node_modules/.bin` directory on `npm install`. ### package.json vars The package.json fields are tacked onto the `npm_package_` prefix. So, for instance, if you had `{"name":"foo", "version":"1.2.5"}` in your package.json file, then your package scripts would have the `npm_package_name` environment variable set to "foo", and the `npm_package_version` set to "1.2.5" ### configuration Configuration parameters are put in the environment with the `npm_config_` prefix. For instance, you can view the effective `root` config by checking the `npm_config_root` environment variable. ### Special: package.json "config" object The package.json "config" keys are overwritten in the environment if there is a config param of `[@]:`. For example, if the package.json has this: { "name" : "foo" , "config" : { "port" : "8080" } , "scripts" : { "start" : "node server.js" } } and the server.js is this: http.createServer(...).listen(process.env.npm_package_config_port) then the user could change the behavior by doing: npm config set foo:port 80 ### current lifecycle event Lastly, the `npm_lifecycle_event` environment variable is set to whichever stage of the cycle is being executed. So, you could have a single script used for different parts of the process which switches based on what's currently happening. Objects are flattened following this format, so if you had `{"scripts":{"install":"foo.js"}}` in your package.json, then you'd see this in the script: process.env.npm_package_scripts_install === "foo.js" ## EXAMPLES For example, if your package.json contains this: { "scripts" : { "install" : "scripts/install.js" , "postinstall" : "scripts/install.js" , "uninstall" : "scripts/uninstall.js" } } then the `scripts/install.js` will be called for the install, post-install, stages of the lifecycle, and the `scripts/uninstall.js` would be called when the package is uninstalled. Since `scripts/install.js` is running for three different phases, it would be wise in this case to look at the `npm_lifecycle_event` environment variable. If you want to run a make command, you can do so. This works just fine: { "scripts" : { "preinstall" : "./configure" , "install" : "make && make install" , "test" : "make test" } } ## EXITING Scripts are run by passing the line as a script argument to `sh`. If the script exits with a code other than 0, then this will abort the process. Note that these script files don't have to be nodejs or even javascript programs. They just have to be some kind of executable file. ## HOOK SCRIPTS If you want to run a specific script at a specific lifecycle event for ALL packages, then you can use a hook script. Place an executable file at `node_modules/.hooks/{eventname}`, and it'll get run for all packages when they are going through that point in the package lifecycle for any packages installed in that root. Hook scripts are run exactly the same way as package.json scripts. That is, they are in a separate child process, with the env described above. ## BEST PRACTICES * Don't exit with a non-zero error code unless you *really* mean it. Except for uninstall scripts, this will cause the npm action to fail, and potentially be rolled back. If the failure is minor or only will prevent some optional features, then it's better to just print a warning and exit successfully. * Try not to use scripts to do what npm can do for you. Read through `package.json(5)` to see all the things that you can specify and enable by simply describing your package appropriately. In general, this will lead to a more robust and consistent state. * Inspect the env to determine where to put things. For instance, if the `npm_config_binroot` environ is set to `/home/user/bin`, then don't try to install executables into `/usr/local/bin`. The user probably set it up that way for a reason. * Don't prefix your script commands with "sudo". If root permissions are required for some reason, then it'll fail with that error, and the user will sudo the npm command in question. ## SEE ALSO * npm-run-script(1) * package.json(5) * npm-developers(7) * npm-install(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/removing-npm.md000644 000766 000024 00000003132 12455173731 025341 0ustar00iojsstaff000000 000000 npm-removal(1) -- Cleaning the Slate ==================================== ## SYNOPSIS So sad to see you go. sudo npm uninstall npm -g Or, if that fails, get the npm source code, and do: sudo make uninstall ## More Severe Uninstalling Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed. If that doesn't work, or if you require more drastic measures, continue reading. Note that this is only necessary for globally-installed packages. Local installs are completely contained within a project's `node_modules` folder. Delete that folder, and everything is gone (unless a package's install script is particularly ill-behaved). This assumes that you installed node and npm in the default place. If you configured node with a different `--prefix`, or installed npm with a different prefix setting, then adjust the paths accordingly, replacing `/usr/local` with your install prefix. To remove everything npm-related manually: rm -rf /usr/local/{lib/node{,/.npm,_modules},bin,share/man}/npm* If you installed things *with* npm, then your best bet is to uninstall them with npm first, and then install them again once you have a proper install. This can help find any symlinks that are lying around: ls -laF /usr/local/{lib/node{,/.npm},bin,share/man} | grep npm Prior to version 0.3, npm used shim files for executables and node modules. To track those down, you can do the following: find /usr/local/{lib/node,bin} -exec grep -l npm \{\} \; ; (This is also in the README file.) ## SEE ALSO * README * npm-rm(1) * npm-prune(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/misc/semver.md000644 000766 000024 00000026635 12455173731 024241 0ustar00iojsstaff000000 000000 semver(7) -- The semantic versioner for npm =========================================== ## Usage $ npm install semver semver.valid('1.2.3') // '1.2.3' semver.valid('a.b.c') // null semver.clean(' =v1.2.3 ') // '1.2.3' semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true semver.gt('1.2.3', '9.8.7') // false semver.lt('1.2.3', '9.8.7') // true As a command-line utility: $ semver -h Usage: semver [ [...]] [-r | -i | --preid | -l | -rv] Test if version(s) satisfy the supplied range(s), and sort them. Multiple versions or ranges may be supplied, unless increment option is specified. In that case, only a single version may be used, and it is incremented by the specified level Program exits successfully if any valid version satisfies all supplied ranges, and prints all satisfying versions. If no versions are valid, or ranges are not satisfied, then exits failure. Versions are printed in ascending order, so supplying multiple versions to the utility will just sort them. ## Versions A "version" is described by the `v2.0.0` specification found at . A leading `"="` or `"v"` character is stripped off and ignored. ## Ranges A `version range` is a set of `comparators` which specify versions that satisfy the range. A `comparator` is composed of an `operator` and a `version`. The set of primitive `operators` is: * `<` Less than * `<=` Less than or equal to * `>` Greater than * `>=` Greater than or equal to * `=` Equal. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included. For example, the comparator `>=1.2.7` would match the versions `1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` or `1.1.0`. Comparators can be joined by whitespace to form a `comparator set`, which is satisfied by the **intersection** of all of the comparators it includes. A range is composed of one or more comparator sets, joined by `||`. A version matches a range if and only if every comparator in at least one of the `||`-separated comparator sets is satisfied by the version. For example, the range `>=1.2.7 <1.3.0` would match the versions `1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, or `1.1.0`. The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, `1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. ### Prerelease Tags If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then it will only be allowed to satisfy comparator sets if at least one comparator with the same `[major, minor, patch]` tuple also has a prerelease tag. For example, the range `>1.2.3-alpha.3` would be allowed to match the version `1.2.3-alpha.7`, but it would *not* be satisfied by `3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater than" `1.2.3-alpha.3` according to the SemVer sort rules. The version range only accepts prerelease tags on the `1.2.3` version. The version `3.4.5` *would* satisfy the range, because it does not have a prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. The purpose for this behavior is twofold. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author's design) not yet fit for public consumption. Therefore, by default, they are excluded from range matching semantics. Second, a user who has opted into using a prerelease version has clearly indicated the intent to use *that specific* set of alpha/beta/rc versions. By including a prerelease tag in the range, the user is indicating that they are aware of the risk. However, it is still not appropriate to assume that they have opted into taking a similar risk on the *next* set of prerelease versions. #### Prerelease Identifiers The method `.inc` takes an additional `identifier` string argument that will append the value of the string as a prerelease identifier: ````javascript > semver.inc('1.2.3', 'pre', 'beta') '1.2.4-beta.0' ``` command-line example: ```shell $ semver 1.2.3 -i prerelease --preid beta 1.2.4-beta.0 ``` Which then can be used to increment further: ```shell $ semver 1.2.4-beta.0 -i prerelease 1.2.4-beta.1 ``` ### Advanced Range Syntax Advanced range syntax desugars to primitive comparators in deterministic ways. Advanced ranges may be combined in the same way as primitive comparators using white space or `||`. #### Hyphen Ranges `X.Y.Z - A.B.C` Specifies an inclusive set. * `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` If a partial version is provided as the first version in the inclusive range, then the missing pieces are replaced with zeroes. * `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` If a partial version is provided as the second version in the inclusive range, then all versions that start with the supplied parts of the tuple are accepted, but nothing that would be greater than the provided tuple parts. * `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` * `1.2.3 - 2` := `>=1.2.3 <3.0.0` #### X-Ranges `1.2.x` `1.X` `1.2.*` `*` Any of `X`, `x`, or `*` may be used to "stand in" for one of the numeric values in the `[major, minor, patch]` tuple. * `*` := `>=0.0.0` (Any version satisfies) * `1.x` := `>=1.0.0 <2.0.0` (Matching major version) * `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) A partial version range is treated as an X-Range, so the special character is in fact optional. * `""` (empty string) := `*` := `>=0.0.0` * `1` := `1.x.x` := `>=1.0.0 <2.0.0` * `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` #### Tilde Ranges `~1.2.3` `~1.2` `~1` Allows patch-level changes if a minor version is specified on the comparator. Allows minor-level changes if not. * `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` * `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) * `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) * `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` * `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) * `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) * `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in the `1.2.3` version will be allowed, if they are greater than or equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but `1.2.4-beta.2` would not, because it is a prerelease of a different `[major, minor, patch]` tuple. #### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` Allows changes that do not modify the left-most non-zero digit in the `[major, minor, patch]` tuple. In other words, this allows patch and minor updates for versions `1.0.0` and above, patch updates for versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. Many authors treat a `0.x` version as if the `x` were the major "breaking-change" indicator. Caret ranges are ideal when an author may make breaking changes between `0.2.4` and `0.3.0` releases, which is a common practice. However, it presumes that there will *not* be breaking changes between `0.2.4` and `0.2.5`. It allows for changes that are presumed to be additive (but non-breaking), according to commonly observed practices. * `^1.2.3` := `>=1.2.3 <2.0.0` * `^0.2.3` := `>=0.2.3 <0.3.0` * `^0.0.3` := `>=0.0.3 <0.0.4` * `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in the `1.2.3` version will be allowed, if they are greater than or equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but `1.2.4-beta.2` would not, because it is a prerelease of a different `[major, minor, patch]` tuple. * `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the `0.0.3` version *only* will be allowed, if they are greater than or equal to `beta`. So, `0.0.3-pr.2` would be allowed. When parsing caret ranges, a missing `patch` value desugars to the number `0`, but will allow flexibility within that value, even if the major and minor versions are both `0`. * `^1.2.x` := `>=1.2.0 <2.0.0` * `^0.0.x` := `>=0.0.0 <0.1.0` * `^0.0` := `>=0.0.0 <0.1.0` A missing `minor` and `patch` values will desugar to zero, but also allow flexibility within those values, even if the major version is zero. * `^1.x` := `>=1.0.0 <2.0.0` * `^0.x` := `>=0.0.0 <1.0.0` ## Functions All methods and classes take a final `loose` boolean argument that, if true, will be more forgiving about not-quite-valid semver strings. The resulting output will always be 100% strict, of course. Strict-mode Comparators and Ranges will be strict about the SemVer strings that they parse. * `valid(v)`: Return the parsed version, or null if it's not valid. * `inc(v, release)`: Return the version incremented by the release type (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), or null if it's not valid * `premajor` in one call will bump the version up to the next major version and down to a prerelease of that major version. `preminor`, and `prepatch` work the same way. * If called from a non-prerelease version, the `prerelease` will work the same as `prepatch`. It increments the patch version, then makes a prerelease. If the input version is already a prerelease it simply increments it. ### Comparison * `gt(v1, v2)`: `v1 > v2` * `gte(v1, v2)`: `v1 >= v2` * `lt(v1, v2)`: `v1 < v2` * `lte(v1, v2)`: `v1 <= v2` * `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, even if they're not the exact same string. You already know how to compare strings. * `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. * `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call the corresponding function above. `"==="` and `"!=="` do simple string comparison, but are included for completeness. Throws if an invalid comparison string is provided. * `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. * `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions in descending order when passed to `Array.sort()`. * `diff(v1, v2)`: Returns difference between two versions by the release type (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), or null if the versions are the same. ### Ranges * `validRange(range)`: Return the valid range or null if it's not valid * `satisfies(version, range)`: Return true if the version satisfies the range. * `maxSatisfying(versions, range)`: Return the highest version in the list that satisfies the range, or `null` if none of them do. * `gtr(version, range)`: Return `true` if version is greater than all the versions possible in the range. * `ltr(version, range)`: Return `true` if version is less than all the versions possible in the range. * `outside(version, range, hilo)`: Return true if the version is outside the bounds of the range in either the high or low direction. The `hilo` argument must be either the string `'>'` or `'<'`. (This is the function called by `gtr` and `ltr`.) Note that, since ranges may be non-contiguous, a version might not be greater than a range, less than a range, *or* satisfy a range! For example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` until `2.0.0`, so the version `1.2.10` would not be greater than the range (because `2.0.1` satisfies, which is higher), nor less than the range (since `1.2.8` satisfies, which is lower), and it also does not satisfy the range. If you want to know if a version satisfies or does not satisfy a range, use the `satisfies(version, range)` function. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/files/npm-folders.md000644 000766 000024 00000017250 12455173731 025326 0ustar00iojsstaff000000 000000 npm-folders(5) -- Folder Structures Used by npm =============================================== ## DESCRIPTION npm puts various things on your computer. That's its job. This document will tell you what it puts where. ### tl;dr * Local install (default): puts stuff in `./node_modules` of the current package root. * Global install (with `-g`): puts stuff in /usr/local or wherever node is installed. * Install it **locally** if you're going to `require()` it. * Install it **globally** if you're going to run it on the command line. * If you need both, then install it in both places, or use `npm link`. ### prefix Configuration The `prefix` config defaults to the location where node is installed. On most systems, this is `/usr/local`, and most of the time is the same as node's `process.installPrefix`. On windows, this is the exact location of the node.exe binary. On Unix systems, it's one level up, since node is typically installed at `{prefix}/bin/node` rather than `{prefix}/node.exe`. When the `global` flag is set, npm installs things into this prefix. When it is not set, it uses the root of the current package, or the current working directory if not in a package already. ### Node Modules Packages are dropped into the `node_modules` folder under the `prefix`. When installing locally, this means that you can `require("packagename")` to load its main module, or `require("packagename/lib/path/to/sub/module")` to load other modules. Global installs on Unix systems go to `{prefix}/lib/node_modules`. Global installs on Windows go to `{prefix}/node_modules` (that is, no `lib` folder.) Scoped packages are installed the same way, except they are grouped together in a sub-folder of the relevant `node_modules` folder with the name of that scope prefix by the @ symbol, e.g. `npm install @myorg/package` would place the package in `{prefix}/node_modules/@myorg/package`. See `scopes(7)` for more details. If you wish to `require()` a package, then install it locally. ### Executables When in global mode, executables are linked into `{prefix}/bin` on Unix, or directly into `{prefix}` on Windows. When in local mode, executables are linked into `./node_modules/.bin` so that they can be made available to scripts run through npm. (For example, so that a test runner will be in the path when you run `npm test`.) ### Man Pages When in global mode, man pages are linked into `{prefix}/share/man`. When in local mode, man pages are not installed. Man pages are not installed on Windows systems. ### Cache See `npm-cache(1)`. Cache files are stored in `~/.npm` on Posix, or `~/npm-cache` on Windows. This is controlled by the `cache` configuration param. ### Temp Files Temporary files are stored by default in the folder specified by the `tmp` config, which defaults to the TMPDIR, TMP, or TEMP environment variables, or `/tmp` on Unix and `c:\windows\temp` on Windows. Temp files are given a unique folder under this root for each run of the program, and are deleted upon successful exit. ## More Information When installing locally, npm first tries to find an appropriate `prefix` folder. This is so that `npm install foo@1.2.3` will install to the sensible root of your package, even if you happen to have `cd`ed into some other folder. Starting at the $PWD, npm will walk up the folder tree checking for a folder that contains either a `package.json` file, or a `node_modules` folder. If such a thing is found, then that is treated as the effective "current directory" for the purpose of running npm commands. (This behavior is inspired by and similar to git's .git-folder seeking logic when running git commands in a working dir.) If no package root is found, then the current folder is used. When you run `npm install foo@1.2.3`, then the package is loaded into the cache, and then unpacked into `./node_modules/foo`. Then, any of foo's dependencies are similarly unpacked into `./node_modules/foo/node_modules/...`. Any bin files are symlinked to `./node_modules/.bin/`, so that they may be found by npm scripts when necessary. ### Global Installation If the `global` configuration is set to true, then npm will install packages "globally". For global installation, packages are installed roughly the same way, but using the folders described above. ### Cycles, Conflicts, and Folder Parsimony Cycles are handled using the property of node's module system that it walks up the directories looking for `node_modules` folders. So, at every stage, if a package is already installed in an ancestor `node_modules` folder, then it is not installed at the current location. Consider the case above, where `foo -> bar -> baz`. Imagine if, in addition to that, baz depended on bar, so you'd have: `foo -> bar -> baz -> bar -> baz ...`. However, since the folder structure is: `foo/node_modules/bar/node_modules/baz`, there's no need to put another copy of bar into `.../baz/node_modules`, since when it calls require("bar"), it will get the copy that is installed in `foo/node_modules/bar`. This shortcut is only used if the exact same version would be installed in multiple nested `node_modules` folders. It is still possible to have `a/node_modules/b/node_modules/a` if the two "a" packages are different versions. However, without repeating the exact same package multiple times, an infinite regress will always be prevented. Another optimization can be made by installing dependencies at the highest level possible, below the localized "target" folder. #### Example Consider this dependency graph: foo +-- blerg@1.2.5 +-- bar@1.2.3 | +-- blerg@1.x (latest=1.3.7) | +-- baz@2.x | | `-- quux@3.x | | `-- bar@1.2.3 (cycle) | `-- asdf@* `-- baz@1.2.3 `-- quux@3.x `-- bar In this case, we might expect a folder structure like this: foo +-- node_modules +-- blerg (1.2.5) <---[A] +-- bar (1.2.3) <---[B] | `-- node_modules | +-- baz (2.0.2) <---[C] | | `-- node_modules | | `-- quux (3.2.0) | `-- asdf (2.3.4) `-- baz (1.2.3) <---[D] `-- node_modules `-- quux (3.2.0) <---[E] Since foo depends directly on `bar@1.2.3` and `baz@1.2.3`, those are installed in foo's `node_modules` folder. Even though the latest copy of blerg is 1.3.7, foo has a specific dependency on version 1.2.5. So, that gets installed at [A]. Since the parent installation of blerg satisfies bar's dependency on `blerg@1.x`, it does not install another copy under [B]. Bar [B] also has dependencies on baz and asdf, so those are installed in bar's `node_modules` folder. Because it depends on `baz@2.x`, it cannot re-use the `baz@1.2.3` installed in the parent `node_modules` folder [D], and must install its own copy [C]. Underneath bar, the `baz -> quux -> bar` dependency creates a cycle. However, because bar is already in quux's ancestry [B], it does not unpack another copy of bar into that folder. Underneath `foo -> baz` [D], quux's [E] folder tree is empty, because its dependency on bar is satisfied by the parent folder copy installed at [B]. For a graphical breakdown of what is installed where, use `npm ls`. ### Publishing Upon publishing, npm will look in the `node_modules` folder. If any of the items there are not in the `bundledDependencies` array, then they will not be included in the package tarball. This allows a package maintainer to install all of their dependencies (and dev dependencies) locally, but only re-publish those items that cannot be found elsewhere. See `package.json(5)` for more information. ## SEE ALSO * npm-faq(7) * package.json(5) * npm-install(1) * npm-pack(1) * npm-cache(1) * npm-config(1) * npmrc(5) * npm-config(7) * npm-publish(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/files/npmrc.md000644 000766 000024 00000004340 12455173731 024213 0ustar00iojsstaff000000 000000 npmrc(5) -- The npm config files ================================ ## DESCRIPTION npm gets its config settings from the command line, environment variables, and `npmrc` files. The `npm config` command can be used to update and edit the contents of the user and global npmrc files. For a list of available configuration options, see npm-config(7). ## FILES The four relevant files are: * per-project config file (/path/to/my/project/.npmrc) * per-user config file (~/.npmrc) * global config file ($PREFIX/npmrc) * npm builtin config file (/path/to/npm/npmrc) All npm config files are an ini-formatted list of `key = value` parameters. Environment variables can be replaced using `${VARIABLE_NAME}`. For example: prefix = ${HOME}/.npm-packages Each of these files is loaded, and config options are resolved in priority order. For example, a setting in the userconfig file would override the setting in the globalconfig file. Array values are specified by adding "[]" after the key name. For example: key[] = "first value" key[] = "second value" ### Per-project config file When working locally in a project, a `.npmrc` file in the root of the project (ie, a sibling of `node_modules` and `package.json`) will set config values specific to this project. Note that this only applies to the root of the project that you're running npm in. It has no effect when your module is published. For example, you can't publish a module that forces itself to install globally, or in a different location. ### Per-user config file `$HOME/.npmrc` (or the `userconfig` param, if set in the environment or on the command line) ### Global config file `$PREFIX/etc/npmrc` (or the `globalconfig` param, if set above): This file is an ini-file formatted list of `key = value` parameters. Environment variables can be replaced as above. ### Built-in config file `path/to/npm/itself/npmrc` This is an unchangeable "builtin" configuration file that npm keeps consistent across updates. Set fields in here using the `./configure` script that comes with npm. This is primarily for distribution maintainers to override default configs in a standard and consistent manner. ## SEE ALSO * npm-folders(5) * npm-config(1) * npm-config(7) * package.json(5) * npm(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/files/package.json.md000644 000766 000024 00000050765 12455173731 025453 0ustar00iojsstaff000000 000000 package.json(5) -- Specifics of npm's package.json handling =========================================================== ## DESCRIPTION This document is all you need to know about what's required in your package.json file. It must be actual JSON, not just a JavaScript object literal. A lot of the behavior described in this document is affected by the config settings described in `npm-config(7)`. ## name The *most* important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version. The name is what your thing is called. Some tips: * Don't put "js" or "node" in the name. It's assumed that it's js, since you're writing a package.json file, and you can specify the engine using the "engines" field. (See below.) * The name ends up being part of a URL, an argument on the command line, and a folder name. Any name with non-url-safe characters will be rejected. Also, it can't start with a dot or an underscore. * The name will probably be passed as an argument to require(), so it should be something short, but also reasonably descriptive. * You may want to check the npm registry to see if there's something by that name already, before you get too attached to it. http://registry.npmjs.org/ A name can be optionally prefixed by a scope, e.g. `@myorg/mypackage`. See `npm-scope(7)` for more detail. ## version The *most* important things in your package.json are the name and version fields. Those are actually required, and your package won't install without them. The name and version together form an identifier that is assumed to be completely unique. Changes to the package should come along with changes to the version. Version must be parseable by [node-semver](https://github.com/isaacs/node-semver), which is bundled with npm as a dependency. (`npm install semver` to use it yourself.) More on version numbers and ranges at semver(7). ## description Put a description in it. It's a string. This helps people discover your package, as it's listed in `npm search`. ## keywords Put keywords in it. It's an array of strings. This helps people discover your package as it's listed in `npm search`. ## homepage The url to the project homepage. **NOTE**: This is *not* the same as "url". If you put a "url" field, then the registry will think it's a redirection to your package that has been published somewhere else, and spit at you. Literally. Spit. I'm so not kidding. ## bugs The url to your project's issue tracker and / or the email address to which issues should be reported. These are helpful for people who encounter issues with your package. It should look like this: { "url" : "http://github.com/owner/project/issues" , "email" : "project@hostname.com" } You can specify either one or both values. If you want to provide only a url, you can specify the value for "bugs" as a simple string instead of an object. If a url is provided, it will be used by the `npm bugs` command. ## license You should specify a license for your package so that people know how they are permitted to use it, and any restrictions you're placing on it. The simplest way, assuming you're using a common license such as BSD-3-Clause or MIT, is to just specify the standard SPDX ID of the license you're using, like this: { "license" : "BSD-3-Clause" } You can check [the full list of SPDX license IDs](https://spdx.org/licenses/). Ideally you should pick one that is [OSI](http://opensource.org/licenses/alphabetical) approved. It's also a good idea to include a LICENSE file at the top level in your package. ## people fields: author, contributors The "author" is one person. "contributors" is an array of people. A "person" is an object with a "name" field and optionally "url" and "email", like this: { "name" : "Barney Rubble" , "email" : "b@rubble.com" , "url" : "http://barnyrubble.tumblr.com/" } Or you can shorten that all into a single string, and npm will parse it for you: "Barney Rubble (http://barnyrubble.tumblr.com/) Both email and url are optional either way. npm also sets a top-level "maintainers" field with your npm user info. ## files The "files" field is an array of files to include in your project. If you name a folder in the array, then it will also include the files inside that folder. (Unless they would be ignored by another rule.) You can also provide a ".npmignore" file in the root of your package, which will keep files from being included, even if they would be picked up by the files array. The ".npmignore" file works just like a ".gitignore". ## main The main field is a module ID that is the primary entry point to your program. That is, if your package is named `foo`, and a user installs it, and then does `require("foo")`, then your main module's exports object will be returned. This should be a module ID relative to the root of your package folder. For most modules, it makes the most sense to have a main script and often not much else. ## bin A lot of packages have one or more executable files that they'd like to install into the PATH. npm makes this pretty easy (in fact, it uses this feature to install the "npm" executable.) To use this, supply a `bin` field in your package.json which is a map of command name to local file name. On install, npm will symlink that file into `prefix/bin` for global installs, or `./node_modules/.bin/` for local installs. For example, npm has this: { "bin" : { "npm" : "./cli.js" } } So, when you install npm, it'll create a symlink from the `cli.js` script to `/usr/local/bin/npm`. If you have a single executable, and its name should be the name of the package, then you can just supply it as a string. For example: { "name": "my-program" , "version": "1.2.5" , "bin": "./path/to/program" } would be the same as this: { "name": "my-program" , "version": "1.2.5" , "bin" : { "my-program" : "./path/to/program" } } ## man Specify either a single file or an array of filenames to put in place for the `man` program to find. If only a single file is provided, then it's installed such that it is the result from `man `, regardless of its actual filename. For example: { "name" : "foo" , "version" : "1.2.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo.js" , "man" : "./man/doc.1" } would link the `./man/doc.1` file in such that it is the target for `man foo` If the filename doesn't start with the package name, then it's prefixed. So, this: { "name" : "foo" , "version" : "1.2.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo.js" , "man" : [ "./man/foo.1", "./man/bar.1" ] } will create files to do `man foo` and `man foo-bar`. Man files must end with a number, and optionally a `.gz` suffix if they are compressed. The number dictates which man section the file is installed into. { "name" : "foo" , "version" : "1.2.3" , "description" : "A packaged foo fooer for fooing foos" , "main" : "foo.js" , "man" : [ "./man/foo.1", "./man/foo.2" ] } will create entries for `man foo` and `man 2 foo` ## directories The CommonJS [Packages](http://wiki.commonjs.org/wiki/Packages/1.0) spec details a few ways that you can indicate the structure of your package using a `directories` object. If you look at [npm's package.json](http://registry.npmjs.org/npm/latest), you'll see that it has directories for doc, lib, and man. In the future, this information may be used in other creative ways. ### directories.lib Tell people where the bulk of your library is. Nothing special is done with the lib folder in any way, but it's useful meta info. ### directories.bin If you specify a `bin` directory, then all the files in that folder will be added as children of the `bin` path. If you have a `bin` path already, then this has no effect. ### directories.man A folder that is full of man pages. Sugar to generate a "man" array by walking the folder. ### directories.doc Put markdown files in here. Eventually, these will be displayed nicely, maybe, someday. ### directories.example Put example scripts in here. Someday, it might be exposed in some clever way. ## repository Specify the place where your code lives. This is helpful for people who want to contribute. If the git repo is on GitHub, then the `npm docs` command will be able to find you. Do it like this: "repository" : { "type" : "git" , "url" : "http://github.com/npm/npm.git" } "repository" : { "type" : "svn" , "url" : "http://v8.googlecode.com/svn/trunk/" } The URL should be a publicly available (perhaps read-only) url that can be handed directly to a VCS program without any modification. It should not be a url to an html project page that you put in your browser. It's for computers. ## scripts The "scripts" property is a dictionary containing script commands that are run at various times in the lifecycle of your package. The key is the lifecycle event, and the value is the command to run at that point. See `npm-scripts(7)` to find out more about writing package scripts. ## config A "config" object can be used to set configuration parameters used in package scripts that persist across upgrades. For instance, if a package had the following: { "name" : "foo" , "config" : { "port" : "8080" } } and then had a "start" command that then referenced the `npm_package_config_port` environment variable, then the user could override that by doing `npm config set foo:port 8001`. See `npm-config(7)` and `npm-scripts(7)` for more on package configs. ## dependencies Dependencies are specified in a simple object that maps a package name to a version range. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or git URL. **Please do not put test harnesses or transpilers in your `dependencies` object.** See `devDependencies`, below. See semver(7) for more details about specifying version ranges. * `version` Must match `version` exactly * `>version` Must be greater than `version` * `>=version` etc * `=version1 <=version2`. * `range1 || range2` Passes if either range1 or range2 are satisfied. * `git...` See 'Git URLs as Dependencies' below * `user/repo` See 'GitHub URLs' below * `tag` A specific version tagged and published as `tag` See `npm-tag(1)` * `path/path/path` See Local Paths below For example, these are all valid: { "dependencies" : { "foo" : "1.0.0 - 2.9999.9999" , "bar" : ">=1.0.2 <2.1.2" , "baz" : ">1.0.2 <=2.3.4" , "boo" : "2.0.1" , "qux" : "<1.0.0 || >=2.3.1 <2.4.5 || >=2.5.2 <3.0.0" , "asd" : "http://asdf.com/asdf.tar.gz" , "til" : "~1.2" , "elf" : "~1.2.3" , "two" : "2.x" , "thr" : "3.3.x" , "lat" : "latest" , "dyl" : "file:../dyl" } } ### URLs as Dependencies You may specify a tarball URL in place of a version range. This tarball will be downloaded and installed locally to your package at install time. ### Git URLs as Dependencies Git urls can be of the form: git://github.com/user/project.git#commit-ish git+ssh://user@hostname:project.git#commit-ish git+ssh://user@hostname/project.git#commit-ish git+http://user@hostname/project/blah.git#commit-ish git+https://user@hostname/project/blah.git#commit-ish The `commit-ish` can be any tag, sha, or branch which can be supplied as an argument to `git checkout`. The default is `master`. ## GitHub URLs As of version 1.1.65, you can refer to GitHub urls as just "foo": "user/foo-project". Just as with git URLs, a `commit-ish` suffix can be included. For example: { "name": "foo", "version": "0.0.0", "dependencies": { "express": "visionmedia/express", "mocha": "visionmedia/mocha#4727d357ea" } } ## Local Paths As of version 2.0.0 you can provide a path to a local directory that contains a package. Local paths can be saved using `npm install --save`, using any of these forms: ../foo/bar ~/foo/bar ./foo/bar /foo/bar in which case they will be normalized to a relative path and added to your `package.json`. For example: { "name": "baz", "dependencies": { "bar": "file:../foo/bar" } } This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages to the public registry. ## devDependencies If someone is planning on downloading and using your module in their program, then they probably don't want or need to download and build the external test or documentation framework that you use. In this case, it's best to map these additional items in a `devDependencies` object. These things will be installed when doing `npm link` or `npm install` from the root of a package, and can be managed like any other npm configuration param. See `npm-config(7)` for more on the topic. For build steps that are not platform-specific, such as compiling CoffeeScript or other languages to JavaScript, use the `prepublish` script to do this, and make the required package a devDependency. For example: { "name": "ethopia-waza", "description": "a delightfully fruity coffee varietal", "version": "1.2.3", "devDependencies": { "coffee-script": "~1.6.3" }, "scripts": { "prepublish": "coffee -o lib/ -c src/waza.coffee" }, "main": "lib/waza.js" } The `prepublish` script will be run before publishing, so that users can consume the functionality without requiring them to compile it themselves. In dev mode (ie, locally running `npm install`), it'll run this script as well, so that you can test it easily. ## peerDependencies In some cases, you want to express the compatibility of your package with an host tool or library, while not necessarily doing a `require` of this host. This is usually referred to as a *plugin*. Notably, your module may be exposing a specific interface, expected and specified by the host documentation. For example: { "name": "tea-latte", "version": "1.3.5" "peerDependencies": { "tea": "2.x" } } This ensures your package `tea-latte` can be installed *along* with the second major version of the host package `tea` only. The host package is automatically installed if needed. `npm install tea-latte` could possibly yield the following dependency graph: ├── tea-latte@1.3.5 └── tea@2.2.0 Trying to install another plugin with a conflicting requirement will cause an error. For this reason, make sure your plugin requirement is as broad as possible, and not to lock it down to specific patch versions. Assuming the host complies with [semver](http://semver.org/), only changes in the host package's major version will break your plugin. Thus, if you've worked with every 1.x version of the host package, use `"^1.0"` or `"1.x"` to express this. If you depend on features introduced in 1.5.2, use `">= 1.5.2 < 2"`. ## bundledDependencies Array of package names that will be bundled when publishing the package. If this is spelled `"bundleDependencies"`, then that is also honorable. ## optionalDependencies If a dependency can be used, but you would like npm to proceed if it cannot be found or fails to install, then you may put it in the `optionalDependencies` object. This is a map of package name to version or url, just like the `dependencies` object. The difference is that build failures do not cause installation to fail. It is still your program's responsibility to handle the lack of the dependency. For example, something like this: try { var foo = require('foo') var fooVersion = require('foo/package.json').version } catch (er) { foo = null } if ( notGoodFooVersion(fooVersion) ) { foo = null } // .. then later in your program .. if (foo) { foo.doFooThings() } Entries in `optionalDependencies` will override entries of the same name in `dependencies`, so it's usually best to only put in one place. ## engines You can specify the version of node that your stuff works on: { "engines" : { "node" : ">=0.10.3 <0.12" } } And, like with dependencies, if you don't specify the version (or if you specify "\*" as the version), then any version of node will do. If you specify an "engines" field, then npm will require that "node" be somewhere on that list. If "engines" is omitted, then npm will just assume that it works on node. You can also use the "engines" field to specify which versions of npm are capable of properly installing your program. For example: { "engines" : { "npm" : "~1.0.20" } } Note that, unless the user has set the `engine-strict` config flag, this field is advisory only. ## engineStrict If you are sure that your module will *definitely not* run properly on versions of Node/npm other than those specified in the `engines` object, then you can set `"engineStrict": true` in your package.json file. This will override the user's `engine-strict` config setting. Please do not do this unless you are really very very sure. If your engines object is something overly restrictive, you can quite easily and inadvertently lock yourself into obscurity and prevent your users from updating to new versions of Node. Consider this choice carefully. If people abuse it, it will be removed in a future version of npm. ## os You can specify which operating systems your module will run on: "os" : [ "darwin", "linux" ] You can also blacklist instead of whitelist operating systems, just prepend the blacklisted os with a '!': "os" : [ "!win32" ] The host operating system is determined by `process.platform` It is allowed to both blacklist, and whitelist, although there isn't any good reason to do this. ## cpu If your code only runs on certain cpu architectures, you can specify which ones. "cpu" : [ "x64", "ia32" ] Like the `os` option, you can also blacklist architectures: "cpu" : [ "!arm", "!mips" ] The host architecture is determined by `process.arch` ## preferGlobal If your package is primarily a command-line application that should be installed globally, then set this value to `true` to provide a warning if it is installed locally. It doesn't actually prevent users from installing it locally, but it does help prevent some confusion if it doesn't work as expected. ## private If you set `"private": true` in your package.json, then npm will refuse to publish it. This is a way to prevent accidental publication of private repositories. If you would like to ensure that a given package is only ever published to a specific registry (for example, an internal registry), then use the `publishConfig` dictionary described below to override the `registry` config param at publish-time. ## publishConfig This is a set of config values that will be used at publish-time. It's especially handy if you want to set the tag or registry, so that you can ensure that a given package is not tagged with "latest" or published to the global public registry by default. Any config values can be overridden, but of course only "tag" and "registry" probably matter for the purposes of publishing. See `npm-config(7)` to see the list of config options that can be overridden. ## DEFAULT VALUES npm will default some values based on package contents. * `"scripts": {"start": "node server.js"}` If there is a `server.js` file in the root of your package, then npm will default the `start` command to `node server.js`. * `"scripts":{"preinstall": "node-gyp rebuild"}` If there is a `binding.gyp` file in the root of your package, npm will default the `preinstall` command to compile using node-gyp. * `"contributors": [...]` If there is an `AUTHORS` file in the root of your package, npm will treat each line as a `Name (url)` format, where email and url are optional. Lines which start with a `#` or are blank, will be ignored. ## SEE ALSO * semver(7) * npm-init(1) * npm-version(1) * npm-config(1) * npm-config(7) * npm-help(1) * npm-faq(7) * npm-install(1) * npm-publish(1) * npm-rm(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-adduser.md000644 000766 000024 00000004202 12455173731 024755 0ustar00iojsstaff000000 000000 npm-adduser(1) -- Add a registry user account ============================================= ## SYNOPSIS npm adduser [--registry=url] [--scope=@orgname] [--always-auth] ## DESCRIPTION Create or verify a user named `` in the specified registry, and save the credentials to the `.npmrc` file. If no registry is specified, the default registry will be used (see `npm-config(7)`). The username, password, and email are read in from prompts. To reset your password, go to To change your email address, go to You may use this command multiple times with the same user account to authorize on a new machine. When authenticating on a new machine, the username, password and email address must all match with your existing record. `npm login` is an alias to `adduser` and behaves exactly the same way. ## CONFIGURATION ### registry Default: http://registry.npmjs.org/ The base URL of the npm package registry. If `scope` is also specified, this registry will only be used for packages with that scope. See `npm-scope(7)`. ### scope Default: none If specified, the user and login credentials given will be associated with the specified scope. See `npm-scope(7)`. You can use both at the same time, e.g. npm adduser --registry=http://myregistry.example.com --scope=@myco This will set a registry for the given scope and login or create a user for that registry at the same time. ### always-auth Default: false If specified, save configuration indicating that all requests to the given registry should include authorization information. Useful for private registries. Can be used with `--registry` and / or `--scope`, e.g. npm adduser --registry=http://private-registry.example.com --always-auth This will ensure that all requests to that registry (including for tarballs) include an authorization header. See `always-auth` in `npm-config(7)` for more details on always-auth. Registry-specific configuration of `always-auth` takes precedence over any global configuration. ## SEE ALSO * npm-registry(7) * npm-config(1) * npm-config(7) * npmrc(5) * npm-owner(1) * npm-whoami(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-bin.md000644 000766 000024 00000000422 12455173731 024076 0ustar00iojsstaff000000 000000 npm-bin(1) -- Display npm bin folder ==================================== ## SYNOPSIS npm bin ## DESCRIPTION Print the folder where npm will install executables. ## SEE ALSO * npm-prefix(1) * npm-root(1) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-bugs.md000644 000766 000024 00000001620 12455173731 024267 0ustar00iojsstaff000000 000000 npm-bugs(1) -- Bugs for a package in a web browser maybe ======================================================== ## SYNOPSIS npm bugs npm bugs (with no args in a package dir) ## DESCRIPTION This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the `--browser` config param. If no package name is provided, it will search for a `package.json` in the current folder and use the `name` property. ## CONFIGURATION ### browser * Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"` * Type: String The browser that is called by the `npm bugs` command to open websites. ### registry * Default: https://registry.npmjs.org/ * Type: url The base URL of the npm package registry. ## SEE ALSO * npm-docs(1) * npm-view(1) * npm-publish(1) * npm-registry(7) * npm-config(1) * npm-config(7) * npmrc(5) * package.json(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-build.md000644 000766 000024 00000000620 12455173731 024425 0ustar00iojsstaff000000 000000 npm-build(1) -- Build a package =============================== ## SYNOPSIS npm build * ``: A folder containing a `package.json` file in its root. ## DESCRIPTION This is the plumbing command called by `npm link` and `npm install`. It should generally not be called directly. ## SEE ALSO * npm-install(1) * npm-link(1) * npm-scripts(7) * package.json(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-bundle.md000644 000766 000024 00000000523 12455173731 024601 0ustar00iojsstaff000000 000000 npm-bundle(1) -- REMOVED ======================== ## DESCRIPTION The `npm bundle` command has been removed in 1.0, for the simple reason that it is no longer necessary, as the default behavior is now to install packages into the local space. Just use `npm install` now to do what `npm bundle` used to do. ## SEE ALSO * npm-install(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-cache.md000644 000766 000024 00000003634 12455173731 024401 0ustar00iojsstaff000000 000000 npm-cache(1) -- Manipulates packages cache ========================================== ## SYNOPSIS npm cache add npm cache add npm cache add npm cache add @ npm cache ls [] npm cache clean [] ## DESCRIPTION Used to add, list, or clear the npm cache folder. * add: Add the specified package to the local cache. This command is primarily intended to be used internally by npm, but it can provide a way to add data to the local installation cache explicitly. * ls: Show the data in the cache. Argument is a path to show in the cache folder. Works a bit like the `find` program, but limited by the `depth` config. * clean: Delete data out of the cache folder. If an argument is provided, then it specifies a subpath to delete. If no argument is provided, then the entire cache is cleared. ## DETAILS npm stores cache data in the directory specified in `npm config get cache`. For each package that is added to the cache, three pieces of information are stored in `{cache}/{name}/{version}`: * .../package/package.json: The package.json file, as npm sees it. * .../package.tgz: The tarball for that version. Additionally, whenever a registry request is made, a `.cache.json` file is placed at the corresponding URI, to store the ETag and the requested data. This is stored in `{cache}/{hostname}/{path}/.cache.json`. Commands that make non-essential registry requests (such as `search` and `view`, or the completion scripts) generally specify a minimum timeout. If the `.cache.json` file is younger than the specified timeout, then they do not make an HTTP request to the registry. ## CONFIGURATION ### cache Default: `~/.npm` on Posix, or `%AppData%/npm-cache` on Windows. The root cache folder. ## SEE ALSO * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) * npm-install(1) * npm-publish(1) * npm-pack(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-completion.md000644 000766 000024 00000001345 12455173731 025504 0ustar00iojsstaff000000 000000 npm-completion(1) -- Tab Completion for npm =========================================== ## SYNOPSIS . <(npm completion) ## DESCRIPTION Enables tab-completion in all npm commands. The synopsis above loads the completions into your current shell. Adding it to your ~/.bashrc or ~/.zshrc will make the completions available everywhere. You may of course also pipe the output of npm completion to a file such as `/usr/local/etc/bash_completion.d/npm` if you have a system that will read that file for you. When `COMP_CWORD`, `COMP_LINE`, and `COMP_POINT` are defined in the environment, `npm completion` acts in "plumbing mode", and outputs completions based on the arguments. ## SEE ALSO * npm-developers(7) * npm-faq(7) * npm(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-config.md000644 000766 000024 00000002476 12455173731 024606 0ustar00iojsstaff000000 000000 npm-config(1) -- Manage the npm configuration files =================================================== ## SYNOPSIS npm config set [--global] npm config get npm config delete npm config list npm config edit npm c [set|get|delete|list] npm get npm set [--global] ## DESCRIPTION npm gets its config settings from the command line, environment variables, `npmrc` files, and in some cases, the `package.json` file. See npmrc(5) for more information about the npmrc files. See `npm-config(7)` for a more thorough discussion of the mechanisms involved. The `npm config` command can be used to update and edit the contents of the user and global npmrc files. ## Sub-commands Config supports the following sub-commands: ### set npm config set key value Sets the config key to the value. If value is omitted, then it sets it to "true". ### get npm config get key Echo the config value to stdout. ### list npm config list Show all the config settings. ### delete npm config delete key Deletes the key from all configuration files. ### edit npm config edit Opens the config file in an editor. Use the `--global` flag to edit the global config. ## SEE ALSO * npm-folders(5) * npm-config(7) * package.json(5) * npmrc(5) * npm(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-dedupe.md000644 000766 000024 00000003015 12455173731 024575 0ustar00iojsstaff000000 000000 npm-dedupe(1) -- Reduce duplication =================================== ## SYNOPSIS npm dedupe [package names...] npm ddp [package names...] ## DESCRIPTION Searches the local package tree and attempts to simplify the overall structure by moving dependencies further up the tree, where they can be more effectively shared by multiple dependent packages. For example, consider this dependency graph: a +-- b <-- depends on c@1.0.x | `-- c@1.0.3 `-- d <-- depends on c@~1.0.9 `-- c@1.0.10 In this case, `npm-dedupe(1)` will transform the tree to: a +-- b +-- d `-- c@1.0.10 Because of the hierarchical nature of node's module lookup, b and d will both get their dependency met by the single c package at the root level of the tree. If a suitable version exists at the target location in the tree already, then it will be left untouched, but the other duplicates will be deleted. If no suitable version can be found, then a warning is printed, and nothing is done. If any arguments are supplied, then they are filters, and only the named packages will be touched. Note that this operation transforms the dependency tree, and may result in packages getting updated versions, perhaps from the npm registry. This feature is experimental, and may change in future versions. The `--tag` argument will apply to all of the affected dependencies. If a tag with the given name exists, the tagged version is preferred over newer versions. ## SEE ALSO * npm-ls(1) * npm-update(1) * npm-install(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-deprecate.md000644 000766 000024 00000001320 12455173731 025260 0ustar00iojsstaff000000 000000 npm-deprecate(1) -- Deprecate a version of a package ==================================================== ## SYNOPSIS npm deprecate [@] ## DESCRIPTION This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it. It works on version ranges as well as specific versions, so you can do something like this: npm deprecate my-thing@"< 0.2.3" "critical bug fixed in v0.2.3" Note that you must be the package owner to deprecate something. See the `owner` and `adduser` help topics. To un-deprecate a package, specify an empty string (`""`) for the `message` argument. ## SEE ALSO * npm-publish(1) * npm-registry(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-docs.md000644 000766 000024 00000002031 12455173731 024254 0ustar00iojsstaff000000 000000 npm-docs(1) -- Docs for a package in a web browser maybe ======================================================== ## SYNOPSIS npm docs [ [ ...]] npm docs (with no args in a package dir) npm home [ [ ...]] npm home (with no args in a package dir) ## DESCRIPTION This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the `--browser` config param. You can pass multiple package names at once. If no package name is provided, it will search for a `package.json` in the current folder and use the `name` property. ## CONFIGURATION ### browser * Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"` * Type: String The browser that is called by the `npm docs` command to open websites. ### registry * Default: https://registry.npmjs.org/ * Type: url The base URL of the npm package registry. ## SEE ALSO * npm-view(1) * npm-publish(1) * npm-registry(7) * npm-config(1) * npm-config(7) * npmrc(5) * package.json(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-edit.md000644 000766 000024 00000001523 12455173731 024256 0ustar00iojsstaff000000 000000 npm-edit(1) -- Edit an installed package ======================================== ## SYNOPSIS npm edit [@] ## DESCRIPTION Opens the package folder in the default editor (or whatever you've configured as the npm `editor` config -- see `npm-config(7)`.) After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages. For instance, you can do `npm install connect` to install connect into your package, and then `npm edit connect` to make a few changes to your locally installed copy. ## CONFIGURATION ### editor * Default: `EDITOR` environment variable if set, or `"vi"` on Posix, or `"notepad"` on Windows. * Type: path The command to run for `npm edit` or `npm config edit`. ## SEE ALSO * npm-folders(5) * npm-explore(1) * npm-install(1) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-explore.md000644 000766 000024 00000001542 12455173731 025010 0ustar00iojsstaff000000 000000 npm-explore(1) -- Browse an installed package ============================================= ## SYNOPSIS npm explore [ -- ] ## DESCRIPTION Spawn a subshell in the directory of the installed package specified. If a command is specified, then it is run in the subshell, which then immediately terminates. This is particularly handy in the case of git submodules in the `node_modules` folder: npm explore some-dependency -- git pull origin master Note that the package is *not* automatically rebuilt afterwards, so be sure to use `npm rebuild ` if you make any changes. ## CONFIGURATION ### shell * Default: SHELL environment variable, or "bash" on Posix, or "cmd" on Windows * Type: path The shell to run for the `npm explore` command. ## SEE ALSO * npm-folders(5) * npm-edit(1) * npm-rebuild(1) * npm-build(1) * npm-install(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-help-search.md000644 000766 000024 00000001476 12455173731 025533 0ustar00iojsstaff000000 000000 npm-help-search(1) -- Search npm help documentation =================================================== ## SYNOPSIS npm help-search some search terms ## DESCRIPTION This command will search the npm markdown documentation files for the terms provided, and then list the results, sorted by relevance. If only one result is found, then it will show that help topic. If the argument to `npm help` is not a known help topic, then it will call `help-search`. It is rarely if ever necessary to call this command directly. ## CONFIGURATION ### long * Type: Boolean * Default false If true, the "long" flag will cause help-search to output context around where the terms were found in the documentation. If false, then help-search will just list out the help topics found. ## SEE ALSO * npm(1) * npm-faq(7) * npm-help(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-help.md000644 000766 000024 00000001521 12455173731 024257 0ustar00iojsstaff000000 000000 npm-help(1) -- Get help on npm ============================== ## SYNOPSIS npm help npm help some search terms ## DESCRIPTION If supplied a topic, then show the appropriate documentation page. If the topic does not exist, or if multiple terms are provided, then run the `help-search` command to find a match. Note that, if `help-search` finds a single subject, then it will run `help` on that topic, so unique matches are equivalent to specifying a topic name. ## CONFIGURATION ### viewer * Default: "man" on Posix, "browser" on Windows * Type: path The program to use to view help content. Set to `"browser"` to view html help content in the default web browser. ## SEE ALSO * npm(1) * README * npm-faq(7) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) * package.json(5) * npm-help-search(1) * npm-index(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-init.md000644 000766 000024 00000001515 12455173731 024275 0ustar00iojsstaff000000 000000 npm-init(1) -- Interactively create a package.json file ======================================================= ## SYNOPSIS npm init [-f|--force|-y|--yes] ## DESCRIPTION This will ask you a bunch of questions, and then write a package.json for you. It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected. If you already have a package.json file, it'll read that first, and default to the options in there. It is strictly additive, so it does not delete options from your package.json without a really good reason to do so. If you invoke it with `-f`, `--force`, `-y`, or `--yes`, it will use only defaults and not prompt you for any options. ## SEE ALSO * * package.json(5) * npm-version(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-install.md000644 000766 000024 00000022236 12455173731 025003 0ustar00iojsstaff000000 000000 npm-install(1) -- Install a package =================================== ## SYNOPSIS npm install (with no args in a package dir) npm install npm install npm install npm install [@/] [--save|--save-dev|--save-optional] [--save-exact] npm install [@/]@ npm install [@/]@ npm install [@/]@ npm i (with any of the previous argument usage) ## DESCRIPTION This command installs a package, and any packages that it depends on. If the package has a shrinkwrap file, the installation of dependencies will be driven by that. See npm-shrinkwrap(1). A `package` is: * a) a folder containing a program described by a package.json file * b) a gzipped tarball containing (a) * c) a url that resolves to (b) * d) a `@` that is published on the registry (see `npm-registry(7)`) with (c) * e) a `@` that points to (d) * f) a `` that has a "latest" tag satisfying (e) * g) a `` that resolves to (b) Even if you never publish your package, you can still get a lot of benefits of using npm if you just want to write a node program (a), and perhaps if you also want to be able to easily install it elsewhere after packing it up into a tarball (b). * `npm install` (in package directory, no arguments): Install the dependencies in the local node_modules folder. In global mode (ie, with `-g` or `--global` appended to the command), it installs the current package context (ie, the current working directory) as a global package. By default, `npm install` will install all modules listed as dependencies. With the `--production` flag, npm will not install modules listed in `devDependencies`. * `npm install `: Install a package that is sitting in a folder on the filesystem. * `npm install `: Install a package that is sitting on the filesystem. Note: if you just want to link a dev directory into your npm root, you can do this more easily by using `npm link`. Example: npm install ./package.tgz * `npm install `: Fetch the tarball url, and then install it. In order to distinguish between this and other options, the argument must start with "http://" or "https://" Example: npm install https://github.com/indexzero/forever/tarball/v0.5.6 * `npm install [@/] [--save|--save-dev|--save-optional]`: Do a `@` install, where `` is the "tag" config. (See `npm-config(7)`.) In most cases, this will install the latest version of the module published on npm. Example: npm install sax `npm install` takes 3 exclusive, optional flags which save or update the package version in your main package.json: * `--save`: Package will appear in your `dependencies`. * `--save-dev`: Package will appear in your `devDependencies`. * `--save-optional`: Package will appear in your `optionalDependencies`. When using any of the above options to save dependencies to your package.json, there is an additional, optional flag: * `--save-exact`: Saved dependencies will be configured with an exact version rather than using npm's default semver range operator. `` is optional. The package will be downloaded from the registry associated with the specified scope. If no registry is associated with the given scope the default registry is assumed. See `npm-scope(7)`. Note: if you do not include the @-symbol on your scope name, npm will interpret this as a GitHub repository instead, see below. Scopes names must also be followed by a slash. Examples: npm install sax --save npm install githubname/reponame npm install @myorg/privatepackage npm install node-tap --save-dev npm install dtrace-provider --save-optional npm install readable-stream --save --save-exact **Note**: If there is a file or folder named `` in the current working directory, then it will try to install that, and only try to fetch the package by name if it is not valid. * `npm install [@/]@`: Install the version of the package that is referenced by the specified tag. If the tag does not exist in the registry data for that package, then this will fail. Example: npm install sax@latest npm install @myorg/mypackage@latest * `npm install [@/]@`: Install the specified version of the package. This will fail if the version has not been published to the registry. Example: npm install sax@0.1.1 npm install @myorg/privatepackage@1.5.0 * `npm install [@/]@`: Install a version of the package matching the specified version range. This will follow the same rules for resolving dependencies described in `package.json(5)`. Note that most version ranges must be put in quotes so that your shell will treat it as a single argument. Example: npm install sax@">=0.1.0 <0.2.0" npm install @myorg/privatepackage@">=0.1.0 <0.2.0" * `npm install /`: Install the package at `https://github.com/githubname/githubrepo" by attempting to clone it using `git`. Example: npm install mygithubuser/myproject To reference a package in a git repo that is not on GitHub, see git remote urls below. * `npm install `: Install a package by cloning a git remote url. The format of the git url is: ://[@][#] `` is one of `git`, `git+ssh`, `git+http`, or `git+https`. If no `` is specified, then `master` is used. Examples: git+ssh://git@github.com:npm/npm.git#v1.0.27 git+https://isaacs@github.com/npm/npm.git git://github.com/npm/npm.git#v1.0.27 You may combine multiple arguments, and even multiple types of arguments. For example: npm install sax@">=0.1.0 <0.2.0" bench supervisor The `--tag` argument will apply to all of the specified install targets. If a tag with the given name exists, the tagged version is preferred over newer versions. The `--force` argument will force npm to fetch remote resources even if a local copy exists on disk. npm install sax --force The `--global` argument will cause npm to install the package globally rather than locally. See `npm-folders(5)`. The `--link` argument will cause npm to link global installs into the local space in some cases. The `--no-bin-links` argument will prevent npm from creating symlinks for any binaries the package might contain. The `--no-optional` argument will prevent optional dependencies from being installed. The `--no-shrinkwrap` argument, which will ignore an available shrinkwrap file and use the package.json instead. The `--nodedir=/path/to/node/source` argument will allow npm to find the node source code so that npm can compile native modules. See `npm-config(7)`. Many of the configuration params have some effect on installation, since that's most of what npm does. ## ALGORITHM To install a package, npm uses the following algorithm: install(where, what, family, ancestors) fetch what, unpack to /node_modules/ for each dep in what.dependencies resolve dep to precise version for each dep@version in what.dependencies not in /node_modules//node_modules/* and not in add precise version deps to install(/node_modules/, dep, family) For this `package{dep}` structure: `A{B,C}, B{C}, C{D}`, this algorithm produces: A +-- B `-- C `-- D That is, the dependency from B to C is satisfied by the fact that A already caused C to be installed at a higher level. See npm-folders(5) for a more detailed description of the specific folder structures that npm creates. ### Limitations of npm's Install Algorithm There are some very rare and pathological edge-cases where a cycle can cause npm to try to install a never-ending tree of packages. Here is the simplest case: A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ... where `A` is some version of a package, and `A'` is a different version of the same package. Because `B` depends on a different version of `A` than the one that is already in the tree, it must install a separate copy. The same is true of `A'`, which must install `B'`. Because `B'` depends on the original version of `A`, which has been overridden, the cycle falls into infinite regress. To avoid this situation, npm flat-out refuses to install any `name@version` that is already present anywhere in the tree of package folder ancestors. A more correct, but more complex, solution would be to symlink the existing version into the new location. If this ever affects a real use-case, it will be investigated. ## SEE ALSO * npm-folders(5) * npm-update(1) * npm-link(1) * npm-rebuild(1) * npm-scripts(7) * npm-build(1) * npm-config(1) * npm-config(7) * npmrc(5) * npm-registry(7) * npm-tag(1) * npm-rm(1) * npm-shrinkwrap(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-link.md000644 000766 000024 00000004242 12455173731 024267 0ustar00iojsstaff000000 000000 npm-link(1) -- Symlink a package folder ======================================= ## SYNOPSIS npm link (in package folder) npm link [@/] npm ln (with any of the previous argument usage) ## DESCRIPTION Package linking is a two-step process. First, `npm link` in a package folder will create a globally-installed symbolic link from `prefix/package-name` to the current folder (see `npm-config(7)` for the value of `prefix`). Next, in some other location, `npm link package-name` will create a symlink from the local `node_modules` folder to the global symlink. Note that `package-name` is taken from `package.json`, not from directory name. The package name can be optionally prefixed with a scope. See `npm-scope(7)`. The scope must be preceded by an @-symbol and followed by a slash. When creating tarballs for `npm publish`, the linked packages are "snapshotted" to their current state by resolving the symbolic links. This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild. For example: cd ~/projects/node-redis # go into the package directory npm link # creates global link cd ~/projects/node-bloggy # go into some other package directory. npm link redis # link-install the package Now, any changes to ~/projects/node-redis will be reflected in ~/projects/node-bloggy/node_modules/redis/ You may also shortcut the two steps in one. For example, to do the above use-case in a shorter way: cd ~/projects/node-bloggy # go into the dir of your main project npm link ../node-redis # link the dir of your dependency The second line is the equivalent of doing: (cd ../node-redis; npm link) npm link redis That is, it first creates a global link, and then links the global installation target into your project's `node_modules` folder. If your linked package is scoped (see `npm-scope(7)`) your link command must include that scope, e.g. npm link @myorg/privatepackage ## SEE ALSO * npm-developers(7) * npm-faq(7) * package.json(5) * npm-install(1) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-ls.md000644 000766 000024 00000003215 12455173731 023747 0ustar00iojsstaff000000 000000 npm-ls(1) -- List installed packages ====================================== ## SYNOPSIS npm list [[@/] ...] npm ls [[@/] ...] npm la [[@/] ...] npm ll [[@/] ...] ## DESCRIPTION This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree-structure. Positional arguments are `name@version-range` identifiers, which will limit the results to only the paths to the packages named. Note that nested packages will *also* show the paths to the specified packages. For example, running `npm ls promzard` in npm's source tree will show: npm@@VERSION@ /path/to/npm └─┬ init-package-json@0.0.4 └── promzard@0.1.5 It will print out extraneous, missing, and invalid packages. If a project specifies git urls for dependencies these are shown in parentheses after the name@version to make it easier for users to recognize potential forks of a project. When run as `ll` or `la`, it shows extended information by default. ## CONFIGURATION ### json * Default: false * Type: Boolean Show information in JSON format. ### long * Default: false * Type: Boolean Show extended information. ### parseable * Default: false * Type: Boolean Show parseable output instead of tree view. ### global * Default: false * Type: Boolean List packages in the global install prefix instead of in the current project. ### depth * Type: Int Max display depth of the dependency tree. ## SEE ALSO * npm-config(1) * npm-config(7) * npmrc(5) * npm-folders(5) * npm-install(1) * npm-link(1) * npm-prune(1) * npm-outdated(1) * npm-update(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-outdated.md000644 000766 000024 00000001701 12455173731 025140 0ustar00iojsstaff000000 000000 npm-outdated(1) -- Check for outdated packages ============================================== ## SYNOPSIS npm outdated [ [ ...]] ## DESCRIPTION This command will check the registry to see if any (or, specific) installed packages are currently outdated. The resulting field 'wanted' shows the latest version according to the version specified in the package.json, the field 'latest' the very latest version of the package. ## CONFIGURATION ### json * Default: false * Type: Boolean Show information in JSON format. ### long * Default: false * Type: Boolean Show extended information. ### parseable * Default: false * Type: Boolean Show parseable output instead of tree view. ### global * Default: false * Type: Boolean Check packages in the global install prefix instead of in the current project. ### depth * Type: Int Max depth for checking dependency tree. ## SEE ALSO * npm-update(1) * npm-registry(7) * npm-folders(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-owner.md000644 000766 000024 00000001625 12455173731 024466 0ustar00iojsstaff000000 000000 npm-owner(1) -- Manage package owners ===================================== ## SYNOPSIS npm owner ls npm owner add npm owner rm ## DESCRIPTION Manage ownership of published packages. * ls: List all the users who have access to modify a package and push new versions. Handy when you need to know who to bug for help. * add: Add a new user as a maintainer of a package. This user is enabled to modify metadata, publish new versions, and add other owners. * rm: Remove a user from the package owner list. This immediately revokes their privileges. Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time. ## SEE ALSO * npm-publish(1) * npm-registry(7) * npm-adduser(1) * npm-disputes(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-pack.md000644 000766 000024 00000001305 12455173731 024245 0ustar00iojsstaff000000 000000 npm-pack(1) -- Create a tarball from a package ============================================== ## SYNOPSIS npm pack [ [ ...]] ## DESCRIPTION For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as `-.tgz`, and then write the filenames out to stdout. If the same package is specified multiple times, then the file will be overwritten the second time. If no arguments are supplied, then npm packs the current package folder. ## SEE ALSO * npm-cache(1) * npm-publish(1) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-prefix.md000644 000766 000024 00000000714 12455173731 024627 0ustar00iojsstaff000000 000000 npm-prefix(1) -- Display prefix =============================== ## SYNOPSIS npm prefix [-g] ## DESCRIPTION Print the local prefix to standard out. This is the closest parent directory to contain a package.json file unless `-g` is also specified. If `-g` is specified, this will be the value of the global prefix. See `npm-config(7)` for more detail. ## SEE ALSO * npm-root(1) * npm-bin(1) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-prune.md000644 000766 000024 00000001146 12455173731 024463 0ustar00iojsstaff000000 000000 npm-prune(1) -- Remove extraneous packages ========================================== ## SYNOPSIS npm prune [ [ [ [--tag ] npm publish [--tag ] ## DESCRIPTION Publishes a package to the registry so that it can be installed by name. See `npm-developers(7)` for details on what's included in the published package, as well as details on how the package is built. By default npm will publish to the public registry. This can be overridden by specifying a different default registry or using a `npm-scope(7)` in the name (see `package.json(5)`). * ``: A folder containing a package.json file * ``: A url or file path to a gzipped tar archive containing a single folder with a package.json file inside. * `[--tag ]` Registers the published package with the given tag, such that `npm install @` will install this version. By default, `npm publish` updates and `npm install` installs the `latest` tag. Fails if the package name and version combination already exists in the specified registry. Once a package is published with a given name and version, that specific name and version combination can never be used again, even if it is removed with npm-unpublish(1). ## SEE ALSO * npm-registry(7) * npm-adduser(1) * npm-owner(1) * npm-deprecate(1) * npm-tag(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-rebuild.md000644 000766 000024 00000000670 12455173731 024761 0ustar00iojsstaff000000 000000 npm-rebuild(1) -- Rebuild a package =================================== ## SYNOPSIS npm rebuild [ [ ...]] npm rb [ [ ...]] * ``: The package to rebuild ## DESCRIPTION This command runs the `npm build` command on the matched folders. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary. ## SEE ALSO * npm-build(1) * npm-install(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-repo.md000644 000766 000024 00000001305 12455173731 024274 0ustar00iojsstaff000000 000000 npm-repo(1) -- Open package repository page in the browser ======================================================== ## SYNOPSIS npm repo npm repo (with no args in a package dir) ## DESCRIPTION This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the `--browser` config param. If no package name is provided, it will search for a `package.json` in the current folder and use the `name` property. ## CONFIGURATION ### browser * Default: OS X: `"open"`, Windows: `"start"`, Others: `"xdg-open"` * Type: String The browser that is called by the `npm repo` command to open websites. ## SEE ALSO * npm-docs(1) * npm-config(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-restart.md000644 000766 000024 00000001365 12455173731 025021 0ustar00iojsstaff000000 000000 npm-restart(1) -- Restart a package =================================== ## SYNOPSIS npm restart [-- ] ## DESCRIPTION This restarts a package. This runs a package's "stop", "restart", and "start" scripts, and associated pre- and post- scripts, in the order given below: 1. prerestart 2. prestop 3. stop 4. poststop 5. restart 6. prestart 7. start 8. poststart 9. postrestart ## NOTE Note that the "restart" script is run **in addition to** the "stop" and "start" scripts, not instead of them. This is the behavior as of `npm` major version 2. A change in this behavior will be accompanied by an increase in major version number ## SEE ALSO * npm-run-script(1) * npm-scripts(7) * npm-test(1) * npm-start(1) * npm-stop(1) * npm-restart(3)iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-rm.md000644 000766 000024 00000000552 12455173731 023750 0ustar00iojsstaff000000 000000 npm-rm(1) -- Remove a package ============================= ## SYNOPSIS npm rm npm r npm uninstall npm un ## DESCRIPTION This uninstalls a package, completely removing everything npm installed on its behalf. ## SEE ALSO * npm-prune(1) * npm-install(1) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-root.md000644 000766 000024 00000000416 12455173731 024314 0ustar00iojsstaff000000 000000 npm-root(1) -- Display npm root =============================== ## SYNOPSIS npm root ## DESCRIPTION Print the effective `node_modules` folder to standard out. ## SEE ALSO * npm-prefix(1) * npm-bin(1) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-run-script.md000644 000766 000024 00000002131 12455173731 025433 0ustar00iojsstaff000000 000000 npm-run-script(1) -- Run arbitrary package scripts ================================================== ## SYNOPSIS npm run-script [command] [-- ] npm run [command] [-- ] ## DESCRIPTION This runs an arbitrary command from a package's `"scripts"` object. If no package name is provided, it will search for a `package.json` in the current folder and use its `"scripts"` object. If no `"command"` is provided, it will list the available top level scripts. It is used by the test, start, restart, and stop commands, but can be called directly, as well. As of [`npm@2.0.0`](http://blog.npmjs.org/post/98131109725/npm-2-0-0), you can use custom arguments when executing scripts. The special option `--` is used by [getopt](http://goo.gl/KxMmtG) to delimit the end of the options. npm will pass all the arguments after the `--` directly to your script: npm run test -- --grep="pattern" The arguments will only be passed to the script specified after ```npm run``` and not to any pre or post script. ## SEE ALSO * npm-scripts(7) * npm-test(1) * npm-start(1) * npm-restart(1) * npm-stop(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-search.md000644 000766 000024 00000001526 12455173731 024601 0ustar00iojsstaff000000 000000 npm-search(1) -- Search for packages ==================================== ## SYNOPSIS npm search [--long] [search terms ...] npm s [search terms ...] npm se [search terms ...] ## DESCRIPTION Search the registry for packages matching the search terms. If a term starts with `/`, then it's interpreted as a regular expression. A trailing `/` will be ignored in this case. (Note that many regular expression characters must be escaped or quoted in most shells.) ## CONFIGURATION ### long * Default: false * Type: Boolean Display full package descriptions and other long text across multiple lines. When disabled (default) search results are truncated to fit neatly on a single line. Modules with extremely long names will fall on multiple lines. ## SEE ALSO * npm-registry(7) * npm-config(1) * npm-config(7) * npmrc(5) * npm-view(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-shrinkwrap.md000644 000766 000024 00000013455 12455173731 025530 0ustar00iojsstaff000000 000000 npm-shrinkwrap(1) -- Lock down dependency versions ===================================================== ## SYNOPSIS npm shrinkwrap ## DESCRIPTION This command locks down the versions of a package's dependencies so that you can control exactly which versions of each dependency will be used when your package is installed. The "package.json" file is still required if you want to use "npm install". By default, "npm install" recursively installs the target's dependencies (as specified in package.json), choosing the latest available version that satisfies the dependency's semver pattern. In some situations, particularly when shipping software where each change is tightly managed, it's desirable to fully specify each version of each dependency recursively so that subsequent builds and deploys do not inadvertently pick up newer versions of a dependency that satisfy the semver pattern. Specifying specific semver patterns in each dependency's package.json would facilitate this, but that's not always possible or desirable, as when another author owns the npm package. It's also possible to check dependencies directly into source control, but that may be undesirable for other reasons. As an example, consider package A: { "name": "A", "version": "0.1.0", "dependencies": { "B": "<0.1.0" } } package B: { "name": "B", "version": "0.0.1", "dependencies": { "C": "<0.1.0" } } and package C: { "name": "C, "version": "0.0.1" } If these are the only versions of A, B, and C available in the registry, then a normal "npm install A" will install: A@0.1.0 `-- B@0.0.1 `-- C@0.0.1 However, if B@0.0.2 is published, then a fresh "npm install A" will install: A@0.1.0 `-- B@0.0.2 `-- C@0.0.1 assuming the new version did not modify B's dependencies. Of course, the new version of B could include a new version of C and any number of new dependencies. If such changes are undesirable, the author of A could specify a dependency on B@0.0.1. However, if A's author and B's author are not the same person, there's no way for A's author to say that he or she does not want to pull in newly published versions of C when B hasn't changed at all. In this case, A's author can run npm shrinkwrap This generates npm-shrinkwrap.json, which will look something like this: { "name": "A", "version": "0.1.0", "dependencies": { "B": { "version": "0.0.1", "dependencies": { "C": { "version": "0.1.0" } } } } } The shrinkwrap command has locked down the dependencies based on what's currently installed in node_modules. When "npm install" installs a package with a npm-shrinkwrap.json file in the package root, the shrinkwrap file (rather than package.json files) completely drives the installation of that package and all of its dependencies (recursively). So now the author publishes A@0.1.0, and subsequent installs of this package will use B@0.0.1 and C@0.1.0, regardless the dependencies and versions listed in A's, B's, and C's package.json files. ### Using shrinkwrapped packages Using a shrinkwrapped package is no different than using any other package: you can "npm install" it by hand, or add a dependency to your package.json file and "npm install" it. ### Building shrinkwrapped packages To shrinkwrap an existing package: 1. Run "npm install" in the package root to install the current versions of all dependencies. 2. Validate that the package works as expected with these versions. 3. Run "npm shrinkwrap", add npm-shrinkwrap.json to git, and publish your package. To add or update a dependency in a shrinkwrapped package: 1. Run "npm install" in the package root to install the current versions of all dependencies. 2. Add or update dependencies. "npm install" each new or updated package individually and then update package.json. Note that they must be explicitly named in order to be installed: running `npm install` with no arguments will merely reproduce the existing shrinkwrap. 3. Validate that the package works as expected with the new dependencies. 4. Run "npm shrinkwrap", commit the new npm-shrinkwrap.json, and publish your package. You can use npm-outdated(1) to view dependencies with newer versions available. ### Other Notes A shrinkwrap file must be consistent with the package's package.json file. "npm shrinkwrap" will fail if required dependencies are not already installed, since that would result in a shrinkwrap that wouldn't actually work. Similarly, the command will fail if there are extraneous packages (not referenced by package.json), since that would indicate that package.json is not correct. Since "npm shrinkwrap" is intended to lock down your dependencies for production use, `devDependencies` will not be included unless you explicitly set the `--dev` flag when you run `npm shrinkwrap`. If installed `devDependencies` are excluded, then npm will print a warning. If you want them to be installed with your module by default, please consider adding them to `dependencies` instead. If shrinkwrapped package A depends on shrinkwrapped package B, B's shrinkwrap will not be used as part of the installation of A. However, because A's shrinkwrap is constructed from a valid installation of B and recursively specifies all dependencies, the contents of B's shrinkwrap will implicitly be included in A's shrinkwrap. ### Caveats If you wish to lock down the specific bytes included in a package, for example to have 100% confidence in being able to reproduce a deployment or build, then you ought to check your dependencies into source control, or pursue some other mechanism that can verify contents rather than versions. ## SEE ALSO * npm-install(1) * package.json(5) * npm-ls(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-star.md000644 000766 000024 00000000745 12455173731 024307 0ustar00iojsstaff000000 000000 npm-star(1) -- Mark your favorite packages ========================================== ## SYNOPSIS npm star [, ...] npm unstar [, ...] ## DESCRIPTION "Starring" a package means that you have some interest in it. It's a vaguely positive way to show that you care. "Unstarring" is the same thing, but in reverse. It's a boolean thing. Starring repeatedly has no additional effect. ## SEE ALSO * npm-view(1) * npm-whoami(1) * npm-adduser(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-stars.md000644 000766 000024 00000000732 12455173731 024466 0ustar00iojsstaff000000 000000 npm-stars(1) -- View packages marked as favorites ================================================= ## SYNOPSIS npm stars npm stars [username] ## DESCRIPTION If you have starred a lot of neat things and want to find them again quickly this command lets you do just that. You may also want to see your friend's favorite packages, in this case you will most certainly enjoy this command. ## SEE ALSO * npm-star(1) * npm-view(1) * npm-whoami(1) * npm-adduser(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-start.md000644 000766 000024 00000000424 12455173731 024465 0ustar00iojsstaff000000 000000 npm-start(1) -- Start a package =============================== ## SYNOPSIS npm start [-- ] ## DESCRIPTION This runs a package's "start" script, if one was provided. ## SEE ALSO * npm-run-script(1) * npm-scripts(7) * npm-test(1) * npm-restart(1) * npm-stop(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-stop.md000644 000766 000024 00000000417 12455173731 024317 0ustar00iojsstaff000000 000000 npm-stop(1) -- Stop a package ============================= ## SYNOPSIS npm stop [-- ] ## DESCRIPTION This runs a package's "stop" script, if one was provided. ## SEE ALSO * npm-run-script(1) * npm-scripts(7) * npm-test(1) * npm-start(1) * npm-restart(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-tag.md000644 000766 000024 00000001343 12455173731 024104 0ustar00iojsstaff000000 000000 npm-tag(1) -- Tag a published version ===================================== ## SYNOPSIS npm tag @ [] ## DESCRIPTION Tags the specified version of the package with the specified tag, or the `--tag` config if not specified. A tag can be used when installing packages as a reference to a version instead of using a specific version number: npm install @ When installing dependencies, a preferred tagged version may be specified: npm install --tag This also applies to `npm dedupe`. Publishing a package always sets the "latest" tag to the published version. ## SEE ALSO * npm-publish(1) * npm-install(1) * npm-dedupe(1) * npm-registry(7) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-test.md000644 000766 000024 00000000570 12455173731 024311 0ustar00iojsstaff000000 000000 npm-test(1) -- Test a package ============================= ## SYNOPSIS npm test [-- ] npm tst [-- ] ## DESCRIPTION This runs a package's "test" script, if one was provided. To run tests as a condition of installation, set the `npat` config to true. ## SEE ALSO * npm-run-script(1) * npm-scripts(7) * npm-start(1) * npm-restart(1) * npm-stop(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-uninstall.md000644 000766 000024 00000002202 12455173731 025335 0ustar00iojsstaff000000 000000 npm-rm(1) -- Remove a package ============================= ## SYNOPSIS npm uninstall [@/] [--save|--save-dev|--save-optional] npm rm (with any of the previous argument usage) ## DESCRIPTION This uninstalls a package, completely removing everything npm installed on its behalf. Example: npm uninstall sax In global mode (ie, with `-g` or `--global` appended to the command), it uninstalls the current package context as a global package. `npm uninstall` takes 3 exclusive, optional flags which save or update the package version in your main package.json: * `--save`: Package will be removed from your `dependencies`. * `--save-dev`: Package will be removed from your `devDependencies`. * `--save-optional`: Package will be removed from your `optionalDependencies`. Scope is optional and follows the usual rules for `npm-scope(7)`. Examples: npm uninstall sax --save npm uninstall @myorg/privatepackage --save npm uninstall node-tap --save-dev npm uninstall dtrace-provider --save-optional ## SEE ALSO * npm-prune(1) * npm-install(1) * npm-folders(5) * npm-config(1) * npm-config(7) * npmrc(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-unpublish.md000644 000766 000024 00000001777 12455173731 025355 0ustar00iojsstaff000000 000000 npm-unpublish(1) -- Remove a package from the registry ====================================================== ## SYNOPSIS npm unpublish [@/][@] ## WARNING **It is generally considered bad behavior to remove versions of a library that others are depending on!** Consider using the `deprecate` command instead, if your intent is to encourage users to upgrade. There is plenty of room on the registry. ## DESCRIPTION This removes a package version from the registry, deleting its entry and removing the tarball. If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely. Even if a package version is unpublished, that specific name and version combination can never be reused. In order to publish the package again, a new version number must be used. The scope is optional and follows the usual rules for `npm-scope(7)`. ## SEE ALSO * npm-deprecate(1) * npm-publish(1) * npm-registry(7) * npm-adduser(1) * npm-owner(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-update.md000644 000766 000024 00000001077 12455173731 024617 0ustar00iojsstaff000000 000000 npm-update(1) -- Update a package ================================= ## SYNOPSIS npm update [-g] [ [ ...]] ## DESCRIPTION This command will update all the packages listed to the latest version (specified by the `tag` config). It will also install missing packages. If the `-g` flag is specified, this command will update globally installed packages. If no package name is specified, all packages in the specified location (global or local) will be updated. ## SEE ALSO * npm-install(1) * npm-outdated(1) * npm-registry(7) * npm-folders(5) * npm-ls(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-version.md000644 000766 000024 00000003013 12455173731 025012 0ustar00iojsstaff000000 000000 npm-version(1) -- Bump a package version ======================================== ## SYNOPSIS npm version [ | major | minor | patch | premajor | preminor | prepatch | prerelease] ## DESCRIPTION Run this in a package directory to bump the version and write the new data back to `package.json` and, if present, `npm-shrinkwrap.json`. The `newversion` argument should be a valid semver string, *or* a valid second argument to semver.inc (one of "patch", "minor", "major", "prepatch", "preminor", "premajor", "prerelease"). In the second case, the existing version will be incremented by 1 in the specified field. If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean. If supplied with `--message` (shorthand: `-m`) config option, npm will use it as a commit message when creating a version commit. If the `message` config contains `%s` then that will be replaced with the resulting version number. For example: npm version patch -m "Upgrade to %s for reasons" If the `sign-git-tag` config is set, then the tag will be signed using the `-s` flag to git. Note that you must have a default GPG key set up in your git config for this to work properly. For example: $ npm config set sign-git-tag true $ npm version patch You need a passphrase to unlock the secret key for user: "isaacs (http://blog.izs.me/) " 2048-bit RSA key, ID 6C481CF6, created 2010-08-31 Enter passphrase: ## SEE ALSO * npm-init(1) * package.json(5) * semver(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-view.md000644 000766 000024 00000005573 12455173731 024314 0ustar00iojsstaff000000 000000 npm-view(1) -- View registry info ================================= ## SYNOPSIS npm view [@/][@] [[.]...] npm v [@/][@] [[.]...] ## DESCRIPTION This command shows data about a package and prints it to the stream referenced by the `outfd` config, which defaults to stdout. To show the package registry entry for the `connect` package, you can do this: npm view connect The default version is "latest" if unspecified. Field names can be specified after the package descriptor. For example, to show the dependencies of the `ronn` package at version 0.3.5, you could do the following: npm view ronn@0.3.5 dependencies You can view child field by separating them with a period. To view the git repository URL for the latest version of npm, you could do this: npm view npm repository.url This makes it easy to view information about a dependency with a bit of shell scripting. For example, to view all the data about the version of opts that ronn depends on, you can do this: npm view opts@$(npm view ronn dependencies.opts) For fields that are arrays, requesting a non-numeric field will return all of the values from the objects in the list. For example, to get all the contributor names for the "express" project, you can do this: npm view express contributors.email You may also use numeric indices in square braces to specifically select an item in an array field. To just get the email address of the first contributor in the list, you can do this: npm view express contributors[0].email Multiple fields may be specified, and will be printed one after another. For exampls, to get all the contributor names and email addresses, you can do this: npm view express contributors.name contributors.email "Person" fields are shown as a string if they would be shown as an object. So, for example, this will show the list of npm contributors in the shortened string format. (See `package.json(5)` for more on this.) npm view npm contributors If a version range is provided, then data will be printed for every matching version of the package. This will show which version of jsdom was required by each matching version of yui3: npm view yui3@'>0.5.4' dependencies.jsdom ## OUTPUT If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command. If the field is an object, it will be output as a JavaScript object literal. If the --json flag is given, the outputted fields will be JSON. If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to. If multiple fields are requested, than each of them are prefixed with the field name. ## SEE ALSO * npm-search(1) * npm-registry(7) * npm-config(1) * npm-config(7) * npmrc(5) * npm-docs(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm-whoami.md000644 000766 000024 00000000364 12455173731 024617 0ustar00iojsstaff000000 000000 npm-whoami(1) -- Display npm username ===================================== ## SYNOPSIS npm whoami ## DESCRIPTION Print the `username` config to standard output. ## SEE ALSO * npm-config(1) * npm-config(7) * npmrc(5) * npm-adduser(1) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/cli/npm.md000644 000766 000024 00000012327 12455173731 023337 0ustar00iojsstaff000000 000000 npm(1) -- node package manager ============================== ## SYNOPSIS npm [args] ## VERSION @VERSION@ ## DESCRIPTION npm is the package manager for the Node JavaScript platform. It puts modules in place so that node can find them, and manages dependency conflicts intelligently. It is extremely configurable to support a wide variety of use cases. Most commonly, it is used to publish, discover, install, and develop node programs. Run `npm help` to get a list of available commands. ## INTRODUCTION You probably got npm because you want to install stuff. Use `npm install blerg` to install the latest version of "blerg". Check out `npm-install(1)` for more info. It can do a lot of stuff. Use the `npm search` command to show everything that's available. Use `npm ls` to show everything you've installed. ## DEPENDENCIES If a package references to another package with a git URL, npm depends on a preinstalled git. If one of the packages npm tries to install is a native node module and requires compiling of C++ Code, npm will use [node-gyp](https://github.com/TooTallNate/node-gyp) for that task. For a Unix system, [node-gyp](https://github.com/TooTallNate/node-gyp) needs Python, make and a buildchain like GCC. On Windows, Python and Microsoft Visual Studio C++ is needed. Python 3 is not supported by [node-gyp](https://github.com/TooTallNate/node-gyp). For more information visit [the node-gyp repository](https://github.com/TooTallNate/node-gyp) and the [node-gyp Wiki](https://github.com/TooTallNate/node-gyp/wiki). ## DIRECTORIES See `npm-folders(5)` to learn about where npm puts stuff. In particular, npm has two modes of operation: * global mode: npm installs packages into the install prefix at `prefix/lib/node_modules` and bins are installed in `prefix/bin`. * local mode: npm installs packages into the current project directory, which defaults to the current working directory. Packages are installed to `./node_modules`, and bins are installed to `./node_modules/.bin`. Local mode is the default. Use `--global` or `-g` on any command to operate in global mode instead. ## DEVELOPER USAGE If you're using npm to develop and publish your code, check out the following help topics: * json: Make a package.json file. See `package.json(5)`. * link: For linking your current working code into Node's path, so that you don't have to reinstall every time you make a change. Use `npm link` to do this. * install: It's a good idea to install things if you don't need the symbolic link. Especially, installing other peoples code from the registry is done via `npm install` * adduser: Create an account or log in. Credentials are stored in the user config file. * publish: Use the `npm publish` command to upload your code to the registry. ## CONFIGURATION npm is extremely configurable. It reads its configuration options from 5 places. * Command line switches: Set a config with `--key val`. All keys take a value, even if they are booleans (the config parser doesn't know what the options are at the time of parsing.) If no value is provided, then the option is set to boolean `true`. * Environment Variables: Set any config by prefixing the name in an environment variable with `npm_config_`. For example, `export npm_config_key=val`. * User Configs: The file at $HOME/.npmrc is an ini-formatted list of configs. If present, it is parsed. If the `userconfig` option is set in the cli or env, then that will be used instead. * Global Configs: The file found at ../etc/npmrc (from the node executable, by default this resolves to /usr/local/etc/npmrc) will be parsed if it is found. If the `globalconfig` option is set in the cli, env, or user config, then that file is parsed instead. * Defaults: npm's default configuration options are defined in lib/utils/config-defs.js. These must not be changed. See `npm-config(7)` for much much more information. ## CONTRIBUTIONS Patches welcome! * code: Read through `npm-coding-style(7)` if you plan to submit code. You don't have to agree with it, but you do have to follow it. * docs: If you find an error in the documentation, edit the appropriate markdown file in the "doc" folder. (Don't worry about generating the man page.) Contributors are listed in npm's `package.json` file. You can view them easily by doing `npm view npm contributors`. If you would like to contribute, but don't know what to work on, check the issues list or ask on the mailing list. * * ## BUGS When you find issues, please report them: * web: * email: Be sure to include *all* of the output from the npm command that didn't work as expected. The `npm-debug.log` file is also helpful to provide. You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email. ## AUTHOR [Isaac Z. Schlueter](http://blog.izs.me/) :: [isaacs](https://github.com/isaacs/) :: [@izs](http://twitter.com/izs) :: ## SEE ALSO * npm-help(1) * npm-faq(7) * README * package.json(5) * npm-install(1) * npm-config(1) * npm-config(7) * npmrc(5) * npm-index(7) * npm(3) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-bin.md000644 000766 000024 00000000441 12455173731 024101 0ustar00iojsstaff000000 000000 npm-bin(3) -- Display npm bin folder ==================================== ## SYNOPSIS npm.commands.bin(args, cb) ## DESCRIPTION Print the folder where npm will install executables. This function should not be used programmatically. Instead, just refer to the `npm.bin` property. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-bugs.md000644 000766 000024 00000001134 12455173731 024271 0ustar00iojsstaff000000 000000 npm-bugs(3) -- Bugs for a package in a web browser maybe ======================================================== ## SYNOPSIS npm.commands.bugs(package, callback) ## DESCRIPTION This command tries to guess at the likely location of a package's bug tracker URL, and then tries to open it using the `--browser` config param. Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number. This command will launch a browser, so this command may not be the most friendly for programmatic use. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-cache.md000644 000766 000024 00000001720 12455173731 024375 0ustar00iojsstaff000000 000000 npm-cache(3) -- manage the npm cache programmatically ===================================================== ## SYNOPSIS npm.commands.cache([args], callback) // helpers npm.commands.cache.clean([args], callback) npm.commands.cache.add([args], callback) npm.commands.cache.read(name, version, forceBypass, callback) ## DESCRIPTION This acts much the same ways as the npm-cache(1) command line functionality. The callback is called with the package.json data of the thing that is eventually added to or read from the cache. The top level `npm.commands.cache(...)` functionality is a public interface, and like all commands on the `npm.commands` object, it will match the command line behavior exactly. However, the cache folder structure and the cache helper functions are considered **internal** API surface, and as such, may change in future releases of npm, potentially without warning or significant version incrementation. Use at your own risk. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-commands.md000644 000766 000024 00000001146 12455173731 025135 0ustar00iojsstaff000000 000000 npm-commands(3) -- npm commands =============================== ## SYNOPSIS npm.commands[](args, callback) ## DESCRIPTION npm comes with a full set of commands, and each of the commands takes a similar set of arguments. In general, all commands on the command object take an **array** of positional argument **strings**. The last argument to any function is a callback. Some commands are special and take other optional arguments. All commands have their own man page. See `man npm-` for command-line usage, or `man 3 npm-` for programmatic usage. ## SEE ALSO * npm-index(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-config.md000644 000766 000024 00000002242 12455173731 024577 0ustar00iojsstaff000000 000000 npm-config(3) -- Manage the npm configuration files =================================================== ## SYNOPSIS npm.commands.config(args, callback) var val = npm.config.get(key) npm.config.set(key, val) ## DESCRIPTION This function acts much the same way as the command-line version. The first element in the array tells config what to do. Possible values are: * `set` Sets a config parameter. The second element in `args` is interpreted as the key, and the third element is interpreted as the value. * `get` Gets the value of a config parameter. The second element in `args` is the key to get the value of. * `delete` (`rm` or `del`) Deletes a parameter from the config. The second element in `args` is the key to delete. * `list` (`ls`) Show all configs that aren't secret. No parameters necessary. * `edit`: Opens the config file in the default editor. This command isn't very useful programmatically, but it is made available. To programmatically access npm configuration settings, or set them for the duration of a program, use the `npm.config.set` and `npm.config.get` functions instead. ## SEE ALSO * npm(3) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-deprecate.md000644 000766 000024 00000001542 12455173731 025270 0ustar00iojsstaff000000 000000 npm-deprecate(3) -- Deprecate a version of a package ==================================================== ## SYNOPSIS npm.commands.deprecate(args, callback) ## DESCRIPTION This command will update the npm registry entry for a package, providing a deprecation warning to all who attempt to install it. The 'args' parameter must have exactly two elements: * `package[@version]` The `version` portion is optional, and may be either a range, or a specific version, or a tag. * `message` The warning message that will be printed whenever a user attempts to install the package. Note that you must be the package owner to deprecate something. See the `owner` and `adduser` help topics. To un-deprecate a package, specify an empty string (`""`) for the `message` argument. ## SEE ALSO * npm-publish(3) * npm-unpublish(3) * npm-registry(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-docs.md000644 000766 000024 00000001136 12455173731 024263 0ustar00iojsstaff000000 000000 npm-docs(3) -- Docs for a package in a web browser maybe ======================================================== ## SYNOPSIS npm.commands.docs(package, callback) ## DESCRIPTION This command tries to guess at the likely location of a package's documentation URL, and then tries to open it using the `--browser` config param. Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number. This command will launch a browser, so this command may not be the most friendly for programmatic use. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-edit.md000644 000766 000024 00000001464 12455173731 024264 0ustar00iojsstaff000000 000000 npm-edit(3) -- Edit an installed package ======================================== ## SYNOPSIS npm.commands.edit(package, callback) ## DESCRIPTION Opens the package folder in the default editor (or whatever you've configured as the npm `editor` config -- see `npm help config`.) After it has been edited, the package is rebuilt so as to pick up any changes in compiled packages. For instance, you can do `npm install connect` to install connect into your package, and then `npm.commands.edit(["connect"], callback)` to make a few changes to your locally installed copy. The first parameter is a string array with a single element, the package to open. The package can optionally have a version number attached. Since this command opens an editor in a new process, be careful about where and how this is used. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-explore.md000644 000766 000024 00000001235 12455173731 025011 0ustar00iojsstaff000000 000000 npm-explore(3) -- Browse an installed package ============================================= ## SYNOPSIS npm.commands.explore(args, callback) ## DESCRIPTION Spawn a subshell in the directory of the installed package specified. If a command is specified, then it is run in the subshell, which then immediately terminates. Note that the package is *not* automatically rebuilt afterwards, so be sure to use `npm rebuild ` if you make any changes. The first element in the 'args' parameter must be a package name. After that is the optional command, which can be any number of strings. All of the strings will be combined into one, space-delimited command. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-help-search.md000644 000766 000024 00000001653 12455173731 025532 0ustar00iojsstaff000000 000000 npm-help-search(3) -- Search the help pages =========================================== ## SYNOPSIS npm.commands.helpSearch(args, [silent,] callback) ## DESCRIPTION This command is rarely useful, but it exists in the rare case that it is. This command takes an array of search terms and returns the help pages that match in order of best match. If there is only one match, then npm displays that help section. If there are multiple results, the results are printed to the screen formatted and the array of results is returned. Each result is an object with these properties: * hits: A map of args to number of hits on that arg. For example, {"npm": 3} * found: Total number of unique args that matched. * totalHits: Total number of hits. * lines: An array of all matching lines (and some adjacent lines). * file: Name of the file that matched The silent parameter is not necessary not used, but it may in the future. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-init.md000644 000766 000024 00000001762 12455173731 024303 0ustar00iojsstaff000000 000000 npm init(3) -- Interactively create a package.json file ======================================================= ## SYNOPSIS npm.commands.init(args, callback) ## DESCRIPTION This will ask you a bunch of questions, and then write a package.json for you. It attempts to make reasonable guesses about what you want things to be set to, and then writes a package.json file with the options you've selected. If you already have a package.json file, it'll read that first, and default to the options in there. It is strictly additive, so it does not delete options from your package.json without a really good reason to do so. Since this function expects to be run on the command-line, it doesn't work very well as a programmatically. The best option is to roll your own, and since JavaScript makes it stupid simple to output formatted JSON, that is the preferred method. If you're sure you want to handle command-line prompting, then go ahead and use this programmatically. ## SEE ALSO package.json(5) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-install.md000644 000766 000024 00000001162 12455173731 025000 0ustar00iojsstaff000000 000000 npm-install(3) -- install a package programmatically ==================================================== ## SYNOPSIS npm.commands.install([where,] packages, callback) ## DESCRIPTION This acts much the same ways as installing on the command-line. The 'where' parameter is optional and only used internally, and it specifies where the packages should be installed to. The 'packages' parameter is an array of strings. Each element in the array is the name of a package to be installed. Finally, 'callback' is a function that will be called when all packages have been installed or when an error has been encountered. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-link.md000644 000766 000024 00000002017 12455173731 024267 0ustar00iojsstaff000000 000000 npm-link(3) -- Symlink a package folder ======================================= ## SYNOPSIS npm.commands.link(callback) npm.commands.link(packages, callback) ## DESCRIPTION Package linking is a two-step process. Without parameters, link will create a globally-installed symbolic link from `prefix/package-name` to the current folder. With a parameters, link will create a symlink from the local `node_modules` folder to the global symlink. When creating tarballs for `npm publish`, the linked packages are "snapshotted" to their current state by resolving the symbolic links. This is handy for installing your own stuff, so that you can work on it and test it iteratively without having to continually rebuild. For example: npm.commands.link(cb) # creates global link from the cwd # (say redis package) npm.commands.link('redis', cb) # link-install the package Now, any changes to the redis package will be reflected in the package in the current working directory iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-load.md000644 000766 000024 00000001265 12455173731 024255 0ustar00iojsstaff000000 000000 npm-load(3) -- Load config settings =================================== ## SYNOPSIS npm.load(conf, cb) ## DESCRIPTION npm.load() must be called before any other function call. Both parameters are optional, but the second is recommended. The first parameter is an object containing command-line config params, and the second parameter is a callback that will be called when npm is loaded and ready to serve. The first parameter should follow a similar structure as the package.json config object. For example, to emulate the --dev flag, pass an object that looks like this: { "dev": true } For a list of all the available command-line configs, see `npm help config` iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-ls.md000644 000766 000024 00000003141 12455173731 023747 0ustar00iojsstaff000000 000000 npm-ls(3) -- List installed packages ====================================== ## SYNOPSIS npm.commands.ls(args, [silent,] callback) ## DESCRIPTION This command will print to stdout all the versions of packages that are installed, as well as their dependencies, in a tree-structure. It will also return that data using the callback. This command does not take any arguments, but args must be defined. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments, though you may set config flags like with any other command, such as `global` to list global packages. It will print out extraneous, missing, and invalid packages. If the silent parameter is set to true, nothing will be output to the screen, but the data will still be returned. Callback is provided an error if one occurred, the full data about which packages are installed and which dependencies they will receive, and a "lite" data object which just shows which versions are installed where. Note that the full data object is a circular structure, so care must be taken if it is serialized to JSON. ## CONFIGURATION ### long * Default: false * Type: Boolean Show extended information. ### parseable * Default: false * Type: Boolean Show parseable output instead of tree view. ### global * Default: false * Type: Boolean List packages in the global install prefix instead of in the current project. Note, if parseable is set or long isn't set, then duplicates will be trimmed. This means that if a submodule has the same dependency as a parent module, then the dependency will only be output once. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-outdated.md000644 000766 000024 00000000522 12455173731 025142 0ustar00iojsstaff000000 000000 npm-outdated(3) -- Check for outdated packages ============================================== ## SYNOPSIS npm.commands.outdated([packages,] callback) ## DESCRIPTION This command will check the registry to see if the specified packages are currently outdated. If the 'packages' parameter is left out, npm will check all packages. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-owner.md000644 000766 000024 00000001757 12455173731 024476 0ustar00iojsstaff000000 000000 npm-owner(3) -- Manage package owners ===================================== ## SYNOPSIS npm.commands.owner(args, callback) ## DESCRIPTION The first element of the 'args' parameter defines what to do, and the subsequent elements depend on the action. Possible values for the action are (order of parameters are given in parenthesis): * ls (package): List all the users who have access to modify a package and push new versions. Handy when you need to know who to bug for help. * add (user, package): Add a new user as a maintainer of a package. This user is enabled to modify metadata, publish new versions, and add other owners. * rm (user, package): Remove a user from the package owner list. This immediately revokes their privileges. Note that there is only one level of access. Either you can modify a package, or you can't. Future versions may contain more fine-grained access levels, but that is not implemented at this time. ## SEE ALSO * npm-publish(3) * npm-registry(7) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-pack.md000644 000766 000024 00000001167 12455173731 024255 0ustar00iojsstaff000000 000000 npm-pack(3) -- Create a tarball from a package ============================================== ## SYNOPSIS npm.commands.pack([packages,] callback) ## DESCRIPTION For anything that's installable (that is, a package folder, tarball, tarball url, name@tag, name@version, or name), this command will fetch it to the cache, and then copy the tarball to the current working directory as `-.tgz`, and then write the filenames out to stdout. If the same package is specified multiple times, then the file will be overwritten the second time. If no arguments are supplied, then npm packs the current package folder. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-prefix.md000644 000766 000024 00000000502 12455173731 024624 0ustar00iojsstaff000000 000000 npm-prefix(3) -- Display prefix =============================== ## SYNOPSIS npm.commands.prefix(args, callback) ## DESCRIPTION Print the prefix to standard out. 'args' is never used and callback is never called with data. 'args' must be present or things will break. This function is not useful programmatically iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-prune.md000644 000766 000024 00000000671 12455173731 024467 0ustar00iojsstaff000000 000000 npm-prune(3) -- Remove extraneous packages ========================================== ## SYNOPSIS npm.commands.prune([packages,] callback) ## DESCRIPTION This command removes "extraneous" packages. The first parameter is optional, and it specifies packages to be removed. No packages are specified, then all packages will be checked. Extraneous packages are packages that are not listed on the parent package's dependencies list. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-publish.md000644 000766 000024 00000001372 12455173731 025003 0ustar00iojsstaff000000 000000 npm-publish(3) -- Publish a package =================================== ## SYNOPSIS npm.commands.publish([packages,] callback) ## DESCRIPTION Publishes a package to the registry so that it can be installed by name. Possible values in the 'packages' array are: * ``: A folder containing a package.json file * ``: A url or file path to a gzipped tar archive containing a single folder with a package.json file inside. If the package array is empty, npm will try to publish something in the current working directory. This command could fails if one of the packages specified already exists in the registry. Overwrites when the "force" environment variable is set. ## SEE ALSO * npm-registry(7) * npm-adduser(1) * npm-owner(3) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-rebuild.md000644 000766 000024 00000000703 12455173731 024760 0ustar00iojsstaff000000 000000 npm-rebuild(3) -- Rebuild a package =================================== ## SYNOPSIS npm.commands.rebuild([packages,] callback) ## DESCRIPTION This command runs the `npm build` command on each of the matched packages. This is useful when you install a new version of node, and must recompile all your C++ addons with the new binary. If no 'packages' parameter is specify, every package will be rebuilt. ## CONFIGURATION See `npm help build` iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-repo.md000644 000766 000024 00000001135 12455173731 024277 0ustar00iojsstaff000000 000000 npm-repo(3) -- Open package repository page in the browser ======================================================== ## SYNOPSIS npm.commands.repo(package, callback) ## DESCRIPTION This command tries to guess at the likely location of a package's repository URL, and then tries to open it using the `--browser` config param. Like other commands, the first parameter is an array. This command only uses the first element, which is expected to be a package name with an optional version number. This command will launch a browser, so this command may not be the most friendly for programmatic use. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-restart.md000644 000766 000024 00000001576 12455173731 025027 0ustar00iojsstaff000000 000000 npm-restart(3) -- Restart a package =================================== ## SYNOPSIS npm.commands.restart(packages, callback) ## DESCRIPTION This restarts a package (or multiple packages). This runs a package's "stop", "restart", and "start" scripts, and associated pre- and post- scripts, in the order given below: 1. prerestart 2. prestop 3. stop 4. poststop 5. restart 6. prestart 7. start 8. poststart 9. postrestart If no version is specified, then it restarts the "active" version. npm can restart multiple packages. Just specify multiple packages in the `packages` parameter. ## NOTE Note that the "restart" script is run **in addition to** the "stop" and "start" scripts, not instead of them. This is the behavior as of `npm` major version 2. A change in this behavior will be accompanied by an increase in major version number ## SEE ALSO * npm-start(3) * npm-stop(3) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-root.md000644 000766 000024 00000000532 12455173731 024315 0ustar00iojsstaff000000 000000 npm-root(3) -- Display npm root =============================== ## SYNOPSIS npm.commands.root(args, callback) ## DESCRIPTION Print the effective `node_modules` folder to standard out. 'args' is never used and callback is never called with data. 'args' must be present or things will break. This function is not useful programmatically. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-run-script.md000644 000766 000024 00000001440 12455173731 025437 0ustar00iojsstaff000000 000000 npm-run-script(3) -- Run arbitrary package scripts ================================================== ## SYNOPSIS npm.commands.run-script(args, callback) ## DESCRIPTION This runs an arbitrary command from a package's "scripts" object. It is used by the test, start, restart, and stop commands, but can be called directly, as well. The 'args' parameter is an array of strings. Behavior depends on the number of elements. If there is only one element, npm assumes that the element represents a command to be run on the local repository. If there is more than one element, then the first is assumed to be the package and the second is assumed to be the command to run. All other elements are ignored. ## SEE ALSO * npm-scripts(7) * npm-test(3) * npm-start(3) * npm-restart(3) * npm-stop(3) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-search.md000644 000766 000024 00000002354 12455173731 024603 0ustar00iojsstaff000000 000000 npm-search(3) -- Search for packages ==================================== ## SYNOPSIS npm.commands.search(searchTerms, [silent,] [staleness,] callback) ## DESCRIPTION Search the registry for packages matching the search terms. The available parameters are: * searchTerms: Array of search terms. These terms are case-insensitive. * silent: If true, npm will not log anything to the console. * staleness: This is the threshold for stale packages. "Fresh" packages are not refreshed from the registry. This value is measured in seconds. * callback: Returns an object where each key is the name of a package, and the value is information about that package along with a 'words' property, which is a space-delimited string of all of the interesting words in that package. The only properties included are those that are searched, which generally include: * name * description * maintainers * url * keywords A search on the registry excludes any result that does not match all of the search terms. It also removes any items from the results that contain an excluded term (the "searchexclude" config). The search is case insensitive and doesn't try to read your mind (it doesn't do any verb tense matching or the like). iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-shrinkwrap.md000644 000766 000024 00000001247 12455173731 025526 0ustar00iojsstaff000000 000000 npm-shrinkwrap(3) -- programmatically generate package shrinkwrap file ==================================================== ## SYNOPSIS npm.commands.shrinkwrap(args, [silent,] callback) ## DESCRIPTION This acts much the same ways as shrinkwrapping on the command-line. This command does not take any arguments, but 'args' must be defined. Beyond that, if any arguments are passed in, npm will politely warn that it does not take positional arguments. If the 'silent' parameter is set to true, nothing will be output to the screen, but the shrinkwrap file will still be written. Finally, 'callback' is a function that will be called when the shrinkwrap has been saved. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-start.md000644 000766 000024 00000000443 12455173731 024470 0ustar00iojsstaff000000 000000 npm-start(3) -- Start a package =============================== ## SYNOPSIS npm.commands.start(packages, callback) ## DESCRIPTION This runs a package's "start" script, if one was provided. npm can start multiple packages. Just specify multiple packages in the `packages` parameter. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-stop.md000644 000766 000024 00000000443 12455173731 024320 0ustar00iojsstaff000000 000000 npm-stop(3) -- Stop a package ============================= ## SYNOPSIS npm.commands.stop(packages, callback) ## DESCRIPTION This runs a package's "stop" script, if one was provided. npm can run stop on multiple packages. Just specify multiple packages in the `packages` parameter. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-tag.md000644 000766 000024 00000001472 12455173731 024111 0ustar00iojsstaff000000 000000 npm-tag(3) -- Tag a published version ===================================== ## SYNOPSIS npm.commands.tag(package@version, tag, callback) ## DESCRIPTION Tags the specified version of the package with the specified tag, or the `--tag` config if not specified. The 'package@version' is an array of strings, but only the first two elements are currently used. The first element must be in the form package@version, where package is the package name and version is the version number (much like installing a specific version). The second element is the name of the tag to tag this version with. If this parameter is missing or falsey (empty), the default froom the config will be used. For more information about how to set this config, check `man 3 npm-config` for programmatic usage or `man npm-config` for cli usage. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-test.md000644 000766 000024 00000000563 12455173731 024315 0ustar00iojsstaff000000 000000 npm-test(3) -- Test a package ============================= ## SYNOPSIS npm.commands.test(packages, callback) ## DESCRIPTION This runs a package's "test" script, if one was provided. To run tests as a condition of installation, set the `npat` config to true. npm can run tests on multiple packages. Just specify multiple packages in the `packages` parameter. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-uninstall.md000644 000766 000024 00000001000 12455173731 025332 0ustar00iojsstaff000000 000000 npm-uninstall(3) -- uninstall a package programmatically ======================================================== ## SYNOPSIS npm.commands.uninstall(packages, callback) ## DESCRIPTION This acts much the same ways as uninstalling on the command-line. The 'packages' parameter is an array of strings. Each element in the array is the name of a package to be uninstalled. Finally, 'callback' is a function that will be called when all packages have been uninstalled or when an error has been encountered. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-unpublish.md000644 000766 000024 00000001157 12455173731 025347 0ustar00iojsstaff000000 000000 npm-unpublish(3) -- Remove a package from the registry ====================================================== ## SYNOPSIS npm.commands.unpublish(package, callback) ## DESCRIPTION This removes a package version from the registry, deleting its entry and removing the tarball. The package parameter must be defined. Only the first element in the package parameter is used. If there is no first element, then npm assumes that the package at the current working directory is what is meant. If no version is specified, or if all versions are removed then the root package entry is removed from the registry entirely. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-update.md000644 000766 000024 00000000562 12455173731 024617 0ustar00iojsstaff000000 000000 npm-update(3) -- Update a package ================================= ## SYNOPSIS npm.commands.update(packages, callback) # DESCRIPTION Updates a package, upgrading it to the latest version. It also installs any missing packages. The 'packages' argument is an array of packages to update. The 'callback' parameter will be called when done or when an error occurs. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-version.md000644 000766 000024 00000001121 12455173731 025012 0ustar00iojsstaff000000 000000 npm-version(3) -- Bump a package version ======================================== ## SYNOPSIS npm.commands.version(newversion, callback) ## DESCRIPTION Run this in a package directory to bump the version and write the new data back to the package.json file. If run in a git repo, it will also create a version commit and tag, and fail if the repo is not clean. Like all other commands, this function takes a string array as its first parameter. The difference, however, is this function will fail if it does not have exactly one element. The only element should be a version number. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-view.md000644 000766 000024 00000006350 12455173731 024310 0ustar00iojsstaff000000 000000 npm-view(3) -- View registry info ================================= ## SYNOPSIS npm.commands.view(args, [silent,] callback) ## DESCRIPTION This command shows data about a package and prints it to the stream referenced by the `outfd` config, which defaults to stdout. The "args" parameter is an ordered list that closely resembles the command-line usage. The elements should be ordered such that the first element is the package and version (package@version). The version is optional. After that, the rest of the parameters are fields with optional subfields ("field.subfield") which can be used to get only the information desired from the registry. The callback will be passed all of the data returned by the query. For example, to get the package registry entry for the `connect` package, you can do this: npm.commands.view(["connect"], callback) If no version is specified, "latest" is assumed. Field names can be specified after the package descriptor. For example, to show the dependencies of the `ronn` package at version 0.3.5, you could do the following: npm.commands.view(["ronn@0.3.5", "dependencies"], callback) You can view child field by separating them with a period. To view the git repository URL for the latest version of npm, you could do this: npm.commands.view(["npm", "repository.url"], callback) For fields that are arrays, requesting a non-numeric field will return all of the values from the objects in the list. For example, to get all the contributor names for the "express" project, you can do this: npm.commands.view(["express", "contributors.email"], callback) You may also use numeric indices in square braces to specifically select an item in an array field. To just get the email address of the first contributor in the list, you can do this: npm.commands.view(["express", "contributors[0].email"], callback) Multiple fields may be specified, and will be printed one after another. For exampls, to get all the contributor names and email addresses, you can do this: npm.commands.view(["express", "contributors.name", "contributors.email"], callback) "Person" fields are shown as a string if they would be shown as an object. So, for example, this will show the list of npm contributors in the shortened string format. (See `npm help json` for more on this.) npm.commands.view(["npm", "contributors"], callback) If a version range is provided, then data will be printed for every matching version of the package. This will show which version of jsdom was required by each matching version of yui3: npm.commands.view(["yui3@'>0.5.4'", "dependencies.jsdom"], callback) ## OUTPUT If only a single string field for a single version is output, then it will not be colorized or quoted, so as to enable piping the output to another command. If the version range matches multiple versions, than each printed value will be prefixed with the version it applies to. If multiple fields are requested, than each of them are prefixed with the field name. Console output can be disabled by setting the 'silent' parameter to true. ## RETURN VALUE The data returned will be an object in this formation: { : { : , ... } , ... } corresponding to the list of fields selected. iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm-whoami.md000644 000766 000024 00000000534 12455173731 024620 0ustar00iojsstaff000000 000000 npm-whoami(3) -- Display npm username ===================================== ## SYNOPSIS npm.commands.whoami(args, callback) ## DESCRIPTION Print the `username` config to standard output. 'args' is never used and callback is never called with data. 'args' must be present or things will break. This function is not useful programmatically iojs-v1.0.2-darwin-x64/lib/node_modules/npm/doc/api/npm.md000644 000766 000024 00000006354 12455173731 023344 0ustar00iojsstaff000000 000000 npm(3) -- node package manager ============================== ## SYNOPSIS var npm = require("npm") npm.load([configObject, ]function (er, npm) { // use the npm object, now that it's loaded. npm.config.set(key, val) val = npm.config.get(key) console.log("prefix = %s", npm.prefix) npm.commands.install(["package"], cb) }) ## VERSION @VERSION@ ## DESCRIPTION This is the API documentation for npm. To find documentation of the command line client, see `npm(1)`. Prior to using npm's commands, `npm.load()` must be called. If you provide `configObject` as an object map of top-level configs, they override the values stored in the various config locations. In the npm command line client, this set of configs is parsed from the command line options. Additional configuration params are loaded from two configuration files. See `npm-config(1)`, `npm-config(7)`, and `npmrc(5)` for more information. After that, each of the functions are accessible in the commands object: `npm.commands.`. See `npm-index(7)` for a list of all possible commands. All commands on the command object take an **array** of positional argument **strings**. The last argument to any function is a callback. Some commands take other optional arguments. Configs cannot currently be set on a per function basis, as each call to npm.config.set will change the value for *all* npm commands in that process. To find API documentation for a specific command, run the `npm apihelp` command. ## METHODS AND PROPERTIES * `npm.load(configs, cb)` Load the configuration params, and call the `cb` function once the globalconfig and userconfig files have been loaded as well, or on nextTick if they've already been loaded. * `npm.config` An object for accessing npm configuration parameters. * `npm.config.get(key)` * `npm.config.set(key, val)` * `npm.config.del(key)` * `npm.dir` or `npm.root` The `node_modules` directory where npm will operate. * `npm.prefix` The prefix where npm is operating. (Most often the current working directory.) * `npm.cache` The place where npm keeps JSON and tarballs it fetches from the registry (or uploads to the registry). * `npm.tmp` npm's temporary working directory. * `npm.deref` Get the "real" name for a command that has either an alias or abbreviation. ## MAGIC For each of the methods in the `npm.commands` object, a method is added to the npm object, which takes a set of positional string arguments rather than an array and a callback. If the last argument is a callback, then it will use the supplied callback. However, if no callback is provided, then it will print out the error or results. For example, this would work in a node repl: > npm = require("npm") > npm.load() // wait a sec... > npm.install("dnode", "express") Note that that *won't* work in a node program, since the `install` method will get called before the configuration load is completed. ## ABBREVS In order to support `npm ins foo` instead of `npm install foo`, the `npm.commands` object has a set of abbreviations as well as the full method names. Use the `npm.deref` method to find the real name. For example: var cmd = npm.deref("unp") // cmd === "unpublish" iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/node-gyp-bin/000755 000766 000024 00000000000 12456115117 023735 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/npm000755 000766 000024 00000000524 12455173731 022173 0ustar00iojsstaff000000 000000 #!/bin/sh (set -o igncr) 2>/dev/null && set -o igncr; # cygwin encoding fix basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac if [ -x "$basedir/node.exe" ]; then "$basedir/node.exe" "$basedir/node_modules/npm/bin/npm-cli.js" "$@" else node "$basedir/node_modules/npm/bin/npm-cli.js" "$@" fi iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/npm-cli.js000755 000766 000024 00000003610 12456115120 023337 0ustar00iojsstaff000000 000000 #!/bin/sh // 2>/dev/null; exec "`dirname "$0"`/iojs" "$0" "$@" ;(function () { // wrapper in case we're in module_context mode // windows: running "npm blah" in this folder will invoke WSH, not node. if (typeof WScript !== "undefined") { WScript.echo("npm does not work when run\n" +"with the Windows Scripting Host\n\n" +"'cd' to a different directory,\n" +"or type 'npm.cmd ',\n" +"or type 'node npm '.") WScript.quit(1) return } process.title = "npm" var log = require("npmlog") log.pause() // will be unpaused when config is loaded. log.info("it worked if it ends with", "ok") var path = require("path") , npm = require("../lib/npm.js") , npmconf = require("../lib/config/core.js") , errorHandler = require("../lib/utils/error-handler.js") , configDefs = npmconf.defs , shorthands = configDefs.shorthands , types = configDefs.types , nopt = require("nopt") // if npm is called as "npmg" or "npm_g", then // run in global mode. if (path.basename(process.argv[1]).slice(-1) === "g") { process.argv.splice(1, 1, "npm", "-g") } log.verbose("cli", process.argv) var conf = nopt(types, shorthands) npm.argv = conf.argv.remain if (npm.deref(npm.argv[0])) npm.command = npm.argv.shift() else conf.usage = true if (conf.version) { console.log(npm.version) return } if (conf.versions) { npm.command = "version" conf.usage = false npm.argv = [] } log.info("using", "npm@%s", npm.version) log.info("using", "node@%s", process.version) process.on("uncaughtException", errorHandler) if (conf.usage && npm.command !== "help") { npm.argv.unshift(npm.command) npm.command = "help" } // now actually fire up npm and run the command. // this is how to use npm programmatically: conf._exit = true npm.load(conf, function (er) { if (er) return errorHandler(er) npm.commands[npm.command](npm.argv, errorHandler) }) })() iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/npm.cmd000644 000766 000024 00000000321 12455173731 022725 0ustar00iojsstaff000000 000000 :: Created by npm, please don't edit manually. @IF EXIST "%~dp0\node.exe" ( "%~dp0\node.exe" "%~dp0\.\node_modules\npm\bin\npm-cli.js" %* ) ELSE ( node "%~dp0\.\node_modules\npm\bin\npm-cli.js" %* ) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/read-package-json.js000755 000766 000024 00000000770 12455173731 025272 0ustar00iojsstaff000000 000000 var argv = process.argv if (argv.length < 3) { console.error("Usage: read-package.json [ ...]") process.exit(1) } var fs = require("fs") , file = argv[2] , readJson = require("read-package-json") readJson(file, function (er, data) { if (er) throw er if (argv.length === 3) console.log(data) else argv.slice(3).forEach(function (field) { field = field.split(".") var val = data field.forEach(function (f) { val = val[f] }) console.log(val) }) }) iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/node-gyp-bin/node-gyp000755 000766 000024 00000000131 12455173731 025405 0ustar00iojsstaff000000 000000 #!/usr/bin/env sh node "`dirname "$0"`/../../node_modules/node-gyp/bin/node-gyp.js" "$@" iojs-v1.0.2-darwin-x64/lib/node_modules/npm/bin/node-gyp-bin/node-gyp.cmd000755 000766 000024 00000000075 12455173731 026156 0ustar00iojsstaff000000 000000 node "%~dp0\..\..\node_modules\node-gyp\bin\node-gyp.js" %* iojs-v1.0.2-darwin-x64/lib/dtrace/node.d000644 000766 000024 00000030044 12456115076 017767 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the * "Software"), to deal in the Software without restriction, including * without limitation the rights to use, copy, modify, merge, publish, * distribute, sublicense, and/or sell copies of the Software, and to permit * persons to whom the Software is furnished to do so, subject to the * following conditions: * * The above copyright notice and this permission notice shall be included * in all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS * OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF * MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN * NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, * DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR * OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE * USE OR OTHER DEALINGS IN THE SOFTWARE. */ /* * This is the DTrace library file for the node provider, which includes * the necessary translators to get from the args[] to something useful. * Be warned: the mechanics here are seriously ugly -- and one must always * keep in mind that clean abstractions often require filthy systems. */ #pragma D depends_on library procfs.d typedef struct { int32_t fd; int32_t port; uint32_t remote; uint32_t buffered; } node_dtrace_connection_t; typedef struct { int32_t fd; int32_t port; uint64_t remote; uint32_t buffered; } node_dtrace_connection64_t; typedef struct { int fd; string remoteAddress; int remotePort; int bufferSize; } node_connection_t; translator node_connection_t { fd = *(int32_t *)copyin((uintptr_t)&nc->fd, sizeof (int32_t)); remotePort = *(int32_t *)copyin((uintptr_t)&nc->port, sizeof (int32_t)); remoteAddress = curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr((uintptr_t)*(uint32_t *)copyin((uintptr_t)&nc->remote, sizeof (int32_t))) : copyinstr((uintptr_t)*(uint64_t *)copyin((uintptr_t) &((node_dtrace_connection64_t *)nc)->remote, sizeof (int64_t))); bufferSize = curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? *(uint32_t *)copyin((uintptr_t)&nc->buffered, sizeof (int32_t)) : *(uint32_t *)copyin((uintptr_t) &((node_dtrace_connection64_t *)nc)->buffered, sizeof (int32_t)); }; /* * 32-bit and 64-bit structures received from node for HTTP client request * probe. */ typedef struct { uint32_t url; uint32_t method; } node_dtrace_http_client_request_t; typedef struct { uint64_t url; uint64_t method; } node_dtrace_http_client_request64_t; /* * The following structures are never used directly, but must exist to bind the * types specified in the provider to the translators defined here. * Ultimately, they always get cast to a more specific type inside the * translator. To add to the confusion, the DTrace compiler does not allow * declaring two translators with the same destination type if the source types * are structures with the same size (because libctf says they're compatible, * so dtrace considers them equivalent). Since we must define translators from * node_dtrace_http_client_request_t (above), node_dtrace_http_request_t, and * node_dtrace_http_server_request_t (both below), each of these three structs * must be declared with a different size. */ typedef struct { uint32_t version; uint64_t dummy1; } node_dtrace_http_request_t; typedef struct { uint32_t version; uint64_t dummy2; uint64_t dummy3; } node_dtrace_http_server_request_t; /* * Actual 32-bit and 64-bit, v0 and v1 structures received from node for the * HTTP server request probe. */ typedef struct { uint32_t url; uint32_t method; } node_dtrace_http_server_request_v0_t; typedef struct { uint32_t version; uint32_t url; uint32_t method; uint32_t forwardedFor; } node_dtrace_http_server_request_v1_t; typedef struct { uint64_t url; uint64_t method; } node_dtrace_http_server_request64_v0_t; typedef struct { uint32_t version; uint32_t pad; uint64_t url; uint64_t method; uint64_t forwardedFor; } node_dtrace_http_server_request64_v1_t; /* * In the end, both client and server request probes from both old and new * binaries translate their arguments to node_http_request_t, which is what the * user's D script ultimately sees. */ typedef struct { string url; string method; string forwardedFor; } node_http_request_t; /* * The following translators are particularly filthy for reasons of backwards * compatibility. Stable versions of node prior to 0.6 used a single * http_request struct with fields for "url" and "method" for both client and * server probes. 0.6 added a "forwardedFor" field intended for the server * probe only, and the http_request struct passed by the application was split * first into client_http_request and server_http_request and the latter was * again split for v0 (the old struct) and v1. * * To distinguish the binary representations of the two versions of these * structs, the new version prepends a "version" member (where the old one has * a "url" pointer). Each field that we're translating below first switches on * the value of this "version" field: if it's larger than 4096, we know we must * be looking at the "url" pointer of the older structure version. Otherwise, * we must be looking at the new version. Besides this, we have the usual * switch based on the userland process data model. This would all be simpler * with macros, but those aren't available in D library files since we cannot * rely on cpp being present at runtime. * * In retrospect, the versioning bit might have been unnecessary since the type * of the object passed in should allow DTrace to select which translator to * use. However, DTrace does sometimes use translators whose source types * don't quite match, and since we know this versioning logic works, we just * leave it alone. Each of the translators below is functionally identical * (except that the client -> client translator doesn't bother translating * forwardedFor) and should actually work with any version of any of the client * or server structs transmitted by the application up to this point. */ /* * Translate from node_dtrace_http_server_request_t (received from node 0.6 and * later versions) to node_http_request_t. */ translator node_http_request_t { url = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v0_t *)nd)->url, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v0_t *)nd)->url, sizeof (uint64_t)))) : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->url, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)->url, sizeof (uint64_t)))); method = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v0_t *)nd)->method, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v0_t *)nd)->method, sizeof (uint64_t)))) : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->method, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)->method, sizeof (uint64_t)))); forwardedFor = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? "" : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->forwardedFor, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)-> forwardedFor, sizeof (uint64_t)))); }; /* * Translate from node_dtrace_http_client_request_t (received from node 0.6 and * later versions) to node_http_request_t. */ translator node_http_request_t { url = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v0_t *)nd)->url, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v0_t *)nd)->url, sizeof (uint64_t)))) : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->url, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)->url, sizeof (uint64_t)))); method = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v0_t *)nd)->method, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v0_t *)nd)->method, sizeof (uint64_t)))) : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->method, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)->method, sizeof (uint64_t)))); forwardedFor = ""; }; /* * Translate from node_dtrace_http_request_t (received from versions of node * prior to 0.6) to node_http_request_t. This is used for both the server and * client probes since these versions of node didn't distinguish between the * types used in these probes. */ translator node_http_request_t { url = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v0_t *)nd)->url, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v0_t *)nd)->url, sizeof (uint64_t)))) : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->url, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)->url, sizeof (uint64_t)))); method = (*(uint32_t *)copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v0_t *)nd)->method, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v0_t *)nd)->method, sizeof (uint64_t)))) : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->method, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)->method, sizeof (uint64_t)))); forwardedFor = (*(uint32_t *) copyin((uintptr_t)(uint32_t *)nd, sizeof (uint32_t))) >= 4096 ? "" : (curpsinfo->pr_dmodel == PR_MODEL_ILP32 ? copyinstr(*(uint32_t *)copyin((uintptr_t) &((node_dtrace_http_server_request_v1_t *)nd)->forwardedFor, sizeof (uint32_t))) : copyinstr(*(uint64_t *)copyin((uintptr_t) &((node_dtrace_http_server_request64_v1_t *)nd)-> forwardedFor, sizeof (uint64_t)))); }; iojs-v1.0.2-darwin-x64/include/node/000755 000766 000024 00000000000 12456115120 017222 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/include/node/android-ifaddrs.h000644 000766 000024 00000003475 12455173732 022452 0ustar00iojsstaff000000 000000 /* * Copyright (c) 1995, 1999 * Berkeley Software Design, Inc. All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * * THIS SOFTWARE IS PROVIDED BY Berkeley Software Design, Inc. ``AS IS'' AND * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE * ARE DISCLAIMED. IN NO EVENT SHALL Berkeley Software Design, Inc. BE LIABLE * FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL * DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS * OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY * OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF * SUCH DAMAGE. * * BSDI ifaddrs.h,v 2.5 2000/02/23 14:51:59 dab Exp */ #ifndef _IFADDRS_H_ #define _IFADDRS_H_ struct ifaddrs { struct ifaddrs *ifa_next; char *ifa_name; unsigned int ifa_flags; struct sockaddr *ifa_addr; struct sockaddr *ifa_netmask; struct sockaddr *ifa_dstaddr; void *ifa_data; }; /* * This may have been defined in . Note that if is * to be included it must be included before this header file. */ #ifndef ifa_broadaddr #define ifa_broadaddr ifa_dstaddr /* broadcast address interface */ #endif #include __BEGIN_DECLS extern int getifaddrs(struct ifaddrs **ifap); extern void freeifaddrs(struct ifaddrs *ifa); __END_DECLS #endif iojs-v1.0.2-darwin-x64/include/node/ares.h000644 000766 000024 00000052552 12455173731 020351 0ustar00iojsstaff000000 000000 /* Copyright 1998 by the Massachusetts Institute of Technology. * Copyright (C) 2007-2013 by Daniel Stenberg * * Permission to use, copy, modify, and distribute this * software and its documentation for any purpose and without * fee is hereby granted, provided that the above copyright * notice appear in all copies and that both that copyright * notice and this permission notice appear in supporting * documentation, and that the name of M.I.T. not be used in * advertising or publicity pertaining to distribution of the * software without specific, written prior permission. * M.I.T. makes no representations about the suitability of * this software for any purpose. It is provided "as is" * without express or implied warranty. */ #ifndef ARES__H #define ARES__H #include "ares_version.h" /* c-ares version defines */ /* * Define WIN32 when build target is Win32 API */ #if (defined(_WIN32) || defined(__WIN32__)) && \ !defined(WIN32) && !defined(__SYMBIAN32__) # define WIN32 #endif /*************************** libuv patch ***************/ /* * We want to avoid autoconf altogether since there are a finite number of * operating systems and simply build c-ares. Therefore we do not want the * configurations provided by ares_build.h since we are always statically * linking c-ares into libuv. Having a system dependent ares_build.h forces * all users of ares.h to include the correct ares_build.h. We do not care * about the linking checks provided by ares_rules.h. This would complicate * the libuv build process. */ #if defined(WIN32) /* Configure process defines this to 1 when it finds out that system */ /* header file ws2tcpip.h must be included by the external interface. */ /* #undef CARES_PULL_WS2TCPIP_H */ # include # include # include #else /* Not Windows */ # include # include # include #endif #if 0 /* The size of `long', as computed by sizeof. */ #define CARES_SIZEOF_LONG 4 #endif /* Integral data type used for ares_socklen_t. */ #define CARES_TYPEOF_ARES_SOCKLEN_T socklen_t #if 0 /* The size of `ares_socklen_t', as computed by sizeof. */ #define CARES_SIZEOF_ARES_SOCKLEN_T 4 #endif /* Data type definition of ares_socklen_t. */ typedef int ares_socklen_t; #if 0 /* libuv disabled */ #include "ares_rules.h" /* c-ares rules enforcement */ #endif /*********************** end libuv patch ***************/ #include /* HP-UX systems version 9, 10 and 11 lack sys/select.h and so does oldish libc5-based Linux systems. Only include it on system that are known to require it! */ #if defined(_AIX) || defined(__NOVELL_LIBC__) || defined(__NetBSD__) || \ defined(__minix) || defined(__SYMBIAN32__) || defined(__INTEGRITY) || \ defined(ANDROID) || defined(__ANDROID__) #include #endif #if (defined(NETWARE) && !defined(__NOVELL_LIBC__)) #include #endif #if defined(WATT32) # include # include # include #elif defined(_WIN32_WCE) # ifndef WIN32_LEAN_AND_MEAN # define WIN32_LEAN_AND_MEAN # endif # include # include #elif defined(WIN32) # ifndef WIN32_LEAN_AND_MEAN # define WIN32_LEAN_AND_MEAN # endif # include # include # include #else # include # include #endif #ifdef __cplusplus extern "C" { #endif /* ** c-ares external API function linkage decorations. */ #ifdef CARES_STATICLIB # define CARES_EXTERN #elif defined(WIN32) || defined(_WIN32) || defined(__SYMBIAN32__) # if defined(CARES_BUILDING_LIBRARY) # define CARES_EXTERN __declspec(dllexport) # else # define CARES_EXTERN __declspec(dllimport) # endif #elif defined(CARES_BUILDING_LIBRARY) && defined(CARES_SYMBOL_HIDING) # define CARES_EXTERN CARES_SYMBOL_SCOPE_EXTERN #else # define CARES_EXTERN #endif #define ARES_SUCCESS 0 /* Server error codes (ARES_ENODATA indicates no relevant answer) */ #define ARES_ENODATA 1 #define ARES_EFORMERR 2 #define ARES_ESERVFAIL 3 #define ARES_ENOTFOUND 4 #define ARES_ENOTIMP 5 #define ARES_EREFUSED 6 /* Locally generated error codes */ #define ARES_EBADQUERY 7 #define ARES_EBADNAME 8 #define ARES_EBADFAMILY 9 #define ARES_EBADRESP 10 #define ARES_ECONNREFUSED 11 #define ARES_ETIMEOUT 12 #define ARES_EOF 13 #define ARES_EFILE 14 #define ARES_ENOMEM 15 #define ARES_EDESTRUCTION 16 #define ARES_EBADSTR 17 /* ares_getnameinfo error codes */ #define ARES_EBADFLAGS 18 /* ares_getaddrinfo error codes */ #define ARES_ENONAME 19 #define ARES_EBADHINTS 20 /* Uninitialized library error code */ #define ARES_ENOTINITIALIZED 21 /* introduced in 1.7.0 */ /* ares_library_init error codes */ #define ARES_ELOADIPHLPAPI 22 /* introduced in 1.7.0 */ #define ARES_EADDRGETNETWORKPARAMS 23 /* introduced in 1.7.0 */ /* More error codes */ #define ARES_ECANCELLED 24 /* introduced in 1.7.0 */ /* Flag values */ #define ARES_FLAG_USEVC (1 << 0) #define ARES_FLAG_PRIMARY (1 << 1) #define ARES_FLAG_IGNTC (1 << 2) #define ARES_FLAG_NORECURSE (1 << 3) #define ARES_FLAG_STAYOPEN (1 << 4) #define ARES_FLAG_NOSEARCH (1 << 5) #define ARES_FLAG_NOALIASES (1 << 6) #define ARES_FLAG_NOCHECKRESP (1 << 7) #define ARES_FLAG_EDNS (1 << 8) /* Option mask values */ #define ARES_OPT_FLAGS (1 << 0) #define ARES_OPT_TIMEOUT (1 << 1) #define ARES_OPT_TRIES (1 << 2) #define ARES_OPT_NDOTS (1 << 3) #define ARES_OPT_UDP_PORT (1 << 4) #define ARES_OPT_TCP_PORT (1 << 5) #define ARES_OPT_SERVERS (1 << 6) #define ARES_OPT_DOMAINS (1 << 7) #define ARES_OPT_LOOKUPS (1 << 8) #define ARES_OPT_SOCK_STATE_CB (1 << 9) #define ARES_OPT_SORTLIST (1 << 10) #define ARES_OPT_SOCK_SNDBUF (1 << 11) #define ARES_OPT_SOCK_RCVBUF (1 << 12) #define ARES_OPT_TIMEOUTMS (1 << 13) #define ARES_OPT_ROTATE (1 << 14) #define ARES_OPT_EDNSPSZ (1 << 15) /* Nameinfo flag values */ #define ARES_NI_NOFQDN (1 << 0) #define ARES_NI_NUMERICHOST (1 << 1) #define ARES_NI_NAMEREQD (1 << 2) #define ARES_NI_NUMERICSERV (1 << 3) #define ARES_NI_DGRAM (1 << 4) #define ARES_NI_TCP 0 #define ARES_NI_UDP ARES_NI_DGRAM #define ARES_NI_SCTP (1 << 5) #define ARES_NI_DCCP (1 << 6) #define ARES_NI_NUMERICSCOPE (1 << 7) #define ARES_NI_LOOKUPHOST (1 << 8) #define ARES_NI_LOOKUPSERVICE (1 << 9) /* Reserved for future use */ #define ARES_NI_IDN (1 << 10) #define ARES_NI_IDN_ALLOW_UNASSIGNED (1 << 11) #define ARES_NI_IDN_USE_STD3_ASCII_RULES (1 << 12) /* Addrinfo flag values */ #define ARES_AI_CANONNAME (1 << 0) #define ARES_AI_NUMERICHOST (1 << 1) #define ARES_AI_PASSIVE (1 << 2) #define ARES_AI_NUMERICSERV (1 << 3) #define ARES_AI_V4MAPPED (1 << 4) #define ARES_AI_ALL (1 << 5) #define ARES_AI_ADDRCONFIG (1 << 6) /* Reserved for future use */ #define ARES_AI_IDN (1 << 10) #define ARES_AI_IDN_ALLOW_UNASSIGNED (1 << 11) #define ARES_AI_IDN_USE_STD3_ASCII_RULES (1 << 12) #define ARES_AI_CANONIDN (1 << 13) #define ARES_AI_MASK (ARES_AI_CANONNAME|ARES_AI_NUMERICHOST|ARES_AI_PASSIVE| \ ARES_AI_NUMERICSERV|ARES_AI_V4MAPPED|ARES_AI_ALL| \ ARES_AI_ADDRCONFIG) #define ARES_GETSOCK_MAXNUM 16 /* ares_getsock() can return info about this many sockets */ #define ARES_GETSOCK_READABLE(bits,num) (bits & (1<< (num))) #define ARES_GETSOCK_WRITABLE(bits,num) (bits & (1 << ((num) + \ ARES_GETSOCK_MAXNUM))) /* c-ares library initialization flag values */ #define ARES_LIB_INIT_NONE (0) #define ARES_LIB_INIT_WIN32 (1 << 0) #define ARES_LIB_INIT_ALL (ARES_LIB_INIT_WIN32) /* * Typedef our socket type */ #ifndef ares_socket_typedef #ifdef WIN32 typedef SOCKET ares_socket_t; #define ARES_SOCKET_BAD INVALID_SOCKET #else typedef int ares_socket_t; #define ARES_SOCKET_BAD -1 #endif #define ares_socket_typedef #endif /* ares_socket_typedef */ typedef void (*ares_sock_state_cb)(void *data, ares_socket_t socket_fd, int readable, int writable); struct apattern; /* NOTE about the ares_options struct to users and developers. This struct will remain looking like this. It will not be extended nor shrunk in future releases, but all new options will be set by ares_set_*() options instead of with the ares_init_options() function. Eventually (in a galaxy far far away), all options will be settable by ares_set_*() options and the ares_init_options() function will become deprecated. When new options are added to c-ares, they are not added to this struct. And they are not "saved" with the ares_save_options() function but instead we encourage the use of the ares_dup() function. Needless to say, if you add config options to c-ares you need to make sure ares_dup() duplicates this new option. */ struct ares_options { int flags; int timeout; /* in seconds or milliseconds, depending on options */ int tries; int ndots; unsigned short udp_port; unsigned short tcp_port; int socket_send_buffer_size; int socket_receive_buffer_size; struct in_addr *servers; int nservers; char **domains; int ndomains; char *lookups; ares_sock_state_cb sock_state_cb; void *sock_state_cb_data; struct apattern *sortlist; int nsort; int ednspsz; }; struct hostent; struct timeval; struct sockaddr; struct ares_channeldata; typedef struct ares_channeldata *ares_channel; typedef void (*ares_callback)(void *arg, int status, int timeouts, unsigned char *abuf, int alen); typedef void (*ares_host_callback)(void *arg, int status, int timeouts, struct hostent *hostent); typedef void (*ares_nameinfo_callback)(void *arg, int status, int timeouts, char *node, char *service); typedef int (*ares_sock_create_callback)(ares_socket_t socket_fd, int type, void *data); CARES_EXTERN int ares_library_init(int flags); CARES_EXTERN void ares_library_cleanup(void); CARES_EXTERN const char *ares_version(int *version); CARES_EXTERN int ares_init(ares_channel *channelptr); CARES_EXTERN int ares_init_options(ares_channel *channelptr, struct ares_options *options, int optmask); CARES_EXTERN int ares_save_options(ares_channel channel, struct ares_options *options, int *optmask); CARES_EXTERN void ares_destroy_options(struct ares_options *options); CARES_EXTERN int ares_dup(ares_channel *dest, ares_channel src); CARES_EXTERN void ares_destroy(ares_channel channel); CARES_EXTERN void ares_cancel(ares_channel channel); /* These next 3 configure local binding for the out-going socket * connection. Use these to specify source IP and/or network device * on multi-homed systems. */ CARES_EXTERN void ares_set_local_ip4(ares_channel channel, unsigned int local_ip); /* local_ip6 should be 16 bytes in length */ CARES_EXTERN void ares_set_local_ip6(ares_channel channel, const unsigned char* local_ip6); /* local_dev_name should be null terminated. */ CARES_EXTERN void ares_set_local_dev(ares_channel channel, const char* local_dev_name); CARES_EXTERN void ares_set_socket_callback(ares_channel channel, ares_sock_create_callback callback, void *user_data); CARES_EXTERN void ares_send(ares_channel channel, const unsigned char *qbuf, int qlen, ares_callback callback, void *arg); CARES_EXTERN void ares_query(ares_channel channel, const char *name, int dnsclass, int type, ares_callback callback, void *arg); CARES_EXTERN void ares_search(ares_channel channel, const char *name, int dnsclass, int type, ares_callback callback, void *arg); CARES_EXTERN void ares_gethostbyname(ares_channel channel, const char *name, int family, ares_host_callback callback, void *arg); CARES_EXTERN int ares_gethostbyname_file(ares_channel channel, const char *name, int family, struct hostent **host); CARES_EXTERN void ares_gethostbyaddr(ares_channel channel, const void *addr, int addrlen, int family, ares_host_callback callback, void *arg); CARES_EXTERN void ares_getnameinfo(ares_channel channel, const struct sockaddr *sa, ares_socklen_t salen, int flags, ares_nameinfo_callback callback, void *arg); CARES_EXTERN int ares_fds(ares_channel channel, fd_set *read_fds, fd_set *write_fds); CARES_EXTERN int ares_getsock(ares_channel channel, ares_socket_t *socks, int numsocks); CARES_EXTERN struct timeval *ares_timeout(ares_channel channel, struct timeval *maxtv, struct timeval *tv); CARES_EXTERN void ares_process(ares_channel channel, fd_set *read_fds, fd_set *write_fds); CARES_EXTERN void ares_process_fd(ares_channel channel, ares_socket_t read_fd, ares_socket_t write_fd); CARES_EXTERN int ares_create_query(const char *name, int dnsclass, int type, unsigned short id, int rd, unsigned char **buf, int *buflen, int max_udp_size); CARES_EXTERN int ares_mkquery(const char *name, int dnsclass, int type, unsigned short id, int rd, unsigned char **buf, int *buflen); CARES_EXTERN int ares_expand_name(const unsigned char *encoded, const unsigned char *abuf, int alen, char **s, long *enclen); CARES_EXTERN int ares_expand_string(const unsigned char *encoded, const unsigned char *abuf, int alen, unsigned char **s, long *enclen); /* * NOTE: before c-ares 1.7.0 we would most often use the system in6_addr * struct below when ares itself was built, but many apps would use this * private version since the header checked a HAVE_* define for it. Starting * with 1.7.0 we always declare and use our own to stop relying on the * system's one. */ struct ares_in6_addr { union { unsigned char _S6_u8[16]; } _S6_un; }; struct ares_addrttl { struct in_addr ipaddr; int ttl; }; struct ares_addr6ttl { struct ares_in6_addr ip6addr; int ttl; }; struct ares_srv_reply { struct ares_srv_reply *next; char *host; unsigned short priority; unsigned short weight; unsigned short port; }; struct ares_mx_reply { struct ares_mx_reply *next; char *host; unsigned short priority; }; struct ares_txt_reply { struct ares_txt_reply *next; unsigned char *txt; size_t length; /* length excludes null termination */ unsigned char record_start; /* 1 - if start of new record * 0 - if a chunk in the same record */ }; struct ares_naptr_reply { struct ares_naptr_reply *next; unsigned char *flags; unsigned char *service; unsigned char *regexp; char *replacement; unsigned short order; unsigned short preference; }; struct ares_soa_reply { char *nsname; char *hostmaster; unsigned int serial; unsigned int refresh; unsigned int retry; unsigned int expire; unsigned int minttl; }; /* ** Parse the buffer, starting at *abuf and of length alen bytes, previously ** obtained from an ares_search call. Put the results in *host, if nonnull. ** Also, if addrttls is nonnull, put up to *naddrttls IPv4 addresses along with ** their TTLs in that array, and set *naddrttls to the number of addresses ** so written. */ CARES_EXTERN int ares_parse_a_reply(const unsigned char *abuf, int alen, struct hostent **host, struct ares_addrttl *addrttls, int *naddrttls); CARES_EXTERN int ares_parse_aaaa_reply(const unsigned char *abuf, int alen, struct hostent **host, struct ares_addr6ttl *addrttls, int *naddrttls); CARES_EXTERN int ares_parse_ptr_reply(const unsigned char *abuf, int alen, const void *addr, int addrlen, int family, struct hostent **host); CARES_EXTERN int ares_parse_ns_reply(const unsigned char *abuf, int alen, struct hostent **host); CARES_EXTERN int ares_parse_srv_reply(const unsigned char* abuf, int alen, struct ares_srv_reply** srv_out); CARES_EXTERN int ares_parse_mx_reply(const unsigned char* abuf, int alen, struct ares_mx_reply** mx_out); CARES_EXTERN int ares_parse_txt_reply(const unsigned char* abuf, int alen, struct ares_txt_reply** txt_out); CARES_EXTERN int ares_parse_naptr_reply(const unsigned char* abuf, int alen, struct ares_naptr_reply** naptr_out); CARES_EXTERN int ares_parse_soa_reply(const unsigned char* abuf, int alen, struct ares_soa_reply** soa_out); CARES_EXTERN void ares_free_string(void *str); CARES_EXTERN void ares_free_hostent(struct hostent *host); CARES_EXTERN void ares_free_data(void *dataptr); CARES_EXTERN const char *ares_strerror(int code); /* TODO: Hold port here as well. */ struct ares_addr_node { struct ares_addr_node *next; int family; union { struct in_addr addr4; struct ares_in6_addr addr6; } addr; }; CARES_EXTERN int ares_set_servers(ares_channel channel, struct ares_addr_node *servers); /* Incomming string format: host[:port][,host[:port]]... */ CARES_EXTERN int ares_set_servers_csv(ares_channel channel, const char* servers); CARES_EXTERN int ares_get_servers(ares_channel channel, struct ares_addr_node **servers); CARES_EXTERN const char *ares_inet_ntop(int af, const void *src, char *dst, ares_socklen_t size); CARES_EXTERN int ares_inet_pton(int af, const char *src, void *dst); #ifdef __cplusplus } #endif #endif /* ARES__H */ iojs-v1.0.2-darwin-x64/include/node/ares_version.h000644 000766 000024 00000001214 12455173731 022103 0ustar00iojsstaff000000 000000 #ifndef ARES__VERSION_H #define ARES__VERSION_H /* This is the global package copyright */ #define ARES_COPYRIGHT "2004 - 2013 Daniel Stenberg, ." #define ARES_VERSION_MAJOR 1 #define ARES_VERSION_MINOR 10 #define ARES_VERSION_PATCH 0 #define ARES_VERSION ((ARES_VERSION_MAJOR<<16)|\ (ARES_VERSION_MINOR<<8)|\ (ARES_VERSION_PATCH)) #define ARES_VERSION_STR "1.10.0-DEV" #if (ARES_VERSION >= 0x010700) # define CARES_HAVE_ARES_LIBRARY_INIT 1 # define CARES_HAVE_ARES_LIBRARY_CLEANUP 1 #else # undef CARES_HAVE_ARES_LIBRARY_INIT # undef CARES_HAVE_ARES_LIBRARY_CLEANUP #endif #endif iojs-v1.0.2-darwin-x64/include/node/common.gypi000644 000766 000024 00000022505 12455173731 021423 0ustar00iojsstaff000000 000000 { 'variables': { 'asan%': 0, 'werror': '', # Turn off -Werror in V8 build. 'visibility%': 'hidden', # V8's visibility setting 'target_arch%': 'ia32', # set v8's target architecture 'host_arch%': 'ia32', # set v8's host architecture 'want_separate_host_toolset%': 0, # V8 should not build target and host 'library%': 'static_library', # allow override to 'shared_library' for DLL/.so builds 'component%': 'static_library', # NB. these names match with what V8 expects 'msvs_multi_core_compile': '0', # we do enable multicore compiles, but not using the V8 way 'python%': 'python', 'node_tag%': '', 'uv_library%': 'static_library', # Default to -O0 for debug builds. 'v8_optimized_debug%': 0, # Enable disassembler for `--print-code` v8 options 'v8_enable_disassembler': 1, # Don't bake anything extra into the snapshot. 'v8_use_external_startup_data%': 0, # Disable V8's post-mortem debugging; frequently broken and hardly used. 'v8_postmortem_support%': 'false', 'conditions': [ ['OS == "win"', { 'os_posix': 0, }, { 'os_posix': 1, }], ['GENERATOR == "ninja" or OS== "mac"', { 'OBJ_DIR': '<(PRODUCT_DIR)/obj', 'V8_BASE': '<(PRODUCT_DIR)/libv8_base.a', }, { 'OBJ_DIR': '<(PRODUCT_DIR)/obj.target', 'V8_BASE': '<(PRODUCT_DIR)/obj.target/deps/v8/tools/gyp/libv8_base.a', }], ['OS=="mac"', { 'clang%': 1, }, { 'clang%': 0, }], ], }, 'target_defaults': { 'default_configuration': 'Release', 'configurations': { 'Debug': { 'variables': { 'v8_enable_handle_zapping%': 1, }, 'defines': [ 'DEBUG', '_DEBUG' ], 'cflags': [ '-g', '-O0' ], 'conditions': [ ['target_arch=="x64"', { 'msvs_configuration_platform': 'x64', }], ], 'msvs_settings': { 'VCCLCompilerTool': { 'RuntimeLibrary': 1, # static debug 'Optimization': 0, # /Od, no optimization 'MinimalRebuild': 'false', 'OmitFramePointers': 'false', 'BasicRuntimeChecks': 3, # /RTC1 }, 'VCLinkerTool': { 'LinkIncremental': 2, # enable incremental linking }, }, 'xcode_settings': { 'GCC_OPTIMIZATION_LEVEL': '0', # stop gyp from defaulting to -Os }, }, 'Release': { 'variables': { 'v8_enable_handle_zapping%': 0, }, 'cflags': [ '-O3', '-ffunction-sections', '-fdata-sections' ], 'conditions': [ ['target_arch=="x64"', { 'msvs_configuration_platform': 'x64', }], ['OS=="solaris"', { # pull in V8's postmortem metadata 'ldflags': [ '-Wl,-z,allextract' ] }], ['OS!="mac" and OS!="win"', { 'cflags': [ '-fno-omit-frame-pointer' ], }], ], 'msvs_settings': { 'VCCLCompilerTool': { 'RuntimeLibrary': 0, # static release 'Optimization': 3, # /Ox, full optimization 'FavorSizeOrSpeed': 1, # /Ot, favour speed over size 'InlineFunctionExpansion': 2, # /Ob2, inline anything eligible 'WholeProgramOptimization': 'true', # /GL, whole program optimization, needed for LTCG 'OmitFramePointers': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'RuntimeTypeInfo': 'false', 'AdditionalOptions': [ '/MP', # compile across multiple CPUs ], }, 'VCLibrarianTool': { 'AdditionalOptions': [ '/LTCG', # link time code generation ], }, 'VCLinkerTool': { 'LinkTimeCodeGeneration': 1, # link-time code generation 'OptimizeReferences': 2, # /OPT:REF 'EnableCOMDATFolding': 2, # /OPT:ICF 'LinkIncremental': 1, # disable incremental linking }, }, } }, # Forcibly disable -Werror. We support a wide range of compilers, it's # simply not feasible to squelch all warnings, never mind that the # libraries in deps/ are not under our control. 'cflags!': ['-Werror'], 'msvs_settings': { 'VCCLCompilerTool': { 'StringPooling': 'true', # pool string literals 'DebugInformationFormat': 3, # Generate a PDB 'WarningLevel': 3, 'BufferSecurityCheck': 'true', 'ExceptionHandling': 0, # /EHsc 'SuppressStartupBanner': 'true', 'WarnAsError': 'false', }, 'VCLibrarianTool': { }, 'VCLinkerTool': { 'conditions': [ ['target_arch=="x64"', { 'TargetMachine' : 17 # /MACHINE:X64 }], ], 'GenerateDebugInformation': 'true', 'RandomizedBaseAddress': 2, # enable ASLR 'DataExecutionPrevention': 2, # enable DEP 'AllowIsolation': 'true', 'SuppressStartupBanner': 'true', 'target_conditions': [ ['_type=="executable"', { 'SubSystem': 1, # console executable }], ], }, }, 'msvs_disabled_warnings': [4351, 4355, 4800], 'conditions': [ ['asan != 0', { 'cflags+': [ '-fno-omit-frame-pointer', '-fsanitize=address', '-w', # http://crbug.com/162783 ], 'cflags_cc+': [ '-gline-tables-only' ], 'cflags!': [ '-fomit-frame-pointer' ], 'ldflags': [ '-fsanitize=address' ], }], ['OS == "win"', { 'msvs_cygwin_shell': 0, # prevent actions from trying to use cygwin 'defines': [ 'WIN32', # we don't really want VC++ warning us about # how dangerous C functions are... '_CRT_SECURE_NO_DEPRECATE', # ... or that C implementations shouldn't use # POSIX names '_CRT_NONSTDC_NO_DEPRECATE', # Make sure the STL doesn't try to use exceptions '_HAS_EXCEPTIONS=0', 'BUILDING_V8_SHARED=1', 'BUILDING_UV_SHARED=1', ], }], [ 'OS in "linux freebsd openbsd solaris"', { 'cflags': [ '-pthread', ], 'ldflags': [ '-pthread' ], }], [ 'OS in "linux freebsd openbsd solaris android"', { 'cflags': [ '-Wall', '-Wextra', '-Wno-unused-parameter', ], 'cflags_cc': [ '-fno-rtti', '-fno-exceptions', '-std=gnu++0x' ], 'ldflags': [ '-rdynamic' ], 'target_conditions': [ ['_type=="static_library"', { 'standalone_static_library': 1, # disable thin archive which needs binutils >= 2.19 }], ], 'conditions': [ [ 'target_arch=="ia32"', { 'cflags': [ '-m32' ], 'ldflags': [ '-m32' ], }], [ 'target_arch=="x32"', { 'cflags': [ '-mx32' ], 'ldflags': [ '-mx32' ], }], [ 'target_arch=="x64"', { 'cflags': [ '-m64' ], 'ldflags': [ '-m64' ], }], [ 'OS=="solaris"', { 'cflags': [ '-pthreads' ], 'ldflags': [ '-pthreads' ], 'cflags!': [ '-pthread' ], 'ldflags!': [ '-pthread' ], }], ], }], [ 'OS=="android"', { 'defines': ['_GLIBCXX_USE_C99_MATH'], 'libraries': [ '-llog' ], }], ['OS=="mac"', { 'defines': ['_DARWIN_USE_64_BIT_INODE=1'], 'xcode_settings': { 'ALWAYS_SEARCH_USER_PATHS': 'NO', 'GCC_CW_ASM_SYNTAX': 'NO', # No -fasm-blocks 'GCC_DYNAMIC_NO_PIC': 'NO', # No -mdynamic-no-pic # (Equivalent to -fPIC) 'GCC_ENABLE_CPP_EXCEPTIONS': 'NO', # -fno-exceptions 'GCC_ENABLE_CPP_RTTI': 'NO', # -fno-rtti 'GCC_ENABLE_PASCAL_STRINGS': 'NO', # No -mpascal-strings 'GCC_THREADSAFE_STATICS': 'NO', # -fno-threadsafe-statics 'PREBINDING': 'NO', # No -Wl,-prebind 'MACOSX_DEPLOYMENT_TARGET': '10.5', # -mmacosx-version-min=10.5 'USE_HEADERMAP': 'NO', 'OTHER_CFLAGS': [ '-fno-strict-aliasing', ], 'WARNING_CFLAGS': [ '-Wall', '-Wendif-labels', '-W', '-Wno-unused-parameter', ], }, 'target_conditions': [ ['_type!="static_library"', { 'xcode_settings': {'OTHER_LDFLAGS': ['-Wl,-search_paths_first']}, }], ], 'conditions': [ ['target_arch=="ia32"', { 'xcode_settings': {'ARCHS': ['i386']}, }], ['target_arch=="x64"', { 'xcode_settings': {'ARCHS': ['x86_64']}, }], ['clang==1', { 'xcode_settings': { 'GCC_VERSION': 'com.apple.compilers.llvm.clang.1_0', 'CLANG_CXX_LANGUAGE_STANDARD': 'gnu++0x', # -std=gnu++0x }, }], ], }], ['OS=="freebsd" and node_use_dtrace=="true"', { 'libraries': [ '-lelf' ], }], ['OS=="freebsd"', { 'ldflags': [ '-Wl,--export-dynamic', ], }] ], } } iojs-v1.0.2-darwin-x64/include/node/config.gypi000644 000766 000024 00000002716 12456114616 021400 0ustar00iojsstaff000000 000000 # Do not edit. Generated by the configure script. { 'target_defaults': { 'cflags': [], 'default_configuration': 'Release', 'defines': [], 'include_dirs': [], 'libraries': []}, 'variables': { 'host_arch': 'x64', 'icu_small': 'false', 'node_install_npm': 'true', 'node_prefix': '/', 'node_shared_http_parser': 'false', 'node_shared_libuv': 'false', 'node_shared_openssl': 'false', 'node_shared_v8': 'false', 'node_shared_zlib': 'false', 'node_tag': '', 'node_use_dtrace': 'true', 'node_use_etw': 'false', 'node_use_mdb': 'false', 'node_use_openssl': 'true', 'node_use_perfctr': 'false', 'openssl_no_asm': 0, 'python': '/usr/bin/python', 'target_arch': 'x64', 'uv_library': 'static_library', 'uv_parent_path': '/deps/uv/', 'uv_use_dtrace': 'true', 'v8_enable_gdbjit': 0, 'v8_enable_i18n_support': 0, 'v8_no_strict_aliasing': 1, 'v8_optimized_debug': 0, 'v8_random_seed': 0, 'v8_use_snapshot': 'false', 'want_separate_host_toolset': 0}} iojs-v1.0.2-darwin-x64/include/node/libplatform/000755 000766 000024 00000000000 12456115120 021535 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/include/node/nameser.h000644 000766 000024 00000020365 12455173731 021046 0ustar00iojsstaff000000 000000 #ifndef ARES_NAMESER_H #define ARES_NAMESER_H /* header file provided by liren@vivisimo.com */ #ifndef HAVE_ARPA_NAMESER_H #define NS_PACKETSZ 512 /* maximum packet size */ #define NS_MAXDNAME 256 /* maximum domain name */ #define NS_MAXCDNAME 255 /* maximum compressed domain name */ #define NS_MAXLABEL 63 #define NS_HFIXEDSZ 12 /* #/bytes of fixed data in header */ #define NS_QFIXEDSZ 4 /* #/bytes of fixed data in query */ #define NS_RRFIXEDSZ 10 /* #/bytes of fixed data in r record */ #define NS_INT16SZ 2 #define NS_INADDRSZ 4 #define NS_IN6ADDRSZ 16 #define NS_CMPRSFLGS 0xc0 /* Flag bits indicating name compression. */ #define NS_DEFAULTPORT 53 /* For both TCP and UDP. */ typedef enum __ns_class { ns_c_invalid = 0, /* Cookie. */ ns_c_in = 1, /* Internet. */ ns_c_2 = 2, /* unallocated/unsupported. */ ns_c_chaos = 3, /* MIT Chaos-net. */ ns_c_hs = 4, /* MIT Hesiod. */ /* Query class values which do not appear in resource records */ ns_c_none = 254, /* for prereq. sections in update requests */ ns_c_any = 255, /* Wildcard match. */ ns_c_max = 65536 } ns_class; typedef enum __ns_type { ns_t_invalid = 0, /* Cookie. */ ns_t_a = 1, /* Host address. */ ns_t_ns = 2, /* Authoritative server. */ ns_t_md = 3, /* Mail destination. */ ns_t_mf = 4, /* Mail forwarder. */ ns_t_cname = 5, /* Canonical name. */ ns_t_soa = 6, /* Start of authority zone. */ ns_t_mb = 7, /* Mailbox domain name. */ ns_t_mg = 8, /* Mail group member. */ ns_t_mr = 9, /* Mail rename name. */ ns_t_null = 10, /* Null resource record. */ ns_t_wks = 11, /* Well known service. */ ns_t_ptr = 12, /* Domain name pointer. */ ns_t_hinfo = 13, /* Host information. */ ns_t_minfo = 14, /* Mailbox information. */ ns_t_mx = 15, /* Mail routing information. */ ns_t_txt = 16, /* Text strings. */ ns_t_rp = 17, /* Responsible person. */ ns_t_afsdb = 18, /* AFS cell database. */ ns_t_x25 = 19, /* X_25 calling address. */ ns_t_isdn = 20, /* ISDN calling address. */ ns_t_rt = 21, /* Router. */ ns_t_nsap = 22, /* NSAP address. */ ns_t_nsap_ptr = 23, /* Reverse NSAP lookup (deprecated). */ ns_t_sig = 24, /* Security signature. */ ns_t_key = 25, /* Security key. */ ns_t_px = 26, /* X.400 mail mapping. */ ns_t_gpos = 27, /* Geographical position (withdrawn). */ ns_t_aaaa = 28, /* Ip6 Address. */ ns_t_loc = 29, /* Location Information. */ ns_t_nxt = 30, /* Next domain (security). */ ns_t_eid = 31, /* Endpoint identifier. */ ns_t_nimloc = 32, /* Nimrod Locator. */ ns_t_srv = 33, /* Server Selection. */ ns_t_atma = 34, /* ATM Address */ ns_t_naptr = 35, /* Naming Authority PoinTeR */ ns_t_kx = 36, /* Key Exchange */ ns_t_cert = 37, /* Certification record */ ns_t_a6 = 38, /* IPv6 address (deprecates AAAA) */ ns_t_dname = 39, /* Non-terminal DNAME (for IPv6) */ ns_t_sink = 40, /* Kitchen sink (experimentatl) */ ns_t_opt = 41, /* EDNS0 option (meta-RR) */ ns_t_apl = 42, /* Address prefix list (RFC3123) */ ns_t_ds = 43, /* Delegation Signer (RFC4034) */ ns_t_sshfp = 44, /* SSH Key Fingerprint (RFC4255) */ ns_t_rrsig = 46, /* Resource Record Signature (RFC4034) */ ns_t_nsec = 47, /* Next Secure (RFC4034) */ ns_t_dnskey = 48, /* DNS Public Key (RFC4034) */ ns_t_tkey = 249, /* Transaction key */ ns_t_tsig = 250, /* Transaction signature. */ ns_t_ixfr = 251, /* Incremental zone transfer. */ ns_t_axfr = 252, /* Transfer zone of authority. */ ns_t_mailb = 253, /* Transfer mailbox records. */ ns_t_maila = 254, /* Transfer mail agent records. */ ns_t_any = 255, /* Wildcard match. */ ns_t_zxfr = 256, /* BIND-specific, nonstandard. */ ns_t_max = 65536 } ns_type; typedef enum __ns_opcode { ns_o_query = 0, /* Standard query. */ ns_o_iquery = 1, /* Inverse query (deprecated/unsupported). */ ns_o_status = 2, /* Name server status query (unsupported). */ /* Opcode 3 is undefined/reserved. */ ns_o_notify = 4, /* Zone change notification. */ ns_o_update = 5, /* Zone update message. */ ns_o_max = 6 } ns_opcode; typedef enum __ns_rcode { ns_r_noerror = 0, /* No error occurred. */ ns_r_formerr = 1, /* Format error. */ ns_r_servfail = 2, /* Server failure. */ ns_r_nxdomain = 3, /* Name error. */ ns_r_notimpl = 4, /* Unimplemented. */ ns_r_refused = 5, /* Operation refused. */ /* these are for BIND_UPDATE */ ns_r_yxdomain = 6, /* Name exists */ ns_r_yxrrset = 7, /* RRset exists */ ns_r_nxrrset = 8, /* RRset does not exist */ ns_r_notauth = 9, /* Not authoritative for zone */ ns_r_notzone = 10, /* Zone of record different from zone section */ ns_r_max = 11, /* The following are TSIG extended errors */ ns_r_badsig = 16, ns_r_badkey = 17, ns_r_badtime = 18 } ns_rcode; #endif /* HAVE_ARPA_NAMESER_H */ #ifndef HAVE_ARPA_NAMESER_COMPAT_H #define PACKETSZ NS_PACKETSZ #define MAXDNAME NS_MAXDNAME #define MAXCDNAME NS_MAXCDNAME #define MAXLABEL NS_MAXLABEL #define HFIXEDSZ NS_HFIXEDSZ #define QFIXEDSZ NS_QFIXEDSZ #define RRFIXEDSZ NS_RRFIXEDSZ #define INDIR_MASK NS_CMPRSFLGS #define NAMESERVER_PORT NS_DEFAULTPORT #define QUERY ns_o_query #define SERVFAIL ns_r_servfail #define NOTIMP ns_r_notimpl #define REFUSED ns_r_refused #undef NOERROR /* it seems this is already defined in winerror.h */ #define NOERROR ns_r_noerror #define FORMERR ns_r_formerr #define NXDOMAIN ns_r_nxdomain #define C_IN ns_c_in #define C_CHAOS ns_c_chaos #define C_HS ns_c_hs #define C_NONE ns_c_none #define C_ANY ns_c_any #define T_A ns_t_a #define T_NS ns_t_ns #define T_MD ns_t_md #define T_MF ns_t_mf #define T_CNAME ns_t_cname #define T_SOA ns_t_soa #define T_MB ns_t_mb #define T_MG ns_t_mg #define T_MR ns_t_mr #define T_NULL ns_t_null #define T_WKS ns_t_wks #define T_PTR ns_t_ptr #define T_HINFO ns_t_hinfo #define T_MINFO ns_t_minfo #define T_MX ns_t_mx #define T_TXT ns_t_txt #define T_RP ns_t_rp #define T_AFSDB ns_t_afsdb #define T_X25 ns_t_x25 #define T_ISDN ns_t_isdn #define T_RT ns_t_rt #define T_NSAP ns_t_nsap #define T_NSAP_PTR ns_t_nsap_ptr #define T_SIG ns_t_sig #define T_KEY ns_t_key #define T_PX ns_t_px #define T_GPOS ns_t_gpos #define T_AAAA ns_t_aaaa #define T_LOC ns_t_loc #define T_NXT ns_t_nxt #define T_EID ns_t_eid #define T_NIMLOC ns_t_nimloc #define T_SRV ns_t_srv #define T_ATMA ns_t_atma #define T_NAPTR ns_t_naptr #define T_KX ns_t_kx #define T_CERT ns_t_cert #define T_A6 ns_t_a6 #define T_DNAME ns_t_dname #define T_SINK ns_t_sink #define T_OPT ns_t_opt #define T_APL ns_t_apl #define T_DS ns_t_ds #define T_SSHFP ns_t_sshfp #define T_RRSIG ns_t_rrsig #define T_NSEC ns_t_nsec #define T_DNSKEY ns_t_dnskey #define T_TKEY ns_t_tkey #define T_TSIG ns_t_tsig #define T_IXFR ns_t_ixfr #define T_AXFR ns_t_axfr #define T_MAILB ns_t_mailb #define T_MAILA ns_t_maila #define T_ANY ns_t_any #endif /* HAVE_ARPA_NAMESER_COMPAT_H */ #endif /* ARES_NAMESER_H */ iojs-v1.0.2-darwin-x64/include/node/node.h000644 000766 000024 00000041061 12455173734 020340 0ustar00iojsstaff000000 000000 #ifndef SRC_NODE_H_ #define SRC_NODE_H_ #ifdef _WIN32 # ifndef BUILDING_NODE_EXTENSION # define NODE_EXTERN __declspec(dllexport) # else # define NODE_EXTERN __declspec(dllimport) # endif #else # define NODE_EXTERN /* nothing */ #endif #ifdef BUILDING_NODE_EXTENSION # undef BUILDING_V8_SHARED # undef BUILDING_UV_SHARED # define USING_V8_SHARED 1 # define USING_UV_SHARED 1 #endif // This should be defined in make system. // See issue https://github.com/joyent/node/issues/1236 #if defined(__MINGW32__) || defined(_MSC_VER) #ifndef _WIN32_WINNT # define _WIN32_WINNT 0x0501 #endif #define NOMINMAX #endif #if defined(_MSC_VER) #define PATH_MAX MAX_PATH #endif #ifdef _WIN32 # define SIGKILL 9 #endif #include "v8.h" // NOLINT(build/include_order) #include "node_version.h" // NODE_MODULE_VERSION #define NODE_DEPRECATED(msg, fn) V8_DEPRECATED(msg, fn) // Forward-declare libuv loop struct uv_loop_s; // Forward-declare these functions now to stop MSVS from becoming // terminally confused when it's done in node_internals.h namespace node { NODE_EXTERN v8::Local ErrnoException(v8::Isolate* isolate, int errorno, const char* syscall = NULL, const char* message = NULL, const char* path = NULL); NODE_EXTERN v8::Local UVException(v8::Isolate* isolate, int errorno, const char* syscall = NULL, const char* message = NULL, const char* path = NULL); NODE_DEPRECATED("Use UVException(isolate, ...)", inline v8::Local ErrnoException( int errorno, const char* syscall = NULL, const char* message = NULL, const char* path = NULL) { return ErrnoException(v8::Isolate::GetCurrent(), errorno, syscall, message, path); }) inline v8::Local UVException(int errorno, const char* syscall = NULL, const char* message = NULL, const char* path = NULL) { return UVException(v8::Isolate::GetCurrent(), errorno, syscall, message, path); } /* * MakeCallback doesn't have a HandleScope. That means the callers scope * will retain ownership of created handles from MakeCallback and related. * There is by default a wrapping HandleScope before uv_run, if the caller * doesn't have a HandleScope on the stack the global will take ownership * which won't be reaped until the uv loop exits. * * If a uv callback is fired, and there is no enclosing HandleScope in the * cb, you will appear to leak 4-bytes for every invocation. Take heed. */ NODE_EXTERN v8::Handle MakeCallback( v8::Isolate* isolate, v8::Handle recv, const char* method, int argc, v8::Handle* argv); NODE_EXTERN v8::Handle MakeCallback( v8::Isolate* isolate, v8::Handle recv, v8::Handle symbol, int argc, v8::Handle* argv); NODE_EXTERN v8::Handle MakeCallback( v8::Isolate* isolate, v8::Handle recv, v8::Handle callback, int argc, v8::Handle* argv); } // namespace node #if defined(NODE_WANT_INTERNALS) && NODE_WANT_INTERNALS #include "node_internals.h" #endif #include #include #ifndef NODE_STRINGIFY #define NODE_STRINGIFY(n) NODE_STRINGIFY_HELPER(n) #define NODE_STRINGIFY_HELPER(n) #n #endif #ifdef _WIN32 // TODO(tjfontaine) consider changing the usage of ssize_t to ptrdiff_t #if !defined(_SSIZE_T_) && !defined(_SSIZE_T_DEFINED) typedef intptr_t ssize_t; # define _SSIZE_T_ # define _SSIZE_T_DEFINED #endif #else // !_WIN32 # include // size_t, ssize_t #endif // _WIN32 namespace node { NODE_EXTERN extern bool no_deprecation; NODE_EXTERN int Start(int argc, char *argv[]); NODE_EXTERN void Init(int* argc, const char** argv, int* exec_argc, const char*** exec_argv); class Environment; NODE_EXTERN Environment* CreateEnvironment(v8::Isolate* isolate, struct uv_loop_s* loop, v8::Handle context, int argc, const char* const* argv, int exec_argc, const char* const* exec_argv); NODE_EXTERN void LoadEnvironment(Environment* env); // NOTE: Calling this is the same as calling // CreateEnvironment() + LoadEnvironment() from above. // `uv_default_loop()` will be passed as `loop`. NODE_EXTERN Environment* CreateEnvironment(v8::Isolate* isolate, v8::Handle context, int argc, const char* const* argv, int exec_argc, const char* const* exec_argv); NODE_EXTERN void EmitBeforeExit(Environment* env); NODE_EXTERN int EmitExit(Environment* env); NODE_EXTERN void RunAtExit(Environment* env); /* Converts a unixtime to V8 Date */ #define NODE_UNIXTIME_V8(t) v8::Date::New(v8::Isolate::GetCurrent(), \ 1000 * static_cast(t)) #define NODE_V8_UNIXTIME(v) (static_cast((v)->NumberValue())/1000.0); // Used to be a macro, hence the uppercase name. #define NODE_DEFINE_CONSTANT(target, constant) \ do { \ v8::Isolate* isolate = v8::Isolate::GetCurrent(); \ v8::Local constant_name = \ v8::String::NewFromUtf8(isolate, #constant); \ v8::Local constant_value = \ v8::Number::New(isolate, static_cast(constant)); \ v8::PropertyAttribute constant_attributes = \ static_cast(v8::ReadOnly | v8::DontDelete); \ (target)->ForceSet(constant_name, constant_value, constant_attributes); \ } \ while (0) // Used to be a macro, hence the uppercase name. template inline void NODE_SET_METHOD(const TypeName& recv, const char* name, v8::FunctionCallback callback) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); v8::HandleScope handle_scope(isolate); v8::Local t = v8::FunctionTemplate::New(isolate, callback); v8::Local fn = t->GetFunction(); v8::Local fn_name = v8::String::NewFromUtf8(isolate, name); fn->SetName(fn_name); recv->Set(fn_name, fn); } #define NODE_SET_METHOD node::NODE_SET_METHOD // Used to be a macro, hence the uppercase name. // Not a template because it only makes sense for FunctionTemplates. inline void NODE_SET_PROTOTYPE_METHOD(v8::Handle recv, const char* name, v8::FunctionCallback callback) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); v8::HandleScope handle_scope(isolate); v8::Handle s = v8::Signature::New(isolate, recv); v8::Local t = v8::FunctionTemplate::New(isolate, callback, v8::Handle(), s); v8::Local fn = t->GetFunction(); recv->PrototypeTemplate()->Set(v8::String::NewFromUtf8(isolate, name), fn); v8::Local fn_name = v8::String::NewFromUtf8(isolate, name); fn->SetName(fn_name); } #define NODE_SET_PROTOTYPE_METHOD node::NODE_SET_PROTOTYPE_METHOD enum encoding {ASCII, UTF8, BASE64, UCS2, BINARY, HEX, BUFFER}; enum encoding ParseEncoding(v8::Isolate* isolate, v8::Handle encoding_v, enum encoding _default = BINARY); NODE_DEPRECATED("Use ParseEncoding(isolate, ...)", inline enum encoding ParseEncoding( v8::Handle encoding_v, enum encoding _default = BINARY) { return ParseEncoding(v8::Isolate::GetCurrent(), encoding_v, _default); }) NODE_EXTERN void FatalException(v8::Isolate* isolate, const v8::TryCatch& try_catch); NODE_DEPRECATED("Use FatalException(isolate, ...)", inline void FatalException(const v8::TryCatch& try_catch) { return FatalException(v8::Isolate::GetCurrent(), try_catch); }) // Don't call with encoding=UCS2. NODE_EXTERN v8::Local Encode(v8::Isolate* isolate, const char* buf, size_t len, enum encoding encoding = BINARY); // The input buffer should be in host endianness. NODE_EXTERN v8::Local Encode(v8::Isolate* isolate, const uint16_t* buf, size_t len); NODE_DEPRECATED("Use Encode(isolate, ...)", inline v8::Local Encode( const void* buf, size_t len, enum encoding encoding = BINARY) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); if (encoding == UCS2) { assert(reinterpret_cast(buf) % sizeof(uint16_t) == 0 && "UCS2 buffer must be aligned on two-byte boundary."); const uint16_t* that = static_cast(buf); return Encode(isolate, that, len / sizeof(*that)); } return Encode(isolate, static_cast(buf), len, encoding); }) // Returns -1 if the handle was not valid for decoding NODE_EXTERN ssize_t DecodeBytes(v8::Isolate* isolate, v8::Handle, enum encoding encoding = BINARY); NODE_DEPRECATED("Use DecodeBytes(isolate, ...)", inline ssize_t DecodeBytes( v8::Handle val, enum encoding encoding = BINARY) { return DecodeBytes(v8::Isolate::GetCurrent(), val, encoding); }) // returns bytes written. NODE_EXTERN ssize_t DecodeWrite(v8::Isolate* isolate, char* buf, size_t buflen, v8::Handle, enum encoding encoding = BINARY); NODE_DEPRECATED("Use DecodeWrite(isolate, ...)", inline ssize_t DecodeWrite(char* buf, size_t buflen, v8::Handle val, enum encoding encoding = BINARY) { return DecodeWrite(v8::Isolate::GetCurrent(), buf, buflen, val, encoding); }) #ifdef _WIN32 NODE_EXTERN v8::Local WinapiErrnoException( v8::Isolate* isolate, int errorno, const char *syscall = NULL, const char *msg = "", const char *path = NULL); NODE_DEPRECATED("Use WinapiErrnoException(isolate, ...)", inline v8::Local WinapiErrnoException(int errorno, const char *syscall = NULL, const char *msg = "", const char *path = NULL) { return WinapiErrnoException(v8::Isolate::GetCurrent(), errorno, syscall, msg, path); }) #endif const char *signo_string(int errorno); typedef void (*addon_register_func)( v8::Handle exports, v8::Handle module, void* priv); typedef void (*addon_context_register_func)( v8::Handle exports, v8::Handle module, v8::Handle context, void* priv); #define NM_F_BUILTIN 0x01 #define NM_F_LINKED 0x02 struct node_module { int nm_version; unsigned int nm_flags; void* nm_dso_handle; const char* nm_filename; node::addon_register_func nm_register_func; node::addon_context_register_func nm_context_register_func; const char* nm_modname; void* nm_priv; struct node_module* nm_link; }; node_module* get_builtin_module(const char *name); node_module* get_linked_module(const char *name); extern "C" NODE_EXTERN void node_module_register(void* mod); #ifdef _WIN32 # define NODE_MODULE_EXPORT __declspec(dllexport) #else # define NODE_MODULE_EXPORT __attribute__((visibility("default"))) #endif #if defined(_MSC_VER) #pragma section(".CRT$XCU", read) #define NODE_C_CTOR(fn) \ static void __cdecl fn(void); \ __declspec(dllexport, allocate(".CRT$XCU")) \ void (__cdecl*fn ## _)(void) = fn; \ static void __cdecl fn(void) #else #define NODE_C_CTOR(fn) \ static void fn(void) __attribute__((constructor)); \ static void fn(void) #endif #define NODE_MODULE_X(modname, regfunc, priv, flags) \ extern "C" { \ static node::node_module _module = \ { \ NODE_MODULE_VERSION, \ flags, \ NULL, \ __FILE__, \ (node::addon_register_func) (regfunc), \ NULL, \ NODE_STRINGIFY(modname), \ priv, \ NULL \ }; \ NODE_C_CTOR(_register_ ## modname) { \ node_module_register(&_module); \ } \ } #define NODE_MODULE_CONTEXT_AWARE_X(modname, regfunc, priv, flags) \ extern "C" { \ static node::node_module _module = \ { \ NODE_MODULE_VERSION, \ flags, \ NULL, \ __FILE__, \ NULL, \ (node::addon_context_register_func) (regfunc), \ NODE_STRINGIFY(modname), \ priv, \ NULL \ }; \ NODE_C_CTOR(_register_ ## modname) { \ node_module_register(&_module); \ } \ } #define NODE_MODULE(modname, regfunc) \ NODE_MODULE_X(modname, regfunc, NULL, 0) #define NODE_MODULE_CONTEXT_AWARE(modname, regfunc) \ NODE_MODULE_CONTEXT_AWARE_X(modname, regfunc, NULL, 0) #define NODE_MODULE_CONTEXT_AWARE_BUILTIN(modname, regfunc) \ NODE_MODULE_CONTEXT_AWARE_X(modname, regfunc, NULL, NM_F_BUILTIN) \ /* * For backward compatibility in add-on modules. */ #define NODE_MODULE_DECL /* nothing */ /* Called after the event loop exits but before the VM is disposed. * Callbacks are run in reverse order of registration, i.e. newest first. */ NODE_EXTERN void AtExit(void (*cb)(void* arg), void* arg = 0); } // namespace node #endif // SRC_NODE_H_ iojs-v1.0.2-darwin-x64/include/node/node_buffer.h000644 000766 000024 00000010051 12455173734 021664 0ustar00iojsstaff000000 000000 #ifndef SRC_NODE_BUFFER_H_ #define SRC_NODE_BUFFER_H_ #include "node.h" #include "smalloc.h" #include "v8.h" #if defined(NODE_WANT_INTERNALS) #include "env.h" #endif // defined(NODE_WANT_INTERNALS) namespace node { namespace Buffer { static const unsigned int kMaxLength = smalloc::kMaxLength; NODE_EXTERN bool HasInstance(v8::Handle val); NODE_EXTERN bool HasInstance(v8::Handle val); NODE_EXTERN char* Data(v8::Handle val); NODE_EXTERN char* Data(v8::Handle val); NODE_EXTERN size_t Length(v8::Handle val); NODE_EXTERN size_t Length(v8::Handle val); // public constructor NODE_EXTERN v8::Local New(v8::Isolate* isolate, size_t length); NODE_DEPRECATED("Use New(isolate, ...)", inline v8::Local New(size_t length) { return New(v8::Isolate::GetCurrent(), length); }) // public constructor from string NODE_EXTERN v8::Local New(v8::Isolate* isolate, v8::Handle string, enum encoding enc = UTF8); NODE_DEPRECATED("Use New(isolate, ...)", inline v8::Local New(v8::Handle string, enum encoding enc = UTF8) { return New(v8::Isolate::GetCurrent(), string, enc); }) // public constructor - data is copied // TODO(trevnorris): should be something like Copy() NODE_EXTERN v8::Local New(v8::Isolate* isolate, const char* data, size_t len); NODE_DEPRECATED("Use New(isolate, ...)", inline v8::Local New(const char* data, size_t len) { return New(v8::Isolate::GetCurrent(), data, len); }) // public constructor - data is used, callback is passed data on object gc NODE_EXTERN v8::Local New(v8::Isolate* isolate, char* data, size_t length, smalloc::FreeCallback callback, void* hint); NODE_DEPRECATED("Use New(isolate, ...)", inline v8::Local New(char* data, size_t length, smalloc::FreeCallback callback, void* hint) { return New(v8::Isolate::GetCurrent(), data, length, callback, hint); }) // public constructor - data is used. // TODO(trevnorris): should be New() for consistency NODE_EXTERN v8::Local Use(v8::Isolate* isolate, char* data, uint32_t len); NODE_DEPRECATED("Use Use(isolate, ...)", inline v8::Local Use(char* data, uint32_t len) { return Use(v8::Isolate::GetCurrent(), data, len); }) // This is verbose to be explicit with inline commenting static inline bool IsWithinBounds(size_t off, size_t len, size_t max) { // Asking to seek too far into the buffer // check to avoid wrapping in subsequent subtraction if (off > max) return false; // Asking for more than is left over in the buffer if (max - off < len) return false; // Otherwise we're in bounds return true; } // Internal. Not for public consumption. We can't define these in // src/node_internals.h due to a circular dependency issue with // the smalloc.h and node_internals.h headers. #if defined(NODE_WANT_INTERNALS) v8::Local New(Environment* env, size_t size); v8::Local New(Environment* env, const char* data, size_t len); v8::Local New(Environment* env, char* data, size_t length, smalloc::FreeCallback callback, void* hint); v8::Local Use(Environment* env, char* data, uint32_t length); #endif // defined(NODE_WANT_INTERNALS) } // namespace Buffer } // namespace node #endif // SRC_NODE_BUFFER_H_ iojs-v1.0.2-darwin-x64/include/node/node_internals.h000644 000766 000024 00000016300 12455173734 022415 0ustar00iojsstaff000000 000000 #ifndef SRC_NODE_INTERNALS_H_ #define SRC_NODE_INTERNALS_H_ #include "node.h" #include "util.h" #include "util-inl.h" #include "uv.h" #include "v8.h" #include #include struct sockaddr; namespace node { // Forward declaration class Environment; // If persistent.IsWeak() == false, then do not call persistent.Reset() // while the returned Local is still in scope, it will destroy the // reference to the object. template inline v8::Local PersistentToLocal( v8::Isolate* isolate, const v8::Persistent& persistent); // Call with valid HandleScope and while inside Context scope. v8::Handle MakeCallback(Environment* env, v8::Handle recv, const char* method, int argc = 0, v8::Handle* argv = nullptr); // Call with valid HandleScope and while inside Context scope. v8::Handle MakeCallback(Environment* env, v8::Handle recv, uint32_t index, int argc = 0, v8::Handle* argv = nullptr); // Call with valid HandleScope and while inside Context scope. v8::Handle MakeCallback(Environment* env, v8::Handle recv, v8::Handle symbol, int argc = 0, v8::Handle* argv = nullptr); // Call with valid HandleScope and while inside Context scope. v8::Handle MakeCallback(Environment* env, v8::Handle recv, v8::Handle callback, int argc = 0, v8::Handle* argv = nullptr); // Convert a struct sockaddr to a { address: '1.2.3.4', port: 1234 } JS object. // Sets address and port properties on the info object and returns it. // If |info| is omitted, a new object is returned. v8::Local AddressToJS( Environment* env, const sockaddr* addr, v8::Local info = v8::Handle()); #ifdef _WIN32 // emulate snprintf() on windows, _snprintf() doesn't zero-terminate the buffer // on overflow... #include inline static int snprintf(char* buf, unsigned int len, const char* fmt, ...) { va_list ap; va_start(ap, fmt); int n = _vsprintf_p(buf, len, fmt, ap); if (len) buf[len - 1] = '\0'; va_end(ap); return n; } #endif #if defined(__x86_64__) # define BITS_PER_LONG 64 #else # define BITS_PER_LONG 32 #endif #ifndef ARRAY_SIZE # define ARRAY_SIZE(a) (sizeof((a)) / sizeof((a)[0])) #endif #ifndef ROUND_UP # define ROUND_UP(a, b) ((a) % (b) ? ((a) + (b)) - ((a) % (b)) : (a)) #endif #if defined(__GNUC__) && __GNUC__ >= 4 # define MUST_USE_RESULT __attribute__((warn_unused_result)) # define NO_RETURN __attribute__((noreturn)) #else # define MUST_USE_RESULT # define NO_RETURN #endif void AppendExceptionLine(Environment* env, v8::Handle er, v8::Handle message); NO_RETURN void FatalError(const char* location, const char* message); v8::Local BuildStatsObject(Environment* env, const uv_stat_t* s); enum Endianness { kLittleEndian, // _Not_ LITTLE_ENDIAN, clashes with endian.h. kBigEndian }; inline enum Endianness GetEndianness() { // Constant-folded by the compiler. const union { uint8_t u8[2]; uint16_t u16; } u = { { 1, 0 } }; return u.u16 == 1 ? kLittleEndian : kBigEndian; } inline bool IsLittleEndian() { return GetEndianness() == kLittleEndian; } inline bool IsBigEndian() { return GetEndianness() == kBigEndian; } // parse index for external array data inline MUST_USE_RESULT bool ParseArrayIndex(v8::Handle arg, size_t def, size_t* ret) { if (arg->IsUndefined()) { *ret = def; return true; } int32_t tmp_i = arg->Int32Value(); if (tmp_i < 0) return false; *ret = static_cast(tmp_i); return true; } void ThrowError(v8::Isolate* isolate, const char* errmsg); void ThrowTypeError(v8::Isolate* isolate, const char* errmsg); void ThrowRangeError(v8::Isolate* isolate, const char* errmsg); void ThrowErrnoException(v8::Isolate* isolate, int errorno, const char* syscall = nullptr, const char* message = nullptr, const char* path = nullptr); void ThrowUVException(v8::Isolate* isolate, int errorno, const char* syscall = nullptr, const char* message = nullptr, const char* path = nullptr); NODE_DEPRECATED("Use ThrowError(isolate)", inline void ThrowError(const char* errmsg) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); return ThrowError(isolate, errmsg); }) NODE_DEPRECATED("Use ThrowTypeError(isolate)", inline void ThrowTypeError(const char* errmsg) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); return ThrowTypeError(isolate, errmsg); }) NODE_DEPRECATED("Use ThrowRangeError(isolate)", inline void ThrowRangeError(const char* errmsg) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); return ThrowRangeError(isolate, errmsg); }) NODE_DEPRECATED("Use ThrowErrnoException(isolate)", inline void ThrowErrnoException(int errorno, const char* syscall = nullptr, const char* message = nullptr, const char* path = nullptr) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); return ThrowErrnoException(isolate, errorno, syscall, message, path); }) NODE_DEPRECATED("Use ThrowUVException(isolate)", inline void ThrowUVException(int errorno, const char* syscall = nullptr, const char* message = nullptr, const char* path = nullptr) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); return ThrowUVException(isolate, errorno, syscall, message, path); }) inline void NODE_SET_EXTERNAL(v8::Handle target, const char* key, v8::AccessorGetterCallback getter) { v8::Isolate* isolate = v8::Isolate::GetCurrent(); v8::HandleScope handle_scope(isolate); v8::Local prop = v8::String::NewFromUtf8(isolate, key); target->SetAccessor(prop, getter, nullptr, v8::Handle(), v8::DEFAULT, static_cast(v8::ReadOnly | v8::DontDelete)); } } // namespace node #endif // SRC_NODE_INTERNALS_H_ iojs-v1.0.2-darwin-x64/include/node/node_object_wrap.h000644 000766 000024 00000005553 12455173734 022725 0ustar00iojsstaff000000 000000 #ifndef SRC_NODE_OBJECT_WRAP_H_ #define SRC_NODE_OBJECT_WRAP_H_ #include "v8.h" #include namespace node { class ObjectWrap { public: ObjectWrap() { refs_ = 0; } virtual ~ObjectWrap() { if (persistent().IsEmpty()) return; assert(persistent().IsNearDeath()); persistent().ClearWeak(); persistent().Reset(); } template static inline T* Unwrap(v8::Handle handle) { assert(!handle.IsEmpty()); assert(handle->InternalFieldCount() > 0); // Cast to ObjectWrap before casting to T. A direct cast from void // to T won't work right when T has more than one base class. void* ptr = handle->GetAlignedPointerFromInternalField(0); ObjectWrap* wrap = static_cast(ptr); return static_cast(wrap); } inline v8::Local handle() { return handle(v8::Isolate::GetCurrent()); } inline v8::Local handle(v8::Isolate* isolate) { return v8::Local::New(isolate, persistent()); } inline v8::Persistent& persistent() { return handle_; } protected: inline void Wrap(v8::Handle handle) { assert(persistent().IsEmpty()); assert(handle->InternalFieldCount() > 0); handle->SetAlignedPointerInInternalField(0, this); persistent().Reset(v8::Isolate::GetCurrent(), handle); MakeWeak(); } inline void MakeWeak(void) { persistent().SetWeak(this, WeakCallback); persistent().MarkIndependent(); } /* Ref() marks the object as being attached to an event loop. * Refed objects will not be garbage collected, even if * all references are lost. */ virtual void Ref() { assert(!persistent().IsEmpty()); persistent().ClearWeak(); refs_++; } /* Unref() marks an object as detached from the event loop. This is its * default state. When an object with a "weak" reference changes from * attached to detached state it will be freed. Be careful not to access * the object after making this call as it might be gone! * (A "weak reference" means an object that only has a * persistant handle.) * * DO NOT CALL THIS FROM DESTRUCTOR */ virtual void Unref() { assert(!persistent().IsEmpty()); assert(!persistent().IsWeak()); assert(refs_ > 0); if (--refs_ == 0) MakeWeak(); } int refs_; // ro private: static void WeakCallback( const v8::WeakCallbackData& data) { v8::Isolate* isolate = data.GetIsolate(); v8::HandleScope scope(isolate); ObjectWrap* wrap = data.GetParameter(); assert(wrap->refs_ == 0); assert(wrap->handle_.IsNearDeath()); assert( data.GetValue() == v8::Local::New(isolate, wrap->handle_)); wrap->handle_.Reset(); delete wrap; } v8::Persistent handle_; }; } // namespace node #endif // SRC_NODE_OBJECT_WRAP_H_ iojs-v1.0.2-darwin-x64/include/node/node_version.h000644 000766 000024 00000003022 12456106751 022074 0ustar00iojsstaff000000 000000 #ifndef SRC_NODE_VERSION_H_ #define SRC_NODE_VERSION_H_ #define NODE_MAJOR_VERSION 1 #define NODE_MINOR_VERSION 0 #define NODE_PATCH_VERSION 2 #define NODE_VERSION_IS_RELEASE 1 #ifndef NODE_STRINGIFY #define NODE_STRINGIFY(n) NODE_STRINGIFY_HELPER(n) #define NODE_STRINGIFY_HELPER(n) #n #endif #if NODE_VERSION_IS_RELEASE # ifndef NODE_TAG # define NODE_TAG "" # endif # define NODE_VERSION_STRING NODE_STRINGIFY(NODE_MAJOR_VERSION) "." \ NODE_STRINGIFY(NODE_MINOR_VERSION) "." \ NODE_STRINGIFY(NODE_PATCH_VERSION) \ NODE_TAG #else # ifndef NODE_TAG # define NODE_TAG "-pre" # endif # define NODE_VERSION_STRING NODE_STRINGIFY(NODE_MAJOR_VERSION) "." \ NODE_STRINGIFY(NODE_MINOR_VERSION) "." \ NODE_STRINGIFY(NODE_PATCH_VERSION) \ NODE_TAG #endif #define NODE_VERSION "v" NODE_VERSION_STRING #define NODE_VERSION_AT_LEAST(major, minor, patch) \ (( (major) < NODE_MAJOR_VERSION) \ || ((major) == NODE_MAJOR_VERSION && (minor) < NODE_MINOR_VERSION) \ || ((major) == NODE_MAJOR_VERSION && \ (minor) == NODE_MINOR_VERSION && (patch) <= NODE_PATCH_VERSION)) /** * When this version number is changed, node.js will refuse * to load older modules. This should be done whenever * an API is broken in the C++ side, including in v8 or * other dependencies. */ #define NODE_MODULE_VERSION 42 /* io.js v1.0.0 */ #endif /* SRC_NODE_VERSION_H_ */ iojs-v1.0.2-darwin-x64/include/node/openssl/000755 000766 000024 00000000000 12456115120 020705 5ustar00iojsstaff000000 000000 iojs-v1.0.2-darwin-x64/include/node/pthread-fixes.h000644 000766 000024 00000005561 12455173732 022161 0ustar00iojsstaff000000 000000 /* Copyright (c) 2013, Sony Mobile Communications AB * Copyright (c) 2012, Google Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Google Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ #ifndef GOOGLE_BREAKPAD_COMMON_ANDROID_TESTING_PTHREAD_FIXES_H #define GOOGLE_BREAKPAD_COMMON_ANDROID_TESTING_PTHREAD_FIXES_H #include /*Android doesn't provide pthread_barrier_t for now.*/ #ifndef PTHREAD_BARRIER_SERIAL_THREAD /* Anything except 0 will do here.*/ #define PTHREAD_BARRIER_SERIAL_THREAD 0x12345 typedef struct { pthread_mutex_t mutex; pthread_cond_t cond; unsigned count; } pthread_barrier_t; int pthread_barrier_init(pthread_barrier_t* barrier, const void* barrier_attr, unsigned count); int pthread_barrier_wait(pthread_barrier_t* barrier); int pthread_barrier_destroy(pthread_barrier_t *barrier); #endif /* defined(PTHREAD_BARRIER_SERIAL_THREAD) */ int pthread_yield(void); /* Workaround pthread_sigmask() returning EINVAL on versions < 4.1 by * replacing all calls to pthread_sigmask with sigprocmask. See: * https://android.googlesource.com/platform/bionic/+/9bf330b5 * https://code.google.com/p/android/issues/detail?id=15337 */ int uv__pthread_sigmask(int how, const sigset_t* set, sigset_t* oset); #ifdef pthread_sigmask #undef pthread_sigmask #endif #define pthread_sigmask(how, set, oldset) uv__pthread_sigmask(how, set, oldset) #endif /* GOOGLE_BREAKPAD_COMMON_ANDROID_TESTING_PTHREAD_FIXES_H */ iojs-v1.0.2-darwin-x64/include/node/smalloc.h000644 000766 000024 00000006541 12455173734 021051 0ustar00iojsstaff000000 000000 #ifndef SRC_SMALLOC_H_ #define SRC_SMALLOC_H_ #include "node.h" #include "v8.h" namespace node { // Forward declaration class Environment; /** * Simple memory allocator. * * Utilities for external memory allocation management. Is an abstraction for * v8's external array data handling to simplify and centralize how this is * managed. */ namespace smalloc { // mirrors deps/v8/src/objects.h static const unsigned int kMaxLength = 0x3fffffff; NODE_EXTERN typedef void (*FreeCallback)(char* data, void* hint); /** * Return byte size of external array type. */ NODE_EXTERN size_t ExternalArraySize(enum v8::ExternalArrayType type); /** * Allocate external array data onto obj. * * Passed data transfers ownership, and if no callback is passed then memory * will automatically be free'd using free() (not delete[]). * * length is always the byte size of the data. Not the length of the external * array. This intentionally differs from the JS API so users always know * exactly how much memory is being allocated, regardless of the external array * type. For this reason the helper function ExternalArraySize is provided to * help determine the appropriate byte size to be allocated. * * In the following example we're allocating a Float array and setting the * "length" property on the Object: * * \code * size_t array_length = 8; * size_t byte_length = node::smalloc::ExternalArraySize( * v8::kExternalFloatArray); * v8::Local obj = v8::Object::New(); * char* data = static_cast(malloc(byte_length * array_length)); * node::smalloc::Alloc(obj, data, byte_length, v8::kExternalFloatArray); * obj->Set(v8::String::NewFromUtf8("length"), * v8::Integer::NewFromUnsigned(array_length)); * \code */ NODE_EXTERN void Alloc(Environment* env, v8::Handle obj, size_t length, enum v8::ExternalArrayType type = v8::kExternalUnsignedByteArray); NODE_EXTERN void Alloc(Environment* env, v8::Handle obj, char* data, size_t length, enum v8::ExternalArrayType type = v8::kExternalUnsignedByteArray); NODE_EXTERN void Alloc(Environment* env, v8::Handle obj, size_t length, FreeCallback fn, void* hint, enum v8::ExternalArrayType type = v8::kExternalUnsignedByteArray); NODE_EXTERN void Alloc(Environment* env, v8::Handle obj, char* data, size_t length, FreeCallback fn, void* hint, enum v8::ExternalArrayType type = v8::kExternalUnsignedByteArray); /** * Free memory associated with an externally allocated object. If no external * memory is allocated to the object then nothing will happen. */ NODE_EXTERN void AllocDispose(Environment* env, v8::Handle obj); /** * Check if the Object has externally allocated memory. */ NODE_EXTERN bool HasExternalData(Environment* env, v8::Local obj); } // namespace smalloc } // namespace node #endif // SRC_SMALLOC_H_ iojs-v1.0.2-darwin-x64/include/node/stdint-msvc2008.h000644 000766 000024 00000017060 12455173732 022200 0ustar00iojsstaff000000 000000 // ISO C9x compliant stdint.h for Microsoft Visual Studio // Based on ISO/IEC 9899:TC2 Committee draft (May 6, 2005) WG14/N1124 // // Copyright (c) 2006-2008 Alexander Chemeris // // Redistribution and use in source and binary forms, with or without // modification, are permitted provided that the following conditions are met: // // 1. Redistributions of source code must retain the above copyright notice, // this list of conditions and the following disclaimer. // // 2. Redistributions in binary form must reproduce the above copyright // notice, this list of conditions and the following disclaimer in the // documentation and/or other materials provided with the distribution. // // 3. The name of the author may be used to endorse or promote products // derived from this software without specific prior written permission. // // THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED // WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF // MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO // EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, // SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, // PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; // OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, // WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR // OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF // ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. // /////////////////////////////////////////////////////////////////////////////// #ifndef _MSC_VER // [ #error "Use this header only with Microsoft Visual C++ compilers!" #endif // _MSC_VER ] #ifndef _MSC_STDINT_H_ // [ #define _MSC_STDINT_H_ #if _MSC_VER > 1000 #pragma once #endif #include // For Visual Studio 6 in C++ mode and for many Visual Studio versions when // compiling for ARM we should wrap include with 'extern "C++" {}' // or compiler give many errors like this: // error C2733: second C linkage of overloaded function 'wmemchr' not allowed #ifdef __cplusplus extern "C" { #endif # include #ifdef __cplusplus } #endif // Define _W64 macros to mark types changing their size, like intptr_t. #ifndef _W64 # if !defined(__midl) && (defined(_X86_) || defined(_M_IX86)) && _MSC_VER >= 1300 # define _W64 __w64 # else # define _W64 # endif #endif // 7.18.1 Integer types // 7.18.1.1 Exact-width integer types // Visual Studio 6 and Embedded Visual C++ 4 doesn't // realize that, e.g. char has the same size as __int8 // so we give up on __intX for them. #if (_MSC_VER < 1300) typedef signed char int8_t; typedef signed short int16_t; typedef signed int int32_t; typedef unsigned char uint8_t; typedef unsigned short uint16_t; typedef unsigned int uint32_t; #else typedef signed __int8 int8_t; typedef signed __int16 int16_t; typedef signed __int32 int32_t; typedef unsigned __int8 uint8_t; typedef unsigned __int16 uint16_t; typedef unsigned __int32 uint32_t; #endif typedef signed __int64 int64_t; typedef unsigned __int64 uint64_t; // 7.18.1.2 Minimum-width integer types typedef int8_t int_least8_t; typedef int16_t int_least16_t; typedef int32_t int_least32_t; typedef int64_t int_least64_t; typedef uint8_t uint_least8_t; typedef uint16_t uint_least16_t; typedef uint32_t uint_least32_t; typedef uint64_t uint_least64_t; // 7.18.1.3 Fastest minimum-width integer types typedef int8_t int_fast8_t; typedef int16_t int_fast16_t; typedef int32_t int_fast32_t; typedef int64_t int_fast64_t; typedef uint8_t uint_fast8_t; typedef uint16_t uint_fast16_t; typedef uint32_t uint_fast32_t; typedef uint64_t uint_fast64_t; // 7.18.1.4 Integer types capable of holding object pointers #ifdef _WIN64 // [ typedef signed __int64 intptr_t; typedef unsigned __int64 uintptr_t; #else // _WIN64 ][ typedef _W64 signed int intptr_t; typedef _W64 unsigned int uintptr_t; #endif // _WIN64 ] // 7.18.1.5 Greatest-width integer types typedef int64_t intmax_t; typedef uint64_t uintmax_t; // 7.18.2 Limits of specified-width integer types #if !defined(__cplusplus) || defined(__STDC_LIMIT_MACROS) // [ See footnote 220 at page 257 and footnote 221 at page 259 // 7.18.2.1 Limits of exact-width integer types #define INT8_MIN ((int8_t)_I8_MIN) #define INT8_MAX _I8_MAX #define INT16_MIN ((int16_t)_I16_MIN) #define INT16_MAX _I16_MAX #define INT32_MIN ((int32_t)_I32_MIN) #define INT32_MAX _I32_MAX #define INT64_MIN ((int64_t)_I64_MIN) #define INT64_MAX _I64_MAX #define UINT8_MAX _UI8_MAX #define UINT16_MAX _UI16_MAX #define UINT32_MAX _UI32_MAX #define UINT64_MAX _UI64_MAX // 7.18.2.2 Limits of minimum-width integer types #define INT_LEAST8_MIN INT8_MIN #define INT_LEAST8_MAX INT8_MAX #define INT_LEAST16_MIN INT16_MIN #define INT_LEAST16_MAX INT16_MAX #define INT_LEAST32_MIN INT32_MIN #define INT_LEAST32_MAX INT32_MAX #define INT_LEAST64_MIN INT64_MIN #define INT_LEAST64_MAX INT64_MAX #define UINT_LEAST8_MAX UINT8_MAX #define UINT_LEAST16_MAX UINT16_MAX #define UINT_LEAST32_MAX UINT32_MAX #define UINT_LEAST64_MAX UINT64_MAX // 7.18.2.3 Limits of fastest minimum-width integer types #define INT_FAST8_MIN INT8_MIN #define INT_FAST8_MAX INT8_MAX #define INT_FAST16_MIN INT16_MIN #define INT_FAST16_MAX INT16_MAX #define INT_FAST32_MIN INT32_MIN #define INT_FAST32_MAX INT32_MAX #define INT_FAST64_MIN INT64_MIN #define INT_FAST64_MAX INT64_MAX #define UINT_FAST8_MAX UINT8_MAX #define UINT_FAST16_MAX UINT16_MAX #define UINT_FAST32_MAX UINT32_MAX #define UINT_FAST64_MAX UINT64_MAX // 7.18.2.4 Limits of integer types capable of holding object pointers #ifdef _WIN64 // [ # define INTPTR_MIN INT64_MIN # define INTPTR_MAX INT64_MAX # define UINTPTR_MAX UINT64_MAX #else // _WIN64 ][ # define INTPTR_MIN INT32_MIN # define INTPTR_MAX INT32_MAX # define UINTPTR_MAX UINT32_MAX #endif // _WIN64 ] // 7.18.2.5 Limits of greatest-width integer types #define INTMAX_MIN INT64_MIN #define INTMAX_MAX INT64_MAX #define UINTMAX_MAX UINT64_MAX // 7.18.3 Limits of other integer types #ifdef _WIN64 // [ # define PTRDIFF_MIN _I64_MIN # define PTRDIFF_MAX _I64_MAX #else // _WIN64 ][ # define PTRDIFF_MIN _I32_MIN # define PTRDIFF_MAX _I32_MAX #endif // _WIN64 ] #define SIG_ATOMIC_MIN INT_MIN #define SIG_ATOMIC_MAX INT_MAX #ifndef SIZE_MAX // [ # ifdef _WIN64 // [ # define SIZE_MAX _UI64_MAX # else // _WIN64 ][ # define SIZE_MAX _UI32_MAX # endif // _WIN64 ] #endif // SIZE_MAX ] // WCHAR_MIN and WCHAR_MAX are also defined in #ifndef WCHAR_MIN // [ # define WCHAR_MIN 0 #endif // WCHAR_MIN ] #ifndef WCHAR_MAX // [ # define WCHAR_MAX _UI16_MAX #endif // WCHAR_MAX ] #define WINT_MIN 0 #define WINT_MAX _UI16_MAX #endif // __STDC_LIMIT_MACROS ] // 7.18.4 Limits of other integer types #if !defined(__cplusplus) || defined(__STDC_CONSTANT_MACROS) // [ See footnote 224 at page 260 // 7.18.4.1 Macros for minimum-width integer constants #define INT8_C(val) val##i8 #define INT16_C(val) val##i16 #define INT32_C(val) val##i32 #define INT64_C(val) val##i64 #define UINT8_C(val) val##ui8 #define UINT16_C(val) val##ui16 #define UINT32_C(val) val##ui32 #define UINT64_C(val) val##ui64 // 7.18.4.2 Macros for greatest-width integer constants #define INTMAX_C INT64_C #define UINTMAX_C UINT64_C #endif // __STDC_CONSTANT_MACROS ] #endif // _MSC_STDINT_H_ ] iojs-v1.0.2-darwin-x64/include/node/tree.h000644 000766 000024 00000147231 12455173732 020356 0ustar00iojsstaff000000 000000 /*- * Copyright 2002 Niels Provos * All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * 2. Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in the * documentation and/or other materials provided with the distribution. * * THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR * IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES * OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. * IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT * NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF * THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ #ifndef UV_TREE_H_ #define UV_TREE_H_ #ifndef UV__UNUSED # if __GNUC__ # define UV__UNUSED __attribute__((unused)) # else # define UV__UNUSED # endif #endif /* * This file defines data structures for different types of trees: * splay trees and red-black trees. * * A splay tree is a self-organizing data structure. Every operation * on the tree causes a splay to happen. The splay moves the requested * node to the root of the tree and partly rebalances it. * * This has the benefit that request locality causes faster lookups as * the requested nodes move to the top of the tree. On the other hand, * every lookup causes memory writes. * * The Balance Theorem bounds the total access time for m operations * and n inserts on an initially empty tree as O((m + n)lg n). The * amortized cost for a sequence of m accesses to a splay tree is O(lg n); * * A red-black tree is a binary search tree with the node color as an * extra attribute. It fulfills a set of conditions: * - every search path from the root to a leaf consists of the * same number of black nodes, * - each red node (except for the root) has a black parent, * - each leaf node is black. * * Every operation on a red-black tree is bounded as O(lg n). * The maximum height of a red-black tree is 2lg (n+1). */ #define SPLAY_HEAD(name, type) \ struct name { \ struct type *sph_root; /* root of the tree */ \ } #define SPLAY_INITIALIZER(root) \ { NULL } #define SPLAY_INIT(root) do { \ (root)->sph_root = NULL; \ } while (/*CONSTCOND*/ 0) #define SPLAY_ENTRY(type) \ struct { \ struct type *spe_left; /* left element */ \ struct type *spe_right; /* right element */ \ } #define SPLAY_LEFT(elm, field) (elm)->field.spe_left #define SPLAY_RIGHT(elm, field) (elm)->field.spe_right #define SPLAY_ROOT(head) (head)->sph_root #define SPLAY_EMPTY(head) (SPLAY_ROOT(head) == NULL) /* SPLAY_ROTATE_{LEFT,RIGHT} expect that tmp hold SPLAY_{RIGHT,LEFT} */ #define SPLAY_ROTATE_RIGHT(head, tmp, field) do { \ SPLAY_LEFT((head)->sph_root, field) = SPLAY_RIGHT(tmp, field); \ SPLAY_RIGHT(tmp, field) = (head)->sph_root; \ (head)->sph_root = tmp; \ } while (/*CONSTCOND*/ 0) #define SPLAY_ROTATE_LEFT(head, tmp, field) do { \ SPLAY_RIGHT((head)->sph_root, field) = SPLAY_LEFT(tmp, field); \ SPLAY_LEFT(tmp, field) = (head)->sph_root; \ (head)->sph_root = tmp; \ } while (/*CONSTCOND*/ 0) #define SPLAY_LINKLEFT(head, tmp, field) do { \ SPLAY_LEFT(tmp, field) = (head)->sph_root; \ tmp = (head)->sph_root; \ (head)->sph_root = SPLAY_LEFT((head)->sph_root, field); \ } while (/*CONSTCOND*/ 0) #define SPLAY_LINKRIGHT(head, tmp, field) do { \ SPLAY_RIGHT(tmp, field) = (head)->sph_root; \ tmp = (head)->sph_root; \ (head)->sph_root = SPLAY_RIGHT((head)->sph_root, field); \ } while (/*CONSTCOND*/ 0) #define SPLAY_ASSEMBLE(head, node, left, right, field) do { \ SPLAY_RIGHT(left, field) = SPLAY_LEFT((head)->sph_root, field); \ SPLAY_LEFT(right, field) = SPLAY_RIGHT((head)->sph_root, field); \ SPLAY_LEFT((head)->sph_root, field) = SPLAY_RIGHT(node, field); \ SPLAY_RIGHT((head)->sph_root, field) = SPLAY_LEFT(node, field); \ } while (/*CONSTCOND*/ 0) /* Generates prototypes and inline functions */ #define SPLAY_PROTOTYPE(name, type, field, cmp) \ void name##_SPLAY(struct name *, struct type *); \ void name##_SPLAY_MINMAX(struct name *, int); \ struct type *name##_SPLAY_INSERT(struct name *, struct type *); \ struct type *name##_SPLAY_REMOVE(struct name *, struct type *); \ \ /* Finds the node with the same key as elm */ \ static __inline struct type * \ name##_SPLAY_FIND(struct name *head, struct type *elm) \ { \ if (SPLAY_EMPTY(head)) \ return(NULL); \ name##_SPLAY(head, elm); \ if ((cmp)(elm, (head)->sph_root) == 0) \ return (head->sph_root); \ return (NULL); \ } \ \ static __inline struct type * \ name##_SPLAY_NEXT(struct name *head, struct type *elm) \ { \ name##_SPLAY(head, elm); \ if (SPLAY_RIGHT(elm, field) != NULL) { \ elm = SPLAY_RIGHT(elm, field); \ while (SPLAY_LEFT(elm, field) != NULL) { \ elm = SPLAY_LEFT(elm, field); \ } \ } else \ elm = NULL; \ return (elm); \ } \ \ static __inline struct type * \ name##_SPLAY_MIN_MAX(struct name *head, int val) \ { \ name##_SPLAY_MINMAX(head, val); \ return (SPLAY_ROOT(head)); \ } /* Main splay operation. * Moves node close to the key of elm to top */ #define SPLAY_GENERATE(name, type, field, cmp) \ struct type * \ name##_SPLAY_INSERT(struct name *head, struct type *elm) \ { \ if (SPLAY_EMPTY(head)) { \ SPLAY_LEFT(elm, field) = SPLAY_RIGHT(elm, field) = NULL; \ } else { \ int __comp; \ name##_SPLAY(head, elm); \ __comp = (cmp)(elm, (head)->sph_root); \ if(__comp < 0) { \ SPLAY_LEFT(elm, field) = SPLAY_LEFT((head)->sph_root, field); \ SPLAY_RIGHT(elm, field) = (head)->sph_root; \ SPLAY_LEFT((head)->sph_root, field) = NULL; \ } else if (__comp > 0) { \ SPLAY_RIGHT(elm, field) = SPLAY_RIGHT((head)->sph_root, field); \ SPLAY_LEFT(elm, field) = (head)->sph_root; \ SPLAY_RIGHT((head)->sph_root, field) = NULL; \ } else \ return ((head)->sph_root); \ } \ (head)->sph_root = (elm); \ return (NULL); \ } \ \ struct type * \ name##_SPLAY_REMOVE(struct name *head, struct type *elm) \ { \ struct type *__tmp; \ if (SPLAY_EMPTY(head)) \ return (NULL); \ name##_SPLAY(head, elm); \ if ((cmp)(elm, (head)->sph_root) == 0) { \ if (SPLAY_LEFT((head)->sph_root, field) == NULL) { \ (head)->sph_root = SPLAY_RIGHT((head)->sph_root, field); \ } else { \ __tmp = SPLAY_RIGHT((head)->sph_root, field); \ (head)->sph_root = SPLAY_LEFT((head)->sph_root, field); \ name##_SPLAY(head, elm); \ SPLAY_RIGHT((head)->sph_root, field) = __tmp; \ } \ return (elm); \ } \ return (NULL); \ } \ \ void \ name##_SPLAY(struct name *head, struct type *elm) \ { \ struct type __node, *__left, *__right, *__tmp; \ int __comp; \ \ SPLAY_LEFT(&__node, field) = SPLAY_RIGHT(&__node, field) = NULL; \ __left = __right = &__node; \ \ while ((__comp = (cmp)(elm, (head)->sph_root)) != 0) { \ if (__comp < 0) { \ __tmp = SPLAY_LEFT((head)->sph_root, field); \ if (__tmp == NULL) \ break; \ if ((cmp)(elm, __tmp) < 0){ \ SPLAY_ROTATE_RIGHT(head, __tmp, field); \ if (SPLAY_LEFT((head)->sph_root, field) == NULL) \ break; \ } \ SPLAY_LINKLEFT(head, __right, field); \ } else if (__comp > 0) { \ __tmp = SPLAY_RIGHT((head)->sph_root, field); \ if (__tmp == NULL) \ break; \ if ((cmp)(elm, __tmp) > 0){ \ SPLAY_ROTATE_LEFT(head, __tmp, field); \ if (SPLAY_RIGHT((head)->sph_root, field) == NULL) \ break; \ } \ SPLAY_LINKRIGHT(head, __left, field); \ } \ } \ SPLAY_ASSEMBLE(head, &__node, __left, __right, field); \ } \ \ /* Splay with either the minimum or the maximum element \ * Used to find minimum or maximum element in tree. \ */ \ void name##_SPLAY_MINMAX(struct name *head, int __comp) \ { \ struct type __node, *__left, *__right, *__tmp; \ \ SPLAY_LEFT(&__node, field) = SPLAY_RIGHT(&__node, field) = NULL; \ __left = __right = &__node; \ \ while (1) { \ if (__comp < 0) { \ __tmp = SPLAY_LEFT((head)->sph_root, field); \ if (__tmp == NULL) \ break; \ if (__comp < 0){ \ SPLAY_ROTATE_RIGHT(head, __tmp, field); \ if (SPLAY_LEFT((head)->sph_root, field) == NULL) \ break; \ } \ SPLAY_LINKLEFT(head, __right, field); \ } else if (__comp > 0) { \ __tmp = SPLAY_RIGHT((head)->sph_root, field); \ if (__tmp == NULL) \ break; \ if (__comp > 0) { \ SPLAY_ROTATE_LEFT(head, __tmp, field); \ if (SPLAY_RIGHT((head)->sph_root, field) == NULL) \ break; \ } \ SPLAY_LINKRIGHT(head, __left, field); \ } \ } \ SPLAY_ASSEMBLE(head, &__node, __left, __right, field); \ } #define SPLAY_NEGINF -1 #define SPLAY_INF 1 #define SPLAY_INSERT(name, x, y) name##_SPLAY_INSERT(x, y) #define SPLAY_REMOVE(name, x, y) name##_SPLAY_REMOVE(x, y) #define SPLAY_FIND(name, x, y) name##_SPLAY_FIND(x, y) #define SPLAY_NEXT(name, x, y) name##_SPLAY_NEXT(x, y) #define SPLAY_MIN(name, x) (SPLAY_EMPTY(x) ? NULL \ : name##_SPLAY_MIN_MAX(x, SPLAY_NEGINF)) #define SPLAY_MAX(name, x) (SPLAY_EMPTY(x) ? NULL \ : name##_SPLAY_MIN_MAX(x, SPLAY_INF)) #define SPLAY_FOREACH(x, name, head) \ for ((x) = SPLAY_MIN(name, head); \ (x) != NULL; \ (x) = SPLAY_NEXT(name, head, x)) /* Macros that define a red-black tree */ #define RB_HEAD(name, type) \ struct name { \ struct type *rbh_root; /* root of the tree */ \ } #define RB_INITIALIZER(root) \ { NULL } #define RB_INIT(root) do { \ (root)->rbh_root = NULL; \ } while (/*CONSTCOND*/ 0) #define RB_BLACK 0 #define RB_RED 1 #define RB_ENTRY(type) \ struct { \ struct type *rbe_left; /* left element */ \ struct type *rbe_right; /* right element */ \ struct type *rbe_parent; /* parent element */ \ int rbe_color; /* node color */ \ } #define RB_LEFT(elm, field) (elm)->field.rbe_left #define RB_RIGHT(elm, field) (elm)->field.rbe_right #define RB_PARENT(elm, field) (elm)->field.rbe_parent #define RB_COLOR(elm, field) (elm)->field.rbe_color #define RB_ROOT(head) (head)->rbh_root #define RB_EMPTY(head) (RB_ROOT(head) == NULL) #define RB_SET(elm, parent, field) do { \ RB_PARENT(elm, field) = parent; \ RB_LEFT(elm, field) = RB_RIGHT(elm, field) = NULL; \ RB_COLOR(elm, field) = RB_RED; \ } while (/*CONSTCOND*/ 0) #define RB_SET_BLACKRED(black, red, field) do { \ RB_COLOR(black, field) = RB_BLACK; \ RB_COLOR(red, field) = RB_RED; \ } while (/*CONSTCOND*/ 0) #ifndef RB_AUGMENT #define RB_AUGMENT(x) do {} while (0) #endif #define RB_ROTATE_LEFT(head, elm, tmp, field) do { \ (tmp) = RB_RIGHT(elm, field); \ if ((RB_RIGHT(elm, field) = RB_LEFT(tmp, field)) != NULL) { \ RB_PARENT(RB_LEFT(tmp, field), field) = (elm); \ } \ RB_AUGMENT(elm); \ if ((RB_PARENT(tmp, field) = RB_PARENT(elm, field)) != NULL) { \ if ((elm) == RB_LEFT(RB_PARENT(elm, field), field)) \ RB_LEFT(RB_PARENT(elm, field), field) = (tmp); \ else \ RB_RIGHT(RB_PARENT(elm, field), field) = (tmp); \ } else \ (head)->rbh_root = (tmp); \ RB_LEFT(tmp, field) = (elm); \ RB_PARENT(elm, field) = (tmp); \ RB_AUGMENT(tmp); \ if ((RB_PARENT(tmp, field))) \ RB_AUGMENT(RB_PARENT(tmp, field)); \ } while (/*CONSTCOND*/ 0) #define RB_ROTATE_RIGHT(head, elm, tmp, field) do { \ (tmp) = RB_LEFT(elm, field); \ if ((RB_LEFT(elm, field) = RB_RIGHT(tmp, field)) != NULL) { \ RB_PARENT(RB_RIGHT(tmp, field), field) = (elm); \ } \ RB_AUGMENT(elm); \ if ((RB_PARENT(tmp, field) = RB_PARENT(elm, field)) != NULL) { \ if ((elm) == RB_LEFT(RB_PARENT(elm, field), field)) \ RB_LEFT(RB_PARENT(elm, field), field) = (tmp); \ else \ RB_RIGHT(RB_PARENT(elm, field), field) = (tmp); \ } else \ (head)->rbh_root = (tmp); \ RB_RIGHT(tmp, field) = (elm); \ RB_PARENT(elm, field) = (tmp); \ RB_AUGMENT(tmp); \ if ((RB_PARENT(tmp, field))) \ RB_AUGMENT(RB_PARENT(tmp, field)); \ } while (/*CONSTCOND*/ 0) /* Generates prototypes and inline functions */ #define RB_PROTOTYPE(name, type, field, cmp) \ RB_PROTOTYPE_INTERNAL(name, type, field, cmp,) #define RB_PROTOTYPE_STATIC(name, type, field, cmp) \ RB_PROTOTYPE_INTERNAL(name, type, field, cmp, UV__UNUSED static) #define RB_PROTOTYPE_INTERNAL(name, type, field, cmp, attr) \ attr void name##_RB_INSERT_COLOR(struct name *, struct type *); \ attr void name##_RB_REMOVE_COLOR(struct name *, struct type *, struct type *);\ attr struct type *name##_RB_REMOVE(struct name *, struct type *); \ attr struct type *name##_RB_INSERT(struct name *, struct type *); \ attr struct type *name##_RB_FIND(struct name *, struct type *); \ attr struct type *name##_RB_NFIND(struct name *, struct type *); \ attr struct type *name##_RB_NEXT(struct type *); \ attr struct type *name##_RB_PREV(struct type *); \ attr struct type *name##_RB_MINMAX(struct name *, int); \ \ /* Main rb operation. * Moves node close to the key of elm to top */ #define RB_GENERATE(name, type, field, cmp) \ RB_GENERATE_INTERNAL(name, type, field, cmp,) #define RB_GENERATE_STATIC(name, type, field, cmp) \ RB_GENERATE_INTERNAL(name, type, field, cmp, UV__UNUSED static) #define RB_GENERATE_INTERNAL(name, type, field, cmp, attr) \ attr void \ name##_RB_INSERT_COLOR(struct name *head, struct type *elm) \ { \ struct type *parent, *gparent, *tmp; \ while ((parent = RB_PARENT(elm, field)) != NULL && \ RB_COLOR(parent, field) == RB_RED) { \ gparent = RB_PARENT(parent, field); \ if (parent == RB_LEFT(gparent, field)) { \ tmp = RB_RIGHT(gparent, field); \ if (tmp && RB_COLOR(tmp, field) == RB_RED) { \ RB_COLOR(tmp, field) = RB_BLACK; \ RB_SET_BLACKRED(parent, gparent, field); \ elm = gparent; \ continue; \ } \ if (RB_RIGHT(parent, field) == elm) { \ RB_ROTATE_LEFT(head, parent, tmp, field); \ tmp = parent; \ parent = elm; \ elm = tmp; \ } \ RB_SET_BLACKRED(parent, gparent, field); \ RB_ROTATE_RIGHT(head, gparent, tmp, field); \ } else { \ tmp = RB_LEFT(gparent, field); \ if (tmp && RB_COLOR(tmp, field) == RB_RED) { \ RB_COLOR(tmp, field) = RB_BLACK; \ RB_SET_BLACKRED(parent, gparent, field); \ elm = gparent; \ continue; \ } \ if (RB_LEFT(parent, field) == elm) { \ RB_ROTATE_RIGHT(head, parent, tmp, field); \ tmp = parent; \ parent = elm; \ elm = tmp; \ } \ RB_SET_BLACKRED(parent, gparent, field); \ RB_ROTATE_LEFT(head, gparent, tmp, field); \ } \ } \ RB_COLOR(head->rbh_root, field) = RB_BLACK; \ } \ \ attr void \ name##_RB_REMOVE_COLOR(struct name *head, struct type *parent, \ struct type *elm) \ { \ struct type *tmp; \ while ((elm == NULL || RB_COLOR(elm, field) == RB_BLACK) && \ elm != RB_ROOT(head)) { \ if (RB_LEFT(parent, field) == elm) { \ tmp = RB_RIGHT(parent, field); \ if (RB_COLOR(tmp, field) == RB_RED) { \ RB_SET_BLACKRED(tmp, parent, field); \ RB_ROTATE_LEFT(head, parent, tmp, field); \ tmp = RB_RIGHT(parent, field); \ } \ if ((RB_LEFT(tmp, field) == NULL || \ RB_COLOR(RB_LEFT(tmp, field), field) == RB_BLACK) && \ (RB_RIGHT(tmp, field) == NULL || \ RB_COLOR(RB_RIGHT(tmp, field), field) == RB_BLACK)) { \ RB_COLOR(tmp, field) = RB_RED; \ elm = parent; \ parent = RB_PARENT(elm, field); \ } else { \ if (RB_RIGHT(tmp, field) == NULL || \ RB_COLOR(RB_RIGHT(tmp, field), field) == RB_BLACK) { \ struct type *oleft; \ if ((oleft = RB_LEFT(tmp, field)) \ != NULL) \ RB_COLOR(oleft, field) = RB_BLACK; \ RB_COLOR(tmp, field) = RB_RED; \ RB_ROTATE_RIGHT(head, tmp, oleft, field); \ tmp = RB_RIGHT(parent, field); \ } \ RB_COLOR(tmp, field) = RB_COLOR(parent, field); \ RB_COLOR(parent, field) = RB_BLACK; \ if (RB_RIGHT(tmp, field)) \ RB_COLOR(RB_RIGHT(tmp, field), field) = RB_BLACK; \ RB_ROTATE_LEFT(head, parent, tmp, field); \ elm = RB_ROOT(head); \ break; \ } \ } else { \ tmp = RB_LEFT(parent, field); \ if (RB_COLOR(tmp, field) == RB_RED) { \ RB_SET_BLACKRED(tmp, parent, field); \ RB_ROTATE_RIGHT(head, parent, tmp, field); \ tmp = RB_LEFT(parent, field); \ } \ if ((RB_LEFT(tmp, field) == NULL || \ RB_COLOR(RB_LEFT(tmp, field), field) == RB_BLACK) && \ (RB_RIGHT(tmp, field) == NULL || \ RB_COLOR(RB_RIGHT(tmp, field), field) == RB_BLACK)) { \ RB_COLOR(tmp, field) = RB_RED; \ elm = parent; \ parent = RB_PARENT(elm, field); \ } else { \ if (RB_LEFT(tmp, field) == NULL || \ RB_COLOR(RB_LEFT(tmp, field), field) == RB_BLACK) { \ struct type *oright; \ if ((oright = RB_RIGHT(tmp, field)) \ != NULL) \ RB_COLOR(oright, field) = RB_BLACK; \ RB_COLOR(tmp, field) = RB_RED; \ RB_ROTATE_LEFT(head, tmp, oright, field); \ tmp = RB_LEFT(parent, field); \ } \ RB_COLOR(tmp, field) = RB_COLOR(parent, field); \ RB_COLOR(parent, field) = RB_BLACK; \ if (RB_LEFT(tmp, field)) \ RB_COLOR(RB_LEFT(tmp, field), field) = RB_BLACK; \ RB_ROTATE_RIGHT(head, parent, tmp, field); \ elm = RB_ROOT(head); \ break; \ } \ } \ } \ if (elm) \ RB_COLOR(elm, field) = RB_BLACK; \ } \ \ attr struct type * \ name##_RB_REMOVE(struct name *head, struct type *elm) \ { \ struct type *child, *parent, *old = elm; \ int color; \ if (RB_LEFT(elm, field) == NULL) \ child = RB_RIGHT(elm, field); \ else if (RB_RIGHT(elm, field) == NULL) \ child = RB_LEFT(elm, field); \ else { \ struct type *left; \ elm = RB_RIGHT(elm, field); \ while ((left = RB_LEFT(elm, field)) != NULL) \ elm = left; \ child = RB_RIGHT(elm, field); \ parent = RB_PARENT(elm, field); \ color = RB_COLOR(elm, field); \ if (child) \ RB_PARENT(child, field) = parent; \ if (parent) { \ if (RB_LEFT(parent, field) == elm) \ RB_LEFT(parent, field) = child; \ else \ RB_RIGHT(parent, field) = child; \ RB_AUGMENT(parent); \ } else \ RB_ROOT(head) = child; \ if (RB_PARENT(elm, field) == old) \ parent = elm; \ (elm)->field = (old)->field; \ if (RB_PARENT(old, field)) { \ if (RB_LEFT(RB_PARENT(old, field), field) == old) \ RB_LEFT(RB_PARENT(old, field), field) = elm; \ else \ RB_RIGHT(RB_PARENT(old, field), field) = elm; \ RB_AUGMENT(RB_PARENT(old, field)); \ } else \ RB_ROOT(head) = elm; \ RB_PARENT(RB_LEFT(old, field), field) = elm; \ if (RB_RIGHT(old, field)) \ RB_PARENT(RB_RIGHT(old, field), field) = elm; \ if (parent) { \ left = parent; \ do { \ RB_AUGMENT(left); \ } while ((left = RB_PARENT(left, field)) != NULL); \ } \ goto color; \ } \ parent = RB_PARENT(elm, field); \ color = RB_COLOR(elm, field); \ if (child) \ RB_PARENT(child, field) = parent; \ if (parent) { \ if (RB_LEFT(parent, field) == elm) \ RB_LEFT(parent, field) = child; \ else \ RB_RIGHT(parent, field) = child; \ RB_AUGMENT(parent); \ } else \ RB_ROOT(head) = child; \ color: \ if (color == RB_BLACK) \ name##_RB_REMOVE_COLOR(head, parent, child); \ return (old); \ } \ \ /* Inserts a node into the RB tree */ \ attr struct type * \ name##_RB_INSERT(struct name *head, struct type *elm) \ { \ struct type *tmp; \ struct type *parent = NULL; \ int comp = 0; \ tmp = RB_ROOT(head); \ while (tmp) { \ parent = tmp; \ comp = (cmp)(elm, parent); \ if (comp < 0) \ tmp = RB_LEFT(tmp, field); \ else if (comp > 0) \ tmp = RB_RIGHT(tmp, field); \ else \ return (tmp); \ } \ RB_SET(elm, parent, field); \ if (parent != NULL) { \ if (comp < 0) \ RB_LEFT(parent, field) = elm; \ else \ RB_RIGHT(parent, field) = elm; \ RB_AUGMENT(parent); \ } else \ RB_ROOT(head) = elm; \ name##_RB_INSERT_COLOR(head, elm); \ return (NULL); \ } \ \ /* Finds the node with the same key as elm */ \ attr struct type * \ name##_RB_FIND(struct name *head, struct type *elm) \ { \ struct type *tmp = RB_ROOT(head); \ int comp; \ while (tmp) { \ comp = cmp(elm, tmp); \ if (comp < 0) \ tmp = RB_LEFT(tmp, field); \ else if (comp > 0) \ tmp = RB_RIGHT(tmp, field); \ else \ return (tmp); \ } \ return (NULL); \ } \ \ /* Finds the first node greater than or equal to the search key */ \ attr struct type * \ name##_RB_NFIND(struct name *head, struct type *elm) \ { \ struct type *tmp = RB_ROOT(head); \ struct type *res = NULL; \ int comp; \ while (tmp) { \ comp = cmp(elm, tmp); \ if (comp < 0) { \ res = tmp; \ tmp = RB_LEFT(tmp, field); \ } \ else if (comp > 0) \ tmp = RB_RIGHT(tmp, field); \ else \ return (tmp); \ } \ return (res); \ } \ \ /* ARGSUSED */ \ attr struct type * \ name##_RB_NEXT(struct type *elm) \ { \ if (RB_RIGHT(elm, field)) { \ elm = RB_RIGHT(elm, field); \ while (RB_LEFT(elm, field)) \ elm = RB_LEFT(elm, field); \ } else { \ if (RB_PARENT(elm, field) && \ (elm == RB_LEFT(RB_PARENT(elm, field), field))) \ elm = RB_PARENT(elm, field); \ else { \ while (RB_PARENT(elm, field) && \ (elm == RB_RIGHT(RB_PARENT(elm, field), field))) \ elm = RB_PARENT(elm, field); \ elm = RB_PARENT(elm, field); \ } \ } \ return (elm); \ } \ \ /* ARGSUSED */ \ attr struct type * \ name##_RB_PREV(struct type *elm) \ { \ if (RB_LEFT(elm, field)) { \ elm = RB_LEFT(elm, field); \ while (RB_RIGHT(elm, field)) \ elm = RB_RIGHT(elm, field); \ } else { \ if (RB_PARENT(elm, field) && \ (elm == RB_RIGHT(RB_PARENT(elm, field), field))) \ elm = RB_PARENT(elm, field); \ else { \ while (RB_PARENT(elm, field) && \ (elm == RB_LEFT(RB_PARENT(elm, field), field))) \ elm = RB_PARENT(elm, field); \ elm = RB_PARENT(elm, field); \ } \ } \ return (elm); \ } \ \ attr struct type * \ name##_RB_MINMAX(struct name *head, int val) \ { \ struct type *tmp = RB_ROOT(head); \ struct type *parent = NULL; \ while (tmp) { \ parent = tmp; \ if (val < 0) \ tmp = RB_LEFT(tmp, field); \ else \ tmp = RB_RIGHT(tmp, field); \ } \ return (parent); \ } #define RB_NEGINF -1 #define RB_INF 1 #define RB_INSERT(name, x, y) name##_RB_INSERT(x, y) #define RB_REMOVE(name, x, y) name##_RB_REMOVE(x, y) #define RB_FIND(name, x, y) name##_RB_FIND(x, y) #define RB_NFIND(name, x, y) name##_RB_NFIND(x, y) #define RB_NEXT(name, x, y) name##_RB_NEXT(y) #define RB_PREV(name, x, y) name##_RB_PREV(y) #define RB_MIN(name, x) name##_RB_MINMAX(x, RB_NEGINF) #define RB_MAX(name, x) name##_RB_MINMAX(x, RB_INF) #define RB_FOREACH(x, name, head) \ for ((x) = RB_MIN(name, head); \ (x) != NULL; \ (x) = name##_RB_NEXT(x)) #define RB_FOREACH_FROM(x, name, y) \ for ((x) = (y); \ ((x) != NULL) && ((y) = name##_RB_NEXT(x), (x) != NULL); \ (x) = (y)) #define RB_FOREACH_SAFE(x, name, head, y) \ for ((x) = RB_MIN(name, head); \ ((x) != NULL) && ((y) = name##_RB_NEXT(x), (x) != NULL); \ (x) = (y)) #define RB_FOREACH_REVERSE(x, name, head) \ for ((x) = RB_MAX(name, head); \ (x) != NULL; \ (x) = name##_RB_PREV(x)) #define RB_FOREACH_REVERSE_FROM(x, name, y) \ for ((x) = (y); \ ((x) != NULL) && ((y) = name##_RB_PREV(x), (x) != NULL); \ (x) = (y)) #define RB_FOREACH_REVERSE_SAFE(x, name, head, y) \ for ((x) = RB_MAX(name, head); \ ((x) != NULL) && ((y) = name##_RB_PREV(x), (x) != NULL); \ (x) = (y)) #endif /* UV_TREE_H_ */ iojs-v1.0.2-darwin-x64/include/node/uv-aix.h000644 000766 000024 00000003117 12455173732 020622 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_AIX_H #define UV_AIX_H #define UV_PLATFORM_LOOP_FIELDS \ int fs_fd; \ #define UV_PLATFORM_FS_EVENT_FIELDS \ uv__io_t event_watcher; \ char *dir_filename; \ #endif /* UV_AIX_H */ iojs-v1.0.2-darwin-x64/include/node/uv-bsd.h000644 000766 000024 00000003151 12455173732 020607 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_BSD_H #define UV_BSD_H #define UV_PLATFORM_FS_EVENT_FIELDS \ uv__io_t event_watcher; \ #define UV_IO_PRIVATE_PLATFORM_FIELDS \ int rcount; \ int wcount; \ #define UV_HAVE_KQUEUE 1 #endif /* UV_BSD_H */ iojs-v1.0.2-darwin-x64/include/node/uv-darwin.h000644 000766 000024 00000006215 12455173732 021327 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_DARWIN_H #define UV_DARWIN_H #if defined(__APPLE__) && defined(__MACH__) # include # include # include # include # define UV_PLATFORM_SEM_T semaphore_t #endif #define UV_IO_PRIVATE_PLATFORM_FIELDS \ int rcount; \ int wcount; \ #define UV_PLATFORM_LOOP_FIELDS \ uv_thread_t cf_thread; \ void* _cf_reserved; \ void* cf_state; \ uv_mutex_t cf_mutex; \ uv_sem_t cf_sem; \ void* cf_signals[2]; \ #define UV_PLATFORM_FS_EVENT_FIELDS \ uv__io_t event_watcher; \ char* realpath; \ int realpath_len; \ int cf_flags; \ uv_async_t* cf_cb; \ void* cf_events[2]; \ void* cf_member[2]; \ int cf_error; \ uv_mutex_t cf_mutex; \ #define UV_STREAM_PRIVATE_PLATFORM_FIELDS \ void* select; \ #define UV_HAVE_KQUEUE 1 #endif /* UV_DARWIN_H */ iojs-v1.0.2-darwin-x64/include/node/uv-errno.h000644 000766 000024 00000021327 12455173732 021171 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_ERRNO_H_ #define UV_ERRNO_H_ #include #define UV__EOF (-4095) #define UV__UNKNOWN (-4094) #define UV__EAI_ADDRFAMILY (-3000) #define UV__EAI_AGAIN (-3001) #define UV__EAI_BADFLAGS (-3002) #define UV__EAI_CANCELED (-3003) #define UV__EAI_FAIL (-3004) #define UV__EAI_FAMILY (-3005) #define UV__EAI_MEMORY (-3006) #define UV__EAI_NODATA (-3007) #define UV__EAI_NONAME (-3008) #define UV__EAI_OVERFLOW (-3009) #define UV__EAI_SERVICE (-3010) #define UV__EAI_SOCKTYPE (-3011) #define UV__EAI_BADHINTS (-3013) #define UV__EAI_PROTOCOL (-3014) /* Only map to the system errno on non-Windows platforms. It's apparently * a fairly common practice for Windows programmers to redefine errno codes. */ #if defined(E2BIG) && !defined(_WIN32) # define UV__E2BIG (-E2BIG) #else # define UV__E2BIG (-4093) #endif #if defined(EACCES) && !defined(_WIN32) # define UV__EACCES (-EACCES) #else # define UV__EACCES (-4092) #endif #if defined(EADDRINUSE) && !defined(_WIN32) # define UV__EADDRINUSE (-EADDRINUSE) #else # define UV__EADDRINUSE (-4091) #endif #if defined(EADDRNOTAVAIL) && !defined(_WIN32) # define UV__EADDRNOTAVAIL (-EADDRNOTAVAIL) #else # define UV__EADDRNOTAVAIL (-4090) #endif #if defined(EAFNOSUPPORT) && !defined(_WIN32) # define UV__EAFNOSUPPORT (-EAFNOSUPPORT) #else # define UV__EAFNOSUPPORT (-4089) #endif #if defined(EAGAIN) && !defined(_WIN32) # define UV__EAGAIN (-EAGAIN) #else # define UV__EAGAIN (-4088) #endif #if defined(EALREADY) && !defined(_WIN32) # define UV__EALREADY (-EALREADY) #else # define UV__EALREADY (-4084) #endif #if defined(EBADF) && !defined(_WIN32) # define UV__EBADF (-EBADF) #else # define UV__EBADF (-4083) #endif #if defined(EBUSY) && !defined(_WIN32) # define UV__EBUSY (-EBUSY) #else # define UV__EBUSY (-4082) #endif #if defined(ECANCELED) && !defined(_WIN32) # define UV__ECANCELED (-ECANCELED) #else # define UV__ECANCELED (-4081) #endif #if defined(ECHARSET) && !defined(_WIN32) # define UV__ECHARSET (-ECHARSET) #else # define UV__ECHARSET (-4080) #endif #if defined(ECONNABORTED) && !defined(_WIN32) # define UV__ECONNABORTED (-ECONNABORTED) #else # define UV__ECONNABORTED (-4079) #endif #if defined(ECONNREFUSED) && !defined(_WIN32) # define UV__ECONNREFUSED (-ECONNREFUSED) #else # define UV__ECONNREFUSED (-4078) #endif #if defined(ECONNRESET) && !defined(_WIN32) # define UV__ECONNRESET (-ECONNRESET) #else # define UV__ECONNRESET (-4077) #endif #if defined(EDESTADDRREQ) && !defined(_WIN32) # define UV__EDESTADDRREQ (-EDESTADDRREQ) #else # define UV__EDESTADDRREQ (-4076) #endif #if defined(EEXIST) && !defined(_WIN32) # define UV__EEXIST (-EEXIST) #else # define UV__EEXIST (-4075) #endif #if defined(EFAULT) && !defined(_WIN32) # define UV__EFAULT (-EFAULT) #else # define UV__EFAULT (-4074) #endif #if defined(EHOSTUNREACH) && !defined(_WIN32) # define UV__EHOSTUNREACH (-EHOSTUNREACH) #else # define UV__EHOSTUNREACH (-4073) #endif #if defined(EINTR) && !defined(_WIN32) # define UV__EINTR (-EINTR) #else # define UV__EINTR (-4072) #endif #if defined(EINVAL) && !defined(_WIN32) # define UV__EINVAL (-EINVAL) #else # define UV__EINVAL (-4071) #endif #if defined(EIO) && !defined(_WIN32) # define UV__EIO (-EIO) #else # define UV__EIO (-4070) #endif #if defined(EISCONN) && !defined(_WIN32) # define UV__EISCONN (-EISCONN) #else # define UV__EISCONN (-4069) #endif #if defined(EISDIR) && !defined(_WIN32) # define UV__EISDIR (-EISDIR) #else # define UV__EISDIR (-4068) #endif #if defined(ELOOP) && !defined(_WIN32) # define UV__ELOOP (-ELOOP) #else # define UV__ELOOP (-4067) #endif #if defined(EMFILE) && !defined(_WIN32) # define UV__EMFILE (-EMFILE) #else # define UV__EMFILE (-4066) #endif #if defined(EMSGSIZE) && !defined(_WIN32) # define UV__EMSGSIZE (-EMSGSIZE) #else # define UV__EMSGSIZE (-4065) #endif #if defined(ENAMETOOLONG) && !defined(_WIN32) # define UV__ENAMETOOLONG (-ENAMETOOLONG) #else # define UV__ENAMETOOLONG (-4064) #endif #if defined(ENETDOWN) && !defined(_WIN32) # define UV__ENETDOWN (-ENETDOWN) #else # define UV__ENETDOWN (-4063) #endif #if defined(ENETUNREACH) && !defined(_WIN32) # define UV__ENETUNREACH (-ENETUNREACH) #else # define UV__ENETUNREACH (-4062) #endif #if defined(ENFILE) && !defined(_WIN32) # define UV__ENFILE (-ENFILE) #else # define UV__ENFILE (-4061) #endif #if defined(ENOBUFS) && !defined(_WIN32) # define UV__ENOBUFS (-ENOBUFS) #else # define UV__ENOBUFS (-4060) #endif #if defined(ENODEV) && !defined(_WIN32) # define UV__ENODEV (-ENODEV) #else # define UV__ENODEV (-4059) #endif #if defined(ENOENT) && !defined(_WIN32) # define UV__ENOENT (-ENOENT) #else # define UV__ENOENT (-4058) #endif #if defined(ENOMEM) && !defined(_WIN32) # define UV__ENOMEM (-ENOMEM) #else # define UV__ENOMEM (-4057) #endif #if defined(ENONET) && !defined(_WIN32) # define UV__ENONET (-ENONET) #else # define UV__ENONET (-4056) #endif #if defined(ENOSPC) && !defined(_WIN32) # define UV__ENOSPC (-ENOSPC) #else # define UV__ENOSPC (-4055) #endif #if defined(ENOSYS) && !defined(_WIN32) # define UV__ENOSYS (-ENOSYS) #else # define UV__ENOSYS (-4054) #endif #if defined(ENOTCONN) && !defined(_WIN32) # define UV__ENOTCONN (-ENOTCONN) #else # define UV__ENOTCONN (-4053) #endif #if defined(ENOTDIR) && !defined(_WIN32) # define UV__ENOTDIR (-ENOTDIR) #else # define UV__ENOTDIR (-4052) #endif #if defined(ENOTEMPTY) && !defined(_WIN32) # define UV__ENOTEMPTY (-ENOTEMPTY) #else # define UV__ENOTEMPTY (-4051) #endif #if defined(ENOTSOCK) && !defined(_WIN32) # define UV__ENOTSOCK (-ENOTSOCK) #else # define UV__ENOTSOCK (-4050) #endif #if defined(ENOTSUP) && !defined(_WIN32) # define UV__ENOTSUP (-ENOTSUP) #else # define UV__ENOTSUP (-4049) #endif #if defined(EPERM) && !defined(_WIN32) # define UV__EPERM (-EPERM) #else # define UV__EPERM (-4048) #endif #if defined(EPIPE) && !defined(_WIN32) # define UV__EPIPE (-EPIPE) #else # define UV__EPIPE (-4047) #endif #if defined(EPROTO) && !defined(_WIN32) # define UV__EPROTO (-EPROTO) #else # define UV__EPROTO (-4046) #endif #if defined(EPROTONOSUPPORT) && !defined(_WIN32) # define UV__EPROTONOSUPPORT (-EPROTONOSUPPORT) #else # define UV__EPROTONOSUPPORT (-4045) #endif #if defined(EPROTOTYPE) && !defined(_WIN32) # define UV__EPROTOTYPE (-EPROTOTYPE) #else # define UV__EPROTOTYPE (-4044) #endif #if defined(EROFS) && !defined(_WIN32) # define UV__EROFS (-EROFS) #else # define UV__EROFS (-4043) #endif #if defined(ESHUTDOWN) && !defined(_WIN32) # define UV__ESHUTDOWN (-ESHUTDOWN) #else # define UV__ESHUTDOWN (-4042) #endif #if defined(ESPIPE) && !defined(_WIN32) # define UV__ESPIPE (-ESPIPE) #else # define UV__ESPIPE (-4041) #endif #if defined(ESRCH) && !defined(_WIN32) # define UV__ESRCH (-ESRCH) #else # define UV__ESRCH (-4040) #endif #if defined(ETIMEDOUT) && !defined(_WIN32) # define UV__ETIMEDOUT (-ETIMEDOUT) #else # define UV__ETIMEDOUT (-4039) #endif #if defined(ETXTBSY) && !defined(_WIN32) # define UV__ETXTBSY (-ETXTBSY) #else # define UV__ETXTBSY (-4038) #endif #if defined(EXDEV) && !defined(_WIN32) # define UV__EXDEV (-EXDEV) #else # define UV__EXDEV (-4037) #endif #if defined(EFBIG) && !defined(_WIN32) # define UV__EFBIG (-EFBIG) #else # define UV__EFBIG (-4036) #endif #if defined(ENOPROTOOPT) && !defined(_WIN32) # define UV__ENOPROTOOPT (-ENOPROTOOPT) #else # define UV__ENOPROTOOPT (-4035) #endif #if defined(ERANGE) && !defined(_WIN32) # define UV__ERANGE (-ERANGE) #else # define UV__ERANGE (-4034) #endif #if defined(ENXIO) && !defined(_WIN32) # define UV__ENXIO (-ENXIO) #else # define UV__ENXIO (-4033) #endif #if defined(EMLINK) && !defined(_WIN32) # define UV__EMLINK (-EMLINK) #else # define UV__EMLINK (-4032) #endif #endif /* UV_ERRNO_H_ */ iojs-v1.0.2-darwin-x64/include/node/uv-linux.h000644 000766 000024 00000003365 12455173732 021205 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_LINUX_H #define UV_LINUX_H #define UV_PLATFORM_LOOP_FIELDS \ uv__io_t inotify_read_watcher; \ void* inotify_watchers; \ int inotify_fd; \ #define UV_PLATFORM_FS_EVENT_FIELDS \ void* watchers[2]; \ int wd; \ #endif /* UV_LINUX_H */ iojs-v1.0.2-darwin-x64/include/node/uv-sunos.h000644 000766 000024 00000003701 12455173732 021207 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_SUNOS_H #define UV_SUNOS_H #include #include /* For the sake of convenience and reduced #ifdef-ery in src/unix/sunos.c, * add the fs_event fields even when this version of SunOS doesn't support * file watching. */ #define UV_PLATFORM_LOOP_FIELDS \ uv__io_t fs_event_watcher; \ int fs_fd; \ #if defined(PORT_SOURCE_FILE) # define UV_PLATFORM_FS_EVENT_FIELDS \ file_obj_t fo; \ int fd; \ #endif /* defined(PORT_SOURCE_FILE) */ #endif /* UV_SUNOS_H */ iojs-v1.0.2-darwin-x64/include/node/uv-threadpool.h000644 000766 000024 00000002731 12455173732 022203 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ /* * This file is private to libuv. It provides common functionality to both * Windows and Unix backends. */ #ifndef UV_THREADPOOL_H_ #define UV_THREADPOOL_H_ struct uv__work { void (*work)(struct uv__work *w); void (*done)(struct uv__work *w, int status); struct uv_loop_s* loop; void* wq[2]; }; #endif /* UV_THREADPOOL_H_ */ iojs-v1.0.2-darwin-x64/include/node/uv-unix.h000644 000766 000024 00000037446 12455173732 021040 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_UNIX_H #define UV_UNIX_H #include #include #include #include #include #include #include #include #include #include #include #include #include #ifdef __ANDROID__ #include "pthread-fixes.h" #endif #include #include "uv-threadpool.h" #if defined(__linux__) # include "uv-linux.h" #elif defined(_AIX) # include "uv-aix.h" #elif defined(__sun) # include "uv-sunos.h" #elif defined(__APPLE__) # include "uv-darwin.h" #elif defined(__DragonFly__) || \ defined(__FreeBSD__) || \ defined(__OpenBSD__) || \ defined(__NetBSD__) # include "uv-bsd.h" #endif #ifndef NI_MAXHOST # define NI_MAXHOST 1025 #endif #ifndef NI_MAXSERV # define NI_MAXSERV 32 #endif #ifndef UV_IO_PRIVATE_PLATFORM_FIELDS # define UV_IO_PRIVATE_PLATFORM_FIELDS /* empty */ #endif struct uv__io_s; struct uv__async; struct uv_loop_s; typedef void (*uv__io_cb)(struct uv_loop_s* loop, struct uv__io_s* w, unsigned int events); typedef struct uv__io_s uv__io_t; struct uv__io_s { uv__io_cb cb; void* pending_queue[2]; void* watcher_queue[2]; unsigned int pevents; /* Pending event mask i.e. mask at next tick. */ unsigned int events; /* Current event mask. */ int fd; UV_IO_PRIVATE_PLATFORM_FIELDS }; typedef void (*uv__async_cb)(struct uv_loop_s* loop, struct uv__async* w, unsigned int nevents); struct uv__async { uv__async_cb cb; uv__io_t io_watcher; int wfd; }; #ifndef UV_PLATFORM_SEM_T # define UV_PLATFORM_SEM_T sem_t #endif #ifndef UV_PLATFORM_LOOP_FIELDS # define UV_PLATFORM_LOOP_FIELDS /* empty */ #endif #ifndef UV_PLATFORM_FS_EVENT_FIELDS # define UV_PLATFORM_FS_EVENT_FIELDS /* empty */ #endif #ifndef UV_STREAM_PRIVATE_PLATFORM_FIELDS # define UV_STREAM_PRIVATE_PLATFORM_FIELDS /* empty */ #endif /* Note: May be cast to struct iovec. See writev(2). */ typedef struct uv_buf_t { char* base; size_t len; } uv_buf_t; typedef int uv_file; typedef int uv_os_sock_t; typedef int uv_os_fd_t; #define UV_ONCE_INIT PTHREAD_ONCE_INIT typedef pthread_once_t uv_once_t; typedef pthread_t uv_thread_t; typedef pthread_mutex_t uv_mutex_t; typedef pthread_rwlock_t uv_rwlock_t; typedef UV_PLATFORM_SEM_T uv_sem_t; typedef pthread_cond_t uv_cond_t; typedef pthread_key_t uv_key_t; #if defined(__APPLE__) && defined(__MACH__) typedef struct { unsigned int n; unsigned int count; uv_mutex_t mutex; uv_sem_t turnstile1; uv_sem_t turnstile2; } uv_barrier_t; #else /* defined(__APPLE__) && defined(__MACH__) */ typedef pthread_barrier_t uv_barrier_t; #endif /* defined(__APPLE__) && defined(__MACH__) */ /* Platform-specific definitions for uv_spawn support. */ typedef gid_t uv_gid_t; typedef uid_t uv_uid_t; typedef struct dirent uv__dirent_t; #if defined(DT_UNKNOWN) # define HAVE_DIRENT_TYPES # if defined(DT_REG) # define UV__DT_FILE DT_REG # else # define UV__DT_FILE -1 # endif # if defined(DT_DIR) # define UV__DT_DIR DT_DIR # else # define UV__DT_DIR -2 # endif # if defined(DT_LNK) # define UV__DT_LINK DT_LNK # else # define UV__DT_LINK -3 # endif # if defined(DT_FIFO) # define UV__DT_FIFO DT_FIFO # else # define UV__DT_FIFO -4 # endif # if defined(DT_SOCK) # define UV__DT_SOCKET DT_SOCK # else # define UV__DT_SOCKET -5 # endif # if defined(DT_CHR) # define UV__DT_CHAR DT_CHR # else # define UV__DT_CHAR -6 # endif # if defined(DT_BLK) # define UV__DT_BLOCK DT_BLK # else # define UV__DT_BLOCK -7 # endif #endif /* Platform-specific definitions for uv_dlopen support. */ #define UV_DYNAMIC /* empty */ typedef struct { void* handle; char* errmsg; } uv_lib_t; #define UV_LOOP_PRIVATE_FIELDS \ unsigned long flags; \ int backend_fd; \ void* pending_queue[2]; \ void* watcher_queue[2]; \ uv__io_t** watchers; \ unsigned int nwatchers; \ unsigned int nfds; \ void* wq[2]; \ uv_mutex_t wq_mutex; \ uv_async_t wq_async; \ uv_rwlock_t cloexec_lock; \ uv_handle_t* closing_handles; \ void* process_handles[2]; \ void* prepare_handles[2]; \ void* check_handles[2]; \ void* idle_handles[2]; \ void* async_handles[2]; \ struct uv__async async_watcher; \ struct { \ void* min; \ unsigned int nelts; \ } timer_heap; \ uint64_t timer_counter; \ uint64_t time; \ int signal_pipefd[2]; \ uv__io_t signal_io_watcher; \ uv_signal_t child_watcher; \ int emfile_fd; \ UV_PLATFORM_LOOP_FIELDS \ #define UV_REQ_TYPE_PRIVATE /* empty */ #define UV_REQ_PRIVATE_FIELDS /* empty */ #define UV_PRIVATE_REQ_TYPES /* empty */ #define UV_WRITE_PRIVATE_FIELDS \ void* queue[2]; \ unsigned int write_index; \ uv_buf_t* bufs; \ unsigned int nbufs; \ int error; \ uv_buf_t bufsml[4]; \ #define UV_CONNECT_PRIVATE_FIELDS \ void* queue[2]; \ #define UV_SHUTDOWN_PRIVATE_FIELDS /* empty */ #define UV_UDP_SEND_PRIVATE_FIELDS \ void* queue[2]; \ struct sockaddr_storage addr; \ unsigned int nbufs; \ uv_buf_t* bufs; \ ssize_t status; \ uv_udp_send_cb send_cb; \ uv_buf_t bufsml[4]; \ #define UV_HANDLE_PRIVATE_FIELDS \ uv_handle_t* next_closing; \ unsigned int flags; \ #define UV_STREAM_PRIVATE_FIELDS \ uv_connect_t *connect_req; \ uv_shutdown_t *shutdown_req; \ uv__io_t io_watcher; \ void* write_queue[2]; \ void* write_completed_queue[2]; \ uv_connection_cb connection_cb; \ int delayed_error; \ int accepted_fd; \ void* queued_fds; \ UV_STREAM_PRIVATE_PLATFORM_FIELDS \ #define UV_TCP_PRIVATE_FIELDS /* empty */ #define UV_UDP_PRIVATE_FIELDS \ uv_alloc_cb alloc_cb; \ uv_udp_recv_cb recv_cb; \ uv__io_t io_watcher; \ void* write_queue[2]; \ void* write_completed_queue[2]; \ #define UV_PIPE_PRIVATE_FIELDS \ const char* pipe_fname; /* strdup'ed */ #define UV_POLL_PRIVATE_FIELDS \ uv__io_t io_watcher; #define UV_PREPARE_PRIVATE_FIELDS \ uv_prepare_cb prepare_cb; \ void* queue[2]; \ #define UV_CHECK_PRIVATE_FIELDS \ uv_check_cb check_cb; \ void* queue[2]; \ #define UV_IDLE_PRIVATE_FIELDS \ uv_idle_cb idle_cb; \ void* queue[2]; \ #define UV_ASYNC_PRIVATE_FIELDS \ uv_async_cb async_cb; \ void* queue[2]; \ int pending; \ #define UV_TIMER_PRIVATE_FIELDS \ uv_timer_cb timer_cb; \ void* heap_node[3]; \ uint64_t timeout; \ uint64_t repeat; \ uint64_t start_id; #define UV_GETADDRINFO_PRIVATE_FIELDS \ struct uv__work work_req; \ uv_getaddrinfo_cb cb; \ struct addrinfo* hints; \ char* hostname; \ char* service; \ struct addrinfo* res; \ int retcode; #define UV_GETNAMEINFO_PRIVATE_FIELDS \ struct uv__work work_req; \ uv_getnameinfo_cb getnameinfo_cb; \ struct sockaddr_storage storage; \ int flags; \ char host[NI_MAXHOST]; \ char service[NI_MAXSERV]; \ int retcode; #define UV_PROCESS_PRIVATE_FIELDS \ void* queue[2]; \ int status; \ #define UV_FS_PRIVATE_FIELDS \ const char *new_path; \ uv_file file; \ int flags; \ mode_t mode; \ unsigned int nbufs; \ uv_buf_t* bufs; \ off_t off; \ uv_uid_t uid; \ uv_gid_t gid; \ double atime; \ double mtime; \ struct uv__work work_req; \ uv_buf_t bufsml[4]; \ #define UV_WORK_PRIVATE_FIELDS \ struct uv__work work_req; #define UV_TTY_PRIVATE_FIELDS \ struct termios orig_termios; \ int mode; #define UV_SIGNAL_PRIVATE_FIELDS \ /* RB_ENTRY(uv_signal_s) tree_entry; */ \ struct { \ struct uv_signal_s* rbe_left; \ struct uv_signal_s* rbe_right; \ struct uv_signal_s* rbe_parent; \ int rbe_color; \ } tree_entry; \ /* Use two counters here so we don have to fiddle with atomics. */ \ unsigned int caught_signals; \ unsigned int dispatched_signals; #define UV_FS_EVENT_PRIVATE_FIELDS \ uv_fs_event_cb cb; \ UV_PLATFORM_FS_EVENT_FIELDS \ #endif /* UV_UNIX_H */ iojs-v1.0.2-darwin-x64/include/node/uv-version.h000644 000766 000024 00000003227 12456106751 021526 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef UV_VERSION_H #define UV_VERSION_H /* * Versions with the same major number are ABI stable. API is allowed to * evolve between minor releases, but only in a backwards compatible way. * Make sure you update the -soname directives in configure.ac * and uv.gyp whenever you bump UV_VERSION_MAJOR or UV_VERSION_MINOR (but * not UV_VERSION_PATCH.) */ #define UV_VERSION_MAJOR 1 #define UV_VERSION_MINOR 2 #define UV_VERSION_PATCH 1 #define UV_VERSION_IS_RELEASE 1 #define UV_VERSION_SUFFIX "" #endif /* UV_VERSION_H */ iojs-v1.0.2-darwin-x64/include/node/uv-win.h000644 000766 000024 00000074507 12455173732 020651 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ #ifndef _WIN32_WINNT # define _WIN32_WINNT 0x0502 #endif #if !defined(_SSIZE_T_) && !defined(_SSIZE_T_DEFINED) typedef intptr_t ssize_t; # define _SSIZE_T_ # define _SSIZE_T_DEFINED #endif #include #if defined(__MINGW32__) && !defined(__MINGW64_VERSION_MAJOR) typedef struct pollfd { SOCKET fd; short events; short revents; } WSAPOLLFD, *PWSAPOLLFD, *LPWSAPOLLFD; #endif #ifndef LOCALE_INVARIANT # define LOCALE_INVARIANT 0x007f #endif #ifndef _malloca # if defined(_DEBUG) # define _malloca(size) malloc(size) # define _freea(ptr) free(ptr) # else # define _malloca(size) alloca(size) # define _freea(ptr) # endif #endif #include #include #include #include #include #include #if defined(_MSC_VER) && _MSC_VER < 1600 # include "stdint-msvc2008.h" #else # include #endif #include "tree.h" #include "uv-threadpool.h" #define MAX_PIPENAME_LEN 256 #ifndef S_IFLNK # define S_IFLNK 0xA000 #endif /* Additional signals supported by uv_signal and or uv_kill. The CRT defines * the following signals already: * * #define SIGINT 2 * #define SIGILL 4 * #define SIGABRT_COMPAT 6 * #define SIGFPE 8 * #define SIGSEGV 11 * #define SIGTERM 15 * #define SIGBREAK 21 * #define SIGABRT 22 * * The additional signals have values that are common on other Unix * variants (Linux and Darwin) */ #define SIGHUP 1 #define SIGKILL 9 #define SIGWINCH 28 /* The CRT defines SIGABRT_COMPAT as 6, which equals SIGABRT on many */ /* unix-like platforms. However MinGW doesn't define it, so we do. */ #ifndef SIGABRT_COMPAT # define SIGABRT_COMPAT 6 #endif /* * Guids and typedefs for winsock extension functions * Mingw32 doesn't have these :-( */ #ifndef WSAID_ACCEPTEX # define WSAID_ACCEPTEX \ {0xb5367df1, 0xcbac, 0x11cf, \ {0x95, 0xca, 0x00, 0x80, 0x5f, 0x48, 0xa1, 0x92}} # define WSAID_CONNECTEX \ {0x25a207b9, 0xddf3, 0x4660, \ {0x8e, 0xe9, 0x76, 0xe5, 0x8c, 0x74, 0x06, 0x3e}} # define WSAID_GETACCEPTEXSOCKADDRS \ {0xb5367df2, 0xcbac, 0x11cf, \ {0x95, 0xca, 0x00, 0x80, 0x5f, 0x48, 0xa1, 0x92}} # define WSAID_DISCONNECTEX \ {0x7fda2e11, 0x8630, 0x436f, \ {0xa0, 0x31, 0xf5, 0x36, 0xa6, 0xee, 0xc1, 0x57}} # define WSAID_TRANSMITFILE \ {0xb5367df0, 0xcbac, 0x11cf, \ {0x95, 0xca, 0x00, 0x80, 0x5f, 0x48, 0xa1, 0x92}} typedef BOOL PASCAL (*LPFN_ACCEPTEX) (SOCKET sListenSocket, SOCKET sAcceptSocket, PVOID lpOutputBuffer, DWORD dwReceiveDataLength, DWORD dwLocalAddressLength, DWORD dwRemoteAddressLength, LPDWORD lpdwBytesReceived, LPOVERLAPPED lpOverlapped); typedef BOOL PASCAL (*LPFN_CONNECTEX) (SOCKET s, const struct sockaddr* name, int namelen, PVOID lpSendBuffer, DWORD dwSendDataLength, LPDWORD lpdwBytesSent, LPOVERLAPPED lpOverlapped); typedef void PASCAL (*LPFN_GETACCEPTEXSOCKADDRS) (PVOID lpOutputBuffer, DWORD dwReceiveDataLength, DWORD dwLocalAddressLength, DWORD dwRemoteAddressLength, LPSOCKADDR* LocalSockaddr, LPINT LocalSockaddrLength, LPSOCKADDR* RemoteSockaddr, LPINT RemoteSockaddrLength); typedef BOOL PASCAL (*LPFN_DISCONNECTEX) (SOCKET hSocket, LPOVERLAPPED lpOverlapped, DWORD dwFlags, DWORD reserved); typedef BOOL PASCAL (*LPFN_TRANSMITFILE) (SOCKET hSocket, HANDLE hFile, DWORD nNumberOfBytesToWrite, DWORD nNumberOfBytesPerSend, LPOVERLAPPED lpOverlapped, LPTRANSMIT_FILE_BUFFERS lpTransmitBuffers, DWORD dwFlags); typedef PVOID RTL_SRWLOCK; typedef RTL_SRWLOCK SRWLOCK, *PSRWLOCK; #endif typedef int (WSAAPI* LPFN_WSARECV) (SOCKET socket, LPWSABUF buffers, DWORD buffer_count, LPDWORD bytes, LPDWORD flags, LPWSAOVERLAPPED overlapped, LPWSAOVERLAPPED_COMPLETION_ROUTINE completion_routine); typedef int (WSAAPI* LPFN_WSARECVFROM) (SOCKET socket, LPWSABUF buffers, DWORD buffer_count, LPDWORD bytes, LPDWORD flags, struct sockaddr* addr, LPINT addr_len, LPWSAOVERLAPPED overlapped, LPWSAOVERLAPPED_COMPLETION_ROUTINE completion_routine); #ifndef _NTDEF_ typedef LONG NTSTATUS; typedef NTSTATUS *PNTSTATUS; #endif #ifndef RTL_CONDITION_VARIABLE_INIT typedef PVOID CONDITION_VARIABLE, *PCONDITION_VARIABLE; #endif typedef struct _AFD_POLL_HANDLE_INFO { HANDLE Handle; ULONG Events; NTSTATUS Status; } AFD_POLL_HANDLE_INFO, *PAFD_POLL_HANDLE_INFO; typedef struct _AFD_POLL_INFO { LARGE_INTEGER Timeout; ULONG NumberOfHandles; ULONG Exclusive; AFD_POLL_HANDLE_INFO Handles[1]; } AFD_POLL_INFO, *PAFD_POLL_INFO; #define UV_MSAFD_PROVIDER_COUNT 3 /** * It should be possible to cast uv_buf_t[] to WSABUF[] * see http://msdn.microsoft.com/en-us/library/ms741542(v=vs.85).aspx */ typedef struct uv_buf_t { ULONG len; char* base; } uv_buf_t; typedef int uv_file; typedef SOCKET uv_os_sock_t; typedef HANDLE uv_os_fd_t; typedef HANDLE uv_thread_t; typedef HANDLE uv_sem_t; typedef CRITICAL_SECTION uv_mutex_t; /* This condition variable implementation is based on the SetEvent solution * (section 3.2) at http://www.cs.wustl.edu/~schmidt/win32-cv-1.html * We could not use the SignalObjectAndWait solution (section 3.4) because * it want the 2nd argument (type uv_mutex_t) of uv_cond_wait() and * uv_cond_timedwait() to be HANDLEs, but we use CRITICAL_SECTIONs. */ typedef union { CONDITION_VARIABLE cond_var; struct { unsigned int waiters_count; CRITICAL_SECTION waiters_count_lock; HANDLE signal_event; HANDLE broadcast_event; } fallback; } uv_cond_t; typedef union { /* srwlock_ has type SRWLOCK, but not all toolchains define this type in */ /* windows.h. */ SRWLOCK srwlock_; struct { uv_mutex_t read_mutex_; uv_mutex_t write_mutex_; unsigned int num_readers_; } fallback_; } uv_rwlock_t; typedef struct { unsigned int n; unsigned int count; uv_mutex_t mutex; uv_sem_t turnstile1; uv_sem_t turnstile2; } uv_barrier_t; typedef struct { DWORD tls_index; } uv_key_t; #define UV_ONCE_INIT { 0, NULL } typedef struct uv_once_s { unsigned char ran; HANDLE event; } uv_once_t; /* Platform-specific definitions for uv_spawn support. */ typedef unsigned char uv_uid_t; typedef unsigned char uv_gid_t; typedef struct uv__dirent_s { int d_type; char d_name[1]; } uv__dirent_t; #define UV__DT_DIR UV_DIRENT_DIR #define UV__DT_FILE UV_DIRENT_FILE #define UV__DT_LINK UV_DIRENT_LINK #define UV__DT_FIFO UV_DIRENT_FIFO #define UV__DT_SOCKET UV_DIRENT_SOCKET #define UV__DT_CHAR UV_DIRENT_CHAR #define UV__DT_BLOCK UV_DIRENT_BLOCK /* Platform-specific definitions for uv_dlopen support. */ #define UV_DYNAMIC FAR WINAPI typedef struct { HMODULE handle; char* errmsg; } uv_lib_t; RB_HEAD(uv_timer_tree_s, uv_timer_s); #define UV_LOOP_PRIVATE_FIELDS \ /* The loop's I/O completion port */ \ HANDLE iocp; \ /* The current time according to the event loop. in msecs. */ \ uint64_t time; \ /* Tail of a single-linked circular queue of pending reqs. If the queue */ \ /* is empty, tail_ is NULL. If there is only one item, */ \ /* tail_->next_req == tail_ */ \ uv_req_t* pending_reqs_tail; \ /* Head of a single-linked list of closed handles */ \ uv_handle_t* endgame_handles; \ /* The head of the timers tree */ \ struct uv_timer_tree_s timers; \ /* Lists of active loop (prepare / check / idle) watchers */ \ uv_prepare_t* prepare_handles; \ uv_check_t* check_handles; \ uv_idle_t* idle_handles; \ /* This pointer will refer to the prepare/check/idle handle whose */ \ /* callback is scheduled to be called next. This is needed to allow */ \ /* safe removal from one of the lists above while that list being */ \ /* iterated over. */ \ uv_prepare_t* next_prepare_handle; \ uv_check_t* next_check_handle; \ uv_idle_t* next_idle_handle; \ /* This handle holds the peer sockets for the fast variant of uv_poll_t */ \ SOCKET poll_peer_sockets[UV_MSAFD_PROVIDER_COUNT]; \ /* Counter to keep track of active tcp streams */ \ unsigned int active_tcp_streams; \ /* Counter to keep track of active udp streams */ \ unsigned int active_udp_streams; \ /* Counter to started timer */ \ uint64_t timer_counter; \ /* Threadpool */ \ void* wq[2]; \ uv_mutex_t wq_mutex; \ uv_async_t wq_async; #define UV_REQ_TYPE_PRIVATE \ /* TODO: remove the req suffix */ \ UV_ACCEPT, \ UV_FS_EVENT_REQ, \ UV_POLL_REQ, \ UV_PROCESS_EXIT, \ UV_READ, \ UV_UDP_RECV, \ UV_WAKEUP, \ UV_SIGNAL_REQ, #define UV_REQ_PRIVATE_FIELDS \ union { \ /* Used by I/O operations */ \ struct { \ OVERLAPPED overlapped; \ size_t queued_bytes; \ }; \ }; \ struct uv_req_s* next_req; #define UV_WRITE_PRIVATE_FIELDS \ int ipc_header; \ uv_buf_t write_buffer; \ HANDLE event_handle; \ HANDLE wait_handle; #define UV_CONNECT_PRIVATE_FIELDS \ /* empty */ #define UV_SHUTDOWN_PRIVATE_FIELDS \ /* empty */ #define UV_UDP_SEND_PRIVATE_FIELDS \ /* empty */ #define UV_PRIVATE_REQ_TYPES \ typedef struct uv_pipe_accept_s { \ UV_REQ_FIELDS \ HANDLE pipeHandle; \ struct uv_pipe_accept_s* next_pending; \ } uv_pipe_accept_t; \ \ typedef struct uv_tcp_accept_s { \ UV_REQ_FIELDS \ SOCKET accept_socket; \ char accept_buffer[sizeof(struct sockaddr_storage) * 2 + 32]; \ HANDLE event_handle; \ HANDLE wait_handle; \ struct uv_tcp_accept_s* next_pending; \ } uv_tcp_accept_t; \ \ typedef struct uv_read_s { \ UV_REQ_FIELDS \ HANDLE event_handle; \ HANDLE wait_handle; \ } uv_read_t; #define uv_stream_connection_fields \ unsigned int write_reqs_pending; \ uv_shutdown_t* shutdown_req; #define uv_stream_server_fields \ uv_connection_cb connection_cb; #define UV_STREAM_PRIVATE_FIELDS \ unsigned int reqs_pending; \ int activecnt; \ uv_read_t read_req; \ union { \ struct { uv_stream_connection_fields }; \ struct { uv_stream_server_fields }; \ }; #define uv_tcp_server_fields \ uv_tcp_accept_t* accept_reqs; \ unsigned int processed_accepts; \ uv_tcp_accept_t* pending_accepts; \ LPFN_ACCEPTEX func_acceptex; #define uv_tcp_connection_fields \ uv_buf_t read_buffer; \ LPFN_CONNECTEX func_connectex; #define UV_TCP_PRIVATE_FIELDS \ SOCKET socket; \ int delayed_error; \ union { \ struct { uv_tcp_server_fields }; \ struct { uv_tcp_connection_fields }; \ }; #define UV_UDP_PRIVATE_FIELDS \ SOCKET socket; \ unsigned int reqs_pending; \ int activecnt; \ uv_req_t recv_req; \ uv_buf_t recv_buffer; \ struct sockaddr_storage recv_from; \ int recv_from_len; \ uv_udp_recv_cb recv_cb; \ uv_alloc_cb alloc_cb; \ LPFN_WSARECV func_wsarecv; \ LPFN_WSARECVFROM func_wsarecvfrom; #define uv_pipe_server_fields \ int pending_instances; \ uv_pipe_accept_t* accept_reqs; \ uv_pipe_accept_t* pending_accepts; #define uv_pipe_connection_fields \ uv_timer_t* eof_timer; \ uv_write_t ipc_header_write_req; \ int ipc_pid; \ uint64_t remaining_ipc_rawdata_bytes; \ struct { \ void* queue[2]; \ int queue_len; \ } pending_ipc_info; \ uv_write_t* non_overlapped_writes_tail; \ uv_mutex_t readfile_mutex; \ volatile HANDLE readfile_thread; #define UV_PIPE_PRIVATE_FIELDS \ HANDLE handle; \ WCHAR* name; \ union { \ struct { uv_pipe_server_fields }; \ struct { uv_pipe_connection_fields }; \ }; /* TODO: put the parser states in an union - TTY handles are always */ /* half-duplex so read-state can safely overlap write-state. */ #define UV_TTY_PRIVATE_FIELDS \ HANDLE handle; \ union { \ struct { \ /* Used for readable TTY handles */ \ HANDLE read_line_handle; \ uv_buf_t read_line_buffer; \ HANDLE read_raw_wait; \ /* Fields used for translating win keystrokes into vt100 characters */ \ char last_key[8]; \ unsigned char last_key_offset; \ unsigned char last_key_len; \ WCHAR last_utf16_high_surrogate; \ INPUT_RECORD last_input_record; \ }; \ struct { \ /* Used for writable TTY handles */ \ /* utf8-to-utf16 conversion state */ \ unsigned int utf8_codepoint; \ unsigned char utf8_bytes_left; \ /* eol conversion state */ \ unsigned char previous_eol; \ /* ansi parser state */ \ unsigned char ansi_parser_state; \ unsigned char ansi_csi_argc; \ unsigned short ansi_csi_argv[4]; \ COORD saved_position; \ WORD saved_attributes; \ }; \ }; #define UV_POLL_PRIVATE_FIELDS \ SOCKET socket; \ /* Used in fast mode */ \ SOCKET peer_socket; \ AFD_POLL_INFO afd_poll_info_1; \ AFD_POLL_INFO afd_poll_info_2; \ /* Used in fast and slow mode. */ \ uv_req_t poll_req_1; \ uv_req_t poll_req_2; \ unsigned char submitted_events_1; \ unsigned char submitted_events_2; \ unsigned char mask_events_1; \ unsigned char mask_events_2; \ unsigned char events; #define UV_TIMER_PRIVATE_FIELDS \ RB_ENTRY(uv_timer_s) tree_entry; \ uint64_t due; \ uint64_t repeat; \ uint64_t start_id; \ uv_timer_cb timer_cb; #define UV_ASYNC_PRIVATE_FIELDS \ struct uv_req_s async_req; \ uv_async_cb async_cb; \ /* char to avoid alignment issues */ \ char volatile async_sent; #define UV_PREPARE_PRIVATE_FIELDS \ uv_prepare_t* prepare_prev; \ uv_prepare_t* prepare_next; \ uv_prepare_cb prepare_cb; #define UV_CHECK_PRIVATE_FIELDS \ uv_check_t* check_prev; \ uv_check_t* check_next; \ uv_check_cb check_cb; #define UV_IDLE_PRIVATE_FIELDS \ uv_idle_t* idle_prev; \ uv_idle_t* idle_next; \ uv_idle_cb idle_cb; #define UV_HANDLE_PRIVATE_FIELDS \ uv_handle_t* endgame_next; \ unsigned int flags; #define UV_GETADDRINFO_PRIVATE_FIELDS \ struct uv__work work_req; \ uv_getaddrinfo_cb getaddrinfo_cb; \ void* alloc; \ WCHAR* node; \ WCHAR* service; \ struct addrinfoW* hints; \ struct addrinfoW* res; \ int retcode; #define UV_GETNAMEINFO_PRIVATE_FIELDS \ struct uv__work work_req; \ uv_getnameinfo_cb getnameinfo_cb; \ struct sockaddr_storage storage; \ int flags; \ char host[NI_MAXHOST]; \ char service[NI_MAXSERV]; \ int retcode; #define UV_PROCESS_PRIVATE_FIELDS \ struct uv_process_exit_s { \ UV_REQ_FIELDS \ } exit_req; \ BYTE* child_stdio_buffer; \ int exit_signal; \ HANDLE wait_handle; \ HANDLE process_handle; \ volatile char exit_cb_pending; #define UV_FS_PRIVATE_FIELDS \ struct uv__work work_req; \ int flags; \ DWORD sys_errno_; \ union { \ /* TODO: remove me in 0.9. */ \ WCHAR* pathw; \ int fd; \ }; \ union { \ struct { \ int mode; \ WCHAR* new_pathw; \ int file_flags; \ int fd_out; \ unsigned int nbufs; \ uv_buf_t* bufs; \ int64_t offset; \ uv_buf_t bufsml[4]; \ }; \ struct { \ double atime; \ double mtime; \ }; \ }; #define UV_WORK_PRIVATE_FIELDS \ struct uv__work work_req; #define UV_FS_EVENT_PRIVATE_FIELDS \ struct uv_fs_event_req_s { \ UV_REQ_FIELDS \ } req; \ HANDLE dir_handle; \ int req_pending; \ uv_fs_event_cb cb; \ WCHAR* filew; \ WCHAR* short_filew; \ WCHAR* dirw; \ char* buffer; #define UV_SIGNAL_PRIVATE_FIELDS \ RB_ENTRY(uv_signal_s) tree_entry; \ struct uv_req_s signal_req; \ unsigned long pending_signum; int uv_utf16_to_utf8(const WCHAR* utf16Buffer, size_t utf16Size, char* utf8Buffer, size_t utf8Size); int uv_utf8_to_utf16(const char* utf8Buffer, WCHAR* utf16Buffer, size_t utf16Size); #ifndef F_OK #define F_OK 0 #endif #ifndef R_OK #define R_OK 4 #endif #ifndef W_OK #define W_OK 2 #endif #ifndef X_OK #define X_OK 1 #endif iojs-v1.0.2-darwin-x64/include/node/uv.h000644 000766 000024 00000145320 12455173732 020046 0ustar00iojsstaff000000 000000 /* Copyright Joyent, Inc. and other Node contributors. All rights reserved. * * Permission is hereby granted, free of charge, to any person obtaining a copy * of this software and associated documentation files (the "Software"), to * deal in the Software without restriction, including without limitation the * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or * sell copies of the Software, and to permit persons to whom the Software is * furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS * IN THE SOFTWARE. */ /* See https://github.com/libuv/libuv#documentation for documentation. */ #ifndef UV_H #define UV_H #ifdef __cplusplus extern "C" { #endif #ifdef _WIN32 /* Windows - set up dll import/export decorators. */ # if defined(BUILDING_UV_SHARED) /* Building shared library. */ # define UV_EXTERN __declspec(dllexport) # elif defined(USING_UV_SHARED) /* Using shared library. */ # define UV_EXTERN __declspec(dllimport) # else /* Building static library. */ # define UV_EXTERN /* nothing */ # endif #elif __GNUC__ >= 4 # define UV_EXTERN __attribute__((visibility("default"))) #else # define UV_EXTERN /* nothing */ #endif #include "uv-errno.h" #include "uv-version.h" #include #if defined(_MSC_VER) && _MSC_VER < 1600 # include "stdint-msvc2008.h" #else # include #endif #if defined(_WIN32) # include "uv-win.h" #else # include "uv-unix.h" #endif /* Expand this list if necessary. */ #define UV_ERRNO_MAP(XX) \ XX(E2BIG, "argument list too long") \ XX(EACCES, "permission denied") \ XX(EADDRINUSE, "address already in use") \ XX(EADDRNOTAVAIL, "address not available") \ XX(EAFNOSUPPORT, "address family not supported") \ XX(EAGAIN, "resource temporarily unavailable") \ XX(EAI_ADDRFAMILY, "address family not supported") \ XX(EAI_AGAIN, "temporary failure") \ XX(EAI_BADFLAGS, "bad ai_flags value") \ XX(EAI_BADHINTS, "invalid value for hints") \ XX(EAI_CANCELED, "request canceled") \ XX(EAI_FAIL, "permanent failure") \ XX(EAI_FAMILY, "ai_family not supported") \ XX(EAI_MEMORY, "out of memory") \ XX(EAI_NODATA, "no address") \ XX(EAI_NONAME, "unknown node or service") \ XX(EAI_OVERFLOW, "argument buffer overflow") \ XX(EAI_PROTOCOL, "resolved protocol is unknown") \ XX(EAI_SERVICE, "service not available for socket type") \ XX(EAI_SOCKTYPE, "socket type not supported") \ XX(EALREADY, "connection already in progress") \ XX(EBADF, "bad file descriptor") \ XX(EBUSY, "resource busy or locked") \ XX(ECANCELED, "operation canceled") \ XX(ECHARSET, "invalid Unicode character") \ XX(ECONNABORTED, "software caused connection abort") \ XX(ECONNREFUSED, "connection refused") \ XX(ECONNRESET, "connection reset by peer") \ XX(EDESTADDRREQ, "destination address required") \ XX(EEXIST, "file already exists") \ XX(EFAULT, "bad address in system call argument") \ XX(EFBIG, "file too large") \ XX(EHOSTUNREACH, "host is unreachable") \ XX(EINTR, "interrupted system call") \ XX(EINVAL, "invalid argument") \ XX(EIO, "i/o error") \ XX(EISCONN, "socket is already connected") \ XX(EISDIR, "illegal operation on a directory") \ XX(ELOOP, "too many symbolic links encountered") \ XX(EMFILE, "too many open files") \ XX(EMSGSIZE, "message too long") \ XX(ENAMETOOLONG, "name too long") \ XX(ENETDOWN, "network is down") \ XX(ENETUNREACH, "network is unreachable") \ XX(ENFILE, "file table overflow") \ XX(ENOBUFS, "no buffer space available") \ XX(ENODEV, "no such device") \ XX(ENOENT, "no such file or directory") \ XX(ENOMEM, "not enough memory") \ XX(ENONET, "machine is not on the network") \ XX(ENOPROTOOPT, "protocol not available") \ XX(ENOSPC, "no space left on device") \ XX(ENOSYS, "function not implemented") \ XX(ENOTCONN, "socket is not connected") \ XX(ENOTDIR, "not a directory") \ XX(ENOTEMPTY, "directory not empty") \ XX(ENOTSOCK, "socket operation on non-socket") \ XX(ENOTSUP, "operation not supported on socket") \ XX(EPERM, "operation not permitted") \ XX(EPIPE, "broken pipe") \ XX(EPROTO, "protocol error") \ XX(EPROTONOSUPPORT, "protocol not supported") \ XX(EPROTOTYPE, "protocol wrong type for socket") \ XX(ERANGE, "result too large") \ XX(EROFS, "read-only file system") \ XX(ESHUTDOWN, "cannot send after transport endpoint shutdown") \ XX(ESPIPE, "invalid seek") \ XX(ESRCH, "no such process") \ XX(ETIMEDOUT, "connection timed out") \ XX(ETXTBSY, "text file is busy") \ XX(EXDEV, "cross-device link not permitted") \ XX(UNKNOWN, "unknown error") \ XX(EOF, "end of file") \ XX(ENXIO, "no such device or address") \ XX(EMLINK, "too many links") \ #define UV_HANDLE_TYPE_MAP(XX) \ XX(ASYNC, async) \ XX(CHECK, check) \ XX(FS_EVENT, fs_event) \ XX(FS_POLL, fs_poll) \ XX(HANDLE, handle) \ XX(IDLE, idle) \ XX(NAMED_PIPE, pipe) \ XX(POLL, poll) \ XX(PREPARE, prepare) \ XX(PROCESS, process) \ XX(STREAM, stream) \ XX(TCP, tcp) \ XX(TIMER, timer) \ XX(TTY, tty) \ XX(UDP, udp) \ XX(SIGNAL, signal) \ #define UV_REQ_TYPE_MAP(XX) \ XX(REQ, req) \ XX(CONNECT, connect) \ XX(WRITE, write) \ XX(SHUTDOWN, shutdown) \ XX(UDP_SEND, udp_send) \ XX(FS, fs) \ XX(WORK, work) \ XX(GETADDRINFO, getaddrinfo) \ XX(GETNAMEINFO, getnameinfo) \ typedef enum { #define XX(code, _) UV_ ## code = UV__ ## code, UV_ERRNO_MAP(XX) #undef XX UV_ERRNO_MAX = UV__EOF - 1 } uv_errno_t; typedef enum { UV_UNKNOWN_HANDLE = 0, #define XX(uc, lc) UV_##uc, UV_HANDLE_TYPE_MAP(XX) #undef XX UV_FILE, UV_HANDLE_TYPE_MAX } uv_handle_type; typedef enum { UV_UNKNOWN_REQ = 0, #define XX(uc, lc) UV_##uc, UV_REQ_TYPE_MAP(XX) #undef XX UV_REQ_TYPE_PRIVATE UV_REQ_TYPE_MAX } uv_req_type; /* Handle types. */ typedef struct uv_loop_s uv_loop_t; typedef struct uv_handle_s uv_handle_t; typedef struct uv_stream_s uv_stream_t; typedef struct uv_tcp_s uv_tcp_t; typedef struct uv_udp_s uv_udp_t; typedef struct uv_pipe_s uv_pipe_t; typedef struct uv_tty_s uv_tty_t; typedef struct uv_poll_s uv_poll_t; typedef struct uv_timer_s uv_timer_t; typedef struct uv_prepare_s uv_prepare_t; typedef struct uv_check_s uv_check_t; typedef struct uv_idle_s uv_idle_t; typedef struct uv_async_s uv_async_t; typedef struct uv_process_s uv_process_t; typedef struct uv_fs_event_s uv_fs_event_t; typedef struct uv_fs_poll_s uv_fs_poll_t; typedef struct uv_signal_s uv_signal_t; /* Request types. */ typedef struct uv_req_s uv_req_t; typedef struct uv_getaddrinfo_s uv_getaddrinfo_t; typedef struct uv_getnameinfo_s uv_getnameinfo_t; typedef struct uv_shutdown_s uv_shutdown_t; typedef struct uv_write_s uv_write_t; typedef struct uv_connect_s uv_connect_t; typedef struct uv_udp_send_s uv_udp_send_t; typedef struct uv_fs_s uv_fs_t; typedef struct uv_work_s uv_work_t; /* None of the above. */ typedef struct uv_cpu_info_s uv_cpu_info_t; typedef struct uv_interface_address_s uv_interface_address_t; typedef struct uv_dirent_s uv_dirent_t; typedef enum { UV_LOOP_BLOCK_SIGNAL } uv_loop_option; typedef enum { UV_RUN_DEFAULT = 0, UV_RUN_ONCE, UV_RUN_NOWAIT } uv_run_mode; UV_EXTERN unsigned int uv_version(void); UV_EXTERN const char* uv_version_string(void); UV_EXTERN uv_loop_t* uv_default_loop(void); UV_EXTERN int uv_loop_init(uv_loop_t* loop); UV_EXTERN int uv_loop_close(uv_loop_t* loop); /* * NOTE: * This function is DEPRECATED (to be removed after 0.12), users should * allocate the loop manually and use uv_loop_init instead. */ UV_EXTERN uv_loop_t* uv_loop_new(void); /* * NOTE: * This function is DEPRECATED (to be removed after 0.12). Users should use * uv_loop_close and free the memory manually instead. */ UV_EXTERN void uv_loop_delete(uv_loop_t*); UV_EXTERN size_t uv_loop_size(void); UV_EXTERN int uv_loop_alive(const uv_loop_t* loop); UV_EXTERN int uv_loop_configure(uv_loop_t* loop, uv_loop_option option, ...); UV_EXTERN int uv_run(uv_loop_t*, uv_run_mode mode); UV_EXTERN void uv_stop(uv_loop_t*); UV_EXTERN void uv_ref(uv_handle_t*); UV_EXTERN void uv_unref(uv_handle_t*); UV_EXTERN int uv_has_ref(const uv_handle_t*); UV_EXTERN void uv_update_time(uv_loop_t*); UV_EXTERN uint64_t uv_now(const uv_loop_t*); UV_EXTERN int uv_backend_fd(const uv_loop_t*); UV_EXTERN int uv_backend_timeout(const uv_loop_t*); typedef void (*uv_alloc_cb)(uv_handle_t* handle, size_t suggested_size, uv_buf_t* buf); typedef void (*uv_read_cb)(uv_stream_t* stream, ssize_t nread, const uv_buf_t* buf); typedef void (*uv_write_cb)(uv_write_t* req, int status); typedef void (*uv_connect_cb)(uv_connect_t* req, int status); typedef void (*uv_shutdown_cb)(uv_shutdown_t* req, int status); typedef void (*uv_connection_cb)(uv_stream_t* server, int status); typedef void (*uv_close_cb)(uv_handle_t* handle); typedef void (*uv_poll_cb)(uv_poll_t* handle, int status, int events); typedef void (*uv_timer_cb)(uv_timer_t* handle); typedef void (*uv_async_cb)(uv_async_t* handle); typedef void (*uv_prepare_cb)(uv_prepare_t* handle); typedef void (*uv_check_cb)(uv_check_t* handle); typedef void (*uv_idle_cb)(uv_idle_t* handle); typedef void (*uv_exit_cb)(uv_process_t*, int64_t exit_status, int term_signal); typedef void (*uv_walk_cb)(uv_handle_t* handle, void* arg); typedef void (*uv_fs_cb)(uv_fs_t* req); typedef void (*uv_work_cb)(uv_work_t* req); typedef void (*uv_after_work_cb)(uv_work_t* req, int status); typedef void (*uv_getaddrinfo_cb)(uv_getaddrinfo_t* req, int status, struct addrinfo* res); typedef void (*uv_getnameinfo_cb)(uv_getnameinfo_t* req, int status, const char* hostname, const char* service); typedef struct { long tv_sec; long tv_nsec; } uv_timespec_t; typedef struct { uint64_t st_dev; uint64_t st_mode; uint64_t st_nlink; uint64_t st_uid; uint64_t st_gid; uint64_t st_rdev; uint64_t st_ino; uint64_t st_size; uint64_t st_blksize; uint64_t st_blocks; uint64_t st_flags; uint64_t st_gen; uv_timespec_t st_atim; uv_timespec_t st_mtim; uv_timespec_t st_ctim; uv_timespec_t st_birthtim; } uv_stat_t; typedef void (*uv_fs_event_cb)(uv_fs_event_t* handle, const char* filename, int events, int status); typedef void (*uv_fs_poll_cb)(uv_fs_poll_t* handle, int status, const uv_stat_t* prev, const uv_stat_t* curr); typedef void (*uv_signal_cb)(uv_signal_t* handle, int signum); typedef enum { UV_LEAVE_GROUP = 0, UV_JOIN_GROUP } uv_membership; UV_EXTERN const char* uv_strerror(int err); UV_EXTERN const char* uv_err_name(int err); #define UV_REQ_FIELDS \ /* public */ \ void* data; \ /* read-only */ \ uv_req_type type; \ /* private */ \ void* active_queue[2]; \ void* reserved[4]; \ UV_REQ_PRIVATE_FIELDS \ /* Abstract base class of all requests. */ struct uv_req_s { UV_REQ_FIELDS }; /* Platform-specific request types. */ UV_PRIVATE_REQ_TYPES UV_EXTERN int uv_shutdown(uv_shutdown_t* req, uv_stream_t* handle, uv_shutdown_cb cb); struct uv_shutdown_s { UV_REQ_FIELDS uv_stream_t* handle; uv_shutdown_cb cb; UV_SHUTDOWN_PRIVATE_FIELDS }; #define UV_HANDLE_FIELDS \ /* public */ \ void* data; \ /* read-only */ \ uv_loop_t* loop; \ uv_handle_type type; \ /* private */ \ uv_close_cb close_cb; \ void* handle_queue[2]; \ void* reserved[4]; \ UV_HANDLE_PRIVATE_FIELDS \ /* The abstract base class of all handles. */ struct uv_handle_s { UV_HANDLE_FIELDS }; UV_EXTERN size_t uv_handle_size(uv_handle_type type); UV_EXTERN size_t uv_req_size(uv_req_type type); UV_EXTERN int uv_is_active(const uv_handle_t* handle); UV_EXTERN void uv_walk(uv_loop_t* loop, uv_walk_cb walk_cb, void* arg); UV_EXTERN void uv_close(uv_handle_t* handle, uv_close_cb close_cb); UV_EXTERN int uv_send_buffer_size(uv_handle_t* handle, int* value); UV_EXTERN int uv_recv_buffer_size(uv_handle_t* handle, int* value); UV_EXTERN int uv_fileno(const uv_handle_t* handle, uv_os_fd_t* fd); UV_EXTERN uv_buf_t uv_buf_init(char* base, unsigned int len); #define UV_STREAM_FIELDS \ /* number of bytes queued for writing */ \ size_t write_queue_size; \ uv_alloc_cb alloc_cb; \ uv_read_cb read_cb; \ /* private */ \ UV_STREAM_PRIVATE_FIELDS /* * uv_stream_t is a subclass of uv_handle_t. * * uv_stream is an abstract class. * * uv_stream_t is the parent class of uv_tcp_t, uv_pipe_t and uv_tty_t. */ struct uv_stream_s { UV_HANDLE_FIELDS UV_STREAM_FIELDS }; UV_EXTERN int uv_listen(uv_stream_t* stream, int backlog, uv_connection_cb cb); UV_EXTERN int uv_accept(uv_stream_t* server, uv_stream_t* client); UV_EXTERN int uv_read_start(uv_stream_t*, uv_alloc_cb alloc_cb, uv_read_cb read_cb); UV_EXTERN int uv_read_stop(uv_stream_t*); UV_EXTERN int uv_write(uv_write_t* req, uv_stream_t* handle, const uv_buf_t bufs[], unsigned int nbufs, uv_write_cb cb); UV_EXTERN int uv_write2(uv_write_t* req, uv_stream_t* handle, const uv_buf_t bufs[], unsigned int nbufs, uv_stream_t* send_handle, uv_write_cb cb); UV_EXTERN int uv_try_write(uv_stream_t* handle, const uv_buf_t bufs[], unsigned int nbufs); /* uv_write_t is a subclass of uv_req_t. */ struct uv_write_s { UV_REQ_FIELDS uv_write_cb cb; uv_stream_t* send_handle; uv_stream_t* handle; UV_WRITE_PRIVATE_FIELDS }; UV_EXTERN int uv_is_readable(const uv_stream_t* handle); UV_EXTERN int uv_is_writable(const uv_stream_t* handle); UV_EXTERN int uv_stream_set_blocking(uv_stream_t* handle, int blocking); UV_EXTERN int uv_is_closing(const uv_handle_t* handle); /* * uv_tcp_t is a subclass of uv_stream_t. * * Represents a TCP stream or TCP server. */ struct uv_tcp_s { UV_HANDLE_FIELDS UV_STREAM_FIELDS UV_TCP_PRIVATE_FIELDS }; UV_EXTERN int uv_tcp_init(uv_loop_t*, uv_tcp_t* handle); UV_EXTERN int uv_tcp_open(uv_tcp_t* handle, uv_os_sock_t sock); UV_EXTERN int uv_tcp_nodelay(uv_tcp_t* handle, int enable); UV_EXTERN int uv_tcp_keepalive(uv_tcp_t* handle, int enable, unsigned int delay); UV_EXTERN int uv_tcp_simultaneous_accepts(uv_tcp_t* handle, int enable); enum uv_tcp_flags { /* Used with uv_tcp_bind, when an IPv6 address is used. */ UV_TCP_IPV6ONLY = 1 }; UV_EXTERN int uv_tcp_bind(uv_tcp_t* handle, const struct sockaddr* addr, unsigned int flags); UV_EXTERN int uv_tcp_getsockname(const uv_tcp_t* handle, struct sockaddr* name, int* namelen); UV_EXTERN int uv_tcp_getpeername(const uv_tcp_t* handle, struct sockaddr* name, int* namelen); UV_EXTERN int uv_tcp_connect(uv_connect_t* req, uv_tcp_t* handle, const struct sockaddr* addr, uv_connect_cb cb); /* uv_connect_t is a subclass of uv_req_t. */ struct uv_connect_s { UV_REQ_FIELDS uv_connect_cb cb; uv_stream_t* handle; UV_CONNECT_PRIVATE_FIELDS }; /* * UDP support. */ enum uv_udp_flags { /* Disables dual stack mode. */ UV_UDP_IPV6ONLY = 1, /* * Indicates message was truncated because read buffer was too small. The * remainder was discarded by the OS. Used in uv_udp_recv_cb. */ UV_UDP_PARTIAL = 2, /* * Indicates if SO_REUSEADDR will be set when binding the handle. * This sets the SO_REUSEPORT socket flag on the BSDs and OS X. On other * Unix platforms, it sets the SO_REUSEADDR flag. What that means is that * multiple threads or processes can bind to the same address without error * (provided they all set the flag) but only the last one to bind will receive * any traffic, in effect "stealing" the port from the previous listener. */ UV_UDP_REUSEADDR = 4 }; typedef void (*uv_udp_send_cb)(uv_udp_send_t* req, int status); typedef void (*uv_udp_recv_cb)(uv_udp_t* handle, ssize_t nread, const uv_buf_t* buf, const struct sockaddr* addr, unsigned flags); /* uv_udp_t is a subclass of uv_handle_t. */ struct uv_udp_s { UV_HANDLE_FIELDS /* read-only */ /* * Number of bytes queued for sending. This field strictly shows how much * information is currently queued. */ size_t send_queue_size; /* * Number of send requests currently in the queue awaiting to be processed. */ size_t send_queue_count; UV_UDP_PRIVATE_FIELDS }; /* uv_udp_send_t is a subclass of uv_req_t. */ struct uv_udp_send_s { UV_REQ_FIELDS uv_udp_t* handle; uv_udp_send_cb cb; UV_UDP_SEND_PRIVATE_FIELDS }; UV_EXTERN int uv_udp_init(uv_loop_t*, uv_udp_t* handle); UV_EXTERN int uv_udp_open(uv_udp_t* handle, uv_os_sock_t sock); UV_EXTERN int uv_udp_bind(uv_udp_t* handle, const struct sockaddr* addr, unsigned int flags); UV_EXTERN int uv_udp_getsockname(const uv_udp_t* handle, struct sockaddr* name, int* namelen); UV_EXTERN int uv_udp_set_membership(uv_udp_t* handle, const char* multicast_addr, const char* interface_addr, uv_membership membership); UV_EXTERN int uv_udp_set_multicast_loop(uv_udp_t* handle, int on); UV_EXTERN int uv_udp_set_multicast_ttl(uv_udp_t* handle, int ttl); UV_EXTERN int uv_udp_set_multicast_interface(uv_udp_t* handle, const char* interface_addr); UV_EXTERN int uv_udp_set_broadcast(uv_udp_t* handle, int on); UV_EXTERN int uv_udp_set_ttl(uv_udp_t* handle, int ttl); UV_EXTERN int uv_udp_send(uv_udp_send_t* req, uv_udp_t* handle, const uv_buf_t bufs[], unsigned int nbufs, const struct sockaddr* addr, uv_udp_send_cb send_cb); UV_EXTERN int uv_udp_try_send(uv_udp_t* handle, const uv_buf_t bufs[], unsigned int nbufs, const struct sockaddr* addr); UV_EXTERN int uv_udp_recv_start(uv_udp_t* handle, uv_alloc_cb alloc_cb, uv_udp_recv_cb recv_cb); UV_EXTERN int uv_udp_recv_stop(uv_udp_t* handle); /* * uv_tty_t is a subclass of uv_stream_t. * * Representing a stream for the console. */ struct uv_tty_s { UV_HANDLE_FIELDS UV_STREAM_FIELDS UV_TTY_PRIVATE_FIELDS }; typedef enum { /* Initial/normal terminal mode */ UV_TTY_MODE_NORMAL, /* Raw input mode (On Windows, ENABLE_WINDOW_INPUT is also enabled) */ UV_TTY_MODE_RAW, /* Binary-safe I/O mode for IPC (Unix-only) */ UV_TTY_MODE_IO } uv_tty_mode_t; UV_EXTERN int uv_tty_init(uv_loop_t*, uv_tty_t*, uv_file fd, int readable); UV_EXTERN int uv_tty_set_mode(uv_tty_t*, uv_tty_mode_t mode); UV_EXTERN int uv_tty_reset_mode(void); UV_EXTERN int uv_tty_get_winsize(uv_tty_t*, int* width, int* height); #ifdef __cplusplus } /* extern "C" */ inline int uv_tty_set_mode(uv_tty_t* handle, int mode) { return uv_tty_set_mode(handle, static_cast(mode)); } extern "C" { #endif UV_EXTERN uv_handle_type uv_guess_handle(uv_file file); /* * uv_pipe_t is a subclass of uv_stream_t. * * Representing a pipe stream or pipe server. On Windows this is a Named * Pipe. On Unix this is a Unix domain socket. */ struct uv_pipe_s { UV_HANDLE_FIELDS UV_STREAM_FIELDS int ipc; /* non-zero if this pipe is used for passing handles */ UV_PIPE_PRIVATE_FIELDS }; UV_EXTERN int uv_pipe_init(uv_loop_t*, uv_pipe_t* handle, int ipc); UV_EXTERN int uv_pipe_open(uv_pipe_t*, uv_file file); UV_EXTERN int uv_pipe_bind(uv_pipe_t* handle, const char* name); UV_EXTERN void uv_pipe_connect(uv_connect_t* req, uv_pipe_t* handle, const char* name, uv_connect_cb cb); UV_EXTERN int uv_pipe_getsockname(const uv_pipe_t* handle, char* buf, size_t* len); UV_EXTERN void uv_pipe_pending_instances(uv_pipe_t* handle, int count); UV_EXTERN int uv_pipe_pending_count(uv_pipe_t* handle); UV_EXTERN uv_handle_type uv_pipe_pending_type(uv_pipe_t* handle); struct uv_poll_s { UV_HANDLE_FIELDS uv_poll_cb poll_cb; UV_POLL_PRIVATE_FIELDS }; enum uv_poll_event { UV_READABLE = 1, UV_WRITABLE = 2 }; UV_EXTERN int uv_poll_init(uv_loop_t* loop, uv_poll_t* handle, int fd); UV_EXTERN int uv_poll_init_socket(uv_loop_t* loop, uv_poll_t* handle, uv_os_sock_t socket); UV_EXTERN int uv_poll_start(uv_poll_t* handle, int events, uv_poll_cb cb); UV_EXTERN int uv_poll_stop(uv_poll_t* handle); struct uv_prepare_s { UV_HANDLE_FIELDS UV_PREPARE_PRIVATE_FIELDS }; UV_EXTERN int uv_prepare_init(uv_loop_t*, uv_prepare_t* prepare); UV_EXTERN int uv_prepare_start(uv_prepare_t* prepare, uv_prepare_cb cb); UV_EXTERN int uv_prepare_stop(uv_prepare_t* prepare); struct uv_check_s { UV_HANDLE_FIELDS UV_CHECK_PRIVATE_FIELDS }; UV_EXTERN int uv_check_init(uv_loop_t*, uv_check_t* check); UV_EXTERN int uv_check_start(uv_check_t* check, uv_check_cb cb); UV_EXTERN int uv_check_stop(uv_check_t* check); struct uv_idle_s { UV_HANDLE_FIELDS UV_IDLE_PRIVATE_FIELDS }; UV_EXTERN int uv_idle_init(uv_loop_t*, uv_idle_t* idle); UV_EXTERN int uv_idle_start(uv_idle_t* idle, uv_idle_cb cb); UV_EXTERN int uv_idle_stop(uv_idle_t* idle); struct uv_async_s { UV_HANDLE_FIELDS UV_ASYNC_PRIVATE_FIELDS }; UV_EXTERN int uv_async_init(uv_loop_t*, uv_async_t* async, uv_async_cb async_cb); UV_EXTERN int uv_async_send(uv_async_t* async); /* * uv_timer_t is a subclass of uv_handle_t. * * Used to get woken up at a specified time in the future. */ struct uv_timer_s { UV_HANDLE_FIELDS UV_TIMER_PRIVATE_FIELDS }; UV_EXTERN int uv_timer_init(uv_loop_t*, uv_timer_t* handle); UV_EXTERN int uv_timer_start(uv_timer_t* handle, uv_timer_cb cb, uint64_t timeout, uint64_t repeat); UV_EXTERN int uv_timer_stop(uv_timer_t* handle); UV_EXTERN int uv_timer_again(uv_timer_t* handle); UV_EXTERN void uv_timer_set_repeat(uv_timer_t* handle, uint64_t repeat); UV_EXTERN uint64_t uv_timer_get_repeat(const uv_timer_t* handle); /* * uv_getaddrinfo_t is a subclass of uv_req_t. * * Request object for uv_getaddrinfo. */ struct uv_getaddrinfo_s { UV_REQ_FIELDS /* read-only */ uv_loop_t* loop; UV_GETADDRINFO_PRIVATE_FIELDS }; UV_EXTERN int uv_getaddrinfo(uv_loop_t* loop, uv_getaddrinfo_t* req, uv_getaddrinfo_cb getaddrinfo_cb, const char* node, const char* service, const struct addrinfo* hints); UV_EXTERN void uv_freeaddrinfo(struct addrinfo* ai); /* * uv_getnameinfo_t is a subclass of uv_req_t. * * Request object for uv_getnameinfo. */ struct uv_getnameinfo_s { UV_REQ_FIELDS /* read-only */ uv_loop_t* loop; UV_GETNAMEINFO_PRIVATE_FIELDS }; UV_EXTERN int uv_getnameinfo(uv_loop_t* loop, uv_getnameinfo_t* req, uv_getnameinfo_cb getnameinfo_cb, const struct sockaddr* addr, int flags); /* uv_spawn() options. */ typedef enum { UV_IGNORE = 0x00, UV_CREATE_PIPE = 0x01, UV_INHERIT_FD = 0x02, UV_INHERIT_STREAM = 0x04, /* * When UV_CREATE_PIPE is specified, UV_READABLE_PIPE and UV_WRITABLE_PIPE * determine the direction of flow, from the child process' perspective. Both * flags may be specified to create a duplex data stream. */ UV_READABLE_PIPE = 0x10, UV_WRITABLE_PIPE = 0x20 } uv_stdio_flags; typedef struct uv_stdio_container_s { uv_stdio_flags flags; union { uv_stream_t* stream; int fd; } data; } uv_stdio_container_t; typedef struct uv_process_options_s { uv_exit_cb exit_cb; /* Called after the process exits. */ const char* file; /* Path to program to execute. */ /* * Command line arguments. args[0] should be the path to the program. On * Windows this uses CreateProcess which concatenates the arguments into a * string this can cause some strange errors. See the note at * windows_verbatim_arguments. */ char** args; /* * This will be set as the environ variable in the subprocess. If this is * NULL then the parents environ will be used. */ char** env; /* * If non-null this represents a directory the subprocess should execute * in. Stands for current working directory. */ const char* cwd; /* * Various flags that control how uv_spawn() behaves. See the definition of * `enum uv_process_flags` below. */ unsigned int flags; /* * The `stdio` field points to an array of uv_stdio_container_t structs that * describe the file descriptors that will be made available to the child * process. The convention is that stdio[0] points to stdin, fd 1 is used for * stdout, and fd 2 is stderr. * * Note that on windows file descriptors greater than 2 are available to the * child process only if the child processes uses the MSVCRT runtime. */ int stdio_count; uv_stdio_container_t* stdio; /* * Libuv can change the child process' user/group id. This happens only when * the appropriate bits are set in the flags fields. This is not supported on * windows; uv_spawn() will fail and set the error to UV_ENOTSUP. */ uv_uid_t uid; uv_gid_t gid; } uv_process_options_t; /* * These are the flags that can be used for the uv_process_options.flags field. */ enum uv_process_flags { /* * Set the child process' user id. The user id is supplied in the `uid` field * of the options struct. This does not work on windows; setting this flag * will cause uv_spawn() to fail. */ UV_PROCESS_SETUID = (1 << 0), /* * Set the child process' group id. The user id is supplied in the `gid` * field of the options struct. This does not work on windows; setting this * flag will cause uv_spawn() to fail. */ UV_PROCESS_SETGID = (1 << 1), /* * Do not wrap any arguments in quotes, or perform any other escaping, when * converting the argument list into a command line string. This option is * only meaningful on Windows systems. On Unix it is silently ignored. */ UV_PROCESS_WINDOWS_VERBATIM_ARGUMENTS = (1 << 2), /* * Spawn the child process in a detached state - this will make it a process * group leader, and will effectively enable the child to keep running after * the parent exits. Note that the child process will still keep the * parent's event loop alive unless the parent process calls uv_unref() on * the child's process handle. */ UV_PROCESS_DETACHED = (1 << 3), /* * Hide the subprocess console window that would normally be created. This * option is only meaningful on Windows systems. On Unix it is silently * ignored. */ UV_PROCESS_WINDOWS_HIDE = (1 << 4) }; /* * uv_process_t is a subclass of uv_handle_t. */ struct uv_process_s { UV_HANDLE_FIELDS uv_exit_cb exit_cb; int pid; UV_PROCESS_PRIVATE_FIELDS }; UV_EXTERN int uv_spawn(uv_loop_t* loop, uv_process_t* handle, const uv_process_options_t* options); UV_EXTERN int uv_process_kill(uv_process_t*, int signum); UV_EXTERN int uv_kill(int pid, int signum); /* * uv_work_t is a subclass of uv_req_t. */ struct uv_work_s { UV_REQ_FIELDS uv_loop_t* loop; uv_work_cb work_cb; uv_after_work_cb after_work_cb; UV_WORK_PRIVATE_FIELDS }; UV_EXTERN int uv_queue_work(uv_loop_t* loop, uv_work_t* req, uv_work_cb work_cb, uv_after_work_cb after_work_cb); UV_EXTERN int uv_cancel(uv_req_t* req); struct uv_cpu_info_s { char* model; int speed; struct uv_cpu_times_s { uint64_t user; uint64_t nice; uint64_t sys; uint64_t idle; uint64_t irq; } cpu_times; }; struct uv_interface_address_s { char* name; char phys_addr[6]; int is_internal; union { struct sockaddr_in address4; struct sockaddr_in6 address6; } address; union { struct sockaddr_in netmask4; struct sockaddr_in6 netmask6; } netmask; }; typedef enum { UV_DIRENT_UNKNOWN, UV_DIRENT_FILE, UV_DIRENT_DIR, UV_DIRENT_LINK, UV_DIRENT_FIFO, UV_DIRENT_SOCKET, UV_DIRENT_CHAR, UV_DIRENT_BLOCK } uv_dirent_type_t; struct uv_dirent_s { const char* name; uv_dirent_type_t type; }; UV_EXTERN char** uv_setup_args(int argc, char** argv); UV_EXTERN int uv_get_process_title(char* buffer, size_t size); UV_EXTERN int uv_set_process_title(const char* title); UV_EXTERN int uv_resident_set_memory(size_t* rss); UV_EXTERN int uv_uptime(double* uptime); typedef struct { long tv_sec; long tv_usec; } uv_timeval_t; typedef struct { uv_timeval_t ru_utime; /* user CPU time used */ uv_timeval_t ru_stime; /* system CPU time used */ uint64_t ru_maxrss; /* maximum resident set size */ uint64_t ru_ixrss; /* integral shared memory size */ uint64_t ru_idrss; /* integral unshared data size */ uint64_t ru_isrss; /* integral unshared stack size */ uint64_t ru_minflt; /* page reclaims (soft page faults) */ uint64_t ru_majflt; /* page faults (hard page faults) */ uint64_t ru_nswap; /* swaps */ uint64_t ru_inblock; /* block input operations */ uint64_t ru_oublock; /* block output operations */ uint64_t ru_msgsnd; /* IPC messages sent */ uint64_t ru_msgrcv; /* IPC messages received */ uint64_t ru_nsignals; /* signals received */ uint64_t ru_nvcsw; /* voluntary context switches */ uint64_t ru_nivcsw; /* involuntary context switches */ } uv_rusage_t; UV_EXTERN int uv_getrusage(uv_rusage_t* rusage); UV_EXTERN int uv_cpu_info(uv_cpu_info_t** cpu_infos, int* count); UV_EXTERN void uv_free_cpu_info(uv_cpu_info_t* cpu_infos, int count); UV_EXTERN int uv_interface_addresses(uv_interface_address_t** addresses, int* count); UV_EXTERN void uv_free_interface_addresses(uv_interface_address_t* addresses, int count); typedef enum { UV_FS_UNKNOWN = -1, UV_FS_CUSTOM, UV_FS_OPEN, UV_FS_CLOSE, UV_FS_READ, UV_FS_WRITE, UV_FS_SENDFILE, UV_FS_STAT, UV_FS_LSTAT, UV_FS_FSTAT, UV_FS_FTRUNCATE, UV_FS_UTIME, UV_FS_FUTIME, UV_FS_ACCESS, UV_FS_CHMOD, UV_FS_FCHMOD, UV_FS_FSYNC, UV_FS_FDATASYNC, UV_FS_UNLINK, UV_FS_RMDIR, UV_FS_MKDIR, UV_FS_MKDTEMP, UV_FS_RENAME, UV_FS_SCANDIR, UV_FS_LINK, UV_FS_SYMLINK, UV_FS_READLINK, UV_FS_CHOWN, UV_FS_FCHOWN } uv_fs_type; /* uv_fs_t is a subclass of uv_req_t. */ struct uv_fs_s { UV_REQ_FIELDS uv_fs_type fs_type; uv_loop_t* loop; uv_fs_cb cb; ssize_t result; void* ptr; const char* path; uv_stat_t statbuf; /* Stores the result of uv_fs_stat() and uv_fs_fstat(). */ UV_FS_PRIVATE_FIELDS }; UV_EXTERN void uv_fs_req_cleanup(uv_fs_t* req); UV_EXTERN int uv_fs_close(uv_loop_t* loop, uv_fs_t* req, uv_file file, uv_fs_cb cb); UV_EXTERN int uv_fs_open(uv_loop_t* loop, uv_fs_t* req, const char* path, int flags, int mode, uv_fs_cb cb); UV_EXTERN int uv_fs_read(uv_loop_t* loop, uv_fs_t* req, uv_file file, const uv_buf_t bufs[], unsigned int nbufs, int64_t offset, uv_fs_cb cb); UV_EXTERN int uv_fs_unlink(uv_loop_t* loop, uv_fs_t* req, const char* path, uv_fs_cb cb); UV_EXTERN int uv_fs_write(uv_loop_t* loop, uv_fs_t* req, uv_file file, const uv_buf_t bufs[], unsigned int nbufs, int64_t offset, uv_fs_cb cb); UV_EXTERN int uv_fs_mkdir(uv_loop_t* loop, uv_fs_t* req, const char* path, int mode, uv_fs_cb cb); UV_EXTERN int uv_fs_mkdtemp(uv_loop_t* loop, uv_fs_t* req, const char* tpl, uv_fs_cb cb); UV_EXTERN int uv_fs_rmdir(uv_loop_t* loop, uv_fs_t* req, const char* path, uv_fs_cb cb); UV_EXTERN int uv_fs_scandir(uv_loop_t* loop, uv_fs_t* req, const char* path, int flags, uv_fs_cb cb); UV_EXTERN int uv_fs_scandir_next(uv_fs_t* req, uv_dirent_t* ent); UV_EXTERN int uv_fs_stat(uv_loop_t* loop, uv_fs_t* req, const char* path, uv_fs_cb cb); UV_EXTERN int uv_fs_fstat(uv_loop_t* loop, uv_fs_t* req, uv_file file, uv_fs_cb cb); UV_EXTERN int uv_fs_rename(uv_loop_t* loop, uv_fs_t* req, const char* path, const char* new_path, uv_fs_cb cb); UV_EXTERN int uv_fs_fsync(uv_loop_t* loop, uv_fs_t* req, uv_file file, uv_fs_cb cb); UV_EXTERN int uv_fs_fdatasync(uv_loop_t* loop, uv_fs_t* req, uv_file file, uv_fs_cb cb); UV_EXTERN int uv_fs_ftruncate(uv_loop_t* loop, uv_fs_t* req, uv_file file, int64_t offset, uv_fs_cb cb); UV_EXTERN int uv_fs_sendfile(uv_loop_t* loop, uv_fs_t* req, uv_file out_fd, uv_file in_fd, int64_t in_offset, size_t length, uv_fs_cb cb); UV_EXTERN int uv_fs_access(uv_loop_t* loop, uv_fs_t* req, const char* path, int mode, uv_fs_cb cb); UV_EXTERN int uv_fs_chmod(uv_loop_t* loop, uv_fs_t* req, const char* path, int mode, uv_fs_cb cb); UV_EXTERN int uv_fs_utime(uv_loop_t* loop, uv_fs_t* req, const char* path, double atime, double mtime, uv_fs_cb cb); UV_EXTERN int uv_fs_futime(uv_loop_t* loop, uv_fs_t* req, uv_file file, double atime, double mtime, uv_fs_cb cb); UV_EXTERN int uv_fs_lstat(uv_loop_t* loop, uv_fs_t* req, const char* path, uv_fs_cb cb); UV_EXTERN int uv_fs_link(uv_loop_t* loop, uv_fs_t* req, const char* path, const char* new_path, uv_fs_cb cb); /* * This flag can be used with uv_fs_symlink() on Windows to specify whether * path argument points to a directory. */ #define UV_FS_SYMLINK_DIR 0x0001 /* * This flag can be used with uv_fs_symlink() on Windows to specify whether * the symlink is to be created using junction points. */ #define UV_FS_SYMLINK_JUNCTION 0x0002 UV_EXTERN int uv_fs_symlink(uv_loop_t* loop, uv_fs_t* req, const char* path, const char* new_path, int flags, uv_fs_cb cb); UV_EXTERN int uv_fs_readlink(uv_loop_t* loop, uv_fs_t* req, const char* path, uv_fs_cb cb); UV_EXTERN int uv_fs_fchmod(uv_loop_t* loop, uv_fs_t* req, uv_file file, int mode, uv_fs_cb cb); UV_EXTERN int uv_fs_chown(uv_loop_t* loop, uv_fs_t* req, const char* path, uv_uid_t uid, uv_gid_t gid, uv_fs_cb cb); UV_EXTERN int uv_fs_fchown(uv_loop_t* loop, uv_fs_t* req, uv_file file, uv_uid_t uid, uv_gid_t gid, uv_fs_cb cb); enum uv_fs_event { UV_RENAME = 1, UV_CHANGE = 2 }; struct uv_fs_event_s { UV_HANDLE_FIELDS /* private */ char* path; UV_FS_EVENT_PRIVATE_FIELDS }; /* * uv_fs_stat() based polling file watcher. */ struct uv_fs_poll_s { UV_HANDLE_FIELDS /* Private, don't touch. */ void* poll_ctx; }; UV_EXTERN int uv_fs_poll_init(uv_loop_t* loop, uv_fs_poll_t* handle); UV_EXTERN int uv_fs_poll_start(uv_fs_poll_t* handle, uv_fs_poll_cb poll_cb, const char* path, unsigned int interval); UV_EXTERN int uv_fs_poll_stop(uv_fs_poll_t* handle); UV_EXTERN int uv_fs_poll_getpath(uv_fs_poll_t* handle, char* buf, size_t* len); struct uv_signal_s { UV_HANDLE_FIELDS uv_signal_cb signal_cb; int signum; UV_SIGNAL_PRIVATE_FIELDS }; UV_EXTERN int uv_signal_init(uv_loop_t* loop, uv_signal_t* handle); UV_EXTERN int uv_signal_start(uv_signal_t* handle, uv_signal_cb signal_cb, int signum); UV_EXTERN int uv_signal_stop(uv_signal_t* handle); UV_EXTERN void uv_loadavg(double avg[3]); /* * Flags to be passed to uv_fs_event_start(). */ enum uv_fs_event_flags { /* * By default, if the fs event watcher is given a directory name, we will * watch for all events in that directory. This flags overrides this behavior * and makes fs_event report only changes to the directory entry itself. This * flag does not affect individual files watched. * This flag is currently not implemented yet on any backend. */ UV_FS_EVENT_WATCH_ENTRY = 1, /* * By default uv_fs_event will try to use a kernel interface such as inotify * or kqueue to detect events. This may not work on remote filesystems such * as NFS mounts. This flag makes fs_event fall back to calling stat() on a * regular interval. * This flag is currently not implemented yet on any backend. */ UV_FS_EVENT_STAT = 2, /* * By default, event watcher, when watching directory, is not registering * (is ignoring) changes in it's subdirectories. * This flag will override this behaviour on platforms that support it. */ UV_FS_EVENT_RECURSIVE = 4 }; UV_EXTERN int uv_fs_event_init(uv_loop_t* loop, uv_fs_event_t* handle); UV_EXTERN int uv_fs_event_start(uv_fs_event_t* handle, uv_fs_event_cb cb, const char* path, unsigned int flags); UV_EXTERN int uv_fs_event_stop(uv_fs_event_t* handle); UV_EXTERN int uv_fs_event_getpath(uv_fs_event_t* handle, char* buf, size_t* len); UV_EXTERN int uv_ip4_addr(const char* ip, int port, struct sockaddr_in* addr); UV_EXTERN int uv_ip6_addr(const char* ip, int port, struct sockaddr_in6* addr); UV_EXTERN int uv_ip4_name(const struct sockaddr_in* src, char* dst, size_t size); UV_EXTERN int uv_ip6_name(const struct sockaddr_in6* src, char* dst, size_t size); UV_EXTERN int uv_inet_ntop(int af, const void* src, char* dst, size_t size); UV_EXTERN int uv_inet_pton(int af, const char* src, void* dst); UV_EXTERN int uv_exepath(char* buffer, size_t* size); UV_EXTERN int uv_cwd(char* buffer, size_t* size); UV_EXTERN int uv_chdir(const char* dir); UV_EXTERN uint64_t uv_get_free_memory(void); UV_EXTERN uint64_t uv_get_total_memory(void); UV_EXTERN extern uint64_t uv_hrtime(void); UV_EXTERN void uv_disable_stdio_inheritance(void); UV_EXTERN int uv_dlopen(const char* filename, uv_lib_t* lib); UV_EXTERN void uv_dlclose(uv_lib_t* lib); UV_EXTERN int uv_dlsym(uv_lib_t* lib, const char* name, void** ptr); UV_EXTERN const char* uv_dlerror(const uv_lib_t* lib); UV_EXTERN int uv_mutex_init(uv_mutex_t* handle); UV_EXTERN void uv_mutex_destroy(uv_mutex_t* handle); UV_EXTERN void uv_mutex_lock(uv_mutex_t* handle); UV_EXTERN int uv_mutex_trylock(uv_mutex_t* handle); UV_EXTERN void uv_mutex_unlock(uv_mutex_t* handle); UV_EXTERN int uv_rwlock_init(uv_rwlock_t* rwlock); UV_EXTERN void uv_rwlock_destroy(uv_rwlock_t* rwlock); UV_EXTERN void uv_rwlock_rdlock(uv_rwlock_t* rwlock); UV_EXTERN int uv_rwlock_tryrdlock(uv_rwlock_t* rwlock); UV_EXTERN void uv_rwlock_rdunlock(uv_rwlock_t* rwlock); UV_EXTERN void uv_rwlock_wrlock(uv_rwlock_t* rwlock); UV_EXTERN int uv_rwlock_trywrlock(uv_rwlock_t* rwlock); UV_EXTERN void uv_rwlock_wrunlock(uv_rwlock_t* rwlock); UV_EXTERN int uv_sem_init(uv_sem_t* sem, unsigned int value); UV_EXTERN void uv_sem_destroy(uv_sem_t* sem); UV_EXTERN void uv_sem_post(uv_sem_t* sem); UV_EXTERN void uv_sem_wait(uv_sem_t* sem); UV_EXTERN int uv_sem_trywait(uv_sem_t* sem); UV_EXTERN int uv_cond_init(uv_cond_t* cond); UV_EXTERN void uv_cond_destroy(uv_cond_t* cond); UV_EXTERN void uv_cond_signal(uv_cond_t* cond); UV_EXTERN void uv_cond_broadcast(uv_cond_t* cond); UV_EXTERN int uv_barrier_init(uv_barrier_t* barrier, unsigned int count); UV_EXTERN void uv_barrier_destroy(uv_barrier_t* barrier); UV_EXTERN int uv_barrier_wait(uv_barrier_t* barrier); UV_EXTERN void uv_cond_wait(uv_cond_t* cond, uv_mutex_t* mutex); UV_EXTERN int uv_cond_timedwait(uv_cond_t* cond, uv_mutex_t* mutex, uint64_t timeout); UV_EXTERN void uv_once(uv_once_t* guard, void (*callback)(void)); UV_EXTERN int uv_key_create(uv_key_t* key); UV_EXTERN void uv_key_delete(uv_key_t* key); UV_EXTERN void* uv_key_get(uv_key_t* key); UV_EXTERN void uv_key_set(uv_key_t* key, void* value); typedef void (*uv_thread_cb)(void* arg); UV_EXTERN int uv_thread_create(uv_thread_t* tid, uv_thread_cb entry, void* arg); UV_EXTERN uv_thread_t uv_thread_self(void); UV_EXTERN int uv_thread_join(uv_thread_t *tid); UV_EXTERN int uv_thread_equal(const uv_thread_t* t1, const uv_thread_t* t2); /* The presence of these unions force similar struct layout. */ #define XX(_, name) uv_ ## name ## _t name; union uv_any_handle { UV_HANDLE_TYPE_MAP(XX) }; union uv_any_req { UV_REQ_TYPE_MAP(XX) }; #undef XX struct uv_loop_s { /* User data - use this for whatever. */ void* data; /* Loop reference counting. */ unsigned int active_handles; void* handle_queue[2]; void* active_reqs[2]; /* Internal flag to signal loop stop. */ unsigned int stop_flag; UV_LOOP_PRIVATE_FIELDS }; /* Don't export the private CPP symbols. */ #undef UV_HANDLE_TYPE_PRIVATE #undef UV_REQ_TYPE_PRIVATE #undef UV_REQ_PRIVATE_FIELDS #undef UV_STREAM_PRIVATE_FIELDS #undef UV_TCP_PRIVATE_FIELDS #undef UV_PREPARE_PRIVATE_FIELDS #undef UV_CHECK_PRIVATE_FIELDS #undef UV_IDLE_PRIVATE_FIELDS #undef UV_ASYNC_PRIVATE_FIELDS #undef UV_TIMER_PRIVATE_FIELDS #undef UV_GETADDRINFO_PRIVATE_FIELDS #undef UV_GETNAMEINFO_PRIVATE_FIELDS #undef UV_FS_REQ_PRIVATE_FIELDS #undef UV_WORK_PRIVATE_FIELDS #undef UV_FS_EVENT_PRIVATE_FIELDS #undef UV_SIGNAL_PRIVATE_FIELDS #undef UV_LOOP_PRIVATE_FIELDS #undef UV_LOOP_PRIVATE_PLATFORM_FIELDS #ifdef __cplusplus } #endif #endif /* UV_H */ iojs-v1.0.2-darwin-x64/include/node/v8-debug.h000644 000766 000024 00000021333 12455173732 021032 0ustar00iojsstaff000000 000000 // Copyright 2008 the V8 project authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef V8_V8_DEBUG_H_ #define V8_V8_DEBUG_H_ #include "v8.h" /** * Debugger support for the V8 JavaScript engine. */ namespace v8 { // Debug events which can occur in the V8 JavaScript engine. enum DebugEvent { Break = 1, Exception = 2, NewFunction = 3, BeforeCompile = 4, AfterCompile = 5, CompileError = 6, PromiseEvent = 7, AsyncTaskEvent = 8, BreakForCommand = 9 }; class V8_EXPORT Debug { public: /** * A client object passed to the v8 debugger whose ownership will be taken by * it. v8 is always responsible for deleting the object. */ class ClientData { public: virtual ~ClientData() {} }; /** * A message object passed to the debug message handler. */ class Message { public: /** * Check type of message. */ virtual bool IsEvent() const = 0; virtual bool IsResponse() const = 0; virtual DebugEvent GetEvent() const = 0; /** * Indicate whether this is a response to a continue command which will * start the VM running after this is processed. */ virtual bool WillStartRunning() const = 0; /** * Access to execution state and event data. Don't store these cross * callbacks as their content becomes invalid. These objects are from the * debugger event that started the debug message loop. */ virtual Handle GetExecutionState() const = 0; virtual Handle GetEventData() const = 0; /** * Get the debugger protocol JSON. */ virtual Handle GetJSON() const = 0; /** * Get the context active when the debug event happened. Note this is not * the current active context as the JavaScript part of the debugger is * running in its own context which is entered at this point. */ virtual Handle GetEventContext() const = 0; /** * Client data passed with the corresponding request if any. This is the * client_data data value passed into Debug::SendCommand along with the * request that led to the message or NULL if the message is an event. The * debugger takes ownership of the data and will delete it even if there is * no message handler. */ virtual ClientData* GetClientData() const = 0; virtual Isolate* GetIsolate() const = 0; virtual ~Message() {} }; /** * An event details object passed to the debug event listener. */ class EventDetails { public: /** * Event type. */ virtual DebugEvent GetEvent() const = 0; /** * Access to execution state and event data of the debug event. Don't store * these cross callbacks as their content becomes invalid. */ virtual Handle GetExecutionState() const = 0; virtual Handle GetEventData() const = 0; /** * Get the context active when the debug event happened. Note this is not * the current active context as the JavaScript part of the debugger is * running in its own context which is entered at this point. */ virtual Handle GetEventContext() const = 0; /** * Client data passed with the corresponding callback when it was * registered. */ virtual Handle GetCallbackData() const = 0; /** * Client data passed to DebugBreakForCommand function. The * debugger takes ownership of the data and will delete it even if * there is no message handler. */ virtual ClientData* GetClientData() const = 0; virtual ~EventDetails() {} }; /** * Debug event callback function. * * \param event_details object providing information about the debug event * * A EventCallback2 does not take possession of the event data, * and must not rely on the data persisting after the handler returns. */ typedef void (*EventCallback)(const EventDetails& event_details); /** * Debug message callback function. * * \param message the debug message handler message object * * A MessageHandler2 does not take possession of the message data, * and must not rely on the data persisting after the handler returns. */ typedef void (*MessageHandler)(const Message& message); /** * Callback function for the host to ensure debug messages are processed. */ typedef void (*DebugMessageDispatchHandler)(); static bool SetDebugEventListener(EventCallback that, Handle data = Handle()); // Schedule a debugger break to happen when JavaScript code is run // in the given isolate. static void DebugBreak(Isolate* isolate); // Remove scheduled debugger break in given isolate if it has not // happened yet. static void CancelDebugBreak(Isolate* isolate); // Check if a debugger break is scheduled in the given isolate. static bool CheckDebugBreak(Isolate* isolate); // Break execution of JavaScript in the given isolate (this method // can be invoked from a non-VM thread) for further client command // execution on a VM thread. Client data is then passed in // EventDetails to EventCallback2 at the moment when the VM actually // stops. static void DebugBreakForCommand(Isolate* isolate, ClientData* data); // Message based interface. The message protocol is JSON. static void SetMessageHandler(MessageHandler handler); static void SendCommand(Isolate* isolate, const uint16_t* command, int length, ClientData* client_data = NULL); /** * Run a JavaScript function in the debugger. * \param fun the function to call * \param data passed as second argument to the function * With this call the debugger is entered and the function specified is called * with the execution state as the first argument. This makes it possible to * get access to information otherwise not available during normal JavaScript * execution e.g. details on stack frames. Receiver of the function call will * be the debugger context global object, however this is a subject to change. * The following example shows a JavaScript function which when passed to * v8::Debug::Call will return the current line of JavaScript execution. * * \code * function frame_source_line(exec_state) { * return exec_state.frame(0).sourceLine(); * } * \endcode */ static Local Call(v8::Handle fun, Handle data = Handle()); /** * Returns a mirror object for the given object. */ static Local GetMirror(v8::Handle obj); /** * Makes V8 process all pending debug messages. * * From V8 point of view all debug messages come asynchronously (e.g. from * remote debugger) but they all must be handled synchronously: V8 cannot * do 2 things at one time so normal script execution must be interrupted * for a while. * * Generally when message arrives V8 may be in one of 3 states: * 1. V8 is running script; V8 will automatically interrupt and process all * pending messages; * 2. V8 is suspended on debug breakpoint; in this state V8 is dedicated * to reading and processing debug messages; * 3. V8 is not running at all or has called some long-working C++ function; * by default it means that processing of all debug messages will be deferred * until V8 gets control again; however, embedding application may improve * this by manually calling this method. * * Technically this method in many senses is equivalent to executing empty * script: * 1. It does nothing except for processing all pending debug messages. * 2. It should be invoked with the same precautions and from the same context * as V8 script would be invoked from, because: * a. with "evaluate" command it can do whatever normal script can do, * including all native calls; * b. no other thread should call V8 while this method is running * (v8::Locker may be used here). * * "Evaluate" debug command behavior currently is not specified in scope * of this method. */ static void ProcessDebugMessages(); /** * Debugger is running in its own context which is entered while debugger * messages are being dispatched. This is an explicit getter for this * debugger context. Note that the content of the debugger context is subject * to change. */ static Local GetDebugContext(); /** * Enable/disable LiveEdit functionality for the given Isolate * (default Isolate if not provided). V8 will abort if LiveEdit is * unexpectedly used. LiveEdit is enabled by default. */ static void SetLiveEditEnabled(Isolate* isolate, bool enable); }; } // namespace v8 #undef EXPORT #endif // V8_V8_DEBUG_H_ iojs-v1.0.2-darwin-x64/include/node/v8-platform.h000644 000766 000024 00000004025 12455173732 021567 0ustar00iojsstaff000000 000000 // Copyright 2013 the V8 project authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef V8_V8_PLATFORM_H_ #define V8_V8_PLATFORM_H_ namespace v8 { class Isolate; /** * A Task represents a unit of work. */ class Task { public: virtual ~Task() {} virtual void Run() = 0; }; /** * V8 Platform abstraction layer. * * The embedder has to provide an implementation of this interface before * initializing the rest of V8. */ class Platform { public: /** * This enum is used to indicate whether a task is potentially long running, * or causes a long wait. The embedder might want to use this hint to decide * whether to execute the task on a dedicated thread. */ enum ExpectedRuntime { kShortRunningTask, kLongRunningTask }; virtual ~Platform() {} /** * Schedules a task to be invoked on a background thread. |expected_runtime| * indicates that the task will run a long time. The Platform implementation * takes ownership of |task|. There is no guarantee about order of execution * of tasks wrt order of scheduling, nor is there a guarantee about the * thread the task will be run on. */ virtual void CallOnBackgroundThread(Task* task, ExpectedRuntime expected_runtime) = 0; /** * Schedules a task to be invoked on a foreground thread wrt a specific * |isolate|. Tasks posted for the same isolate should be execute in order of * scheduling. The definition of "foreground" is opaque to V8. */ virtual void CallOnForegroundThread(Isolate* isolate, Task* task) = 0; /** * Monotonically increasing time in seconds from an arbitrary fixed point in * the past. This function is expected to return at least * millisecond-precision values. For this reason, * it is recommended that the fixed point be no further in the past than * the epoch. **/ virtual double MonotonicallyIncreasingTime() = 0; }; } // namespace v8 #endif // V8_V8_PLATFORM_H_ iojs-v1.0.2-darwin-x64/include/node/v8-profiler.h000644 000766 000024 00000047260 12455173732 021575 0ustar00iojsstaff000000 000000 // Copyright 2010 the V8 project authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef V8_V8_PROFILER_H_ #define V8_V8_PROFILER_H_ #include "v8.h" /** * Profiler support for the V8 JavaScript engine. */ namespace v8 { class HeapGraphNode; struct HeapStatsUpdate; typedef uint32_t SnapshotObjectId; /** * CpuProfileNode represents a node in a call graph. */ class V8_EXPORT CpuProfileNode { public: struct LineTick { /** The 1-based number of the source line where the function originates. */ int line; /** The count of samples associated with the source line. */ unsigned int hit_count; }; /** Returns function name (empty string for anonymous functions.) */ Handle GetFunctionName() const; /** Returns id of the script where function is located. */ int GetScriptId() const; /** Returns resource name for script from where the function originates. */ Handle GetScriptResourceName() const; /** * Returns the number, 1-based, of the line where the function originates. * kNoLineNumberInfo if no line number information is available. */ int GetLineNumber() const; /** * Returns 1-based number of the column where the function originates. * kNoColumnNumberInfo if no column number information is available. */ int GetColumnNumber() const; /** * Returns the number of the function's source lines that collect the samples. */ unsigned int GetHitLineCount() const; /** Returns the set of source lines that collect the samples. * The caller allocates buffer and responsible for releasing it. * True if all available entries are copied, otherwise false. * The function copies nothing if buffer is not large enough. */ bool GetLineTicks(LineTick* entries, unsigned int length) const; /** Returns bailout reason for the function * if the optimization was disabled for it. */ const char* GetBailoutReason() const; /** * Returns the count of samples where the function was currently executing. */ unsigned GetHitCount() const; /** Returns function entry UID. */ unsigned GetCallUid() const; /** Returns id of the node. The id is unique within the tree */ unsigned GetNodeId() const; /** Returns child nodes count of the node. */ int GetChildrenCount() const; /** Retrieves a child node by index. */ const CpuProfileNode* GetChild(int index) const; static const int kNoLineNumberInfo = Message::kNoLineNumberInfo; static const int kNoColumnNumberInfo = Message::kNoColumnInfo; }; /** * CpuProfile contains a CPU profile in a form of top-down call tree * (from main() down to functions that do all the work). */ class V8_EXPORT CpuProfile { public: /** Returns CPU profile title. */ Handle GetTitle() const; /** Returns the root node of the top down call tree. */ const CpuProfileNode* GetTopDownRoot() const; /** * Returns number of samples recorded. The samples are not recorded unless * |record_samples| parameter of CpuProfiler::StartCpuProfiling is true. */ int GetSamplesCount() const; /** * Returns profile node corresponding to the top frame the sample at * the given index. */ const CpuProfileNode* GetSample(int index) const; /** * Returns the timestamp of the sample. The timestamp is the number of * microseconds since some unspecified starting point. * The point is equal to the starting point used by GetStartTime. */ int64_t GetSampleTimestamp(int index) const; /** * Returns time when the profile recording was started (in microseconds) * since some unspecified starting point. */ int64_t GetStartTime() const; /** * Returns time when the profile recording was stopped (in microseconds) * since some unspecified starting point. * The point is equal to the starting point used by GetStartTime. */ int64_t GetEndTime() const; /** * Deletes the profile and removes it from CpuProfiler's list. * All pointers to nodes previously returned become invalid. */ void Delete(); }; /** * Interface for controlling CPU profiling. Instance of the * profiler can be retrieved using v8::Isolate::GetCpuProfiler. */ class V8_EXPORT CpuProfiler { public: /** * Changes default CPU profiler sampling interval to the specified number * of microseconds. Default interval is 1000us. This method must be called * when there are no profiles being recorded. */ void SetSamplingInterval(int us); /** * Starts collecting CPU profile. Title may be an empty string. It * is allowed to have several profiles being collected at * once. Attempts to start collecting several profiles with the same * title are silently ignored. While collecting a profile, functions * from all security contexts are included in it. The token-based * filtering is only performed when querying for a profile. * * |record_samples| parameter controls whether individual samples should * be recorded in addition to the aggregated tree. */ void StartProfiling(Handle title, bool record_samples = false); /** Deprecated. Use StartProfiling instead. */ V8_DEPRECATED("Use StartProfiling", void StartCpuProfiling(Handle title, bool record_samples = false)); /** * Stops collecting CPU profile with a given title and returns it. * If the title given is empty, finishes the last profile started. */ CpuProfile* StopProfiling(Handle title); /** Deprecated. Use StopProfiling instead. */ V8_DEPRECATED("Use StopProfiling", const CpuProfile* StopCpuProfiling(Handle title)); /** * Tells the profiler whether the embedder is idle. */ void SetIdle(bool is_idle); private: CpuProfiler(); ~CpuProfiler(); CpuProfiler(const CpuProfiler&); CpuProfiler& operator=(const CpuProfiler&); }; /** * HeapSnapshotEdge represents a directed connection between heap * graph nodes: from retainers to retained nodes. */ class V8_EXPORT HeapGraphEdge { public: enum Type { kContextVariable = 0, // A variable from a function context. kElement = 1, // An element of an array. kProperty = 2, // A named object property. kInternal = 3, // A link that can't be accessed from JS, // thus, its name isn't a real property name // (e.g. parts of a ConsString). kHidden = 4, // A link that is needed for proper sizes // calculation, but may be hidden from user. kShortcut = 5, // A link that must not be followed during // sizes calculation. kWeak = 6 // A weak reference (ignored by the GC). }; /** Returns edge type (see HeapGraphEdge::Type). */ Type GetType() const; /** * Returns edge name. This can be a variable name, an element index, or * a property name. */ Handle GetName() const; /** Returns origin node. */ const HeapGraphNode* GetFromNode() const; /** Returns destination node. */ const HeapGraphNode* GetToNode() const; }; /** * HeapGraphNode represents a node in a heap graph. */ class V8_EXPORT HeapGraphNode { public: enum Type { kHidden = 0, // Hidden node, may be filtered when shown to user. kArray = 1, // An array of elements. kString = 2, // A string. kObject = 3, // A JS object (except for arrays and strings). kCode = 4, // Compiled code. kClosure = 5, // Function closure. kRegExp = 6, // RegExp. kHeapNumber = 7, // Number stored in the heap. kNative = 8, // Native object (not from V8 heap). kSynthetic = 9, // Synthetic object, usualy used for grouping // snapshot items together. kConsString = 10, // Concatenated string. A pair of pointers to strings. kSlicedString = 11, // Sliced string. A fragment of another string. kSymbol = 12 // A Symbol (ES6). }; /** Returns node type (see HeapGraphNode::Type). */ Type GetType() const; /** * Returns node name. Depending on node's type this can be the name * of the constructor (for objects), the name of the function (for * closures), string value, or an empty string (for compiled code). */ Handle GetName() const; /** * Returns node id. For the same heap object, the id remains the same * across all snapshots. */ SnapshotObjectId GetId() const; /** Returns node's own size, in bytes. */ V8_DEPRECATED("Use GetShallowSize instead", int GetSelfSize() const); /** Returns node's own size, in bytes. */ size_t GetShallowSize() const; /** Returns child nodes count of the node. */ int GetChildrenCount() const; /** Retrieves a child by index. */ const HeapGraphEdge* GetChild(int index) const; }; /** * An interface for exporting data from V8, using "push" model. */ class V8_EXPORT OutputStream { // NOLINT public: enum WriteResult { kContinue = 0, kAbort = 1 }; virtual ~OutputStream() {} /** Notify about the end of stream. */ virtual void EndOfStream() = 0; /** Get preferred output chunk size. Called only once. */ virtual int GetChunkSize() { return 1024; } /** * Writes the next chunk of snapshot data into the stream. Writing * can be stopped by returning kAbort as function result. EndOfStream * will not be called in case writing was aborted. */ virtual WriteResult WriteAsciiChunk(char* data, int size) = 0; /** * Writes the next chunk of heap stats data into the stream. Writing * can be stopped by returning kAbort as function result. EndOfStream * will not be called in case writing was aborted. */ virtual WriteResult WriteHeapStatsChunk(HeapStatsUpdate* data, int count) { return kAbort; } }; /** * HeapSnapshots record the state of the JS heap at some moment. */ class V8_EXPORT HeapSnapshot { public: enum SerializationFormat { kJSON = 0 // See format description near 'Serialize' method. }; /** Returns heap snapshot UID (assigned by the profiler.) */ unsigned GetUid() const; /** Returns heap snapshot title. */ Handle GetTitle() const; /** Returns the root node of the heap graph. */ const HeapGraphNode* GetRoot() const; /** Returns a node by its id. */ const HeapGraphNode* GetNodeById(SnapshotObjectId id) const; /** Returns total nodes count in the snapshot. */ int GetNodesCount() const; /** Returns a node by index. */ const HeapGraphNode* GetNode(int index) const; /** Returns a max seen JS object Id. */ SnapshotObjectId GetMaxSnapshotJSObjectId() const; /** * Deletes the snapshot and removes it from HeapProfiler's list. * All pointers to nodes, edges and paths previously returned become * invalid. */ void Delete(); /** * Prepare a serialized representation of the snapshot. The result * is written into the stream provided in chunks of specified size. * The total length of the serialized snapshot is unknown in * advance, it can be roughly equal to JS heap size (that means, * it can be really big - tens of megabytes). * * For the JSON format, heap contents are represented as an object * with the following structure: * * { * snapshot: { * title: "...", * uid: nnn, * meta: { meta-info }, * node_count: nnn, * edge_count: nnn * }, * nodes: [nodes array], * edges: [edges array], * strings: [strings array] * } * * Nodes reference strings, other nodes, and edges by their indexes * in corresponding arrays. */ void Serialize(OutputStream* stream, SerializationFormat format) const; }; /** * An interface for reporting progress and controlling long-running * activities. */ class V8_EXPORT ActivityControl { // NOLINT public: enum ControlOption { kContinue = 0, kAbort = 1 }; virtual ~ActivityControl() {} /** * Notify about current progress. The activity can be stopped by * returning kAbort as the callback result. */ virtual ControlOption ReportProgressValue(int done, int total) = 0; }; /** * Interface for controlling heap profiling. Instance of the * profiler can be retrieved using v8::Isolate::GetHeapProfiler. */ class V8_EXPORT HeapProfiler { public: /** * Callback function invoked for obtaining RetainedObjectInfo for * the given JavaScript wrapper object. It is prohibited to enter V8 * while the callback is running: only getters on the handle and * GetPointerFromInternalField on the objects are allowed. */ typedef RetainedObjectInfo* (*WrapperInfoCallback) (uint16_t class_id, Handle wrapper); /** Returns the number of snapshots taken. */ int GetSnapshotCount(); /** Returns a snapshot by index. */ const HeapSnapshot* GetHeapSnapshot(int index); /** * Returns SnapshotObjectId for a heap object referenced by |value| if * it has been seen by the heap profiler, kUnknownObjectId otherwise. */ SnapshotObjectId GetObjectId(Handle value); /** * Returns heap object with given SnapshotObjectId if the object is alive, * otherwise empty handle is returned. */ Handle FindObjectById(SnapshotObjectId id); /** * Clears internal map from SnapshotObjectId to heap object. The new objects * will not be added into it unless a heap snapshot is taken or heap object * tracking is kicked off. */ void ClearObjectIds(); /** * A constant for invalid SnapshotObjectId. GetSnapshotObjectId will return * it in case heap profiler cannot find id for the object passed as * parameter. HeapSnapshot::GetNodeById will always return NULL for such id. */ static const SnapshotObjectId kUnknownObjectId = 0; /** * Callback interface for retrieving user friendly names of global objects. */ class ObjectNameResolver { public: /** * Returns name to be used in the heap snapshot for given node. Returned * string must stay alive until snapshot collection is completed. */ virtual const char* GetName(Handle object) = 0; protected: virtual ~ObjectNameResolver() {} }; /** * Takes a heap snapshot and returns it. Title may be an empty string. */ const HeapSnapshot* TakeHeapSnapshot( Handle title, ActivityControl* control = NULL, ObjectNameResolver* global_object_name_resolver = NULL); /** * Starts tracking of heap objects population statistics. After calling * this method, all heap objects relocations done by the garbage collector * are being registered. * * |track_allocations| parameter controls whether stack trace of each * allocation in the heap will be recorded and reported as part of * HeapSnapshot. */ void StartTrackingHeapObjects(bool track_allocations = false); /** * Adds a new time interval entry to the aggregated statistics array. The * time interval entry contains information on the current heap objects * population size. The method also updates aggregated statistics and * reports updates for all previous time intervals via the OutputStream * object. Updates on each time interval are provided as a stream of the * HeapStatsUpdate structure instances. * The return value of the function is the last seen heap object Id. * * StartTrackingHeapObjects must be called before the first call to this * method. */ SnapshotObjectId GetHeapStats(OutputStream* stream); /** * Stops tracking of heap objects population statistics, cleans up all * collected data. StartHeapObjectsTracking must be called again prior to * calling PushHeapObjectsStats next time. */ void StopTrackingHeapObjects(); /** * Deletes all snapshots taken. All previously returned pointers to * snapshots and their contents become invalid after this call. */ void DeleteAllHeapSnapshots(); /** Binds a callback to embedder's class ID. */ void SetWrapperClassInfoProvider( uint16_t class_id, WrapperInfoCallback callback); /** * Default value of persistent handle class ID. Must not be used to * define a class. Can be used to reset a class of a persistent * handle. */ static const uint16_t kPersistentHandleNoClassId = 0; /** Returns memory used for profiler internal data and snapshots. */ size_t GetProfilerMemorySize(); /** * Sets a RetainedObjectInfo for an object group (see V8::SetObjectGroupId). */ void SetRetainedObjectInfo(UniqueId id, RetainedObjectInfo* info); private: HeapProfiler(); ~HeapProfiler(); HeapProfiler(const HeapProfiler&); HeapProfiler& operator=(const HeapProfiler&); }; /** * Interface for providing information about embedder's objects * held by global handles. This information is reported in two ways: * * 1. When calling AddObjectGroup, an embedder may pass * RetainedObjectInfo instance describing the group. To collect * this information while taking a heap snapshot, V8 calls GC * prologue and epilogue callbacks. * * 2. When a heap snapshot is collected, V8 additionally * requests RetainedObjectInfos for persistent handles that * were not previously reported via AddObjectGroup. * * Thus, if an embedder wants to provide information about native * objects for heap snapshots, he can do it in a GC prologue * handler, and / or by assigning wrapper class ids in the following way: * * 1. Bind a callback to class id by calling SetWrapperClassInfoProvider. * 2. Call SetWrapperClassId on certain persistent handles. * * V8 takes ownership of RetainedObjectInfo instances passed to it and * keeps them alive only during snapshot collection. Afterwards, they * are freed by calling the Dispose class function. */ class V8_EXPORT RetainedObjectInfo { // NOLINT public: /** Called by V8 when it no longer needs an instance. */ virtual void Dispose() = 0; /** Returns whether two instances are equivalent. */ virtual bool IsEquivalent(RetainedObjectInfo* other) = 0; /** * Returns hash value for the instance. Equivalent instances * must have the same hash value. */ virtual intptr_t GetHash() = 0; /** * Returns human-readable label. It must be a null-terminated UTF-8 * encoded string. V8 copies its contents during a call to GetLabel. */ virtual const char* GetLabel() = 0; /** * Returns human-readable group label. It must be a null-terminated UTF-8 * encoded string. V8 copies its contents during a call to GetGroupLabel. * Heap snapshot generator will collect all the group names, create * top level entries with these names and attach the objects to the * corresponding top level group objects. There is a default * implementation which is required because embedders don't have their * own implementation yet. */ virtual const char* GetGroupLabel() { return GetLabel(); } /** * Returns element count in case if a global handle retains * a subgraph by holding one of its nodes. */ virtual intptr_t GetElementCount() { return -1; } /** Returns embedder's object size in bytes. */ virtual intptr_t GetSizeInBytes() { return -1; } protected: RetainedObjectInfo() {} virtual ~RetainedObjectInfo() {} private: RetainedObjectInfo(const RetainedObjectInfo&); RetainedObjectInfo& operator=(const RetainedObjectInfo&); }; /** * A struct for exporting HeapStats data from V8, using "push" model. * See HeapProfiler::GetHeapStats. */ struct HeapStatsUpdate { HeapStatsUpdate(uint32_t index, uint32_t count, uint32_t size) : index(index), count(count), size(size) { } uint32_t index; // Index of the time interval that was changed. uint32_t count; // New value of count field for the interval with this index. uint32_t size; // New value of size field for the interval with this index. }; } // namespace v8 #endif // V8_V8_PROFILER_H_ iojs-v1.0.2-darwin-x64/include/node/v8-testing.h000644 000766 000024 00000002011 12455173732 021411 0ustar00iojsstaff000000 000000 // Copyright 2010 the V8 project authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef V8_V8_TEST_H_ #define V8_V8_TEST_H_ #include "v8.h" /** * Testing support for the V8 JavaScript engine. */ namespace v8 { class V8_EXPORT Testing { public: enum StressType { kStressTypeOpt, kStressTypeDeopt }; /** * Set the type of stressing to do. The default if not set is kStressTypeOpt. */ static void SetStressRunType(StressType type); /** * Get the number of runs of a given test that is required to get the full * stress coverage. */ static int GetStressRuns(); /** * Indicate the number of the run which is about to start. The value of run * should be between 0 and one less than the result from GetStressRuns() */ static void PrepareStressRun(int run); /** * Force deoptimization of all functions. */ static void DeoptimizeAll(); }; } // namespace v8 #endif // V8_V8_TEST_H_ iojs-v1.0.2-darwin-x64/include/node/v8-util.h000644 000766 000024 00000034477 12455173732 020736 0ustar00iojsstaff000000 000000 // Copyright 2014 the V8 project authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. #ifndef V8_UTIL_H_ #define V8_UTIL_H_ #include "v8.h" #include #include /** * Support for Persistent containers. * * C++11 embedders can use STL containers with UniquePersistent values, * but pre-C++11 does not support the required move semantic and hence * may want these container classes. */ namespace v8 { typedef uintptr_t PersistentContainerValue; static const uintptr_t kPersistentContainerNotFound = 0; enum PersistentContainerCallbackType { kNotWeak, kWeak }; /** * A default trait implemenation for PersistentValueMap which uses std::map * as a backing map. * * Users will have to implement their own weak callbacks & dispose traits. */ template class StdMapTraits { public: // STL map & related: typedef std::map Impl; typedef typename Impl::iterator Iterator; static bool Empty(Impl* impl) { return impl->empty(); } static size_t Size(Impl* impl) { return impl->size(); } static void Swap(Impl& a, Impl& b) { std::swap(a, b); } // NOLINT static Iterator Begin(Impl* impl) { return impl->begin(); } static Iterator End(Impl* impl) { return impl->end(); } static K Key(Iterator it) { return it->first; } static PersistentContainerValue Value(Iterator it) { return it->second; } static PersistentContainerValue Set(Impl* impl, K key, PersistentContainerValue value) { std::pair res = impl->insert(std::make_pair(key, value)); PersistentContainerValue old_value = kPersistentContainerNotFound; if (!res.second) { old_value = res.first->second; res.first->second = value; } return old_value; } static PersistentContainerValue Get(Impl* impl, K key) { Iterator it = impl->find(key); if (it == impl->end()) return kPersistentContainerNotFound; return it->second; } static PersistentContainerValue Remove(Impl* impl, K key) { Iterator it = impl->find(key); if (it == impl->end()) return kPersistentContainerNotFound; PersistentContainerValue value = it->second; impl->erase(it); return value; } }; /** * A default trait implementation for PersistentValueMap, which inherits * a std:map backing map from StdMapTraits and holds non-weak persistent * objects and has no special Dispose handling. * * You should not derive from this class, since MapType depends on the * surrounding class, and hence a subclass cannot simply inherit the methods. */ template class DefaultPersistentValueMapTraits : public StdMapTraits { public: // Weak callback & friends: static const PersistentContainerCallbackType kCallbackType = kNotWeak; typedef PersistentValueMap > MapType; typedef void WeakCallbackDataType; static WeakCallbackDataType* WeakCallbackParameter( MapType* map, const K& key, Local value) { return NULL; } static MapType* MapFromWeakCallbackData( const WeakCallbackData& data) { return NULL; } static K KeyFromWeakCallbackData( const WeakCallbackData& data) { return K(); } static void DisposeCallbackData(WeakCallbackDataType* data) { } static void Dispose(Isolate* isolate, UniquePersistent value, K key) { } }; /** * A map wrapper that allows using UniquePersistent as a mapped value. * C++11 embedders don't need this class, as they can use UniquePersistent * directly in std containers. * * The map relies on a backing map, whose type and accessors are described * by the Traits class. The backing map will handle values of type * PersistentContainerValue, with all conversion into and out of V8 * handles being transparently handled by this class. */ template class PersistentValueMap { public: explicit PersistentValueMap(Isolate* isolate) : isolate_(isolate) {} ~PersistentValueMap() { Clear(); } Isolate* GetIsolate() { return isolate_; } /** * Return size of the map. */ size_t Size() { return Traits::Size(&impl_); } /** * Return whether the map holds weak persistents. */ bool IsWeak() { return Traits::kCallbackType != kNotWeak; } /** * Get value stored in map. */ Local Get(const K& key) { return Local::New(isolate_, FromVal(Traits::Get(&impl_, key))); } /** * Check whether a value is contained in the map. */ bool Contains(const K& key) { return Traits::Get(&impl_, key) != kPersistentContainerNotFound; } /** * Get value stored in map and set it in returnValue. * Return true if a value was found. */ bool SetReturnValue(const K& key, ReturnValue returnValue) { return SetReturnValueFromVal(&returnValue, Traits::Get(&impl_, key)); } /** * Call Isolate::SetReference with the given parent and the map value. */ void SetReference(const K& key, const Persistent& parent) { GetIsolate()->SetReference( reinterpret_cast(parent.val_), reinterpret_cast(FromVal(Traits::Get(&impl_, key)))); } /** * Put value into map. Depending on Traits::kIsWeak, the value will be held * by the map strongly or weakly. * Returns old value as UniquePersistent. */ UniquePersistent Set(const K& key, Local value) { UniquePersistent persistent(isolate_, value); return SetUnique(key, &persistent); } /** * Put value into map, like Set(const K&, Local). */ UniquePersistent Set(const K& key, UniquePersistent value) { return SetUnique(key, &value); } /** * Return value for key and remove it from the map. */ UniquePersistent Remove(const K& key) { return Release(Traits::Remove(&impl_, key)).Pass(); } /** * Traverses the map repeatedly, * in case side effects of disposal cause insertions. **/ void Clear() { typedef typename Traits::Iterator It; HandleScope handle_scope(isolate_); // TODO(dcarney): figure out if this swap and loop is necessary. while (!Traits::Empty(&impl_)) { typename Traits::Impl impl; Traits::Swap(impl_, impl); for (It i = Traits::Begin(&impl); i != Traits::End(&impl); ++i) { Traits::Dispose(isolate_, Release(Traits::Value(i)).Pass(), Traits::Key(i)); } } } /** * Helper class for GetReference/SetWithReference. Do not use outside * that context. */ class PersistentValueReference { public: PersistentValueReference() : value_(kPersistentContainerNotFound) { } PersistentValueReference(const PersistentValueReference& other) : value_(other.value_) { } Local NewLocal(Isolate* isolate) const { return Local::New(isolate, FromVal(value_)); } bool IsEmpty() const { return value_ == kPersistentContainerNotFound; } template bool SetReturnValue(ReturnValue returnValue) { return SetReturnValueFromVal(&returnValue, value_); } void Reset() { value_ = kPersistentContainerNotFound; } void operator=(const PersistentValueReference& other) { value_ = other.value_; } private: friend class PersistentValueMap; explicit PersistentValueReference(PersistentContainerValue value) : value_(value) { } void operator=(PersistentContainerValue value) { value_ = value; } PersistentContainerValue value_; }; /** * Get a reference to a map value. This enables fast, repeated access * to a value stored in the map while the map remains unchanged. * * Careful: This is potentially unsafe, so please use with care. * The value will become invalid if the value for this key changes * in the underlying map, as a result of Set or Remove for the same * key; as a result of the weak callback for the same key; or as a * result of calling Clear() or destruction of the map. */ PersistentValueReference GetReference(const K& key) { return PersistentValueReference(Traits::Get(&impl_, key)); } /** * Put a value into the map and update the reference. * Restrictions of GetReference apply here as well. */ UniquePersistent Set(const K& key, UniquePersistent value, PersistentValueReference* reference) { *reference = Leak(&value); return SetUnique(key, &value); } private: PersistentValueMap(PersistentValueMap&); void operator=(PersistentValueMap&); /** * Put the value into the map, and set the 'weak' callback when demanded * by the Traits class. */ UniquePersistent SetUnique(const K& key, UniquePersistent* persistent) { if (Traits::kCallbackType != kNotWeak) { Local value(Local::New(isolate_, *persistent)); persistent->template SetWeak( Traits::WeakCallbackParameter(this, key, value), WeakCallback); } PersistentContainerValue old_value = Traits::Set(&impl_, key, ClearAndLeak(persistent)); return Release(old_value).Pass(); } static void WeakCallback( const WeakCallbackData& data) { if (Traits::kCallbackType != kNotWeak) { PersistentValueMap* persistentValueMap = Traits::MapFromWeakCallbackData(data); K key = Traits::KeyFromWeakCallbackData(data); Traits::Dispose(data.GetIsolate(), persistentValueMap->Remove(key).Pass(), key); Traits::DisposeCallbackData(data.GetParameter()); } } static V* FromVal(PersistentContainerValue v) { return reinterpret_cast(v); } static bool SetReturnValueFromVal( ReturnValue* returnValue, PersistentContainerValue value) { bool hasValue = value != kPersistentContainerNotFound; if (hasValue) { returnValue->SetInternal( *reinterpret_cast(FromVal(value))); } return hasValue; } static PersistentContainerValue ClearAndLeak( UniquePersistent* persistent) { V* v = persistent->val_; persistent->val_ = 0; return reinterpret_cast(v); } static PersistentContainerValue Leak( UniquePersistent* persistent) { return reinterpret_cast(persistent->val_); } /** * Return a container value as UniquePersistent and make sure the weak * callback is properly disposed of. All remove functionality should go * through this. */ static UniquePersistent Release(PersistentContainerValue v) { UniquePersistent p; p.val_ = FromVal(v); if (Traits::kCallbackType != kNotWeak && p.IsWeak()) { Traits::DisposeCallbackData( p.template ClearWeak()); } return p.Pass(); } Isolate* isolate_; typename Traits::Impl impl_; }; /** * A map that uses UniquePersistent as value and std::map as the backing * implementation. Persistents are held non-weak. * * C++11 embedders don't need this class, as they can use * UniquePersistent directly in std containers. */ template > class StdPersistentValueMap : public PersistentValueMap { public: explicit StdPersistentValueMap(Isolate* isolate) : PersistentValueMap(isolate) {} }; class DefaultPersistentValueVectorTraits { public: typedef std::vector Impl; static void Append(Impl* impl, PersistentContainerValue value) { impl->push_back(value); } static bool IsEmpty(const Impl* impl) { return impl->empty(); } static size_t Size(const Impl* impl) { return impl->size(); } static PersistentContainerValue Get(const Impl* impl, size_t i) { return (i < impl->size()) ? impl->at(i) : kPersistentContainerNotFound; } static void ReserveCapacity(Impl* impl, size_t capacity) { impl->reserve(capacity); } static void Clear(Impl* impl) { impl->clear(); } }; /** * A vector wrapper that safely stores UniquePersistent values. * C++11 embedders don't need this class, as they can use UniquePersistent * directly in std containers. * * This class relies on a backing vector implementation, whose type and methods * are described by the Traits class. The backing map will handle values of type * PersistentContainerValue, with all conversion into and out of V8 * handles being transparently handled by this class. */ template class PersistentValueVector { public: explicit PersistentValueVector(Isolate* isolate) : isolate_(isolate) { } ~PersistentValueVector() { Clear(); } /** * Append a value to the vector. */ void Append(Local value) { UniquePersistent persistent(isolate_, value); Traits::Append(&impl_, ClearAndLeak(&persistent)); } /** * Append a persistent's value to the vector. */ void Append(UniquePersistent persistent) { Traits::Append(&impl_, ClearAndLeak(&persistent)); } /** * Are there any values in the vector? */ bool IsEmpty() const { return Traits::IsEmpty(&impl_); } /** * How many elements are in the vector? */ size_t Size() const { return Traits::Size(&impl_); } /** * Retrieve the i-th value in the vector. */ Local Get(size_t index) const { return Local::New(isolate_, FromVal(Traits::Get(&impl_, index))); } /** * Remove all elements from the vector. */ void Clear() { size_t length = Traits::Size(&impl_); for (size_t i = 0; i < length; i++) { UniquePersistent p; p.val_ = FromVal(Traits::Get(&impl_, i)); } Traits::Clear(&impl_); } /** * Reserve capacity in the vector. * (Efficiency gains depend on the backing implementation.) */ void ReserveCapacity(size_t capacity) { Traits::ReserveCapacity(&impl_, capacity); } private: static PersistentContainerValue ClearAndLeak( UniquePersistent* persistent) { V* v = persistent->val_; persistent->val_ = 0; return reinterpret_cast(v); } static V* FromVal(PersistentContainerValue v) { return reinterpret_cast(v); } Isolate* isolate_; typename Traits::Impl impl_; }; } // namespace v8 #endif // V8_UTIL_H_ iojs-v1.0.2-darwin-x64/include/node/v8.h000644 000766 000024 00000721143 12455173732 017754 0ustar00iojsstaff000000 000000 // Copyright 2012 the V8 project authors. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. /** \mainpage V8 API Reference Guide * * V8 is Google's open source JavaScript engine. * * This set of documents provides reference material generated from the * V8 header file, include/v8.h. * * For other documentation see http://code.google.com/apis/v8/ */ #ifndef V8_H_ #define V8_H_ #include #include #include #include "v8config.h" // We reserve the V8_* prefix for macros defined in V8 public API and // assume there are no name conflicts with the embedder's code. #ifdef V8_OS_WIN // Setup for Windows DLL export/import. When building the V8 DLL the // BUILDING_V8_SHARED needs to be defined. When building a program which uses // the V8 DLL USING_V8_SHARED needs to be defined. When either building the V8 // static library or building a program which uses the V8 static library neither // BUILDING_V8_SHARED nor USING_V8_SHARED should be defined. #if defined(BUILDING_V8_SHARED) && defined(USING_V8_SHARED) #error both BUILDING_V8_SHARED and USING_V8_SHARED are set - please check the\ build configuration to ensure that at most one of these is set #endif #ifdef BUILDING_V8_SHARED # define V8_EXPORT __declspec(dllexport) #elif USING_V8_SHARED # define V8_EXPORT __declspec(dllimport) #else # define V8_EXPORT #endif // BUILDING_V8_SHARED #else // V8_OS_WIN // Setup for Linux shared library export. #if V8_HAS_ATTRIBUTE_VISIBILITY && defined(V8_SHARED) # ifdef BUILDING_V8_SHARED # define V8_EXPORT __attribute__ ((visibility("default"))) # else # define V8_EXPORT # endif #else # define V8_EXPORT #endif #endif // V8_OS_WIN /** * The v8 JavaScript engine. */ namespace v8 { class AccessorSignature; class Array; class Boolean; class BooleanObject; class Context; class CpuProfiler; class Data; class Date; class DeclaredAccessorDescriptor; class External; class Function; class FunctionTemplate; class HeapProfiler; class ImplementationUtilities; class Int32; class Integer; class Isolate; class Name; class Number; class NumberObject; class Object; class ObjectOperationDescriptor; class ObjectTemplate; class Platform; class Primitive; class Promise; class RawOperationDescriptor; class Script; class Signature; class StackFrame; class StackTrace; class String; class StringObject; class Symbol; class SymbolObject; class Private; class Uint32; class Utils; class Value; template class Handle; template class Local; template class Eternal; template class NonCopyablePersistentTraits; template class PersistentBase; template > class Persistent; template class UniquePersistent; template class PersistentValueMap; template class PersistentValueVector; template class WeakCallbackObject; class FunctionTemplate; class ObjectTemplate; class Data; template class FunctionCallbackInfo; template class PropertyCallbackInfo; class StackTrace; class StackFrame; class Isolate; class DeclaredAccessorDescriptor; class ObjectOperationDescriptor; class RawOperationDescriptor; class CallHandlerHelper; class EscapableHandleScope; template class ReturnValue; namespace internal { class Arguments; class Heap; class HeapObject; class Isolate; class Object; struct StreamedSource; template class CustomArguments; class PropertyCallbackArguments; class FunctionCallbackArguments; class GlobalHandles; class CallbackData { public: V8_INLINE v8::Isolate* GetIsolate() const { return isolate_; } protected: explicit CallbackData(v8::Isolate* isolate) : isolate_(isolate) {} private: v8::Isolate* isolate_; }; } /** * General purpose unique identifier. */ class UniqueId { public: explicit UniqueId(intptr_t data) : data_(data) {} bool operator==(const UniqueId& other) const { return data_ == other.data_; } bool operator!=(const UniqueId& other) const { return data_ != other.data_; } bool operator<(const UniqueId& other) const { return data_ < other.data_; } private: intptr_t data_; }; // --- Handles --- #define TYPE_CHECK(T, S) \ while (false) { \ *(static_cast(0)) = static_cast(0); \ } /** * An object reference managed by the v8 garbage collector. * * All objects returned from v8 have to be tracked by the garbage * collector so that it knows that the objects are still alive. Also, * because the garbage collector may move objects, it is unsafe to * point directly to an object. Instead, all objects are stored in * handles which are known by the garbage collector and updated * whenever an object moves. Handles should always be passed by value * (except in cases like out-parameters) and they should never be * allocated on the heap. * * There are two types of handles: local and persistent handles. * Local handles are light-weight and transient and typically used in * local operations. They are managed by HandleScopes. Persistent * handles can be used when storing objects across several independent * operations and have to be explicitly deallocated when they're no * longer used. * * It is safe to extract the object stored in the handle by * dereferencing the handle (for instance, to extract the Object* from * a Handle); the value will still be governed by a handle * behind the scenes and the same rules apply to these values as to * their handles. */ template class Handle { public: /** * Creates an empty handle. */ V8_INLINE Handle() : val_(0) {} /** * Creates a handle for the contents of the specified handle. This * constructor allows you to pass handles as arguments by value and * to assign between handles. However, if you try to assign between * incompatible handles, for instance from a Handle to a * Handle it will cause a compile-time error. Assigning * between compatible handles, for instance assigning a * Handle to a variable declared as Handle, is legal * because String is a subclass of Value. */ template V8_INLINE Handle(Handle that) : val_(reinterpret_cast(*that)) { /** * This check fails when trying to convert between incompatible * handles. For example, converting from a Handle to a * Handle. */ TYPE_CHECK(T, S); } /** * Returns true if the handle is empty. */ V8_INLINE bool IsEmpty() const { return val_ == 0; } /** * Sets the handle to be empty. IsEmpty() will then return true. */ V8_INLINE void Clear() { val_ = 0; } V8_INLINE T* operator->() const { return val_; } V8_INLINE T* operator*() const { return val_; } /** * Checks whether two handles are the same. * Returns true if both are empty, or if the objects * to which they refer are identical. * The handles' references are not checked. */ template V8_INLINE bool operator==(const Handle& that) const { internal::Object** a = reinterpret_cast(this->val_); internal::Object** b = reinterpret_cast(that.val_); if (a == 0) return b == 0; if (b == 0) return false; return *a == *b; } template V8_INLINE bool operator==( const PersistentBase& that) const { internal::Object** a = reinterpret_cast(this->val_); internal::Object** b = reinterpret_cast(that.val_); if (a == 0) return b == 0; if (b == 0) return false; return *a == *b; } /** * Checks whether two handles are different. * Returns true if only one of the handles is empty, or if * the objects to which they refer are different. * The handles' references are not checked. */ template V8_INLINE bool operator!=(const Handle& that) const { return !operator==(that); } template V8_INLINE bool operator!=( const Persistent& that) const { return !operator==(that); } template V8_INLINE static Handle Cast(Handle that) { #ifdef V8_ENABLE_CHECKS // If we're going to perform the type check then we have to check // that the handle isn't empty before doing the checked cast. if (that.IsEmpty()) return Handle(); #endif return Handle(T::Cast(*that)); } template V8_INLINE Handle As() { return Handle::Cast(*this); } V8_INLINE static Handle New(Isolate* isolate, Handle that) { return New(isolate, that.val_); } V8_INLINE static Handle New(Isolate* isolate, const PersistentBase& that) { return New(isolate, that.val_); } private: friend class Utils; template friend class Persistent; template friend class PersistentBase; template friend class Handle; template friend class Local; template friend class FunctionCallbackInfo; template friend class PropertyCallbackInfo; template friend class internal::CustomArguments; friend Handle Undefined(Isolate* isolate); friend Handle Null(Isolate* isolate); friend Handle True(Isolate* isolate); friend Handle False(Isolate* isolate); friend class Context; friend class HandleScope; friend class Object; friend class Private; /** * Creates a new handle for the specified value. */ V8_INLINE explicit Handle(T* val) : val_(val) {} V8_INLINE static Handle New(Isolate* isolate, T* that); T* val_; }; /** * A light-weight stack-allocated object handle. All operations * that return objects from within v8 return them in local handles. They * are created within HandleScopes, and all local handles allocated within a * handle scope are destroyed when the handle scope is destroyed. Hence it * is not necessary to explicitly deallocate local handles. */ template class Local : public Handle { public: V8_INLINE Local(); template V8_INLINE Local(Local that) : Handle(reinterpret_cast(*that)) { /** * This check fails when trying to convert between incompatible * handles. For example, converting from a Handle to a * Handle. */ TYPE_CHECK(T, S); } template V8_INLINE static Local Cast(Local that) { #ifdef V8_ENABLE_CHECKS // If we're going to perform the type check then we have to check // that the handle isn't empty before doing the checked cast. if (that.IsEmpty()) return Local(); #endif return Local(T::Cast(*that)); } template V8_INLINE Local(Handle that) : Handle(reinterpret_cast(*that)) { TYPE_CHECK(T, S); } template V8_INLINE Local As() { return Local::Cast(*this); } /** * Create a local handle for the content of another handle. * The referee is kept alive by the local handle even when * the original handle is destroyed/disposed. */ V8_INLINE static Local New(Isolate* isolate, Handle that); V8_INLINE static Local New(Isolate* isolate, const PersistentBase& that); private: friend class Utils; template friend class Eternal; template friend class PersistentBase; template friend class Persistent; template friend class Handle; template friend class Local; template friend class FunctionCallbackInfo; template friend class PropertyCallbackInfo; friend class String; friend class Object; friend class Context; template friend class internal::CustomArguments; friend class HandleScope; friend class EscapableHandleScope; template friend class PersistentValueMap; template friend class PersistentValueVector; template V8_INLINE Local(S* that) : Handle(that) { } V8_INLINE static Local New(Isolate* isolate, T* that); }; // Eternal handles are set-once handles that live for the life of the isolate. template class Eternal { public: V8_INLINE Eternal() : index_(kInitialValue) { } template V8_INLINE Eternal(Isolate* isolate, Local handle) : index_(kInitialValue) { Set(isolate, handle); } // Can only be safely called if already set. V8_INLINE Local Get(Isolate* isolate); V8_INLINE bool IsEmpty() { return index_ == kInitialValue; } template V8_INLINE void Set(Isolate* isolate, Local handle); private: static const int kInitialValue = -1; int index_; }; template class PhantomCallbackData : public internal::CallbackData { public: typedef void (*Callback)(const PhantomCallbackData& data); V8_INLINE T* GetParameter() const { return parameter_; } PhantomCallbackData(Isolate* isolate, T* parameter) : internal::CallbackData(isolate), parameter_(parameter) {} private: T* parameter_; }; template class WeakCallbackData : public PhantomCallbackData

    { public: typedef void (*Callback)(const WeakCallbackData& data); V8_INLINE Local GetValue() const { return handle_; } private: friend class internal::GlobalHandles; WeakCallbackData(Isolate* isolate, P* parameter, Local handle) : PhantomCallbackData

    (isolate, parameter), handle_(handle) {} Local handle_; }; template class InternalFieldsCallbackData : public internal::CallbackData { public: typedef void (*Callback)(const InternalFieldsCallbackData& data); InternalFieldsCallbackData(Isolate* isolate, T* internalField1, U* internalField2) : internal::CallbackData(isolate), internal_field1_(internalField1), internal_field2_(internalField2) {} V8_INLINE T* GetInternalField1() const { return internal_field1_; } V8_INLINE U* GetInternalField2() const { return internal_field2_; } private: T* internal_field1_; U* internal_field2_; }; /** * An object reference that is independent of any handle scope. Where * a Local handle only lives as long as the HandleScope in which it was * allocated, a PersistentBase handle remains valid until it is explicitly * disposed. * * A persistent handle contains a reference to a storage cell within * the v8 engine which holds an object value and which is updated by * the garbage collector whenever the object is moved. A new storage * cell can be created using the constructor or PersistentBase::Reset and * existing handles can be disposed using PersistentBase::Reset. * */ template class PersistentBase { public: /** * If non-empty, destroy the underlying storage cell * IsEmpty() will return true after this call. */ V8_INLINE void Reset(); /** * If non-empty, destroy the underlying storage cell * and create a new one with the contents of other if other is non empty */ template V8_INLINE void Reset(Isolate* isolate, const Handle& other); /** * If non-empty, destroy the underlying storage cell * and create a new one with the contents of other if other is non empty */ template V8_INLINE void Reset(Isolate* isolate, const PersistentBase& other); V8_INLINE bool IsEmpty() const { return val_ == NULL; } V8_INLINE void Empty() { val_ = 0; } template V8_INLINE bool operator==(const PersistentBase& that) const { internal::Object** a = reinterpret_cast(this->val_); internal::Object** b = reinterpret_cast(that.val_); if (a == NULL) return b == NULL; if (b == NULL) return false; return *a == *b; } template V8_INLINE bool operator==(const Handle& that) const { internal::Object** a = reinterpret_cast(this->val_); internal::Object** b = reinterpret_cast(that.val_); if (a == NULL) return b == NULL; if (b == NULL) return false; return *a == *b; } template V8_INLINE bool operator!=(const PersistentBase& that) const { return !operator==(that); } template V8_INLINE bool operator!=(const Handle& that) const { return !operator==(that); } /** * Install a finalization callback on this object. * NOTE: There is no guarantee as to *when* or even *if* the callback is * invoked. The invocation is performed solely on a best effort basis. * As always, GC-based finalization should *not* be relied upon for any * critical form of resource management! */ template V8_INLINE void SetWeak( P* parameter, typename WeakCallbackData::Callback callback); template V8_INLINE void SetWeak( P* parameter, typename WeakCallbackData::Callback callback); // Phantom persistents work like weak persistents, except that the pointer to // the object being collected is not available in the finalization callback. // This enables the garbage collector to collect the object and any objects // it references transitively in one GC cycle. At the moment you can either // specify a parameter for the callback or the location of two internal // fields in the dying object. template V8_INLINE void SetPhantom(P* parameter, typename PhantomCallbackData

    ::Callback callback); template V8_INLINE void SetPhantom( void (*callback)(const InternalFieldsCallbackData&), int internal_field_index1, int internal_field_index2); template V8_INLINE P* ClearWeak(); // TODO(dcarney): remove this. V8_INLINE void ClearWeak() { ClearWeak(); } /** * Marks the reference to this object independent. Garbage collector is free * to ignore any object groups containing this object. Weak callback for an * independent handle should not assume that it will be preceded by a global * GC prologue callback or followed by a global GC epilogue callback. */ V8_INLINE void MarkIndependent(); /** * Marks the reference to this object partially dependent. Partially dependent * handles only depend on other partially dependent handles and these * dependencies are provided through object groups. It provides a way to build * smaller object groups for young objects that represent only a subset of all * external dependencies. This mark is automatically cleared after each * garbage collection. */ V8_INLINE void MarkPartiallyDependent(); V8_INLINE bool IsIndependent() const; /** Checks if the handle holds the only reference to an object. */ V8_INLINE bool IsNearDeath() const; /** Returns true if the handle's reference is weak. */ V8_INLINE bool IsWeak() const; /** * Assigns a wrapper class ID to the handle. See RetainedObjectInfo interface * description in v8-profiler.h for details. */ V8_INLINE void SetWrapperClassId(uint16_t class_id); /** * Returns the class ID previously assigned to this handle or 0 if no class ID * was previously assigned. */ V8_INLINE uint16_t WrapperClassId() const; private: friend class Isolate; friend class Utils; template friend class Handle; template friend class Local; template friend class Persistent; template friend class UniquePersistent; template friend class PersistentBase; template friend class ReturnValue; template friend class PersistentValueMap; template friend class PersistentValueVector; friend class Object; explicit V8_INLINE PersistentBase(T* val) : val_(val) {} PersistentBase(PersistentBase& other); // NOLINT void operator=(PersistentBase&); V8_INLINE static T* New(Isolate* isolate, T* that); T* val_; }; /** * Default traits for Persistent. This class does not allow * use of the copy constructor or assignment operator. * At present kResetInDestructor is not set, but that will change in a future * version. */ template class NonCopyablePersistentTraits { public: typedef Persistent > NonCopyablePersistent; static const bool kResetInDestructor = false; template V8_INLINE static void Copy(const Persistent& source, NonCopyablePersistent* dest) { Uncompilable(); } // TODO(dcarney): come up with a good compile error here. template V8_INLINE static void Uncompilable() { TYPE_CHECK(O, Primitive); } }; /** * Helper class traits to allow copying and assignment of Persistent. * This will clone the contents of storage cell, but not any of the flags, etc. */ template struct CopyablePersistentTraits { typedef Persistent > CopyablePersistent; static const bool kResetInDestructor = true; template static V8_INLINE void Copy(const Persistent& source, CopyablePersistent* dest) { // do nothing, just allow copy } }; /** * A PersistentBase which allows copy and assignment. * * Copy, assignment and destructor bevavior is controlled by the traits * class M. * * Note: Persistent class hierarchy is subject to future changes. */ template class Persistent : public PersistentBase { public: /** * A Persistent with no storage cell. */ V8_INLINE Persistent() : PersistentBase(0) { } /** * Construct a Persistent from a Handle. * When the Handle is non-empty, a new storage cell is created * pointing to the same object, and no flags are set. */ template V8_INLINE Persistent(Isolate* isolate, Handle that) : PersistentBase(PersistentBase::New(isolate, *that)) { TYPE_CHECK(T, S); } /** * Construct a Persistent from a Persistent. * When the Persistent is non-empty, a new storage cell is created * pointing to the same object, and no flags are set. */ template V8_INLINE Persistent(Isolate* isolate, const Persistent& that) : PersistentBase(PersistentBase::New(isolate, *that)) { TYPE_CHECK(T, S); } /** * The copy constructors and assignment operator create a Persistent * exactly as the Persistent constructor, but the Copy function from the * traits class is called, allowing the setting of flags based on the * copied Persistent. */ V8_INLINE Persistent(const Persistent& that) : PersistentBase(0) { Copy(that); } template V8_INLINE Persistent(const Persistent& that) : PersistentBase(0) { Copy(that); } V8_INLINE Persistent& operator=(const Persistent& that) { // NOLINT Copy(that); return *this; } template V8_INLINE Persistent& operator=(const Persistent& that) { // NOLINT Copy(that); return *this; } /** * The destructor will dispose the Persistent based on the * kResetInDestructor flags in the traits class. Since not calling dispose * can result in a memory leak, it is recommended to always set this flag. */ V8_INLINE ~Persistent() { if (M::kResetInDestructor) this->Reset(); } // TODO(dcarney): this is pretty useless, fix or remove template V8_INLINE static Persistent& Cast(Persistent& that) { // NOLINT #ifdef V8_ENABLE_CHECKS // If we're going to perform the type check then we have to check // that the handle isn't empty before doing the checked cast. if (!that.IsEmpty()) T::Cast(*that); #endif return reinterpret_cast&>(that); } // TODO(dcarney): this is pretty useless, fix or remove template V8_INLINE Persistent& As() { // NOLINT return Persistent::Cast(*this); } private: friend class Isolate; friend class Utils; template friend class Handle; template friend class Local; template friend class Persistent; template friend class ReturnValue; template V8_INLINE Persistent(S* that) : PersistentBase(that) { } V8_INLINE T* operator*() const { return this->val_; } template V8_INLINE void Copy(const Persistent& that); }; /** * A PersistentBase which has move semantics. * * Note: Persistent class hierarchy is subject to future changes. */ template class UniquePersistent : public PersistentBase { struct RValue { V8_INLINE explicit RValue(UniquePersistent* obj) : object(obj) {} UniquePersistent* object; }; public: /** * A UniquePersistent with no storage cell. */ V8_INLINE UniquePersistent() : PersistentBase(0) { } /** * Construct a UniquePersistent from a Handle. * When the Handle is non-empty, a new storage cell is created * pointing to the same object, and no flags are set. */ template V8_INLINE UniquePersistent(Isolate* isolate, Handle that) : PersistentBase(PersistentBase::New(isolate, *that)) { TYPE_CHECK(T, S); } /** * Construct a UniquePersistent from a PersistentBase. * When the Persistent is non-empty, a new storage cell is created * pointing to the same object, and no flags are set. */ template V8_INLINE UniquePersistent(Isolate* isolate, const PersistentBase& that) : PersistentBase(PersistentBase::New(isolate, that.val_)) { TYPE_CHECK(T, S); } /** * Move constructor. */ V8_INLINE UniquePersistent(RValue rvalue) : PersistentBase(rvalue.object->val_) { rvalue.object->val_ = 0; } V8_INLINE ~UniquePersistent() { this->Reset(); } /** * Move via assignment. */ template V8_INLINE UniquePersistent& operator=(UniquePersistent rhs) { TYPE_CHECK(T, S); this->Reset(); this->val_ = rhs.val_; rhs.val_ = 0; return *this; } /** * Cast operator for moves. */ V8_INLINE operator RValue() { return RValue(this); } /** * Pass allows returning uniques from functions, etc. */ UniquePersistent Pass() { return UniquePersistent(RValue(this)); } private: UniquePersistent(UniquePersistent&); void operator=(UniquePersistent&); }; /** * A stack-allocated class that governs a number of local handles. * After a handle scope has been created, all local handles will be * allocated within that handle scope until either the handle scope is * deleted or another handle scope is created. If there is already a * handle scope and a new one is created, all allocations will take * place in the new handle scope until it is deleted. After that, * new handles will again be allocated in the original handle scope. * * After the handle scope of a local handle has been deleted the * garbage collector will no longer track the object stored in the * handle and may deallocate it. The behavior of accessing a handle * for which the handle scope has been deleted is undefined. */ class V8_EXPORT HandleScope { public: HandleScope(Isolate* isolate); ~HandleScope(); /** * Counts the number of allocated handles. */ static int NumberOfHandles(Isolate* isolate); V8_INLINE Isolate* GetIsolate() const { return reinterpret_cast(isolate_); } protected: V8_INLINE HandleScope() {} void Initialize(Isolate* isolate); static internal::Object** CreateHandle(internal::Isolate* isolate, internal::Object* value); private: // Uses heap_object to obtain the current Isolate. static internal::Object** CreateHandle(internal::HeapObject* heap_object, internal::Object* value); // Make it hard to create heap-allocated or illegal handle scopes by // disallowing certain operations. HandleScope(const HandleScope&); void operator=(const HandleScope&); void* operator new(size_t size); void operator delete(void*, size_t); internal::Isolate* isolate_; internal::Object** prev_next_; internal::Object** prev_limit_; // Local::New uses CreateHandle with an Isolate* parameter. template friend class Local; // Object::GetInternalField and Context::GetEmbedderData use CreateHandle with // a HeapObject* in their shortcuts. friend class Object; friend class Context; }; /** * A HandleScope which first allocates a handle in the current scope * which will be later filled with the escape value. */ class V8_EXPORT EscapableHandleScope : public HandleScope { public: EscapableHandleScope(Isolate* isolate); V8_INLINE ~EscapableHandleScope() {} /** * Pushes the value into the previous scope and returns a handle to it. * Cannot be called twice. */ template V8_INLINE Local Escape(Local value) { internal::Object** slot = Escape(reinterpret_cast(*value)); return Local(reinterpret_cast(slot)); } private: internal::Object** Escape(internal::Object** escape_value); // Make it hard to create heap-allocated or illegal handle scopes by // disallowing certain operations. EscapableHandleScope(const EscapableHandleScope&); void operator=(const EscapableHandleScope&); void* operator new(size_t size); void operator delete(void*, size_t); internal::Object** escape_slot_; }; /** * A simple Maybe type, representing an object which may or may not have a * value. */ template struct Maybe { Maybe() : has_value(false) {} explicit Maybe(T t) : has_value(true), value(t) {} Maybe(bool has, T t) : has_value(has), value(t) {} bool has_value; T value; }; // Convenience wrapper. template inline Maybe maybe(T t) { return Maybe(t); } // --- Special objects --- /** * The superclass of values and API object templates. */ class V8_EXPORT Data { private: Data(); }; /** * The origin, within a file, of a script. */ class ScriptOrigin { public: V8_INLINE ScriptOrigin( Handle resource_name, Handle resource_line_offset = Handle(), Handle resource_column_offset = Handle(), Handle resource_is_shared_cross_origin = Handle(), Handle script_id = Handle()) : resource_name_(resource_name), resource_line_offset_(resource_line_offset), resource_column_offset_(resource_column_offset), resource_is_shared_cross_origin_(resource_is_shared_cross_origin), script_id_(script_id) { } V8_INLINE Handle ResourceName() const; V8_INLINE Handle ResourceLineOffset() const; V8_INLINE Handle ResourceColumnOffset() const; V8_INLINE Handle ResourceIsSharedCrossOrigin() const; V8_INLINE Handle ScriptID() const; private: Handle resource_name_; Handle resource_line_offset_; Handle resource_column_offset_; Handle resource_is_shared_cross_origin_; Handle script_id_; }; /** * A compiled JavaScript script, not yet tied to a Context. */ class V8_EXPORT UnboundScript { public: /** * Binds the script to the currently entered context. */ Local